From cb80e6922a3c56e3f1b4050296b31b8a635492c2 Mon Sep 17 00:00:00 2001 From: Natik Gadzhi Date: Tue, 7 May 2024 08:19:33 -0700 Subject: [PATCH] [tools] prettier rules for .md + formatting cleanup --- .prettierignore | 2 + .prettierrc | 3 +- CODE_OF_CONDUCT.md | 1 + CONTRIBUTING.md | 1 + CONTRIBUTORS.md | 838 +++++++++--------- README.md | 11 +- airbyte-cdk/java/airbyte-cdk/README.md | 2 +- .../cdk/integrations/base/ssh/readme.md | 24 +- airbyte-cdk/python/CHANGELOG.md | 359 +++++++- .../destinations/vector_db_based/README.md | 24 +- .../airbyte_cdk/sources/file_based/README.md | 87 +- .../sources/streams/concurrent/README.md | 10 +- airbyte-cdk/python/sphinx-docs.md | 48 +- airbyte-ci/connectors/base_images/README.md | 54 +- .../connectors/ci_credentials/README.md | 25 +- airbyte-ci/connectors/common_utils/README.md | 1 + .../test_migration_files/extra-header.md | 2 +- .../test_migration_files/missing-entry.md | 2 +- .../test_migration_files/out-of-order.md | 2 +- airbyte-ci/connectors/connectors_qa/README.md | 3 + .../src/connectors_qa/checks/documentation.py | 2 +- .../connectors_qa/templates/qa_checks.md.j2 | 10 +- airbyte-ci/connectors/live-tests/README.md | 124 ++- .../connectors/metadata_service/lib/README.md | 7 +- .../lib/tests/fixtures/doc.md | 2 +- .../templates/connector_nightly_report.md | 8 +- .../connectors/pipelines/CONTRIBUTING.md | 182 ++-- airbyte-ci/connectors/pipelines/README.md | 32 +- .../pipelines/pipelines/helpers/changelog.py | 8 +- .../pipelines/tests/test_changelog.py | 5 + .../changelog_header_no_newline.md | 15 +- .../changelog_header_no_separator.md | 11 +- .../initial_files/no_changelog_header.md | 9 +- .../initial_files/valid_changelog_at_end.md | 15 +- .../valid_changelog_in_middle.md | 18 +- ...ate_version_date_valid_changelog_at_end.md | 9 +- ..._version_date_valid_changelog_in_middle.md | 10 +- ...upicate_versions_valid_changelog_at_end.md | 9 +- ...cate_versions_valid_changelog_in_middle.md | 10 +- .../duplicate_entry_valid_changelog_at_end.md | 9 +- ...plicate_entry_valid_changelog_in_middle.md | 10 +- ...existing_entries_valid_changelog_at_end.md | 9 +- ...sting_entries_valid_changelog_in_middle.md | 10 +- .../single_insert_valid_changelog_at_end.md | 9 +- ...single_insert_valid_changelog_in_middle.md | 10 +- .../dbt-project-template/README.md | 2 +- .../resources/test_simple_streams/README.md | 3 +- .../base-normalization/setup/snowflake.md | 3 +- .../connector-acceptance-test/CHANGELOG.md | 131 ++- .../bases/connector-acceptance-test/README.md | 22 +- .../connector-templates/README.md | 6 +- .../connector-templates/generator/README.md | 30 +- .../source_{{snakeCase name}}/schemas/TODO.md | 21 +- .../destination-harness/README.md | 1 + .../source-harness/README.md | 2 +- .../destination-amazon-sqs/README.md | 25 +- .../destination-amazon-sqs/bootstrap.md | 24 +- .../connectors/destination-astra/README.md | 26 +- .../destination-aws-datalake/README.md | 15 +- .../destination-azure-blob-storage/README.md | 9 +- .../connectors/destination-bigquery/README.md | 17 +- .../connectors/destination-chroma/README.md | 29 +- .../destination-clickhouse/README.md | 17 +- .../destination-clickhouse/bootstrap.md | 1 - .../connectors/destination-convex/README.md | 11 +- .../connectors/destination-cumulio/README.md | 25 +- .../connectors/destination-databend/README.md | 25 +- .../destination-databricks/README.md | 18 +- .../connectors/destination-dev-null/README.md | 15 +- .../connectors/destination-duckdb/README.md | 28 +- .../connectors/destination-dynamodb/README.md | 17 +- .../connectors/destination-e2e-test/README.md | 18 +- .../destination-elasticsearch/README.md | 17 +- .../destination-elasticsearch/bootstrap.md | 23 +- .../connectors/destination-firebolt/README.md | 25 +- .../destination-firebolt/bootstrap.md | 6 +- .../destination-firestore/README.md | 24 +- .../connectors/destination-gcs/README.md | 5 +- .../destination-google-sheets/README.md | 25 +- .../connectors/destination-iceberg/README.md | 4 +- .../destination-iceberg/bootstrap.md | 1 - .../connectors/destination-kafka/README.md | 17 +- .../connectors/destination-kvdb/README.md | 42 +- .../destination-langchain/README.md | 25 +- .../destination-langchain/bootstrap.md | 9 +- .../destination-meilisearch/README.md | 25 +- .../connectors/destination-milvus/README.md | 35 +- .../destination-milvus/bootstrap.md | 7 +- .../README.md | 20 +- .../connectors/destination-pinecone/README.md | 35 +- .../destination-pinecone/bootstrap.md | 9 +- .../connectors/destination-qdrant/README.md | 29 +- .../connectors/destination-rabbitmq/README.md | 25 +- .../connectors/destination-redis/README.md | 17 +- .../connectors/destination-redis/bootstrap.md | 9 +- .../connectors/destination-redshift/README.md | 4 +- .../connectors/destination-s3-glue/README.md | 17 +- .../connectors/destination-s3/README.md | 1 + .../destination-sftp-json/README.md | 25 +- .../destination-snowflake/README.md | 2 + .../connectors/destination-sqlite/README.md | 25 +- .../destination-starburst-galaxy/README.md | 11 +- .../connectors/destination-teradata/README.md | 17 +- .../connectors/destination-timeplus/README.md | 11 +- .../destination-typesense/README.md | 25 +- .../connectors/destination-vectara/README.md | 38 +- .../connectors/destination-weaviate/README.md | 35 +- .../connectors/destination-xata/README.md | 25 +- .../destination-yellowbrick/README.md | 17 + .../source-activecampaign/README.md | 19 +- .../connectors/source-adjust/README.md | 22 +- .../connectors/source-aha/README.md | 19 +- .../connectors/source-aircall/README.md | 19 +- .../connectors/source-airtable/README.md | 35 +- .../connectors/source-alpha-vantage/README.md | 19 +- .../connectors/source-amazon-ads/README.md | 35 +- .../source-amazon-seller-partner/README.md | 35 +- .../connectors/source-amazon-sqs/README.md | 25 +- .../connectors/source-amazon-sqs/bootstrap.md | 16 +- .../connectors/source-amplitude/README.md | 35 +- .../connectors/source-apify-dataset/README.md | 35 +- .../connectors/source-appfollow/README.md | 19 +- .../source-apple-search-ads/README.md | 19 +- .../source-apple-search-ads/bootstrap.md | 15 +- .../connectors/source-appsflyer/README.md | 25 +- .../connectors/source-asana/README.md | 25 +- .../connectors/source-ashby/README.md | 19 +- .../connectors/source-auth0/README.md | 35 +- .../connectors/source-avni/README.md | 24 +- .../source-aws-cloudtrail/README.md | 32 +- .../source-azure-blob-storage/README.md | 57 +- .../connectors/source-azure-table/README.md | 27 +- .../connectors/source-babelforce/README.md | 19 +- .../connectors/source-bamboo-hr/README.md | 35 +- .../connectors/source-bigcommerce/README.md | 19 +- .../integration_tests/README.md | 1 + .../connectors/source-bing-ads/README.md | 35 +- .../connectors/source-bing-ads/bootstrap.md | 33 +- .../connectors/source-braintree/README.md | 25 +- .../connectors/source-braze/README.md | 19 +- .../connectors/source-breezometer/README.md | 19 +- .../connectors/source-callrail/README.md | 19 +- .../connectors/source-captain-data/README.md | 19 +- .../connectors/source-cart/BOOTSTRAP.md | 9 +- .../connectors/source-cart/README.md | 35 +- .../connectors/source-chargebee/README.md | 35 +- .../connectors/source-chargify/README.md | 19 +- .../connectors/source-chartmogul/README.md | 25 +- .../connectors/source-chartmogul/bootstrap.md | 18 +- .../BOOTSTRAP.md | 3 +- .../ReadMe.md | 3 +- .../connectors/source-clickhouse/BOOTSTRAP.md | 3 +- .../connectors/source-clickhouse/ReadMe.md | 3 +- .../connectors/source-clickup-api/README.md | 19 +- .../connectors/source-clockify/README.md | 35 +- .../connectors/source-close-com/README.md | 25 +- .../connectors/source-close-com/bootstrap.md | 8 +- .../connectors/source-coda/README.md | 35 +- .../connectors/source-coin-api/README.md | 35 +- .../source-coingecko-coins/README.md | 19 +- .../connectors/source-coinmarketcap/README.md | 19 +- .../connectors/source-commcare/README.md | 25 +- .../source_commcare/schemas/TODO.md | 21 +- .../connectors/source-commercetools/README.md | 19 +- .../connectors/source-configcat/README.md | 19 +- .../connectors/source-confluence/README.md | 35 +- .../connectors/source-convertkit/README.md | 19 +- .../connectors/source-convex/README.md | 25 +- .../connectors/source-copper/README.md | 35 +- .../connectors/source-customer-io/README.md | 19 +- .../connectors/source-datadog/README.md | 19 +- .../connectors/source-datascope/BOOTSTRAP.md | 10 +- .../connectors/source-datascope/README.md | 19 +- .../connectors/source-db2/CHANGELOG.md | 1 + .../connectors/source-db2/README.md | 3 +- .../source-declarative-manifest/README.md | 26 +- .../connectors/source-delighted/README.md | 35 +- .../connectors/source-dixa/README.md | 19 +- .../connectors/source-dockerhub/README.md | 35 +- .../connectors/source-dockerhub/bootstrap.md | 2 +- .../connectors/source-dremio/README.md | 19 +- .../connectors/source-drift/README.md | 35 +- .../connectors/source-dynamodb/README.md | 19 +- .../source-e2e-test-cloud/README.md | 20 +- .../connectors/source-e2e-test/README.md | 26 +- .../connectors/source-elasticsearch/README.md | 21 +- .../source-emailoctopus/BOOTSTRAP.md | 3 +- .../connectors/source-emailoctopus/README.md | 35 +- .../connectors/source-everhour/README.md | 19 +- .../source-exchange-rates/README.md | 19 +- .../source-facebook-marketing/BOOTSTRAP.md | 7 +- .../source-facebook-marketing/README.md | 35 +- .../source_facebook_marketing/README.md | 23 +- .../source-facebook-pages/README.md | 26 +- .../connectors/source-faker/README.md | 35 +- .../connectors/source-fastbill/README.md | 35 +- .../connectors/source-fauna/README.md | 33 +- .../connectors/source-fauna/bootstrap.md | 1 - .../connectors/source-fauna/overview.md | 3 +- .../connectors/source-file/README.md | 35 +- .../README.md | 25 +- .../bootstrap.md | 49 +- .../connectors/source-firebolt/README.md | 25 +- .../connectors/source-firebolt/bootstrap.md | 10 +- .../connectors/source-flexport/README.md | 19 +- .../connectors/source-flexport/bootstrap.md | 8 +- .../connectors/source-freshcaller/README.md | 24 +- .../connectors/source-freshsales/README.md | 24 +- .../connectors/source-freshservice/README.md | 35 +- .../connectors/source-fullstory/README.md | 19 +- .../connectors/source-gainsight-px/README.md | 19 +- .../connectors/source-gcs/README.md | 35 +- .../connectors/source-genesys/README.md | 26 +- .../connectors/source-getlago/README.md | 19 +- .../connectors/source-github/README.md | 35 +- .../source-github/fixtures/README.md | 7 + .../connectors/source-gitlab/README.md | 35 +- .../connectors/source-glassfrog/README.md | 35 +- .../connectors/source-gnews/README.md | 19 +- .../connectors/source-gocardless/README.md | 19 +- .../connectors/source-gong/README.md | 35 +- .../connectors/source-google-ads/BOOTSTRAP.md | 21 +- .../connectors/source-google-ads/README.md | 35 +- .../README.md | 35 +- .../README.md | 32 +- .../source-google-analytics-v4/README.md | 35 +- .../source-google-directory/README.md | 25 +- .../connectors/source-google-drive/README.md | 33 +- .../README.md | 35 +- .../source-google-search-console/BOOTSTRAP.md | 40 +- .../source-google-search-console/README.md | 35 +- .../credentials/README.md | 1 - .../connectors/source-google-sheets/README.md | 35 +- .../source-google-webfonts/README.md | 35 +- .../source-google-webfonts/bootstrap.md | 8 +- .../connectors/source-greenhouse/README.md | 35 +- .../connectors/source-gridly/README.md | 25 +- .../connectors/source-gutendex/README.md | 19 +- .../connectors/source-gutendex/bootstrap.md | 2 +- .../source_gutendex/schemas/TODO.md | 9 +- .../connectors/source-harness/README.md | 19 +- .../connectors/source-harvest/README.md | 35 +- .../connectors/source-hellobaton/README.md | 19 +- .../connectors/source-hubplanner/README.md | 19 +- .../connectors/source-hubspot/README.md | 35 +- .../connectors/source-insightly/README.md | 35 +- .../connectors/source-instagram/README.md | 35 +- .../connectors/source-instatus/README.md | 19 +- .../connectors/source-intercom/README.md | 35 +- .../connectors/source-intruder/README.md | 19 +- .../connectors/source-ip2whois/README.md | 35 +- .../connectors/source-iterable/README.md | 35 +- .../connectors/source-jira/README.md | 35 +- .../connectors/source-k6-cloud/README.md | 35 +- .../connectors/source-kafka/README.md | 17 +- .../connectors/source-klarna/README.md | 35 +- .../connectors/source-klaus-api/README.md | 43 +- .../connectors/source-klaviyo/README.md | 35 +- .../connectors/source-kyriba/README.md | 35 +- .../connectors/source-kyve/README.md | 11 +- .../connectors/source-launchdarkly/README.md | 19 +- .../connectors/source-lemlist/README.md | 19 +- .../connectors/source-lever-hiring/README.md | 25 +- .../connectors/source-linkedin-ads/README.md | 35 +- .../source-linkedin-pages/README.md | 25 +- .../source-linkedin-pages/bootstrap.md | 2 +- .../connectors/source-linnworks/README.md | 4 +- .../connectors/source-lokalise/README.md | 19 +- .../connectors/source-looker/README.md | 26 +- .../connectors/source-mailchimp/README.md | 35 +- .../connectors/source-mailerlite/README.md | 19 +- .../connectors/source-mailersend/README.md | 19 +- .../connectors/source-mailgun/README.md | 35 +- .../connectors/source-mailjet-mail/README.md | 19 +- .../connectors/source-mailjet-sms/README.md | 35 +- .../connectors/source-marketo/README.md | 35 +- .../connectors/source-marketo/bootstrap.md | 20 +- .../connectors/source-merge/README.md | 19 +- .../connectors/source-metabase/README.md | 31 +- .../connectors/source-metabase/bootstrap.md | 6 +- .../source-microsoft-dataverse/README.md | 25 +- .../source-microsoft-onedrive/README.md | 35 +- .../source-microsoft-sharepoint/README.md | 35 +- .../source-microsoft-teams/README.md | 25 +- .../connectors/source-mixpanel/README.md | 35 +- .../connectors/source-monday/README.md | 35 +- .../connectors/source-mongodb-v2/README.md | 27 +- .../connectors/source-mssql/README.md | 6 +- .../connectors/source-my-hours/README.md | 25 +- .../connectors/source-my-hours/bootstrap.md | 10 +- .../connectors/source-mysql/README.md | 14 +- .../connectors/source-n8n/README.md | 19 +- .../connectors/source-nasa/README.md | 19 +- .../connectors/source-netsuite/README.md | 25 +- .../connectors/source-news-api/README.md | 19 +- .../connectors/source-newsdata/README.md | 35 +- .../connectors/source-notion/README.md | 35 +- .../connectors/source-notion/bootstrap.md | 1 - .../connectors/source-nytimes/README.md | 35 +- .../connectors/source-okta/README.md | 15 +- .../connectors/source-omnisend/README.md | 19 +- .../connectors/source-onesignal/README.md | 19 +- .../source-open-exchange-rates/README.md | 35 +- .../connectors/source-openweather/README.md | 35 +- .../connectors/source-opsgenie/README.md | 35 +- .../connectors/source-oracle/BOOTSTRAP.md | 6 +- .../connectors/source-oracle/README.md | 9 + .../connectors/source-orb/README.md | 25 +- .../connectors/source-orb/bootstrap.md | 16 +- .../connectors/source-orbit/README.md | 35 +- .../connectors/source-oura/README.md | 19 +- .../source-outbrain-amplify/README.md | 25 +- .../source-outbrain-amplify/bootstrap.md | 39 +- .../connectors/source-outreach/README.md | 35 +- .../connectors/source-pagerduty/README.md | 19 +- .../connectors/source-pardot/README.md | 25 +- .../connectors/source-partnerstack/README.md | 19 +- .../source-paypal-transaction/CHANGELOG.md | 6 +- .../source-paypal-transaction/README.md | 147 +-- .../connectors/source-paystack/BOOTSTRAP.md | 5 +- .../connectors/source-paystack/README.md | 25 +- .../connectors/source-pendo/README.md | 35 +- .../connectors/source-persistiq/README.md | 19 +- .../connectors/source-pexels-api/README.md | 24 +- .../connectors/source-pexels-api/bootstrap.md | 20 +- .../connectors/source-pinterest/README.md | 35 +- .../connectors/source-pinterest/bootstrap.md | 29 +- .../connectors/source-pipedrive/README.md | 19 +- .../source-pivotal-tracker/README.md | 25 +- .../connectors/source-plaid/README.md | 25 +- .../connectors/source-plausible/BOOTSTRAP.md | 1 + .../connectors/source-plausible/README.md | 19 +- .../connectors/source-pocket/README.md | 35 +- .../connectors/source-pocket/bootstrap.md | 2 +- .../connectors/source-pokeapi/README.md | 19 +- .../source-polygon-stock-api/README.md | 35 +- .../connectors/source-postgres/README.md | 6 +- .../integration_tests/README.md | 9 +- .../connectors/source-posthog/README.md | 25 +- .../connectors/source-postmarkapp/README.md | 35 +- .../connectors/source-prestashop/README.md | 35 +- .../connectors/source-primetric/README.md | 25 +- .../connectors/source-public-apis/README.md | 19 +- .../connectors/source-punk-api/README.md | 25 +- .../connectors/source-punk-api/bootstrap.md | 10 +- .../connectors/source-pypi/README.md | 35 +- .../source-python-http-tutorial/README.md | 24 +- .../schemas/TODO.md | 21 +- .../connectors/source-qonto/README.md | 16 +- .../connectors/source-qualaroo/README.md | 19 +- .../connectors/source-quickbooks/README.md | 19 +- .../connectors/source-railz/README.md | 19 +- .../source-rd-station-marketing/README.md | 25 +- .../connectors/source-recharge/README.md | 35 +- .../connectors/source-recreation/BOOTSTRAP.md | 15 +- .../connectors/source-recreation/README.md | 35 +- .../connectors/source-recruitee/README.md | 19 +- .../connectors/source-recurly/README.md | 4 +- .../integration_tests/README.md | 1 + .../connectors/source-reply-io/README.md | 19 +- .../connectors/source-retently/README.md | 35 +- .../connectors/source-ringcentral/README.md | 19 +- .../connectors/source-rki-covid/README.md | 27 +- .../connectors/source-rki-covid/bootstrap.md | 44 +- .../source_rki_covid/schemas/TODO.md | 21 +- .../connectors/source-rocket-chat/README.md | 19 +- .../source-rocket-chat/rocket-chat.md | 24 +- .../connectors/source-rss/README.md | 23 +- .../connectors/source-s3/README.md | 35 +- .../connectors/source-salesforce/BOOTSTRAP.md | 34 +- .../connectors/source-salesforce/README.md | 35 +- .../connectors/source-salesloft/README.md | 25 +- .../source-sap-fieldglass/README.md | 19 +- .../source-scaffold-java-jdbc/README.md | 19 +- .../source-scaffold-source-http/README.md | 23 +- .../schemas/TODO.md | 21 +- .../source-scaffold-source-python/README.md | 26 +- .../connectors/source-secoda/README.md | 19 +- .../connectors/source-sendgrid/README.md | 35 +- .../connectors/source-sendinblue/README.md | 19 +- .../connectors/source-senseforce/README.md | 19 +- .../connectors/source-sentry/README.md | 35 +- .../connectors/source-sentry/bootstrap.md | 14 +- .../connectors/source-serpstat/README.md | 19 +- .../connectors/source-sftp-bulk/README.md | 35 +- .../connectors/source-sftp/README.md | 17 +- .../connectors/source-shopify/README.md | 35 +- .../connectors/source-shortio/README.md | 19 +- .../connectors/source-slack/README.md | 35 +- .../connectors/source-smaily/README.md | 19 +- .../connectors/source-smartengage/README.md | 35 +- .../connectors/source-smartsheets/README.md | 19 +- .../source-snapchat-marketing/README.md | 35 +- .../connectors/source-snowflake/CHANGELOG.md | 1 + .../connectors/source-snowflake/README.md | 10 +- .../integration_tests/README.md | 1 + .../connectors/source-sonar-cloud/README.md | 35 +- .../connectors/source-spacex-api/README.md | 24 +- .../connectors/source-spacex-api/bootstrap.md | 9 +- .../connectors/source-square/README.md | 35 +- .../source_square/schemas/TODO.md | 9 +- .../connectors/source-statuspage/README.md | 19 +- .../connectors/source-strava/README.md | 35 +- .../connectors/source-strava/bootstrap.md | 24 +- .../connectors/source-stripe/README.md | 35 +- .../source-survey-sparrow/README.md | 35 +- .../connectors/source-surveycto/README.md | 31 +- .../connectors/source-surveymonkey/README.md | 35 +- .../connectors/source-tempo/README.md | 19 +- .../connectors/source-teradata/README.md | 19 +- .../source-the-guardian-api/README.md | 19 +- .../connectors/source-tidb/README.md | 19 +- .../source-tiktok-marketing/README.md | 35 +- .../source-tiktok-marketing/bootstrap.md | 100 ++- .../connectors/source-timely/README.md | 35 +- .../connectors/source-tmdb/README.md | 24 +- .../connectors/source-tmdb/bootstrap.md | 11 +- .../connectors/source-todoist/README.md | 35 +- .../connectors/source-toggl/README.md | 19 +- .../connectors/source-tplcentral/README.md | 25 +- .../connectors/source-trello/README.md | 19 +- .../connectors/source-trustpilot/README.md | 25 +- .../source-tvmaze-schedule/README.md | 19 +- .../source-twilio-taskrouter/README.md | 35 +- .../connectors/source-twilio/README.md | 35 +- .../connectors/source-twitter/README.md | 19 +- .../connectors/source-tyntec-sms/README.md | 19 +- .../connectors/source-typeform/README.md | 35 +- .../connectors/source-unleash/README.md | 19 +- .../connectors/source-us-census/README.md | 25 +- .../connectors/source-vantage/README.md | 19 +- .../source-visma-economic/README.md | 35 +- .../connectors/source-vitally/README.md | 19 +- .../connectors/source-waiteraid/README.md | 26 +- .../connectors/source-waiteraid/bootstrap.md | 7 +- .../connectors/source-weatherstack/README.md | 25 +- .../connectors/source-webflow/README.md | 25 +- .../connectors/source-whisky-hunter/README.md | 19 +- .../source-whisky-hunter/bootstrap.md | 18 +- .../source-wikipedia-pageviews/README.md | 19 +- .../connectors/source-woocommerce/README.md | 19 +- .../connectors/source-workable/README.md | 19 +- .../connectors/source-workramp/README.md | 19 +- .../connectors/source-wrike/README.md | 19 +- .../connectors/source-xero/README.md | 25 +- .../connectors/source-xkcd/README.md | 25 +- .../connectors/source-xkcd/bootstrap.md | 2 +- .../source-yahoo-finance-price/README.md | 35 +- .../source-yandex-metrica/README.md | 35 +- .../connectors/source-yotpo/README.md | 19 +- .../connectors/source-younium/README.md | 35 +- .../source-youtube-analytics/README.md | 25 +- .../source-zapier-supported-storage/README.md | 35 +- .../connectors/source-zendesk-chat/README.md | 35 +- .../connectors/source-zendesk-sell/README.md | 19 +- .../source-zendesk-sunshine/README.md | 35 +- .../source-zendesk-support/README.md | 35 +- .../connectors/source-zendesk-talk/README.md | 35 +- .../connectors/source-zenefits/README.md | 35 +- .../connectors/source-zenloop/README.md | 35 +- .../connectors/source-zoho-crm/README.md | 25 +- .../connectors/source-zoom/README.md | 26 +- .../sso-providers/azure-entra-id.md | 12 +- docs/access-management/sso-providers/okta.md | 2 + docs/access-management/sso.md | 1 - docs/api-documentation.md | 14 +- .../configuring-connections.md | 26 +- .../dbt-cloud-integration.md | 18 +- .../manage-airbyte-cloud-notifications.md | 57 +- .../manage-connection-state.md | 9 +- .../managing-airbyte-cloud/manage-credits.md | 36 +- .../manage-data-residency.md | 15 +- .../manage-schema-changes.md | 71 +- .../review-connection-status.md | 48 +- .../review-sync-history.md | 39 +- docs/community/code-of-conduct.md | 35 +- docs/community/getting-support.md | 38 +- docs/connector-development/README.md | 1 + docs/connector-development/best-practices.md | 38 +- .../cdk-python/README.md | 2 +- .../cdk-python/basic-concepts.md | 5 +- .../cdk-python/full-refresh-stream.md | 2 +- .../cdk-python/http-streams.md | 15 +- .../cdk-python/incremental-stream.md | 21 +- .../cdk-python/python-concepts.md | 1 - .../resumable-full-refresh-stream.md | 8 +- .../cdk-python/schemas.md | 13 +- .../cdk-python/stream-slices.md | 1 - .../config-based/advanced-topics.md | 12 +- .../config-based/low-code-cdk-overview.md | 1 + .../tutorial/0-getting-started.md | 2 +- .../config-based/tutorial/1-create-source.md | 4 +- .../tutorial/2-install-dependencies.md | 3 +- .../3-connecting-to-the-API-source.md | 18 +- .../config-based/tutorial/4-reading-data.md | 4 +- .../tutorial/5-incremental-reads.md | 12 +- .../config-based/tutorial/6-testing.md | 2 +- .../authentication.md | 197 ++-- .../error-handling.md | 230 ++--- .../incremental-syncs.md | 158 ++-- .../understanding-the-yaml-file/pagination.md | 134 +-- .../partition-router.md | 106 +-- .../record-selector.md | 163 ++-- .../understanding-the-yaml-file/reference.md | 93 +- .../request-options.md | 94 +- .../understanding-the-yaml-file/requester.md | 68 +- .../yaml-overview.md | 116 +-- .../connector-builder-ui/authentication.md | 52 +- .../connector-builder-compatibility.md | 35 +- .../connector-builder-ui/error-handling.md | 23 +- .../connector-builder-ui/overview.md | 32 +- .../connector-builder-ui/pagination.md | 11 +- .../connector-builder-ui/partitioning.md | 74 +- .../connector-metadata-file.md | 42 +- .../connector-specification-reference.md | 7 +- .../connector-development/debugging-docker.md | 70 +- .../migration-to-base-image.md | 23 +- .../connector-development/schema-reference.md | 18 +- .../testing-connectors/README.md | 11 +- .../connector-acceptance-tests-reference.md | 72 +- .../2-reading-a-page.md | 5 +- .../build-a-connector-the-hard-way.md | 2 +- docs/connector-development/ux-handbook.md | 22 +- docs/contributing-to-airbyte/README.md | 17 +- .../change-cdk-connector.md | 36 +- .../issues-and-requests.md | 2 +- .../resources/code-formatting.md | 15 +- .../resources/developing-locally.md | 31 +- .../resources/developing-on-docker.md | 26 +- .../resources/pull-requests-handbook.md | 18 +- .../resources/qa-checks.md | 134 +-- .../submit-new-connector.md | 25 +- docs/contributing-to-airbyte/writing-docs.md | 76 +- docs/deploying-airbyte/docker-compose.md | 3 +- docs/deploying-airbyte/local-deployment.md | 10 +- docs/deploying-airbyte/on-aws-ec2.md | 1 + docs/deploying-airbyte/on-aws-ecs.md | 6 +- docs/deploying-airbyte/on-cloud.md | 1 - .../on-kubernetes-via-helm.md | 8 +- docs/deploying-airbyte/on-oci-vm.md | 2 +- docs/deploying-airbyte/on-plural.md | 7 +- docs/deploying-airbyte/on-restack.md | 33 +- docs/developer-guides/licenses/README.md | 4 +- .../developer-guides/licenses/elv2-license.md | 1 - docs/developer-guides/licenses/examples.md | 14 +- docs/developer-guides/licenses/mit-license.md | 1 - docs/enterprise-setup/README.md | 14 +- docs/enterprise-setup/api-access-config.md | 30 +- docs/enterprise-setup/implementation-guide.md | 217 +++-- .../upgrading-from-community.md | 69 +- docs/integrations/custom-connectors.md | 3 +- docs/integrations/destinations/README.md | 1 - docs/integrations/destinations/astra.md | 18 +- .../integrations/destinations/aws-datalake.md | 20 +- .../destinations/bigquery-migrations.md | 4 +- docs/integrations/destinations/bigquery.md | 4 +- docs/integrations/destinations/chroma.md | 38 +- .../destinations/clickhouse-migrations.md | 23 +- docs/integrations/destinations/clickhouse.md | 2 +- docs/integrations/destinations/dev-null.md | 2 +- docs/integrations/destinations/duckdb.md | 22 +- docs/integrations/destinations/e2e-test.md | 26 +- .../destinations/elasticsearch.md | 87 +- docs/integrations/destinations/firestore.md | 5 +- docs/integrations/destinations/gcs.md | 17 +- .../destinations/google-sheets.md | 24 +- .../destinations/langchain-migrations.md | 2 +- docs/integrations/destinations/langchain.md | 49 +- .../destinations/mariadb-columnstore.md | 53 +- docs/integrations/destinations/milvus.md | 64 +- .../destinations/mssql-migrations.md | 2 +- docs/integrations/destinations/mssql.md | 2 +- .../destinations/mysql-migrations.md | 2 +- .../destinations/oracle-migrations.md | 3 +- docs/integrations/destinations/oracle.md | 2 +- docs/integrations/destinations/pinecone.md | 84 +- .../destinations/postgres-migrations.md | 2 +- docs/integrations/destinations/postgres.md | 7 +- docs/integrations/destinations/qdrant.md | 41 +- .../destinations/redshift-migrations.md | 2 +- docs/integrations/destinations/redshift.md | 4 +- docs/integrations/destinations/s3-glue.md | 5 +- docs/integrations/destinations/s3.md | 12 +- .../destinations/snowflake-migrations.md | 2 +- docs/integrations/destinations/snowflake.md | 4 +- docs/integrations/destinations/teradata.md | 16 +- docs/integrations/destinations/vectara.md | 48 +- .../destinations/weaviate-migrations.md | 1 - docs/integrations/destinations/weaviate.md | 86 +- docs/integrations/destinations/yellowbrick.md | 15 +- docs/integrations/destinations/yugabytedb.md | 40 +- .../locating-files-local-destination.md | 3 +- docs/integrations/sources/activecampaign.md | 30 +- docs/integrations/sources/adjust.md | 3 +- docs/integrations/sources/aha.md | 20 +- docs/integrations/sources/aircall.md | 16 +- .../sources/airtable-migrations.md | 3 +- docs/integrations/sources/airtable.md | 41 +- docs/integrations/sources/alpha-vantage.md | 25 +- .../sources/amazon-ads-migrations.md | 49 +- docs/integrations/sources/amazon-ads.md | 85 +- .../amazon-seller-partner-migrations.md | 65 +- .../sources/amazon-seller-partner.md | 13 +- docs/integrations/sources/amplitude.md | 101 ++- .../sources/apify-dataset-migrations.md | 1 + docs/integrations/sources/apify-dataset.md | 46 +- .../sources/appfollow-migrations.md | 2 +- docs/integrations/sources/appfollow.md | 8 +- docs/integrations/sources/appstore.md | 56 +- docs/integrations/sources/asana.md | 12 +- docs/integrations/sources/ashby.md | 4 +- docs/integrations/sources/auth0.md | 18 +- docs/integrations/sources/avni.md | 4 +- docs/integrations/sources/aws-cloudtrail.md | 20 +- .../sources/azure-blob-storage.md | 15 +- docs/integrations/sources/azure-table.md | 15 +- docs/integrations/sources/babelforce.md | 31 +- docs/integrations/sources/bamboo-hr.md | 51 +- docs/integrations/sources/bigcommerce.md | 2 +- docs/integrations/sources/bigquery.md | 2 +- .../sources/bing-ads-migrations.md | 20 +- docs/integrations/sources/bing-ads.md | 184 ++-- docs/integrations/sources/braintree.md | 56 +- docs/integrations/sources/braze.md | 18 +- docs/integrations/sources/breezometer.md | 27 +- docs/integrations/sources/callrail.md | 31 +- docs/integrations/sources/cart.md | 26 +- docs/integrations/sources/chargebee.md | 131 +-- docs/integrations/sources/chargify.md | 10 +- docs/integrations/sources/chartmogul.md | 37 +- docs/integrations/sources/clickhouse.md | 84 +- docs/integrations/sources/clickup-api.md | 42 +- docs/integrations/sources/clockify.md | 18 +- docs/integrations/sources/close-com.md | 100 +-- docs/integrations/sources/cockroachdb.md | 74 +- docs/integrations/sources/coda.md | 12 +- docs/integrations/sources/coin-api.md | 24 +- docs/integrations/sources/coingecko-coins.md | 14 +- docs/integrations/sources/commcare.md | 6 +- docs/integrations/sources/commercetools.md | 45 +- docs/integrations/sources/configcat.md | 28 +- docs/integrations/sources/confluence.md | 20 +- docs/integrations/sources/convertkit.md | 26 +- docs/integrations/sources/copper.md | 18 +- docs/integrations/sources/courier.md | 2 +- docs/integrations/sources/customer-io.md | 30 +- docs/integrations/sources/datadog.md | 66 +- docs/integrations/sources/datascope.md | 8 +- docs/integrations/sources/db2.md | 62 +- docs/integrations/sources/delighted.md | 28 +- docs/integrations/sources/dixa.md | 2 +- docs/integrations/sources/dockerhub.md | 33 +- docs/integrations/sources/dremio.md | 24 +- docs/integrations/sources/drift.md | 22 +- docs/integrations/sources/drupal.md | 9 +- docs/integrations/sources/dv-360.md | 2 +- docs/integrations/sources/dynamodb.md | 28 +- docs/integrations/sources/e2e-test-cloud.md | 2 +- docs/integrations/sources/e2e-test.md | 41 +- docs/integrations/sources/elasticsearch.md | 10 +- docs/integrations/sources/emailoctopus.md | 29 +- docs/integrations/sources/everhour.md | 8 +- docs/integrations/sources/exchange-rates.md | 20 +- .../sources/facebook-marketing-migrations.md | 27 +- .../sources/facebook-marketing.md | 50 +- .../sources/facebook-pages-migrations.md | 19 +- docs/integrations/sources/facebook-pages.md | 36 +- docs/integrations/sources/fastbill.md | 32 +- docs/integrations/sources/fauna.md | 75 +- docs/integrations/sources/file.md | 29 +- .../sources/firebase-realtime-database.md | 39 +- docs/integrations/sources/firebolt.md | 10 +- docs/integrations/sources/flexport.md | 42 +- docs/integrations/sources/freshcaller.md | 36 +- docs/integrations/sources/freshdesk.md | 2 +- .../sources/freshsales-migrations.md | 2 +- docs/integrations/sources/freshsales.md | 44 +- docs/integrations/sources/freshservice.md | 72 +- docs/integrations/sources/gainsight-px.md | 14 +- docs/integrations/sources/gcs.md | 36 +- docs/integrations/sources/genesys.md | 11 +- docs/integrations/sources/getlago.md | 36 +- docs/integrations/sources/github.md | 30 +- .../integrations/sources/gitlab-migrations.md | 82 +- docs/integrations/sources/gitlab.md | 2 +- docs/integrations/sources/glassfrog.md | 56 +- docs/integrations/sources/gnews.md | 10 +- docs/integrations/sources/gocardless.md | 32 +- docs/integrations/sources/gong.md | 16 +- .../sources/google-ads-migrations.md | 20 +- docs/integrations/sources/google-ads.md | 37 +- .../google-analytics-data-api-migrations.md | 28 +- .../sources/google-analytics-data-api.md | 24 +- ...oogle-analytics-v4-service-account-only.md | 100 +-- .../sources/google-analytics-v4.md | 96 +- docs/integrations/sources/google-directory.md | 41 +- docs/integrations/sources/google-drive.md | 24 +- .../sources/google-pagespeed-insights.md | 28 +- .../sources/google-search-console.md | 115 +-- docs/integrations/sources/google-sheets.md | 72 +- docs/integrations/sources/google-webfonts.md | 16 +- .../sources/google-workspace-admin-reports.md | 48 +- docs/integrations/sources/greenhouse.md | 44 +- docs/integrations/sources/gutendex.md | 35 +- docs/integrations/sources/harness.md | 30 +- .../sources/harvest-migrations.md | 5 +- docs/integrations/sources/harvest.md | 58 +- docs/integrations/sources/hellobaton.md | 6 +- docs/integrations/sources/http-request.md | 6 +- docs/integrations/sources/hubplanner.md | 21 +- .../sources/hubspot-migrations.md | 38 +- docs/integrations/sources/hubspot.md | 136 +-- docs/integrations/sources/insightly.md | 102 ++- .../sources/instagram-migrations.md | 41 +- docs/integrations/sources/instagram.md | 6 +- docs/integrations/sources/instatus.md | 49 +- docs/integrations/sources/intercom.md | 101 +-- docs/integrations/sources/intruder.md | 24 +- docs/integrations/sources/ip2whois.md | 27 +- docs/integrations/sources/iterable.md | 2 +- docs/integrations/sources/jenkins.md | 37 +- docs/integrations/sources/jira-migrations.md | 24 +- docs/integrations/sources/jira.md | 8 +- docs/integrations/sources/k6-cloud.md | 28 +- docs/integrations/sources/kafka.md | 68 +- docs/integrations/sources/klarna.md | 25 +- docs/integrations/sources/klaus-api.md | 4 +- .../sources/klaviyo-migrations.md | 4 +- docs/integrations/sources/klaviyo.md | 4 +- docs/integrations/sources/kustomer-singer.md | 2 +- docs/integrations/sources/kyriba.md | 18 +- docs/integrations/sources/kyve.md | 15 +- docs/integrations/sources/launchdarkly.md | 28 +- docs/integrations/sources/lemlist.md | 12 +- docs/integrations/sources/lever-hiring.md | 36 +- .../sources/linkedin-ads-migrations.md | 31 +- docs/integrations/sources/linkedin-ads.md | 50 +- docs/integrations/sources/linkedin-pages.md | 86 +- docs/integrations/sources/linnworks.md | 36 +- docs/integrations/sources/lokalise.md | 20 +- docs/integrations/sources/looker.md | 119 ++- docs/integrations/sources/low-code.md | 40 +- docs/integrations/sources/magento.md | 1 - .../sources/mailchimp-migrations.md | 42 +- docs/integrations/sources/mailchimp.md | 108 +-- docs/integrations/sources/mailerlite.md | 26 +- docs/integrations/sources/mailgun.md | 4 +- docs/integrations/sources/mailjet-mail.md | 30 +- docs/integrations/sources/mailjet-sms.md | 24 +- docs/integrations/sources/marketo.md | 3 +- docs/integrations/sources/merge.md | 16 +- .../sources/metabase-migrations.md | 33 +- docs/integrations/sources/metabase.md | 48 +- .../sources/microsoft-dataverse.md | 8 +- .../sources/microsoft-dynamics-ax.md | 1 - .../microsoft-dynamics-customer-engagement.md | 1 - .../sources/microsoft-dynamics-gp.md | 1 - .../sources/microsoft-dynamics-nav.md | 1 - .../sources/microsoft-onedrive.md | 80 +- .../sources/microsoft-sharepoint.md | 49 +- docs/integrations/sources/microsoft-teams.md | 144 +-- docs/integrations/sources/mixpanel.md | 2 +- .../integrations/sources/monday-migrations.md | 64 +- docs/integrations/sources/monday.md | 44 +- .../sources/mongodb-v2-migrations.md | 14 +- docs/integrations/sources/mongodb-v2.md | 92 +- docs/integrations/sources/mssql-migrations.md | 10 +- docs/integrations/sources/mssql.md | 2 +- docs/integrations/sources/my-hours.md | 32 +- docs/integrations/sources/mysql-migrations.md | 3 +- docs/integrations/sources/mysql.md | 28 +- .../sources/mysql/mysql-troubleshooting.md | 10 +- docs/integrations/sources/nasa.md | 17 +- docs/integrations/sources/netsuite.md | 62 +- docs/integrations/sources/news-api.md | 12 +- docs/integrations/sources/newsdata.md | 22 +- .../integrations/sources/notion-migrations.md | 4 +- docs/integrations/sources/notion.md | 18 +- docs/integrations/sources/nytimes.md | 28 +- docs/integrations/sources/okta.md | 44 +- docs/integrations/sources/omnisend.md | 26 +- docs/integrations/sources/onesignal.md | 2 +- .../sources/open-exchange-rates.md | 34 +- docs/integrations/sources/openweather.md | 38 +- docs/integrations/sources/opsgenie.md | 50 +- .../integrations/sources/oracle-peoplesoft.md | 7 +- .../integrations/sources/oracle-siebel-crm.md | 7 +- docs/integrations/sources/oracle.md | 4 +- docs/integrations/sources/orb.md | 59 +- docs/integrations/sources/orbit.md | 40 +- docs/integrations/sources/oura.md | 4 +- docs/integrations/sources/outbrain-amplify.md | 68 +- docs/integrations/sources/outreach.md | 26 +- docs/integrations/sources/pagerduty.md | 32 +- docs/integrations/sources/pardot.md | 58 +- .../sources/paypal-transaction-migrations.md | 3 +- .../sources/paypal-transaction.md | 268 +++--- docs/integrations/sources/paystack.md | 43 +- docs/integrations/sources/pendo.md | 31 +- docs/integrations/sources/persistiq.md | 2 +- .../sources/pinterest-migrations.md | 1 + .../sources/pipedrive-migrations.md | 3 +- docs/integrations/sources/pipedrive.md | 60 +- docs/integrations/sources/pivotal-tracker.md | 6 +- docs/integrations/sources/plausible.md | 25 +- docs/integrations/sources/pocket.md | 23 +- docs/integrations/sources/pokeapi.md | 2 +- .../integrations/sources/polygon-stock-api.md | 35 +- docs/integrations/sources/postgres.md | 56 +- .../sources/postgres/cloud-sql-postgres.md | 21 +- .../postgres/postgres-troubleshooting.md | 24 +- docs/integrations/sources/posthog.md | 6 +- docs/integrations/sources/postmarkapp.md | 21 +- docs/integrations/sources/prestashop.md | 22 +- docs/integrations/sources/primetric.md | 6 +- docs/integrations/sources/public-apis.md | 36 +- docs/integrations/sources/punk-api.md | 12 +- docs/integrations/sources/pypi.md | 22 +- docs/integrations/sources/qonto.md | 2 +- docs/integrations/sources/qualaroo.md | 27 +- .../sources/quickbooks-migrations.md | 1 + docs/integrations/sources/quickbooks.md | 34 +- docs/integrations/sources/railz.md | 6 +- .../sources/rd-station-marketing.md | 45 +- docs/integrations/sources/recharge.md | 62 +- docs/integrations/sources/recreation.md | 49 +- docs/integrations/sources/recruitee.md | 18 +- .../sources/recurly-migrations.md | 16 +- docs/integrations/sources/recurly.md | 79 +- docs/integrations/sources/redshift.md | 4 +- docs/integrations/sources/retently.md | 28 +- docs/integrations/sources/ringcentral.md | 22 +- docs/integrations/sources/rki-covid.md | 50 +- docs/integrations/sources/rocket-chat.md | 20 +- docs/integrations/sources/rss-migrations.md | 13 +- docs/integrations/sources/rss.md | 33 +- docs/integrations/sources/s3-migrations.md | 21 +- docs/integrations/sources/s3.md | 30 +- docs/integrations/sources/salesforce.md | 45 +- docs/integrations/sources/sap-business-one.md | 1 - docs/integrations/sources/sap-fieldglass.md | 16 +- docs/integrations/sources/search-metrics.md | 66 +- docs/integrations/sources/secoda.md | 22 +- .../sources/sendgrid-migrations.md | 36 +- docs/integrations/sources/sendgrid.md | 55 +- docs/integrations/sources/sendinblue.md | 22 +- docs/integrations/sources/senseforce.md | 36 +- docs/integrations/sources/sentry.md | 2 +- docs/integrations/sources/serpstat.md | 27 +- .../sources/sftp-bulk-migrations.md | 17 +- docs/integrations/sources/sftp-bulk.md | 2 +- docs/integrations/sources/sftp.md | 4 +- .../sources/shopify-migrations.md | 66 +- docs/integrations/sources/shopify.md | 87 +- docs/integrations/sources/shortio.md | 11 +- docs/integrations/sources/slack-migrations.md | 7 +- docs/integrations/sources/slack.md | 39 +- docs/integrations/sources/smaily.md | 32 +- docs/integrations/sources/smartengage.md | 31 +- docs/integrations/sources/smartsheets.md | 80 +- .../sources/snapchat-marketing.md | 56 +- docs/integrations/sources/snowflake.md | 4 +- docs/integrations/sources/sonar-cloud.md | 33 +- docs/integrations/sources/spacex-api.md | 10 +- docs/integrations/sources/spree-commerce.md | 5 +- docs/integrations/sources/square.md | 2 +- docs/integrations/sources/statuspage.md | 30 +- docs/integrations/sources/strava.md | 24 +- .../integrations/sources/stripe-migrations.md | 11 +- docs/integrations/sources/stripe.md | 220 ++--- docs/integrations/sources/sugar-crm.md | 9 +- docs/integrations/sources/survey-sparrow.md | 35 +- docs/integrations/sources/surveycto.md | 10 +- docs/integrations/sources/surveymonkey.md | 25 +- docs/integrations/sources/talkdesk-explore.md | 37 +- docs/integrations/sources/teradata.md | 28 +- docs/integrations/sources/tidb.md | 83 +- docs/integrations/sources/tiktok-marketing.md | 4 +- docs/integrations/sources/timely.md | 19 +- docs/integrations/sources/tmdb.md | 13 +- docs/integrations/sources/todoist.md | 21 +- docs/integrations/sources/toggl.md | 30 +- docs/integrations/sources/trello.md | 28 +- docs/integrations/sources/trustpilot.md | 24 +- docs/integrations/sources/tvmaze-schedule.md | 11 +- .../integrations/sources/twilio-taskrouter.md | 12 +- docs/integrations/sources/twilio.md | 138 +-- docs/integrations/sources/twitter.md | 10 +- docs/integrations/sources/tyntec-sms.md | 10 +- .../sources/typeform-migrations.md | 2 +- docs/integrations/sources/typeform.md | 36 +- docs/integrations/sources/unleash.md | 14 +- docs/integrations/sources/us-census.md | 4 + docs/integrations/sources/vantage.md | 24 +- docs/integrations/sources/victorops.md | 30 +- docs/integrations/sources/visma-economic.md | 56 +- docs/integrations/sources/waiteraid.md | 18 +- docs/integrations/sources/weatherstack.md | 24 +- docs/integrations/sources/webflow.md | 12 +- docs/integrations/sources/whisky-hunter.md | 29 +- .../sources/wikipedia-pageviews.md | 4 +- docs/integrations/sources/wordpress.md | 1 - docs/integrations/sources/workable.md | 25 +- docs/integrations/sources/wrike.md | 29 +- docs/integrations/sources/xero.md | 16 +- .../sources/yahoo-finance-price.md | 16 +- docs/integrations/sources/yandex-metrica.md | 16 +- docs/integrations/sources/yotpo.md | 18 +- docs/integrations/sources/younium.md | 14 +- .../integrations/sources/youtube-analytics.md | 85 +- .../sources/zapier-supported-storage.md | 9 +- docs/integrations/sources/zencart.md | 1 - docs/integrations/sources/zendesk-chat.md | 4 +- docs/integrations/sources/zendesk-sell.md | 79 +- docs/integrations/sources/zendesk-sunshine.md | 58 +- .../sources/zendesk-support-migrations.md | 2 +- docs/integrations/sources/zendesk-support.md | 18 +- docs/integrations/sources/zendesk-talk.md | 48 +- docs/integrations/sources/zenefits.md | 16 +- docs/integrations/sources/zenloop.md | 53 +- docs/integrations/sources/zoho-crm.md | 36 +- docs/integrations/sources/zoom-migrations.md | 10 +- docs/integrations/sources/zoom.md | 66 +- docs/integrations/sources/zuora.md | 129 ++- docs/operating-airbyte/security.md | 31 +- docs/operator-guides/browsing-output-logs.md | 23 +- docs/operator-guides/collecting-metrics.md | 87 +- .../operator-guides/configuring-airbyte-db.md | 32 +- docs/operator-guides/configuring-airbyte.md | 4 +- .../configuring-connector-resources.md | 11 +- docs/operator-guides/reset.md | 20 +- docs/operator-guides/scaling-airbyte.md | 1 - docs/operator-guides/telemetry.md | 3 + .../README.md | 1 - .../transformations-with-airbyte.md | 15 +- .../transformations-with-dbt.md | 5 +- .../transformations-with-sql.md | 67 +- docs/operator-guides/upgrading-airbyte.md | 12 +- .../using-custom-connectors.md | 61 +- .../using-dagster-integration.md | 20 +- docs/operator-guides/using-kestra-plugin.md | 18 +- docs/operator-guides/using-prefect-task.md | 11 +- .../using-the-airflow-airbyte-operator.md | 15 +- docs/readme.md | 14 +- docs/reference/README.md | 2 +- docs/reference/api/README.md | 6 +- docs/release_notes/april_2023.md | 42 +- docs/release_notes/august_2022.md | 46 +- docs/release_notes/december_2022.md | 46 +- docs/release_notes/december_2023.md | 12 +- docs/release_notes/february_2023.md | 26 +- docs/release_notes/february_2024.md | 9 +- docs/release_notes/january_2023.md | 32 +- docs/release_notes/january_2024.md | 12 +- docs/release_notes/july_2022.md | 51 +- docs/release_notes/july_2023.md | 3 +- docs/release_notes/june_2023.md | 3 +- docs/release_notes/march_2023.md | 63 +- docs/release_notes/march_2024.md | 8 +- docs/release_notes/may_2023.md | 39 +- docs/release_notes/november_2022.md | 29 +- docs/release_notes/november_2023.md | 13 +- docs/release_notes/october_2022.md | 27 +- docs/release_notes/october_2023.md | 7 +- docs/release_notes/september_2022.md | 28 +- docs/release_notes/september_2023.md | 9 +- .../upgrading_to_destinations_v2.md | 20 +- docs/snowflake-native-apps/event-sharing.md | 5 + .../facebook-marketing.md | 60 +- docs/snowflake-native-apps/linkedin-ads.md | 54 +- docs/terraform-documentation.md | 6 +- docs/understanding-airbyte/README.md | 1 - .../airbyte-protocol-docker.md | 12 +- .../airbyte-protocol-versioning.md | 11 +- .../understanding-airbyte/airbyte-protocol.md | 13 +- .../beginners-guide-to-catalog.md | 25 +- docs/understanding-airbyte/cdc.md | 40 +- .../database-data-catalog.md | 186 ++-- docs/understanding-airbyte/heartbeats.md | 28 +- docs/understanding-airbyte/high-level-view.md | 22 +- docs/understanding-airbyte/jobs.md | 55 +- .../json-avro-conversion.md | 62 +- docs/understanding-airbyte/operations.md | 19 +- .../schemaless-sources-and-destinations.md | 12 +- .../supported-data-types.md | 33 +- docs/understanding-airbyte/tech-stack.md | 33 +- .../core-concepts/basic-normalization.md | 18 +- .../using-airbyte/core-concepts/namespaces.md | 66 +- docs/using-airbyte/core-concepts/readme.md | 20 +- .../sync-modes/full-refresh-append.md | 48 +- .../sync-modes/full-refresh-overwrite.md | 28 +- .../core-concepts/sync-schedules.md | 34 +- .../core-concepts/typing-deduping.md | 1 + .../getting-started/add-a-destination.md | 6 +- .../getting-started/add-a-source.md | 1 - docs/using-airbyte/getting-started/readme.md | 2 +- .../getting-started/set-up-a-connection.md | 8 +- docs/using-airbyte/workspaces.md | 19 +- tools/internal/README.md | 12 +- tools/openapi2jsonschema/README.md | 8 +- tools/site/README.md | 6 +- 1001 files changed, 17056 insertions(+), 11500 deletions(-) diff --git a/.prettierignore b/.prettierignore index 8193c5583a6..9579ba1a2fc 100644 --- a/.prettierignore +++ b/.prettierignore @@ -1 +1,3 @@ airbyte-integrations/bases/base-normalization/integration_tests/normalization_test_output +airbyte-ci/connectors/pipelines/tests/test_changelog/result_files +airbyte-integrations/bases/connector-acceptance-test/unit_tests/data/docs diff --git a/.prettierrc b/.prettierrc index b556b2b63c6..31cda2d9257 100644 --- a/.prettierrc +++ b/.prettierrc @@ -3,8 +3,7 @@ { "files": "*.md", "options": { - "printWidth": 100, - "proseWrap": "always" + "proseWrap": "preserve" } } ] diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index df6d0baa677..f8a0700270e 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,2 +1,3 @@ # Code of conduct + View in [docs.airbyte.io](https://docs.airbyte.com/project-overview/code-of-conduct) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 85512b1d4af..39fecef295d 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,2 +1,3 @@ # Contributing + View on [docs.airbyte.io](https://docs.airbyte.io/contributing-to-airbyte) diff --git a/CONTRIBUTORS.md b/CONTRIBUTORS.md index fa4e306a7d7..1eecf14b8c5 100644 --- a/CONTRIBUTORS.md +++ b/CONTRIBUTORS.md @@ -1,428 +1,428 @@ # Contributors -* [69mb](https://github.com/69mb) -* [a-honcharenko](https://github.com/a-honcharenko) -* [aadityasinha-dotcom](https://github.com/aadityasinha-dotcom) -* [aaronsteers](https://github.com/aaronsteers) -* [aazam-gh](https://github.com/aazam-gh) -* [abaerptc](https://github.com/abaerptc) -* [aballiet](https://github.com/aballiet) -* [achaussende](https://github.com/achaussende) -* [ad-m](https://github.com/ad-m) -* [adam-bloom](https://github.com/adam-bloom) -* [adamf](https://github.com/adamf) -* [adamschmidt](https://github.com/adamschmidt) -* [AetherUnbound](https://github.com/AetherUnbound) -* [afranzi](https://github.com/afranzi) -* [agrass](https://github.com/agrass) -* [ahmed-buksh](https://github.com/ahmed-buksh) -* [airbyte-jenny](https://github.com/airbyte-jenny) -* [ajmhatch](https://github.com/ajmhatch) -* [ajzo90](https://github.com/ajzo90) -* [akashkulk](https://github.com/akashkulk) -* [akulgoel96](https://github.com/akulgoel96) -* [alafanechere](https://github.com/alafanechere) -* [alallema](https://github.com/alallema) -* [albert-marrero](https://github.com/albert-marrero) -* [alex-danilin](https://github.com/alex-danilin) -* [alex-gron](https://github.com/alex-gron) -* [alexander-marquardt](https://github.com/alexander-marquardt) -* [AlexanderBatoulis](https://github.com/AlexanderBatoulis) -* [alexandertsukanov](https://github.com/alexandertsukanov) -* [alexandr-shegeda](https://github.com/alexandr-shegeda) -* [alexchouraki](https://github.com/alexchouraki) -* [AlexJameson](https://github.com/AlexJameson) -* [alexnikitchuk](https://github.com/alexnikitchuk) -* [Alihassanc5](https://github.com/Alihassanc5) -* [Allexik](https://github.com/Allexik) -* [alovew](https://github.com/alovew) -* [AM-I-Human](https://github.com/AM-I-Human) -* [amaliaroye](https://github.com/amaliaroye) -* [ambirdsall](https://github.com/ambirdsall) -* [aminamos](https://github.com/aminamos) -* [amitku](https://github.com/amitku) -* [Amruta-Ranade](https://github.com/Amruta-Ranade) -* [anamargaridarl](https://github.com/anamargaridarl) -* [andnig](https://github.com/andnig) -* [andresbravog](https://github.com/andresbravog) -* [andrewlreeve](https://github.com/andrewlreeve) -* [andreyAtBB](https://github.com/andreyAtBB) -* [andriikorotkov](https://github.com/andriikorotkov) -* [andrzejdackiewicz](https://github.com/andrzejdackiewicz) -* [andyjih](https://github.com/andyjih) -* [AndyTwiss](https://github.com/AndyTwiss) -* [animer3009](https://github.com/animer3009) -* [anna-geller](https://github.com/anna-geller) -* [annalvova05](https://github.com/annalvova05) -* [antixar](https://github.com/antixar) -* [antonioneto-hotmart](https://github.com/antonioneto-hotmart) -* [anujgupta0711](https://github.com/anujgupta0711) -* [Anurag870](https://github.com/Anurag870) -* [anushree-agrawal](https://github.com/anushree-agrawal) -* [apostoltego](https://github.com/apostoltego) -* [archangelic](https://github.com/archangelic) -* [arimbr](https://github.com/arimbr) -* [arnaudjnn](https://github.com/arnaudjnn) -* [ArneZsng](https://github.com/ArneZsng) -* [arsenlosenko](https://github.com/arsenlosenko) -* [artem1205](https://github.com/artem1205) -* [artusiep](https://github.com/artusiep) -* [asafepy](https://github.com/asafepy) -* [asyarif93](https://github.com/asyarif93) -* [augan-rymkhan](https://github.com/augan-rymkhan) -* [Auric-Manteo](https://github.com/Auric-Manteo) -* [avaidyanatha](https://github.com/avaidyanatha) -* [avida](https://github.com/avida) -* [avirajsingh7](https://github.com/avirajsingh7) -* [axaysagathiya](https://github.com/axaysagathiya) -* [azhard](https://github.com/azhard) -* [b4stien](https://github.com/b4stien) -* [bala-ceg](https://github.com/bala-ceg) -* [bazarnov](https://github.com/bazarnov) -* [bbugh](https://github.com/bbugh) -* [bcbeidel](https://github.com/bcbeidel) -* [bdashrad](https://github.com/bdashrad) -* [benmoriceau](https://github.com/benmoriceau) -* [BenoitFayolle](https://github.com/BenoitFayolle) -* [BenoitHugonnard](https://github.com/BenoitHugonnard) -* [bgroff](https://github.com/bgroff) -* [Bhupesh-V](https://github.com/Bhupesh-V) -* [BirdboyBolu](https://github.com/BirdboyBolu) -* [bjgbeelen](https://github.com/bjgbeelen) -* [bkrausz](https://github.com/bkrausz) -* [bleonard](https://github.com/bleonard) -* [bnchrch](https://github.com/bnchrch) -* [bobvanluijt](https://github.com/bobvanluijt) -* [brebuanirello-equinix](https://github.com/brebuanirello-equinix) -* [BrentSouza](https://github.com/BrentSouza) -* [brianjlai](https://github.com/brianjlai) -* [brunofaustino](https://github.com/brunofaustino) -* [bstrawson](https://github.com/bstrawson) -* [btkcodedev](https://github.com/btkcodedev) -* [burmecia](https://github.com/burmecia) -* [bzAmin](https://github.com/bzAmin) -* [calebfornari](https://github.com/calebfornari) -* [cameronwtaylor](https://github.com/cameronwtaylor) -* [camro](https://github.com/camro) -* [carlkibler](https://github.com/carlkibler) -* [carlonuccio](https://github.com/carlonuccio) -* [catpineapple](https://github.com/catpineapple) -* [cgardens](https://github.com/cgardens) -* [chadthman](https://github.com/chadthman) -* [chandrasekharan98](https://github.com/chandrasekharan98) -* [ChristoGrab](https://github.com/ChristoGrab) -* [ChristopheDuong](https://github.com/ChristopheDuong) -* [ciancullinan](https://github.com/ciancullinan) -* [cirdes](https://github.com/cirdes) -* [cjwooo](https://github.com/cjwooo) -* [clnoll](https://github.com/clnoll) -* [cobobrien](https://github.com/cobobrien) -* [coetzeevs](https://github.com/coetzeevs) -* [colesnodgrass](https://github.com/colesnodgrass) -* [collinscangarella](https://github.com/collinscangarella) -* [cpdeethree](https://github.com/cpdeethree) -* [CrafterKolyan](https://github.com/CrafterKolyan) -* [cstruct](https://github.com/cstruct) -* [ct-martin](https://github.com/ct-martin) -* [cuyk](https://github.com/cuyk) -* [cynthiaxyin](https://github.com/cynthiaxyin) -* [CyprienBarbault](https://github.com/CyprienBarbault) -* [czuares](https://github.com/czuares) -* [Daemonxiao](https://github.com/Daemonxiao) -* [dainiussa](https://github.com/dainiussa) -* [dalo390](https://github.com/dalo390) -* [damianlegawiec](https://github.com/damianlegawiec) -* [dandpz](https://github.com/dandpz) -* [daniel-cortez-stevenson](https://github.com/daniel-cortez-stevenson) -* [danieldiamond](https://github.com/danieldiamond) -* [Danucas](https://github.com/Danucas) -* [danvass](https://github.com/danvass) -* [darian-heede](https://github.com/darian-heede) -* [darynaishchenko](https://github.com/darynaishchenko) -* [DavidSpek](https://github.com/DavidSpek) -* [davinchia](https://github.com/davinchia) -* [davydov-d](https://github.com/davydov-d) -* [dbyzero](https://github.com/dbyzero) -* [ddoyediran](https://github.com/ddoyediran) -* [deepansh96](https://github.com/deepansh96) -* [delenamalan](https://github.com/delenamalan) -* [denis-sokolov](https://github.com/denis-sokolov) -* [dependabot[bot]](https://github.com/apps/dependabot) -* [dictcp](https://github.com/dictcp) -* [didistars328](https://github.com/didistars328) -* [digambar-t7](https://github.com/digambar-t7) -* [dijonkitchen](https://github.com/dijonkitchen) -* [dizel852](https://github.com/dizel852) -* [dmateusp](https://github.com/dmateusp) -* [domzae](https://github.com/domzae) -* [DoNotPanicUA](https://github.com/DoNotPanicUA) -* [Dracyr](https://github.com/Dracyr) -* [drrest](https://github.com/drrest) -* [dtt101](https://github.com/dtt101) -* [edbizarro](https://github.com/edbizarro) -* [edgao](https://github.com/edgao) -* [edmundito](https://github.com/edmundito) -* [efimmatytsin](https://github.com/efimmatytsin) -* [eliziario](https://github.com/eliziario) -* [elliottrabac](https://github.com/elliottrabac) -* [emmaling27](https://github.com/emmaling27) -* [erica-airbyte](https://github.com/erica-airbyte) -* [erohmensing](https://github.com/erohmensing) -* [etsybaev](https://github.com/etsybaev) -* [eugene-kulak](https://github.com/eugene-kulak) -* [evantahler](https://github.com/evantahler) -* [ffabss](https://github.com/ffabss) -* [flash1293](https://github.com/flash1293) -* [franviera92](https://github.com/franviera92) -* [freimer](https://github.com/freimer) -* [FUT](https://github.com/FUT) -* [gaart](https://github.com/gaart) -* [ganpatagarwal](https://github.com/ganpatagarwal) -* [gargatuma](https://github.com/gargatuma) -* [gergelylendvai](https://github.com/gergelylendvai) -* [girarda](https://github.com/girarda) -* [git-phu](https://github.com/git-phu) -* [github-actions[bot]](https://github.com/apps/github-actions) -* [Gitznik](https://github.com/Gitznik) -* [gordalina](https://github.com/gordalina) -* [gosusnp](https://github.com/gosusnp) -* [grebessi](https://github.com/grebessi) -* [grishick](https://github.com/grishick) -* [grubberr](https://github.com/grubberr) -* [gvillafanetapia](https://github.com/gvillafanetapia) -* [h7kanna](https://github.com/h7kanna) -* [haithem-souala](https://github.com/haithem-souala) -* [haoranyu](https://github.com/haoranyu) -* [harshithmullapudi](https://github.com/harshithmullapudi) -* [heade](https://github.com/heade) -* [hehex9](https://github.com/hehex9) -* [helderco](https://github.com/helderco) -* [henriblancke](https://github.com/henriblancke) -* [Hesperide](https://github.com/Hesperide) -* [hillairet](https://github.com/hillairet) -* [himanshuc3](https://github.com/himanshuc3) -* [hntan](https://github.com/hntan) -* [htrueman](https://github.com/htrueman) -* [hydrosquall](https://github.com/hydrosquall) -* [iberchid](https://github.com/iberchid) -* [igrankova](https://github.com/igrankova) -* [igsaf2](https://github.com/igsaf2) -* [Imbruced](https://github.com/Imbruced) -* [irynakruk](https://github.com/irynakruk) -* [isaacharrisholt](https://github.com/isaacharrisholt) -* [isalikov](https://github.com/isalikov) -* [itaseskii](https://github.com/itaseskii) -* [jacqueskpoty](https://github.com/jacqueskpoty) -* [Jagrutiti](https://github.com/Jagrutiti) -* [jamakase](https://github.com/jamakase) -* [jartek](https://github.com/jartek) -* [jbfbell](https://github.com/jbfbell) -* [jcowanpdx](https://github.com/jcowanpdx) -* [jdclarke5](https://github.com/jdclarke5) -* [jdpgrailsdev](https://github.com/jdpgrailsdev) -* [jeremySrgt](https://github.com/jeremySrgt) -* [jhajajaas](https://github.com/jhajajaas) -* [jhammarstedt](https://github.com/jhammarstedt) -* [jnr0790](https://github.com/jnr0790) -* [joelluijmes](https://github.com/joelluijmes) -* [johnlafleur](https://github.com/johnlafleur) -* [JonsSpaghetti](https://github.com/JonsSpaghetti) -* [jonstacks](https://github.com/jonstacks) -* [jordan-glitch](https://github.com/jordan-glitch) -* [josephkmh](https://github.com/josephkmh) -* [jrhizor](https://github.com/jrhizor) -* [juliachvyrova](https://github.com/juliachvyrova) -* [JulianRommel](https://github.com/JulianRommel) -* [juliatournant](https://github.com/juliatournant) -* [justinbchau](https://github.com/justinbchau) -* [juweins](https://github.com/juweins) -* [jzcruiser](https://github.com/jzcruiser) -* [kaklakariada](https://github.com/kaklakariada) -* [karinakuz](https://github.com/karinakuz) -* [kattos-aws](https://github.com/kattos-aws) -* [KayakinKoder](https://github.com/KayakinKoder) -* [keu](https://github.com/keu) -* [kgrover](https://github.com/kgrover) -* [kimerinn](https://github.com/kimerinn) -* [koconder](https://github.com/koconder) -* [koji-m](https://github.com/koji-m) -* [krishnaglick](https://github.com/krishnaglick) -* [krisjan-oldekamp](https://github.com/krisjan-oldekamp) -* [ksengers](https://github.com/ksengers) -* [kzzzr](https://github.com/kzzzr) -* [lazebnyi](https://github.com/lazebnyi) -* [leo-schick](https://github.com/leo-schick) -* [letiescanciano](https://github.com/letiescanciano) -* [lgomezm](https://github.com/lgomezm) -* [lideke](https://github.com/lideke) -* [lizdeika](https://github.com/lizdeika) -* [lmossman](https://github.com/lmossman) -* [maciej-nedza](https://github.com/maciej-nedza) -* [macmv](https://github.com/macmv) -* [Mainara](https://github.com/Mainara) -* [makalaaneesh](https://github.com/makalaaneesh) -* [makyash](https://github.com/makyash) -* [malikdiarra](https://github.com/malikdiarra) -* [marcelopio](https://github.com/marcelopio) -* [marcosmarxm](https://github.com/marcosmarxm) -* [mariamthiam](https://github.com/mariamthiam) -* [masonwheeler](https://github.com/masonwheeler) -* [masyagin1998](https://github.com/masyagin1998) -* [matter-q](https://github.com/matter-q) -* [maxi297](https://github.com/maxi297) -* [MaxKrog](https://github.com/MaxKrog) -* [mdibaiee](https://github.com/mdibaiee) -* [mfsiega-airbyte](https://github.com/mfsiega-airbyte) -* [michaelnguyen26](https://github.com/michaelnguyen26) -* [michel-tricot](https://github.com/michel-tricot) -* [mickaelandrieu](https://github.com/mickaelandrieu) -* [midavadim](https://github.com/midavadim) -* [mildbyte](https://github.com/mildbyte) -* [misteryeo](https://github.com/misteryeo) -* [mkhokh-33](https://github.com/mkhokh-33) -* [mlavoie-sm360](https://github.com/mlavoie-sm360) -* [mmolimar](https://github.com/mmolimar) -* [mohamagdy](https://github.com/mohamagdy) -* [mohitreddy1996](https://github.com/mohitreddy1996) -* [monai](https://github.com/monai) -* [mrhallak](https://github.com/mrhallak) -* [Muriloo](https://github.com/Muriloo) -* [mustangJaro](https://github.com/mustangJaro) -* [Mykyta-Serbynevskyi](https://github.com/Mykyta-Serbynevskyi) -* [n0rritt](https://github.com/n0rritt) -* [nastra](https://github.com/nastra) -* [nataliekwong](https://github.com/nataliekwong) -* [natalyjazzviolin](https://github.com/natalyjazzviolin) -* [nauxliu](https://github.com/nauxliu) -* [nguyenaiden](https://github.com/nguyenaiden) -* [NipunaPrashan](https://github.com/NipunaPrashan) -* [Nmaxime](https://github.com/Nmaxime) -* [noahkawasaki-airbyte](https://github.com/noahkawasaki-airbyte) -* [noahkawasakigoogle](https://github.com/noahkawasakigoogle) -* [novotl](https://github.com/novotl) -* [ntucker](https://github.com/ntucker) -* [octavia-squidington-iii](https://github.com/octavia-squidington-iii) -* [olivermeyer](https://github.com/olivermeyer) -* [omid](https://github.com/omid) -* [oreopot](https://github.com/oreopot) -* [pabloescoder](https://github.com/pabloescoder) -* [panhavad](https://github.com/panhavad) -* [pecalleja](https://github.com/pecalleja) -* [pedroslopez](https://github.com/pedroslopez) -* [perangel](https://github.com/perangel) -* [peter279k](https://github.com/peter279k) -* [PhilipCorr](https://github.com/PhilipCorr) -* [philippeboyd](https://github.com/philippeboyd) -* [Phlair](https://github.com/Phlair) -* [pmossman](https://github.com/pmossman) -* [po3na4skld](https://github.com/po3na4skld) -* [PoCTo](https://github.com/PoCTo) -* [postamar](https://github.com/postamar) -* [prasrvenkat](https://github.com/prasrvenkat) -* [prateekmukhedkar](https://github.com/prateekmukhedkar) -* [proprefenetre](https://github.com/proprefenetre) -* [Pwaldi](https://github.com/Pwaldi) -* [rach-r](https://github.com/rach-r) -* [ramonvermeulen](https://github.com/ramonvermeulen) -* [ReptilianBrain](https://github.com/ReptilianBrain) -* [rileybrook](https://github.com/rileybrook) -* [RobertoBonnet](https://github.com/RobertoBonnet) -* [robgleason](https://github.com/robgleason) -* [RobLucchi](https://github.com/RobLucchi) -* [rodireich](https://github.com/rodireich) -* [roisinbolt](https://github.com/roisinbolt) -* [roman-romanov-o](https://github.com/roman-romanov-o) -* [roman-yermilov-gl](https://github.com/roman-yermilov-gl) -* [ron-damon](https://github.com/ron-damon) -* [rparrapy](https://github.com/rparrapy) -* [ryankfu](https://github.com/ryankfu) -* [sajarin](https://github.com/sajarin) -* [samos123](https://github.com/samos123) -* [sarafonseca-123](https://github.com/sarafonseca-123) -* [sashaNeshcheret](https://github.com/sashaNeshcheret) -* [SatishChGit](https://github.com/SatishChGit) -* [sbjorn](https://github.com/sbjorn) -* [schlattk](https://github.com/schlattk) -* [scottleechua](https://github.com/scottleechua) -* [sdairs](https://github.com/sdairs) -* [sergei-solonitcyn](https://github.com/sergei-solonitcyn) -* [sergio-ropero](https://github.com/sergio-ropero) -* [sh4sh](https://github.com/sh4sh) -* [shadabshaukat](https://github.com/shadabshaukat) -* [sherifnada](https://github.com/sherifnada) -* [Shishir-rmv](https://github.com/Shishir-rmv) -* [shrodingers](https://github.com/shrodingers) -* [shyngysnurzhan](https://github.com/shyngysnurzhan) -* [siddhant3030](https://github.com/siddhant3030) -* [sivankumar86](https://github.com/sivankumar86) -* [snyk-bot](https://github.com/snyk-bot) -* [SofiiaZaitseva](https://github.com/SofiiaZaitseva) -* [sophia-wiley](https://github.com/sophia-wiley) -* [SPTKL](https://github.com/SPTKL) -* [subhamX](https://github.com/subhamX) -* [subodh1810](https://github.com/subodh1810) -* [suhomud](https://github.com/suhomud) -* [supertopher](https://github.com/supertopher) -* [swyxio](https://github.com/swyxio) -* [tbcdns](https://github.com/tbcdns) -* [tealjulia](https://github.com/tealjulia) -* [terencecho](https://github.com/terencecho) -* [thanhlmm](https://github.com/thanhlmm) -* [thomas-vl](https://github.com/thomas-vl) -* [timroes](https://github.com/timroes) -* [tirth7777777](https://github.com/tirth7777777) -* [tjirab](https://github.com/tjirab) -* [tkorenko](https://github.com/tkorenko) -* [tolik0](https://github.com/tolik0) -* [topefolorunso](https://github.com/topefolorunso) -* [trowacat](https://github.com/trowacat) -* [tryangul](https://github.com/tryangul) -* [TSkrebe](https://github.com/TSkrebe) -* [tuanchris](https://github.com/tuanchris) -* [tuliren](https://github.com/tuliren) -* [tyagi-data-wizard](https://github.com/tyagi-data-wizard) -* [tybernstein](https://github.com/tybernstein) -* [TymoshokDmytro](https://github.com/TymoshokDmytro) -* [tyschroed](https://github.com/tyschroed) -* [ufou](https://github.com/ufou) -* [Upmitt](https://github.com/Upmitt) -* [VitaliiMaltsev](https://github.com/VitaliiMaltsev) -* [vitaliizazmic](https://github.com/vitaliizazmic) -* [vladimir-remar](https://github.com/vladimir-remar) -* [vovavovavovavova](https://github.com/vovavovavovavova) -* [wallies](https://github.com/wallies) -* [winar-jin](https://github.com/winar-jin) -* [wissevrowl](https://github.com/wissevrowl) -* [Wittiest](https://github.com/Wittiest) -* [wjwatkinson](https://github.com/wjwatkinson) -* [Xabilahu](https://github.com/Xabilahu) -* [xiaohansong](https://github.com/xiaohansong) -* [xpuska513](https://github.com/xpuska513) -* [yahu98](https://github.com/yahu98) -* [yannibenoit](https://github.com/yannibenoit) -* [yaroslav-dudar](https://github.com/yaroslav-dudar) -* [yaroslav-hrytsaienko](https://github.com/yaroslav-hrytsaienko) -* [YatsukBogdan1](https://github.com/YatsukBogdan1) -* [ycherniaiev](https://github.com/ycherniaiev) -* [yevhenii-ldv](https://github.com/yevhenii-ldv) -* [YiyangLi](https://github.com/YiyangLi) -* [YowanR](https://github.com/YowanR) -* [yuhuishi-convect](https://github.com/yuhuishi-convect) -* [yurii-bidiuk](https://github.com/yurii-bidiuk) -* [Zawar92](https://github.com/Zawar92) -* [zestyping](https://github.com/zestyping) -* [Zirochkaa](https://github.com/Zirochkaa) -* [zkid18](https://github.com/zkid18) -* [zuc](https://github.com/zuc) -* [zzstoatzz](https://github.com/zzstoatzz) -* [zzztimbo](https://github.com/zzztimbo) +- [69mb](https://github.com/69mb) +- [a-honcharenko](https://github.com/a-honcharenko) +- [aadityasinha-dotcom](https://github.com/aadityasinha-dotcom) +- [aaronsteers](https://github.com/aaronsteers) +- [aazam-gh](https://github.com/aazam-gh) +- [abaerptc](https://github.com/abaerptc) +- [aballiet](https://github.com/aballiet) +- [achaussende](https://github.com/achaussende) +- [ad-m](https://github.com/ad-m) +- [adam-bloom](https://github.com/adam-bloom) +- [adamf](https://github.com/adamf) +- [adamschmidt](https://github.com/adamschmidt) +- [AetherUnbound](https://github.com/AetherUnbound) +- [afranzi](https://github.com/afranzi) +- [agrass](https://github.com/agrass) +- [ahmed-buksh](https://github.com/ahmed-buksh) +- [airbyte-jenny](https://github.com/airbyte-jenny) +- [ajmhatch](https://github.com/ajmhatch) +- [ajzo90](https://github.com/ajzo90) +- [akashkulk](https://github.com/akashkulk) +- [akulgoel96](https://github.com/akulgoel96) +- [alafanechere](https://github.com/alafanechere) +- [alallema](https://github.com/alallema) +- [albert-marrero](https://github.com/albert-marrero) +- [alex-danilin](https://github.com/alex-danilin) +- [alex-gron](https://github.com/alex-gron) +- [alexander-marquardt](https://github.com/alexander-marquardt) +- [AlexanderBatoulis](https://github.com/AlexanderBatoulis) +- [alexandertsukanov](https://github.com/alexandertsukanov) +- [alexandr-shegeda](https://github.com/alexandr-shegeda) +- [alexchouraki](https://github.com/alexchouraki) +- [AlexJameson](https://github.com/AlexJameson) +- [alexnikitchuk](https://github.com/alexnikitchuk) +- [Alihassanc5](https://github.com/Alihassanc5) +- [Allexik](https://github.com/Allexik) +- [alovew](https://github.com/alovew) +- [AM-I-Human](https://github.com/AM-I-Human) +- [amaliaroye](https://github.com/amaliaroye) +- [ambirdsall](https://github.com/ambirdsall) +- [aminamos](https://github.com/aminamos) +- [amitku](https://github.com/amitku) +- [Amruta-Ranade](https://github.com/Amruta-Ranade) +- [anamargaridarl](https://github.com/anamargaridarl) +- [andnig](https://github.com/andnig) +- [andresbravog](https://github.com/andresbravog) +- [andrewlreeve](https://github.com/andrewlreeve) +- [andreyAtBB](https://github.com/andreyAtBB) +- [andriikorotkov](https://github.com/andriikorotkov) +- [andrzejdackiewicz](https://github.com/andrzejdackiewicz) +- [andyjih](https://github.com/andyjih) +- [AndyTwiss](https://github.com/AndyTwiss) +- [animer3009](https://github.com/animer3009) +- [anna-geller](https://github.com/anna-geller) +- [annalvova05](https://github.com/annalvova05) +- [antixar](https://github.com/antixar) +- [antonioneto-hotmart](https://github.com/antonioneto-hotmart) +- [anujgupta0711](https://github.com/anujgupta0711) +- [Anurag870](https://github.com/Anurag870) +- [anushree-agrawal](https://github.com/anushree-agrawal) +- [apostoltego](https://github.com/apostoltego) +- [archangelic](https://github.com/archangelic) +- [arimbr](https://github.com/arimbr) +- [arnaudjnn](https://github.com/arnaudjnn) +- [ArneZsng](https://github.com/ArneZsng) +- [arsenlosenko](https://github.com/arsenlosenko) +- [artem1205](https://github.com/artem1205) +- [artusiep](https://github.com/artusiep) +- [asafepy](https://github.com/asafepy) +- [asyarif93](https://github.com/asyarif93) +- [augan-rymkhan](https://github.com/augan-rymkhan) +- [Auric-Manteo](https://github.com/Auric-Manteo) +- [avaidyanatha](https://github.com/avaidyanatha) +- [avida](https://github.com/avida) +- [avirajsingh7](https://github.com/avirajsingh7) +- [axaysagathiya](https://github.com/axaysagathiya) +- [azhard](https://github.com/azhard) +- [b4stien](https://github.com/b4stien) +- [bala-ceg](https://github.com/bala-ceg) +- [bazarnov](https://github.com/bazarnov) +- [bbugh](https://github.com/bbugh) +- [bcbeidel](https://github.com/bcbeidel) +- [bdashrad](https://github.com/bdashrad) +- [benmoriceau](https://github.com/benmoriceau) +- [BenoitFayolle](https://github.com/BenoitFayolle) +- [BenoitHugonnard](https://github.com/BenoitHugonnard) +- [bgroff](https://github.com/bgroff) +- [Bhupesh-V](https://github.com/Bhupesh-V) +- [BirdboyBolu](https://github.com/BirdboyBolu) +- [bjgbeelen](https://github.com/bjgbeelen) +- [bkrausz](https://github.com/bkrausz) +- [bleonard](https://github.com/bleonard) +- [bnchrch](https://github.com/bnchrch) +- [bobvanluijt](https://github.com/bobvanluijt) +- [brebuanirello-equinix](https://github.com/brebuanirello-equinix) +- [BrentSouza](https://github.com/BrentSouza) +- [brianjlai](https://github.com/brianjlai) +- [brunofaustino](https://github.com/brunofaustino) +- [bstrawson](https://github.com/bstrawson) +- [btkcodedev](https://github.com/btkcodedev) +- [burmecia](https://github.com/burmecia) +- [bzAmin](https://github.com/bzAmin) +- [calebfornari](https://github.com/calebfornari) +- [cameronwtaylor](https://github.com/cameronwtaylor) +- [camro](https://github.com/camro) +- [carlkibler](https://github.com/carlkibler) +- [carlonuccio](https://github.com/carlonuccio) +- [catpineapple](https://github.com/catpineapple) +- [cgardens](https://github.com/cgardens) +- [chadthman](https://github.com/chadthman) +- [chandrasekharan98](https://github.com/chandrasekharan98) +- [ChristoGrab](https://github.com/ChristoGrab) +- [ChristopheDuong](https://github.com/ChristopheDuong) +- [ciancullinan](https://github.com/ciancullinan) +- [cirdes](https://github.com/cirdes) +- [cjwooo](https://github.com/cjwooo) +- [clnoll](https://github.com/clnoll) +- [cobobrien](https://github.com/cobobrien) +- [coetzeevs](https://github.com/coetzeevs) +- [colesnodgrass](https://github.com/colesnodgrass) +- [collinscangarella](https://github.com/collinscangarella) +- [cpdeethree](https://github.com/cpdeethree) +- [CrafterKolyan](https://github.com/CrafterKolyan) +- [cstruct](https://github.com/cstruct) +- [ct-martin](https://github.com/ct-martin) +- [cuyk](https://github.com/cuyk) +- [cynthiaxyin](https://github.com/cynthiaxyin) +- [CyprienBarbault](https://github.com/CyprienBarbault) +- [czuares](https://github.com/czuares) +- [Daemonxiao](https://github.com/Daemonxiao) +- [dainiussa](https://github.com/dainiussa) +- [dalo390](https://github.com/dalo390) +- [damianlegawiec](https://github.com/damianlegawiec) +- [dandpz](https://github.com/dandpz) +- [daniel-cortez-stevenson](https://github.com/daniel-cortez-stevenson) +- [danieldiamond](https://github.com/danieldiamond) +- [Danucas](https://github.com/Danucas) +- [danvass](https://github.com/danvass) +- [darian-heede](https://github.com/darian-heede) +- [darynaishchenko](https://github.com/darynaishchenko) +- [DavidSpek](https://github.com/DavidSpek) +- [davinchia](https://github.com/davinchia) +- [davydov-d](https://github.com/davydov-d) +- [dbyzero](https://github.com/dbyzero) +- [ddoyediran](https://github.com/ddoyediran) +- [deepansh96](https://github.com/deepansh96) +- [delenamalan](https://github.com/delenamalan) +- [denis-sokolov](https://github.com/denis-sokolov) +- [dependabot[bot]](https://github.com/apps/dependabot) +- [dictcp](https://github.com/dictcp) +- [didistars328](https://github.com/didistars328) +- [digambar-t7](https://github.com/digambar-t7) +- [dijonkitchen](https://github.com/dijonkitchen) +- [dizel852](https://github.com/dizel852) +- [dmateusp](https://github.com/dmateusp) +- [domzae](https://github.com/domzae) +- [DoNotPanicUA](https://github.com/DoNotPanicUA) +- [Dracyr](https://github.com/Dracyr) +- [drrest](https://github.com/drrest) +- [dtt101](https://github.com/dtt101) +- [edbizarro](https://github.com/edbizarro) +- [edgao](https://github.com/edgao) +- [edmundito](https://github.com/edmundito) +- [efimmatytsin](https://github.com/efimmatytsin) +- [eliziario](https://github.com/eliziario) +- [elliottrabac](https://github.com/elliottrabac) +- [emmaling27](https://github.com/emmaling27) +- [erica-airbyte](https://github.com/erica-airbyte) +- [erohmensing](https://github.com/erohmensing) +- [etsybaev](https://github.com/etsybaev) +- [eugene-kulak](https://github.com/eugene-kulak) +- [evantahler](https://github.com/evantahler) +- [ffabss](https://github.com/ffabss) +- [flash1293](https://github.com/flash1293) +- [franviera92](https://github.com/franviera92) +- [freimer](https://github.com/freimer) +- [FUT](https://github.com/FUT) +- [gaart](https://github.com/gaart) +- [ganpatagarwal](https://github.com/ganpatagarwal) +- [gargatuma](https://github.com/gargatuma) +- [gergelylendvai](https://github.com/gergelylendvai) +- [girarda](https://github.com/girarda) +- [git-phu](https://github.com/git-phu) +- [github-actions[bot]](https://github.com/apps/github-actions) +- [Gitznik](https://github.com/Gitznik) +- [gordalina](https://github.com/gordalina) +- [gosusnp](https://github.com/gosusnp) +- [grebessi](https://github.com/grebessi) +- [grishick](https://github.com/grishick) +- [grubberr](https://github.com/grubberr) +- [gvillafanetapia](https://github.com/gvillafanetapia) +- [h7kanna](https://github.com/h7kanna) +- [haithem-souala](https://github.com/haithem-souala) +- [haoranyu](https://github.com/haoranyu) +- [harshithmullapudi](https://github.com/harshithmullapudi) +- [heade](https://github.com/heade) +- [hehex9](https://github.com/hehex9) +- [helderco](https://github.com/helderco) +- [henriblancke](https://github.com/henriblancke) +- [Hesperide](https://github.com/Hesperide) +- [hillairet](https://github.com/hillairet) +- [himanshuc3](https://github.com/himanshuc3) +- [hntan](https://github.com/hntan) +- [htrueman](https://github.com/htrueman) +- [hydrosquall](https://github.com/hydrosquall) +- [iberchid](https://github.com/iberchid) +- [igrankova](https://github.com/igrankova) +- [igsaf2](https://github.com/igsaf2) +- [Imbruced](https://github.com/Imbruced) +- [irynakruk](https://github.com/irynakruk) +- [isaacharrisholt](https://github.com/isaacharrisholt) +- [isalikov](https://github.com/isalikov) +- [itaseskii](https://github.com/itaseskii) +- [jacqueskpoty](https://github.com/jacqueskpoty) +- [Jagrutiti](https://github.com/Jagrutiti) +- [jamakase](https://github.com/jamakase) +- [jartek](https://github.com/jartek) +- [jbfbell](https://github.com/jbfbell) +- [jcowanpdx](https://github.com/jcowanpdx) +- [jdclarke5](https://github.com/jdclarke5) +- [jdpgrailsdev](https://github.com/jdpgrailsdev) +- [jeremySrgt](https://github.com/jeremySrgt) +- [jhajajaas](https://github.com/jhajajaas) +- [jhammarstedt](https://github.com/jhammarstedt) +- [jnr0790](https://github.com/jnr0790) +- [joelluijmes](https://github.com/joelluijmes) +- [johnlafleur](https://github.com/johnlafleur) +- [JonsSpaghetti](https://github.com/JonsSpaghetti) +- [jonstacks](https://github.com/jonstacks) +- [jordan-glitch](https://github.com/jordan-glitch) +- [josephkmh](https://github.com/josephkmh) +- [jrhizor](https://github.com/jrhizor) +- [juliachvyrova](https://github.com/juliachvyrova) +- [JulianRommel](https://github.com/JulianRommel) +- [juliatournant](https://github.com/juliatournant) +- [justinbchau](https://github.com/justinbchau) +- [juweins](https://github.com/juweins) +- [jzcruiser](https://github.com/jzcruiser) +- [kaklakariada](https://github.com/kaklakariada) +- [karinakuz](https://github.com/karinakuz) +- [kattos-aws](https://github.com/kattos-aws) +- [KayakinKoder](https://github.com/KayakinKoder) +- [keu](https://github.com/keu) +- [kgrover](https://github.com/kgrover) +- [kimerinn](https://github.com/kimerinn) +- [koconder](https://github.com/koconder) +- [koji-m](https://github.com/koji-m) +- [krishnaglick](https://github.com/krishnaglick) +- [krisjan-oldekamp](https://github.com/krisjan-oldekamp) +- [ksengers](https://github.com/ksengers) +- [kzzzr](https://github.com/kzzzr) +- [lazebnyi](https://github.com/lazebnyi) +- [leo-schick](https://github.com/leo-schick) +- [letiescanciano](https://github.com/letiescanciano) +- [lgomezm](https://github.com/lgomezm) +- [lideke](https://github.com/lideke) +- [lizdeika](https://github.com/lizdeika) +- [lmossman](https://github.com/lmossman) +- [maciej-nedza](https://github.com/maciej-nedza) +- [macmv](https://github.com/macmv) +- [Mainara](https://github.com/Mainara) +- [makalaaneesh](https://github.com/makalaaneesh) +- [makyash](https://github.com/makyash) +- [malikdiarra](https://github.com/malikdiarra) +- [marcelopio](https://github.com/marcelopio) +- [marcosmarxm](https://github.com/marcosmarxm) +- [mariamthiam](https://github.com/mariamthiam) +- [masonwheeler](https://github.com/masonwheeler) +- [masyagin1998](https://github.com/masyagin1998) +- [matter-q](https://github.com/matter-q) +- [maxi297](https://github.com/maxi297) +- [MaxKrog](https://github.com/MaxKrog) +- [mdibaiee](https://github.com/mdibaiee) +- [mfsiega-airbyte](https://github.com/mfsiega-airbyte) +- [michaelnguyen26](https://github.com/michaelnguyen26) +- [michel-tricot](https://github.com/michel-tricot) +- [mickaelandrieu](https://github.com/mickaelandrieu) +- [midavadim](https://github.com/midavadim) +- [mildbyte](https://github.com/mildbyte) +- [misteryeo](https://github.com/misteryeo) +- [mkhokh-33](https://github.com/mkhokh-33) +- [mlavoie-sm360](https://github.com/mlavoie-sm360) +- [mmolimar](https://github.com/mmolimar) +- [mohamagdy](https://github.com/mohamagdy) +- [mohitreddy1996](https://github.com/mohitreddy1996) +- [monai](https://github.com/monai) +- [mrhallak](https://github.com/mrhallak) +- [Muriloo](https://github.com/Muriloo) +- [mustangJaro](https://github.com/mustangJaro) +- [Mykyta-Serbynevskyi](https://github.com/Mykyta-Serbynevskyi) +- [n0rritt](https://github.com/n0rritt) +- [nastra](https://github.com/nastra) +- [nataliekwong](https://github.com/nataliekwong) +- [natalyjazzviolin](https://github.com/natalyjazzviolin) +- [nauxliu](https://github.com/nauxliu) +- [nguyenaiden](https://github.com/nguyenaiden) +- [NipunaPrashan](https://github.com/NipunaPrashan) +- [Nmaxime](https://github.com/Nmaxime) +- [noahkawasaki-airbyte](https://github.com/noahkawasaki-airbyte) +- [noahkawasakigoogle](https://github.com/noahkawasakigoogle) +- [novotl](https://github.com/novotl) +- [ntucker](https://github.com/ntucker) +- [octavia-squidington-iii](https://github.com/octavia-squidington-iii) +- [olivermeyer](https://github.com/olivermeyer) +- [omid](https://github.com/omid) +- [oreopot](https://github.com/oreopot) +- [pabloescoder](https://github.com/pabloescoder) +- [panhavad](https://github.com/panhavad) +- [pecalleja](https://github.com/pecalleja) +- [pedroslopez](https://github.com/pedroslopez) +- [perangel](https://github.com/perangel) +- [peter279k](https://github.com/peter279k) +- [PhilipCorr](https://github.com/PhilipCorr) +- [philippeboyd](https://github.com/philippeboyd) +- [Phlair](https://github.com/Phlair) +- [pmossman](https://github.com/pmossman) +- [po3na4skld](https://github.com/po3na4skld) +- [PoCTo](https://github.com/PoCTo) +- [postamar](https://github.com/postamar) +- [prasrvenkat](https://github.com/prasrvenkat) +- [prateekmukhedkar](https://github.com/prateekmukhedkar) +- [proprefenetre](https://github.com/proprefenetre) +- [Pwaldi](https://github.com/Pwaldi) +- [rach-r](https://github.com/rach-r) +- [ramonvermeulen](https://github.com/ramonvermeulen) +- [ReptilianBrain](https://github.com/ReptilianBrain) +- [rileybrook](https://github.com/rileybrook) +- [RobertoBonnet](https://github.com/RobertoBonnet) +- [robgleason](https://github.com/robgleason) +- [RobLucchi](https://github.com/RobLucchi) +- [rodireich](https://github.com/rodireich) +- [roisinbolt](https://github.com/roisinbolt) +- [roman-romanov-o](https://github.com/roman-romanov-o) +- [roman-yermilov-gl](https://github.com/roman-yermilov-gl) +- [ron-damon](https://github.com/ron-damon) +- [rparrapy](https://github.com/rparrapy) +- [ryankfu](https://github.com/ryankfu) +- [sajarin](https://github.com/sajarin) +- [samos123](https://github.com/samos123) +- [sarafonseca-123](https://github.com/sarafonseca-123) +- [sashaNeshcheret](https://github.com/sashaNeshcheret) +- [SatishChGit](https://github.com/SatishChGit) +- [sbjorn](https://github.com/sbjorn) +- [schlattk](https://github.com/schlattk) +- [scottleechua](https://github.com/scottleechua) +- [sdairs](https://github.com/sdairs) +- [sergei-solonitcyn](https://github.com/sergei-solonitcyn) +- [sergio-ropero](https://github.com/sergio-ropero) +- [sh4sh](https://github.com/sh4sh) +- [shadabshaukat](https://github.com/shadabshaukat) +- [sherifnada](https://github.com/sherifnada) +- [Shishir-rmv](https://github.com/Shishir-rmv) +- [shrodingers](https://github.com/shrodingers) +- [shyngysnurzhan](https://github.com/shyngysnurzhan) +- [siddhant3030](https://github.com/siddhant3030) +- [sivankumar86](https://github.com/sivankumar86) +- [snyk-bot](https://github.com/snyk-bot) +- [SofiiaZaitseva](https://github.com/SofiiaZaitseva) +- [sophia-wiley](https://github.com/sophia-wiley) +- [SPTKL](https://github.com/SPTKL) +- [subhamX](https://github.com/subhamX) +- [subodh1810](https://github.com/subodh1810) +- [suhomud](https://github.com/suhomud) +- [supertopher](https://github.com/supertopher) +- [swyxio](https://github.com/swyxio) +- [tbcdns](https://github.com/tbcdns) +- [tealjulia](https://github.com/tealjulia) +- [terencecho](https://github.com/terencecho) +- [thanhlmm](https://github.com/thanhlmm) +- [thomas-vl](https://github.com/thomas-vl) +- [timroes](https://github.com/timroes) +- [tirth7777777](https://github.com/tirth7777777) +- [tjirab](https://github.com/tjirab) +- [tkorenko](https://github.com/tkorenko) +- [tolik0](https://github.com/tolik0) +- [topefolorunso](https://github.com/topefolorunso) +- [trowacat](https://github.com/trowacat) +- [tryangul](https://github.com/tryangul) +- [TSkrebe](https://github.com/TSkrebe) +- [tuanchris](https://github.com/tuanchris) +- [tuliren](https://github.com/tuliren) +- [tyagi-data-wizard](https://github.com/tyagi-data-wizard) +- [tybernstein](https://github.com/tybernstein) +- [TymoshokDmytro](https://github.com/TymoshokDmytro) +- [tyschroed](https://github.com/tyschroed) +- [ufou](https://github.com/ufou) +- [Upmitt](https://github.com/Upmitt) +- [VitaliiMaltsev](https://github.com/VitaliiMaltsev) +- [vitaliizazmic](https://github.com/vitaliizazmic) +- [vladimir-remar](https://github.com/vladimir-remar) +- [vovavovavovavova](https://github.com/vovavovavovavova) +- [wallies](https://github.com/wallies) +- [winar-jin](https://github.com/winar-jin) +- [wissevrowl](https://github.com/wissevrowl) +- [Wittiest](https://github.com/Wittiest) +- [wjwatkinson](https://github.com/wjwatkinson) +- [Xabilahu](https://github.com/Xabilahu) +- [xiaohansong](https://github.com/xiaohansong) +- [xpuska513](https://github.com/xpuska513) +- [yahu98](https://github.com/yahu98) +- [yannibenoit](https://github.com/yannibenoit) +- [yaroslav-dudar](https://github.com/yaroslav-dudar) +- [yaroslav-hrytsaienko](https://github.com/yaroslav-hrytsaienko) +- [YatsukBogdan1](https://github.com/YatsukBogdan1) +- [ycherniaiev](https://github.com/ycherniaiev) +- [yevhenii-ldv](https://github.com/yevhenii-ldv) +- [YiyangLi](https://github.com/YiyangLi) +- [YowanR](https://github.com/YowanR) +- [yuhuishi-convect](https://github.com/yuhuishi-convect) +- [yurii-bidiuk](https://github.com/yurii-bidiuk) +- [Zawar92](https://github.com/Zawar92) +- [zestyping](https://github.com/zestyping) +- [Zirochkaa](https://github.com/Zirochkaa) +- [zkid18](https://github.com/zkid18) +- [zuc](https://github.com/zuc) +- [zzstoatzz](https://github.com/zzstoatzz) +- [zzztimbo](https://github.com/zzztimbo) ```shell -p=1; -while true; do +p=1; +while true; do s=$(curl "https://api.github.com/repos/airbytehq/airbyte/contributors?page=$p") || break [ "0" = $(echo $s | jq length) ] && break - echo $s | jq -r '.[] | "* [" + .login + "](" + .html_url + ")"' + echo $s | jq -r '.[] | "* [" + .login + "](" + .html_url + ")"' p=$((p+1)) done | sort -f ``` diff --git a/README.md b/README.md index 5a58cb7cab8..c5a039f5c88 100644 --- a/README.md +++ b/README.md @@ -34,11 +34,12 @@ We believe that only an **open-source solution to data movement** can cover the _Screenshot taken from [Airbyte Cloud](https://cloud.airbyte.com/signup)_. ### Getting Started -* [Deploy Airbyte Open Source](https://docs.airbyte.com/quickstart/deploy-airbyte) or set up [Airbyte Cloud](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud) to start centralizing your data. -* Create connectors in minutes with our [no-code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview). -* Explore popular use cases in our [tutorials](https://airbyte.com/tutorials). -* Orchestrate Airbyte syncs with [Airflow](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator), [Prefect](https://docs.airbyte.com/operator-guides/using-prefect-task), [Dagster](https://docs.airbyte.com/operator-guides/using-dagster-integration), [Kestra](https://docs.airbyte.com/operator-guides/using-kestra-plugin) or the [Airbyte API](https://reference.airbyte.com/reference/start). -* Easily transform loaded data with [SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) or [dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt). + +- [Deploy Airbyte Open Source](https://docs.airbyte.com/quickstart/deploy-airbyte) or set up [Airbyte Cloud](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud) to start centralizing your data. +- Create connectors in minutes with our [no-code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview). +- Explore popular use cases in our [tutorials](https://airbyte.com/tutorials). +- Orchestrate Airbyte syncs with [Airflow](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator), [Prefect](https://docs.airbyte.com/operator-guides/using-prefect-task), [Dagster](https://docs.airbyte.com/operator-guides/using-dagster-integration), [Kestra](https://docs.airbyte.com/operator-guides/using-kestra-plugin) or the [Airbyte API](https://reference.airbyte.com/reference/start). +- Easily transform loaded data with [SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) or [dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt). Try it out yourself with our [demo app](https://demo.airbyte.io/), visit our [full documentation](https://docs.airbyte.com/) and learn more about [recent announcements](https://airbyte.com/blog-categories/company-updates). See our [registry](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html) for a full list of connectors already available in Airbyte or Airbyte Cloud. diff --git a/airbyte-cdk/java/airbyte-cdk/README.md b/airbyte-cdk/java/airbyte-cdk/README.md index f31f4a90ea3..85e25aaa8e0 100644 --- a/airbyte-cdk/java/airbyte-cdk/README.md +++ b/airbyte-cdk/java/airbyte-cdk/README.md @@ -173,7 +173,7 @@ corresponds to that version. ### Java CDK | Version | Date | Pull Request | Subject | -|:--------| :--------- | :--------------------------------------------------------- |:---------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 0.33.1 | 2024-05-03 | [\#37824](https://github.com/airbytehq/airbyte/pull/37824) | Add a unit test for cursor based sync | | 0.33.0 | 2024-05-03 | [\#36935](https://github.com/airbytehq/airbyte/pull/36935) | Destinations: Enable non-safe-casting DV2 tests | | 0.32.0 | 2024-05-03 | [\#36929](https://github.com/airbytehq/airbyte/pull/36929) | Destinations: Assorted DV2 changes for mysql | diff --git a/airbyte-cdk/java/airbyte-cdk/core/src/main/kotlin/io/airbyte/cdk/integrations/base/ssh/readme.md b/airbyte-cdk/java/airbyte-cdk/core/src/main/kotlin/io/airbyte/cdk/integrations/base/ssh/readme.md index f72da8f4384..d711f51b70d 100644 --- a/airbyte-cdk/java/airbyte-cdk/core/src/main/kotlin/io/airbyte/cdk/integrations/base/ssh/readme.md +++ b/airbyte-cdk/java/airbyte-cdk/core/src/main/kotlin/io/airbyte/cdk/integrations/base/ssh/readme.md @@ -1,10 +1,13 @@ # Developing an SSH Connector ## Goal + Easy development of any connector that needs the ability to connect to a resource via SSH Tunnel. ## Overview + Our SSH connector support is designed to be easy to plug into any existing connector. There are a few major pieces to consider: + 1. Add SSH Configuration to the Spec - for SSH, we need to take in additional configuration, so we need to inject extra fields into the connector configuration. 2. Add SSH Logic to the Connector - before the connector code begins to execute we need to start an SSH tunnel. This library provides logic to create that tunnel (and clean it up). 3. Acceptance Testing - it is a good practice to include acceptance testing for the SSH version of a connector for at least one of the SSH types (password or ssh key). While unit testing for the SSH functionality exists in this package (coming soon), high-level acceptance testing to make sure this feature works with the individual connector belongs in the connector. @@ -12,40 +15,47 @@ Our SSH connector support is designed to be easy to plug into any existing conne ## How To ### Add SSH Configuration to the Spec + 1. The `SshHelpers` class provides 2 helper functions that injects the SSH configuration objects into a spec JsonSchema for an existing connector. Usually the `spec()` method for a connector looks like `Jsons.deserialize(MoreResources.readResource("spec.json"), ConnectorSpecification.class);`. These helpers are just injecting the ssh spec (`ssh-tunnel-spec.json`) into that spec. 2. You may need to update tests to reflect that new fields have been added to the spec. Usually updating the tests just requires using these helpers in the tests. ### Add SSH Logic to the Connector + 1. This package provides a Source decorated class to make it easy to add SSH logic to an existing source. Simply pass the source you want to wrap into the constructor of the `SshWrappedSource`. That class also requires two other fields: `hostKey` and `portKey`. Both of these fields are pointers to fields in the connector specification. The `hostKey` is a pointer to the field that hold the host of the resource you want to connect and `portKey` is the port. In a simple case, where the host name for a connector is just defined in the top-level `host` field, then `hostKey` would simply be: `["host"]`. If that field is nested, however, then it might be: `["database", "configuration", "host"]`. ### Acceptance Testing + 1. The only difference between existing acceptance testing and acceptance testing with SSH is that the configuration that is used for testing needs to contain additional fields. You can see the `Postgres Source ssh key creds` in lastpass to see an example of what that might look like. Those credentials leverage an existing bastion host in our test infrastructure. (As future work, we want to get rid of the need to use a static bastion server and instead do it in docker so we can run it all locally.) ## Misc ### How to wrap the protocol in an SSH Tunnel + For `spec()`, `check()`, and `discover()` wrapping the connector in an SSH tunnel is easier to think about because when they return all work is done and the tunnel can be closed. Thus, each of these methods can simply be wrapped in a try-with-resource of the SSH Tunnel. For `read()` and `write()` they return an iterator and consumer respectively that perform work that must happen within the SSH Tunnel after the method has returned. Therefore, the `close` function on the iterator and consumer have to handle closing the SSH tunnel; the methods themselves cannot just be wrapped in a try-with-resource. This is handled for you by the `SshWrappedSource`, but if you need to implement any of this manually you must take it into account. ### Name Mangling + One of the least intuitive pieces of the SSH setup to follow is the replacement of host names and ports. The reason `SshWrappedSource` needs to know how to get the hostname and port of the database you are trying to connect to is that when it builds the SSH tunnel that forwards to the database, it needs to know the hostname and port so that the tunnel forwards requests to the right place. After the SSH tunnel is established and forwarding to the database, the connector code itself runs. There's a trick here though! The connector should NOT try to connect to the hostname and port of the database. Instead, it should be trying to connect to `localhost` and whatever port we are forwarding to the database. The `SshTunnel#sshWrap` removes the original host and port from the configuration for the connector and replaces it with `localhost` and the correct port. So from the connector code's point of view it is just operating on localhost. There is a tradeoff here. -* (Good) The way we have structured this allows users to configure a connector in the UI in a way that it is intuitive to user. They put in the host and port they think about referring to the database as (they don't need to worry about any of the localhost version). -* (Good) The connector code does not need to know anything about SSH, it can just operate on the host and port it gets (and we let SSH Tunnel handle swapping the names for us) which makes writing a connector easier. -* (Bad) The downside is that the `SshTunnel` logic is more complicated because it is absorbing all of this name swapping so that neither user nor connector developer need to worry about it. In our estimation, the good outweighs the extra complexity incurred here. +- (Good) The way we have structured this allows users to configure a connector in the UI in a way that it is intuitive to user. They put in the host and port they think about referring to the database as (they don't need to worry about any of the localhost version). +- (Good) The connector code does not need to know anything about SSH, it can just operate on the host and port it gets (and we let SSH Tunnel handle swapping the names for us) which makes writing a connector easier. +- (Bad) The downside is that the `SshTunnel` logic is more complicated because it is absorbing all of this name swapping so that neither user nor connector developer need to worry about it. In our estimation, the good outweighs the extra complexity incurred here. + +### Acceptance Testing via ssh tunnel using SshBastion and JdbcDatabaseContainer in Docker -### Acceptance Testing via ssh tunnel using SshBastion and JdbcDatabaseContainer in Docker 1. The `SshBastion` class provides 3 helper functions: `initAndStartBastion()`to initialize and start SSH Bastion server in Docker test container and creates new `Network` for bastion and tested jdbc container `getTunnelConfig()`which return JsoneNode with all necessary configuration to establish ssh tunnel. Connection configuration for integration tests is now taken directly from container settings and does not require a real database connection `stopAndCloseContainers` to stop and close SshBastion and JdbcDatabaseContainer at the end of the test ## Future Work -* Add unit / integration testing for `ssh` package. -* Restructure spec so that instead of having `SSH Key Authentication` or `Password Authentication` options for `tunnel_method`, just have an `SSH` option and then within that `SSH` option have a `oneOf` for password or key. This is blocked because we cannot use `oneOf`s nested in `oneOf`s. -* Improve the process of acceptance testing by allowing doing acceptance testing using a bastion running in a docker container instead of having to use dedicated infrastructure and a static database. + +- Add unit / integration testing for `ssh` package. +- Restructure spec so that instead of having `SSH Key Authentication` or `Password Authentication` options for `tunnel_method`, just have an `SSH` option and then within that `SSH` option have a `oneOf` for password or key. This is blocked because we cannot use `oneOf`s nested in `oneOf`s. +- Improve the process of acceptance testing by allowing doing acceptance testing using a bastion running in a docker container instead of having to use dedicated infrastructure and a static database. diff --git a/airbyte-cdk/python/CHANGELOG.md b/airbyte-cdk/python/CHANGELOG.md index c4441300091..a1cc8ad80fb 100644 --- a/airbyte-cdk/python/CHANGELOG.md +++ b/airbyte-cdk/python/CHANGELOG.md @@ -1,913 +1,1206 @@ # Changelog ## 0.86.3 + File-based CDK: allow to merge schemas with nullable object values ## 0.86.2 + Fix schemas merge for nullable object types ## 0.86.1 + Fix schemas merge for nullable object types ## 0.86.0 -Expose airbyte_cdk.__version__ and pin airbyte-protocol-models dependency to + +Expose airbyte_cdk.**version** and pin airbyte-protocol-models dependency to ## 0.85.0 + Connector builder: read input state if it exists ## 0.84.0 -Remove package which was deprecated 2021 or earlier + +Remove package which was deprecated 2021 or earlier ## 0.83.1 + Concurrent CDK: if exception is AirbyteTracedException, raise this and not StreamThreadException ## 0.83.0 + Low-code: Add JwtAuthenticator ## 0.82.0 + Connector builder: emit state messages ## 0.81.8 + Concurrent CDK: Break Python application with status 1 on exception ## 0.81.7 + Concurrent CDK: Fix to update partitioned state only when partition is successful ## 0.81.6 + Upgrade to recent version of langchain ## 0.81.5 + Updated langchain version and add langchain_core as a dependency ## 0.81.4 -Adding stream_descriptor as part of AirbyteTracedException.__init__ + +Adding stream_descriptor as part of AirbyteTracedException.**init** ## 0.81.3 + Republish print buffer after previous pypi attempt timed out ## 0.81.2 + Fix concurrent CDK printing by flushing the print buffer for every message ## 0.81.1 + Concurrent CDK: add logging on exception ## 0.81.0 + Unpin airbyte-protocol-models library ## 0.80.0 + Concurrent CDK: support partitioned states ## 0.79.2 + Concurrent CDK: Print error messages properly so that they can be categorized ## 0.79.1 + Dummy patch to test new publishing flow fixes ## 0.79.0 + Update release process of airbyte-cdk and source-declarative manifest ## 0.78.9 + Fix CDK version mismatch introduced in 0.78.8 ## 0.78.8 + Update error messaging/type for missing streams. Note: version mismatch, please use 0.78.9 instead ## 0.78.6 -low-code: add backward compatibility for old close slice behavior + +low-code: add backward compatibility for old close slice behavior ## 0.78.5 + low-code: fix stop_condition instantiation in the cursor pagination ## 0.78.4 + low-code: Add last_record and last_page_size interpolation variables to pagination ## 0.78.3 + Fix dependencies for file-based extras ## 0.78.2 -low-code: fix retrieving partition key for legacy state migration + +low-code: fix retrieving partition key for legacy state migration ## 0.78.1 + connector-builder: return full url-encoded URL instead of separating parameters ## 0.78.0 + low-code: Allow state migration with CustomPartitionRouter ## 0.77.2 + Emit state recordCount as float instead of integer ## 0.77.1 -Fix empty , , extras packages + +Fix empty , , extras packages ## 0.77.0 + low-code: Add string interpolation filter ## 0.76.0 + Migrate Python CDK to Poetry ## 0.75.0 + low-code: Add StateMigration component ## 0.74.0 + Request option params are allowed to be an array ## 0.73.0 + set minimum python version to 3.9 ## 0.72.2 + Connector Builder: have schema fields be nullable by default except from PK and cursor field ## 0.72.1 + low code: add refresh_token_error handler to DeclarativeOauth2Authenticator ## 0.72.0 + low-code: Allow defining custom schema loaders ## 0.71.0 + Declarative datetime-based cursors now only derive state values from records that were read ## 0.70.2 + low-code: remove superfluous sleep ## 0.70.1 + File-based CDK: Fix tab delimiter configuration in CSV file type ## 0.70.0 + testing ## 0.69.2 + low-code: improve error message when a custom component cannot be found ## 0.69.1 + Update mock server test entrypoint wrapper to use per-stream state ## 0.69.0 + Include recordCount in stream state messages and final state message for full refresh syncs ## 0.68.4 + low-code: update cartesian stream slice to emit typed StreamSlice ## 0.68.3 + Low-code: adding a default value if a stream slice is None during read_records ## 0.68.2 + low-code: remove parent cursor compoent from incremental substreams' state message ## 0.68.1 + no-op republish of 0.68.0 ## 0.68.0 + low-code: Allow page size to be defined with string interpolation ## 0.67.3 + CDK: upgrade pyarrow ## 0.67.2 + File CDK: Update parquet parser to handle values that resolve to None ## 0.67.1 + Fix handling of tab-separated CSVs ## 0.67.0 + Low-code: Add CustomRecordFilter ## 0.66.0 + Low-code: Add interpolation for request options ## 0.65.0 + low-code: Allow connectors to ignore stream slicer request options on paginated requests ## 0.64.1 - ## 0.64.0 + Low-code: Add filter to RemoveFields ## 0.63.2 + Correct handling of custom max_records limits in connector_builder ## 0.63.1 -File-based CDK: fix record enqueuing + +File-based CDK: fix record enqueuing ## 0.63.0 + Per-stream error reporting and continue syncing on error by default ## 0.62.2 + mask access key when logging refresh response ## 0.62.1 + [ISSUE #34910] add headers to HttpResponse for test framework ## 0.62.0 + File-based CDK: functionality to make incremental syncs concurrent ## 0.61.2 + [ISSUE #34755] do not propagate parameters on JSON schemas ## 0.61.1 + Align version in CDK Dockerfile to be consistent. Before this change, the docker images was mistakenly pinned to version 0.58.5. ## 0.61.0 + File-based CDK: log warning on no sync mode instead of raising exception ## 0.60.2 + Improve error messages for concurrent CDK ## 0.60.1 + Emit state when no partitions are generated for ccdk and update StateBuilder ## 0.60.0 + File-based CDK: run full refresh syncs with concurrency ## 0.59.2 + Fix CCDK overlapping message due to print in entrypoint ## 0.59.1 + Fix concurrent CDK deadlock ## 0.59.0 + Fix state message handling when running concurrent syncs ## 0.58.9 + concurrent-cdk: improve resource usage when reading from substreams ## 0.58.8 + CDK: HttpRequester can accept http_method in str format, which is required by custom low code components ## 0.58.7 - ## 0.58.6 + File CDK: Added logic to emit logged `RecordParseError` errors and raise the single `AirbyteTracebackException` in the end of the sync, instead of silent skipping the parsing errors. PR: https://github.com/airbytehq/airbyte/pull/32589 ## 0.58.5 + Handle private network exception as config error ## 0.58.4 + Add POST method to HttpMocker ## 0.58.3 + fix declarative oauth initialization ## 0.58.2 + Integration tests: adding debug mode to improve logging ## 0.58.1 + Add schema normalization to declarative stream ## 0.58.0 + Concurrent CDK: add state converter for ISO timestamps with millisecond granularity ## 0.57.8 + add SelectiveAuthenticator ## 0.57.7 + File CDK: Support raw txt file ## 0.57.6 + Adding more tooling to cover source-stripe events stream ## 0.57.5 + Raise error on passing unsupported value formats as query parameters ## 0.57.4 + Vector DB CDK: Refactor embedders, File based CDK: Handle 422 errors properly in document file type parser ## 0.57.3 + Vector DB CDK: Refactor embedders, File based CDK: Handle 422 errors properly in document file type parser ## 0.57.2 + Update airbyte-protocol ## 0.57.1 + Improve integration tests tooling ## 0.57.0 + low-code: cache requests sent for parent streams File-based CDK: Add support for automatic primary key for document file type format File-based CDK: Add support for remote parsing of document file type format via API Vector DB CDK: Fix bug with embedding tokens with special meaning like `<|endoftext|>` ## 0.56.1 + no-op to verify pypi publish flow ## 0.56.0 + Allow for connectors to continue syncing when a stream fails ## 0.55.5 + File-based CDK: hide source-defined primary key; users can define primary keys in the connection's configuration ## 0.55.4 + Source Integration tests: decoupling entrypoint wrapper from pytest ## 0.55.3 + First iteration of integration tests tooling (http mocker and response builder) ## 0.55.2 + concurrent-cdk: factory method initializes concurrent source with default number of max tasks ## 0.55.1 + Vector DB CDK: Add omit_raw_text flag ## 0.55.0 + concurrent cdk: read multiple streams concurrently ## 0.54.0 + low-code: fix injection of page token if first request ## 0.53.9 -Fix of generate the error message using _try_get_error based on list of errors + +Fix of generate the error message using \_try_get_error based on list of errors ## 0.53.8 + Vector DB CDK: Remove CDC records, File CDK: Update unstructured parser ## 0.53.7 + low-code: fix debug logging when using --debug flag ## 0.53.6 + Increase maximum_attempts_to_acquire to avoid crashing in acquire_call ## 0.53.5 + File CDK: Improve stream config appearance ## 0.53.4 + Concurrent CDK: fix futures pruning ## 0.53.3 + Fix spec schema generation for File CDK and Vector DB CDK and allow skipping invalid files in document file parser ## 0.53.2 + Concurrent CDK: Increase connection pool size to allow for 20 max workers ## 0.53.1 + Concurrent CDK: Improve handling of future to avoid memory leak and improve performances ## 0.53.0 + Add call rate functionality ## 0.52.10 + Fix class SessionTokenAuthenticator for CLASS_TYPES_REGISTRY mapper ## 0.52.9 + File CDK: Improve file type detection in document file type parser ## 0.52.8 + Concurrent CDK: incremental (missing state conversion). Outside of concurrent specific work, this includes the following changes: -* Checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs -* `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source + +- Checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs +- `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source ## 0.52.7 + File CDK: Add pptx support ## 0.52.6 -make parameter as not required for default backoff handler + +make parameter as not required for default backoff handler ## 0.52.5 + use in-memory cache if no file path is provided ## 0.52.4 + File CDK: Add unstructured parser ## 0.52.3 + Update source-declarative-manifest base image to update Linux alpine and Python ## 0.52.2 - ## 0.52.1 + Add max time for backoff handler ## 0.52.0 + File CDK: Add CustomFileBasedException for custom errors ## 0.51.44 + low-code: Allow connector developers to specify the type of an added field ## 0.51.43 + concurrent cdk: fail fast if a partition raises an exception ## 0.51.42 + File CDK: Avoid listing all files for check command ## 0.51.41 + Vector DB CDK: Expose stream identifier logic, add field remapping to processing | File CDK: Emit analytics message for used streams ## 0.51.40 -Add filters for base64 encode and decode in Jinja Interpolation + +Add filters for base64 encode and decode in Jinja Interpolation ## 0.51.39 + Few bug fixes for concurrent cdk ## 0.51.38 + Add ability to wrap HTTP errors with specific status codes occurred during access token refresh into AirbyteTracedException ## 0.51.37 + Enable debug logging when running availability check ## 0.51.36 + Enable debug logging when running availability check ## 0.51.35 + File CDK: Allow configuring number of tested files for schema inference and parsability check ## 0.51.34 + Vector DB CDK: Fix OpenAI compatible embedder when used without api key ## 0.51.33 + Vector DB CDK: Improve batching process ## 0.51.32 + Introduce experimental ThreadBasedConcurrentStream ## 0.51.31 + Fix initialize of token_expiry_is_time_of_expiration field ## 0.51.30 + Add new token_expiry_is_time_of_expiration property for AbstractOauth2Authenticator for indicate that token's expiry_in is a time of expiration ## 0.51.29 + Coerce read_records to iterable in http availabilty strategy ## 0.51.28 + Add functionality enabling Page Number/Offset to be set on the first request ## 0.51.27 + Fix parsing of UUID fields in avro files ## 0.51.26 + Vector DB CDK: Fix OpenAI embedder batch size ## 0.51.25 -Add configurable OpenAI embedder to cdk and add cloud environment helper + +Add configurable OpenAI embedder to cdk and add cloud environment helper ## 0.51.24 + Fix previous version of request_cache clearing ## 0.51.23 + Fix request_cache clearing and move it to tmp folder ## 0.51.22 + Vector DB CDK: Adjust batch size for Azure embedder to current limits ## 0.51.21 + Change Error message if Stream is not found ## 0.51.20 + Vector DB CDK: Add text splitting options to document processing ## 0.51.19 + Ensuring invalid user-provided urls does not generate sentry issues ## 0.51.18 + Vector DB CDK adjustments: Prevent failures with big records and OpenAI embedder ## 0.51.17 + [ISSUE #30353] File-Based CDK: remove file_type from stream config ## 0.51.16 + Connector Builder: fix datetime format inference for str parsable as int but not isdecimal ## 0.51.15 + Vector DB CDK: Add Azure OpenAI embedder ## 0.51.14 + File-based CDK: improve error message for CSV parsing error ## 0.51.13 + File-based CDK: migrated parsing error to config error to avoid sentry alerts ## 0.51.12 + Add from-field embedder to vector db CDK ## 0.51.11 + FIle-based CDK: Update spec and fix autogenerated headers with skip after ## 0.51.10 + Vector DB CDK adjustments: Fix id generation, improve config spec, add base test case ## 0.51.9 + [Issue #29660] Support empty keys with record selection ## 0.51.8 + Add vector db CDK helpers ## 0.51.7 + File-based CDK: allow user to provided column names for CSV files ## 0.51.6 + File-based CDK: allow for extension mismatch ## 0.51.5 + File-based CDK: Remove CSV noisy log ## 0.51.4 + Source-S3 V4: feature parity rollout ## 0.51.3 + File-based CDK: Do not stop processing files in slice on error ## 0.51.2 + Check config against spec in embedded sources and remove list endpoint from connector builder module ## 0.51.1 + low-code: allow formatting datetime as milliseconds since unix epoch ## 0.51.0 + File-based CDK: handle legacy options ## 0.50.2 + Fix title and description of datetime_format fields ## 0.50.1 + File-based CDK cursor and entrypoint updates ## 0.50.0 + Low code CDK: Decouple SimpleRetriever and HttpStream ## 0.49.0 + Add utils for embedding sources in other Python applications ## 0.48.0 + Relax pydantic version requirement and update to protocol models version 0.4.0 ## 0.47.5 + Support many format for cursor datetime ## 0.47.4 + File-based CDK updates ## 0.47.3 + Connector Builder: Ensure we return when there are no slices ## 0.47.2 + low-code: deduplicate query params if they are already encoded in the URL ## 0.47.1 + Fix RemoveFields transformation issue ## 0.47.0 + Breaking change: Rename existing SessionTokenAuthenticator to LegacySessionTokenAuthenticator and make SessionTokenAuthenticator more generic ## 0.46.1 + Connector builder: warn if the max number of records was reached ## 0.46.0 + Remove pyarrow from main dependency and add it to extras ## 0.45.0 + Fix pyyaml and cython incompatibility ## 0.44.4 + Connector builder: Show all request/responses as part of the testing panel ## 0.44.3 + [ISSUE #27494] allow for state to rely on transformed field ## 0.44.2 + Ensuring the state value format matches the cursor value from the record ## 0.44.1 + Fix issue with incremental sync following data feed release ## 0.44.0 + Support data feed like incremental syncs ## 0.43.3 + Fix return type of RecordFilter: changed from generator to list ## 0.43.2 + Connector builder module: serialize request body as string ## 0.43.1 + Fix availability check to handle HttpErrors which happen during slice extraction ## 0.43.0 + Refactoring declarative state management ## 0.42.1 + Error message on state per partition state discrepancy ## 0.42.0 + Supporting state per partition given incremental sync and partition router ## 0.41.0 + Use x-www-urlencoded for access token refresh requests ## 0.40.5 -Replace with when making oauth calls + +Replace with when making oauth calls ## 0.40.4 + Emit messages using message repository ## 0.40.3 + Add utils for inferring datetime formats ## 0.40.2 + Add a metadata field to the declarative component schema ## 0.40.1 + make DatetimeBasedCursor.end_datetime optional ## 0.40.0 + Remove SingleUseRefreshTokenOAuthAuthenticator from low code CDK and add generic injection capabilities to ApiKeyAuthenticator ## 0.39.4 + Connector builder: add latest connector config control message to read calls ## 0.39.3 + Add refresh token update capabilities to OAuthAuthenticator ## 0.39.2 + Make step and cursor_granularity optional ## 0.39.1 + Improve connector builder error messages ## 0.39.0 + Align schema generation in SchemaInferrer with Airbyte platform capabilities ## 0.38.0 + Allow nested objects in request_body_json ## 0.37.0 + low-code: Make refresh token in oauth authenticator optional ## 0.36.5 + Unfreeze requests version and test new pipeline ## 0.36.4 + low-code: use jinja sandbox and restrict some methods ## 0.36.3 + pin the version of the requests library ## 0.36.2 + Support parsing non UTC dates and Connector Builder set slice descriptor ## 0.36.1 + low-code: fix add field transformation when running from the connector builder ## 0.36.0 + Emit stream status messages ## 0.35.4 + low-code: remove now_local() macro because it's too unpredictable ## 0.35.3 + low-code: alias stream_interval and stream_partition to stream_slice in jinja context ## 0.35.2 + Connector builder scrubs secrets from raw request and response ## 0.35.1 + low-code: Add title, description, and examples for all fields in the manifest schema ## 0.35.0 + low-code: simplify session token authenticator interface ## 0.34.3 + low-code: fix typo in ManifestDeclarativeSource ## 0.34.2 + Emit slice log messages when running the connector builder ## 0.34.1 + set slice and pages limit when reading from the connector builder module ## 0.34.0 + Low-Code CDK: Enable use of SingleUseRefreshTokenAuthenticator ## 0.33.2 + low-code: fix duplicate stream slicer update ## 0.33.1 + Low-Code CDK: make RecordFilter.filter_records as generator ## 0.33.0 + Enable oauth flow for low-code connectors ## 0.32.0 + Remove unexpected error swallowing on abstract source's check method ## 0.31.1 + connector builder: send stacktrace when error on read ## 0.31.0 + Add connector builder module for handling Connector Builder server requests ## 0.30.4 + CDK's read command handler supports Connector Builder list_streams requests ## 0.30.3 + Fix reset pagination issue on test reads ## 0.30.2 -* Low-code CDK: Override refresh_access_token logic DeclarativeOAuthAuthenticator + +- Low-code CDK: Override refresh_access_token logic DeclarativeOAuthAuthenticator ## 0.30.1 + Releasing using the new release flow. No change to the CDK per se ## 0.30.0 + OAuth: retry refresh access token requests ## 0.29.3 + Low-Code CDK: duration macro added ## 0.29.2 + support python3.8 ## 0.29.1 + Publishing Docker image for source-declarative-manifest ## 0.29.0 + **Breaking changes: We have promoted the low-code CDK to Beta. This release contains a number of breaking changes intended to improve the overall usability of the language by reorganizing certain concepts, renaming, reducing some field duplication, and removal of fields that are seldom used.** -The changes are: -* Deprecated the concept of Stream Slicers in favor of two individual concepts: Incremental Syncs, and Partition Routers: - * Stream will define an `incremental_sync` field which is responsible for defining how the connector should support incremental syncs using a cursor field. `DatetimeStreamSlicer` has been renamed to `DatetimeBasedCursor` and can be used for this field. - * `Retriever`s will now define a `partition_router` field. The remaining slicers are now called `SubstreamPartitionRouter` and `ListPartitionRouter`, both of which can be used here as they already have been. - * The `CartesianProductStreamSlicer` because `partition_router` can accept a list of values and will generate that same cartesian product by default. -* `$options` have been renamed to `$parameters` -* Changed the notation for component references to the JSON schema notation (`$ref: "#/definitions/requester"`) -* `DefaultPaginator` no longer has a `url_base` field. Moving forward, paginators will derive the `url_base` from the `HttpRequester`. There are some unique cases for connectors that implement a custom `Retriever`. -* `primary_key` and `name` no longer need to be defined on `Retriever`s or `Requester`s. They will be derived from the stream’s definition -* Streams no longer define a `stream_cursor_field` and will derive it from the `incremental_sync` component. `checkpoint_interval` has also been deprecated -* DpathExtractor `field_pointer` has been renamed to `field_path` -* `RequestOption` can no longer be used with with `inject_into` set to `path`. There is now a dedicated `RequestPath` component moving forward. +The changes are: + +- Deprecated the concept of Stream Slicers in favor of two individual concepts: Incremental Syncs, and Partition Routers: + - Stream will define an `incremental_sync` field which is responsible for defining how the connector should support incremental syncs using a cursor field. `DatetimeStreamSlicer` has been renamed to `DatetimeBasedCursor` and can be used for this field. + - `Retriever`s will now define a `partition_router` field. The remaining slicers are now called `SubstreamPartitionRouter` and `ListPartitionRouter`, both of which can be used here as they already have been. + - The `CartesianProductStreamSlicer` because `partition_router` can accept a list of values and will generate that same cartesian product by default. +- `$options` have been renamed to `$parameters` +- Changed the notation for component references to the JSON schema notation (`$ref: "#/definitions/requester"`) +- `DefaultPaginator` no longer has a `url_base` field. Moving forward, paginators will derive the `url_base` from the `HttpRequester`. There are some unique cases for connectors that implement a custom `Retriever`. +- `primary_key` and `name` no longer need to be defined on `Retriever`s or `Requester`s. They will be derived from the stream’s definition +- Streams no longer define a `stream_cursor_field` and will derive it from the `incremental_sync` component. `checkpoint_interval` has also been deprecated +- DpathExtractor `field_pointer` has been renamed to `field_path` +- `RequestOption` can no longer be used with with `inject_into` set to `path`. There is now a dedicated `RequestPath` component moving forward. ## 0.28.1 -Low-Code CDK: fix signature _parse_records_and_emit_request_and_responses + +Low-Code CDK: fix signature \_parse_records_and_emit_request_and_responses ## 0.28.0 + Low-Code: improve day_delta macro and MinMaxDatetime component ## 0.27.0 + Make HttpAvailabilityStrategy default for HttpStreams ## 0.26.0 + Low-Code CDK: make DatetimeStreamSlicer.step as InterpolatedString ## 0.25.2 + Low-Code: SubstreamSlicer.parent_key - dpath support added ## 0.25.1 + Fix issue when trying to log stream slices that are non-JSON-serializable ## 0.25.0 + Use dpath.util.values method to parse response with nested lists ## 0.24.0 + Use dpath.util.values method to parse response with nested lists ## 0.23.0 + Limiting the number of HTTP requests during a test read ## 0.22.0 + Surface the resolved manifest in the CDK ## 0.21.0 + Add AvailabilityStrategy concept and use check_availability within CheckStream ## 0.20.2 + Add missing package in previous patch release ## 0.20.1 + Handle edge cases for CheckStream - checking connection to empty stream, and checking connection to substream with no parent records ## 0.20.0 + Low-Code: Refactor low-code to use Pydantic model based manifest parsing and component creation ## 0.19.1 + Low-code: Make documentation_url in the Spec be optional ## 0.19.0 + Low-Code: Handle forward references in manifest ## 0.18.1 + Allow for CustomRequester to be defined within declarative manifests ## 0.18.0 + Adding `cursor_granularity` to the declarative API of DatetimeStreamSlicer ## 0.17.0 + Add utility class to infer schemas from real records ## 0.16.3 + Do not eagerly refresh access token in `SingleUseRefreshTokenOauth2Authenticator` [#20923](https://github.com/airbytehq/airbyte/pull/20923) ## 0.16.2 + Fix the naming of OAuthAuthenticator ## 0.16.1 + Include declarative_component_schema.yaml in the publish to PyPi ## 0.16.0 + Start validating low-code manifests using the declarative_component_schema.yaml file ## 0.15.0 + Reverts additions from versions 0.13.0 and 0.13.3. ## 0.14.0 + Low-code: Add token_expiry_date_format to OAuth Authenticator. Resolve ref schema ## 0.13.3 + Fixed `StopIteration` exception for empty streams while `check_availability` runs. ## 0.13.2 + Low-code: Enable low-code CDK users to specify schema inline in the manifest ## 0.13.1 + Low-code: Add `SessionTokenAuthenticator` ## 0.13.0 + Add `Stream.check_availability` and `Stream.AvailabilityStrategy`. Make `HttpAvailabilityStrategy` the default `HttpStream.AvailabilityStrategy`. ## 0.12.4 + Lookback window should applied when a state is supplied as well ## 0.12.3 + Low-code: Finally, make `OffsetIncrement.page_size` interpolated string or int ## 0.12.2 + Revert breaking change on `read_config` while keeping the improvement on the error message ## 0.12.0 + Improve error readability when reading JSON config files ## 0.11.3 + Low-code: Log response error message on failure ## 0.11.2 + Low-code: Include the HTTP method used by the request in logging output of the `airbyte-cdk` ## 0.11.1 + Low-code: Fix the component manifest schema to and validate check instead of checker ## 0.11.0 + Declare a new authenticator `SingleUseRefreshTokenOauth2Authenticator` that can perform connector configuration mutation and emit `AirbyteControlMessage.ConnectorConfig`. ## 0.10.0 + Low-code: Add `start_from_page` option to a PageIncrement class ## 0.9.5 + Low-code: Add jinja macro `format_datetime` ## 0.9.4 + Low-code: Fix reference resolution for connector builder ## 0.9.3 + Low-code: Avoid duplicate HTTP query in `simple_retriever` ## 0.9.2 + Low-code: Make `default_paginator.page_token_option` optional ## 0.9.1 + Low-code: Fix filtering vars in `InterpolatedRequestInputProvider.eval_request_inputs` ## 0.9.0 + Low-code: Allow `grant_type` to be specified for OAuthAuthenticator ## 0.8.1 + Low-code: Don't update cursor for non-record messages and fix default loader for connector builder manifests ## 0.8.0 + Low-code: Allow for request and response to be emitted as log messages ## 0.7.1 + Low-code: Decouple yaml manifest parsing from the declarative source implementation ## 0.7.0 + Low-code: Allow connector specifications to be defined in the manifest ## 0.6.0 + Low-code: Add support for monthly and yearly incremental updates for `DatetimeStreamSlicer` ## 0.5.4 + Low-code: Get response.json in a safe way ## 0.5.3 + Low-code: Replace EmptySchemaLoader with DefaultSchemaLoader to retain backwards compatibility Low-code: Evaluate backoff strategies at runtime ## 0.5.2 + Low-code: Allow for read even when schemas are not defined for a connector yet ## 0.4.2 + Low-code: Fix off by one error with the stream slicers ## 0.4.1 + Low-code: Fix a few bugs with the stream slicers ## 0.4.0 + Low-code: Add support for custom error messages on error response filters ## 0.3.0 -Publish python typehints via `py.typed` file. + +Publish python typehints via `py.typed` file. ## 0.2.3 + - Propagate options to InterpolatedRequestInputProvider ## 0.2.2 + - Report config validation errors as failed connection status during `check`. - Report config validation errors as `config_error` failure type. @@ -1083,7 +1376,7 @@ Publish python typehints via `py.typed` file. ## 0.1.66 -- Call init_uncaught_exception_handler from AirbyteEntrypoint.__init__ and Destination.run_cmd +- Call init_uncaught_exception_handler from AirbyteEntrypoint.**init** and Destination.run_cmd - Add the ability to remove & add records in YAML-based sources ## 0.1.65 diff --git a/airbyte-cdk/python/airbyte_cdk/destinations/vector_db_based/README.md b/airbyte-cdk/python/airbyte_cdk/destinations/vector_db_based/README.md index b07b42e9457..09668b61e96 100644 --- a/airbyte-cdk/python/airbyte_cdk/destinations/vector_db_based/README.md +++ b/airbyte-cdk/python/airbyte_cdk/destinations/vector_db_based/README.md @@ -11,27 +11,27 @@ To use these helpers, install the CDK with the `vector-db-based` extra: pip install airbyte-cdk[vector-db-based] ``` - The helpers can be used in the following way: -* Add the config models to the spec of the connector -* Implement the `Indexer` interface for your specific database -* In the check implementation of the destination, initialize the indexer and the embedder and call `check` on them -* In the write implementation of the destination, initialize the indexer, the embedder and pass them to a new instance of the writer. Then call the writers `write` method with the iterable for incoming messages + +- Add the config models to the spec of the connector +- Implement the `Indexer` interface for your specific database +- In the check implementation of the destination, initialize the indexer and the embedder and call `check` on them +- In the write implementation of the destination, initialize the indexer, the embedder and pass them to a new instance of the writer. Then call the writers `write` method with the iterable for incoming messages If there are no connector-specific embedders, the `airbyte_cdk.destinations.vector_db_based.embedder.create_from_config` function can be used to get an embedder instance from the config. This is how the components interact: ```text -┌─────────────┐ -│MyDestination│ -└┬────────────┘ -┌▽───────────────────────────────┐ -│Writer │ -└┬─────────┬──────────┬──────────┘ +┌─────────────┐ +│MyDestination│ +└┬────────────┘ +┌▽───────────────────────────────┐ +│Writer │ +└┬─────────┬──────────┬──────────┘ ┌▽───────┐┌▽────────┐┌▽────────────────┐ │Embedder││MyIndexer││DocumentProcessor│ └────────┘└─────────┘└─────────────────┘ ``` -Normally, only the `MyDestination` class and the `MyIndexer` class has to be implemented specifically for the destination. The other classes are provided as is by the helpers. \ No newline at end of file +Normally, only the `MyDestination` class and the `MyIndexer` class has to be implemented specifically for the destination. The other classes are provided as is by the helpers. diff --git a/airbyte-cdk/python/airbyte_cdk/sources/file_based/README.md b/airbyte-cdk/python/airbyte_cdk/sources/file_based/README.md index 469260b0cbd..ea3c20d4ce9 100644 --- a/airbyte-cdk/python/airbyte_cdk/sources/file_based/README.md +++ b/airbyte-cdk/python/airbyte_cdk/sources/file_based/README.md @@ -1,20 +1,24 @@ ## Behavior The Airbyte protocol defines the actions `spec`, `discover`, `check` and `read` for a source to be compliant. Here is the high-level description of the flow for a file-based source: -* spec: calls AbstractFileBasedSpec.documentation_url and AbstractFileBasedSpec.schema to return a ConnectorSpecification. -* discover: calls Source.streams, and subsequently Stream.get_json_schema; this uses Source.open_file to open files during schema discovery. -* check: Source.check_connection is called from the entrypoint code (in the main CDK). -* read: Stream.read_records calls Stream.list_files which calls Source.list_matching_files, and then also uses Source.open_file to parse records from the file handle. + +- spec: calls AbstractFileBasedSpec.documentation_url and AbstractFileBasedSpec.schema to return a ConnectorSpecification. +- discover: calls Source.streams, and subsequently Stream.get_json_schema; this uses Source.open_file to open files during schema discovery. +- check: Source.check_connection is called from the entrypoint code (in the main CDK). +- read: Stream.read_records calls Stream.list_files which calls Source.list_matching_files, and then also uses Source.open_file to parse records from the file handle. ## How to Implement Your Own + To create a file-based source a user must extend three classes – AbstractFileBasedSource, AbstractFileBasedSpec, and AbstractStreamReader – to create an implementation for the connector’s specific storage system. They then initialize a FileBasedSource with the instance of AbstractStreamReader specific to their storage system. The abstract classes house the vast majority of the logic required by file-based sources. For example, when extending AbstractStreamReader, users only have to implement three methods: -* list_matching_files: lists files matching the glob pattern(s) provided in the config. -* open_file: returns a file handle for reading. -* config property setter: concrete implementations of AbstractFileBasedStreamReader's config setter should assert that `value` is of the correct config type for that type of StreamReader. + +- list_matching_files: lists files matching the glob pattern(s) provided in the config. +- open_file: returns a file handle for reading. +- config property setter: concrete implementations of AbstractFileBasedStreamReader's config setter should assert that `value` is of the correct config type for that type of StreamReader. The result is that an implementation of a source might look like this: + ``` class CustomStreamReader(AbstractStreamReader): def open_file(self, remote_file: RemoteFile) -> FileHandler: @@ -47,41 +51,50 @@ For more information, feel free to check the docstrings of each classes or check ## Supported File Types ### Avro + Avro is a serialization format developed by [Apache](https://avro.apache.org/docs/). Avro configuration options for the file-based CDK: -* `double_as_string`: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers. + +- `double_as_string`: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers. ### CSV + CSV is a format loosely described by [RFC 4180](https://www.rfc-editor.org/rfc/rfc4180). The format is quite flexible which leads to a ton of options to consider: -* `delimiter`: The character delimiting individual cells in the CSV data. By name, CSV is comma separated so the default value is `,` -* `quote_char`: When quoted fields are used, it is possible for a field to span multiple lines, even when line breaks appear within such field. The default quote character is `"`. -* `escape_char`: The character used for escaping special characters. -* `encoding`: The character encoding of the file. By default, `UTF-8` -* `double_quote`: Whether two quotes in a quoted CSV value denote a single quote in the data. -* `quoting_behavior`: The quoting behavior determines when a value in a row should have quote marks added around it. -* `skip_rows_before_header`: The number of rows to skip before the header row. For example, if the header row is on the 3rd row, enter 2 in this field. -* `skip_rows_after_header`: The number of rows to skip after the header row. -* `autogenerate_column_names`: If your CSV does not have a header row, the file-based CDK will need this enable to generate column names. -* `null_values`: As CSV does not explicitly define a value for null values, the user can specify a set of case-sensitive strings that should be interpreted as null values. -* `true_values`: As CSV does not explicitly define a value for positive boolean, the user can specify a set of case-sensitive strings that should be interpreted as true values. -* `false_values`: As CSV does not explicitly define a value for negative boolean, the user can specify a set of case-sensitive strings that should be interpreted as false values. + +- `delimiter`: The character delimiting individual cells in the CSV data. By name, CSV is comma separated so the default value is `,` +- `quote_char`: When quoted fields are used, it is possible for a field to span multiple lines, even when line breaks appear within such field. The default quote character is `"`. +- `escape_char`: The character used for escaping special characters. +- `encoding`: The character encoding of the file. By default, `UTF-8` +- `double_quote`: Whether two quotes in a quoted CSV value denote a single quote in the data. +- `quoting_behavior`: The quoting behavior determines when a value in a row should have quote marks added around it. +- `skip_rows_before_header`: The number of rows to skip before the header row. For example, if the header row is on the 3rd row, enter 2 in this field. +- `skip_rows_after_header`: The number of rows to skip after the header row. +- `autogenerate_column_names`: If your CSV does not have a header row, the file-based CDK will need this enable to generate column names. +- `null_values`: As CSV does not explicitly define a value for null values, the user can specify a set of case-sensitive strings that should be interpreted as null values. +- `true_values`: As CSV does not explicitly define a value for positive boolean, the user can specify a set of case-sensitive strings that should be interpreted as true values. +- `false_values`: As CSV does not explicitly define a value for negative boolean, the user can specify a set of case-sensitive strings that should be interpreted as false values. ### JSONL -[JSONL](https://jsonlines.org/) (or JSON Lines) is a format where each row is a JSON object. There are no configuration option for this format. For backward compatibility reasons, the JSONL parser currently supports multiline objects even though this is not part of the JSONL standard. Following some data gathering, we reserve the right to remove the support for this. Given that files have multiline JSON objects, performances will be slow. + +[JSONL](https://jsonlines.org/) (or JSON Lines) is a format where each row is a JSON object. There are no configuration option for this format. For backward compatibility reasons, the JSONL parser currently supports multiline objects even though this is not part of the JSONL standard. Following some data gathering, we reserve the right to remove the support for this. Given that files have multiline JSON objects, performances will be slow. ### Parquet + Parquet is a file format defined by [Apache](https://parquet.apache.org/). Configuration options are: -* `decimal_as_float`: Whether to convert decimal fields to floats. There is a loss of precision when converting decimals to floats, so this is not recommended. + +- `decimal_as_float`: Whether to convert decimal fields to floats. There is a loss of precision when converting decimals to floats, so this is not recommended. ### Document file types (PDF, DOCX, Markdown) For file share source connectors, the `unstructured` parser can be used to parse document file types. The textual content of the whole file will be parsed as a single record with a `content` field containing the text encoded as markdown. To use the unstructured parser, the libraries `poppler` and `tesseract` need to be installed on the system running the connector. For example, on Ubuntu, you can install them with the following command: + ``` apt-get install -y tesseract-ocr poppler-utils ``` on Mac, you can install these via brew: + ``` brew install poppler brew install tesseract @@ -92,32 +105,35 @@ brew install tesseract Having a schema allows for the file-based CDK to take action when there is a discrepancy between a record and what are the expected types of the record fields. Schema can be either inferred or user provided. -* If the user defines it a format using JSON types, inference will not apply. Input schemas are a key/value pair of strings describing column name and data type. Supported types are `["string", "number", "integer", "object", "array", "boolean", "null"]`. For example, `{"col1": "string", "col2": "boolean"}`. -* If the user enables schemaless sync, schema will `{"data": "object"}` and therefore emitted records will look like `{"data": {"col1": val1, …}}`. This is recommended if the contents between files in the stream vary significantly, and/or if data is very nested. -* Else, the file-based CDK will infer the schema depending on the file type. Some file formats defined the schema as part of their metadata (like Parquet), some do on the record-level (like Avro) and some don't have any explicit typing (like JSON or CSV). Note that all CSV values are inferred as strings except where we are supporting legacy configurations. Any file format that does not define their schema on a metadata level will require the file-based CDK to iterate to a number of records. There is a limit of bytes that will be consumed in order to infer the schema. + +- If the user defines it a format using JSON types, inference will not apply. Input schemas are a key/value pair of strings describing column name and data type. Supported types are `["string", "number", "integer", "object", "array", "boolean", "null"]`. For example, `{"col1": "string", "col2": "boolean"}`. +- If the user enables schemaless sync, schema will `{"data": "object"}` and therefore emitted records will look like `{"data": {"col1": val1, …}}`. This is recommended if the contents between files in the stream vary significantly, and/or if data is very nested. +- Else, the file-based CDK will infer the schema depending on the file type. Some file formats defined the schema as part of their metadata (like Parquet), some do on the record-level (like Avro) and some don't have any explicit typing (like JSON or CSV). Note that all CSV values are inferred as strings except where we are supporting legacy configurations. Any file format that does not define their schema on a metadata level will require the file-based CDK to iterate to a number of records. There is a limit of bytes that will be consumed in order to infer the schema. ### Validation Policies + Users will be required to select one of 3 different options, in the event that records are encountered that don’t conform to the schema. -* Skip nonconforming records: check each record to see if it conforms to the user-input or inferred schema; skip the record if it doesn't conform. We keep a count of the number of records in each file that do and do not conform and emit a log message with these counts once we’re done reading the file. -* Emit all records: emit all records, even if they do not conform to the user-provided or inferred schema. Columns that don't exist in the configured catalog probably won't be available in the destination's table since that's the current behavior. -Only error if there are conflicting field types or malformed rows. -* Stop the sync and wait for schema re-discovery: if a record is encountered that does not conform to the configured catalog’s schema, we log a message and stop the whole sync. Note: this option is not recommended if the files have very different columns or datatypes, because the inferred schema may vary significantly at discover time. +- Skip nonconforming records: check each record to see if it conforms to the user-input or inferred schema; skip the record if it doesn't conform. We keep a count of the number of records in each file that do and do not conform and emit a log message with these counts once we’re done reading the file. +- Emit all records: emit all records, even if they do not conform to the user-provided or inferred schema. Columns that don't exist in the configured catalog probably won't be available in the destination's table since that's the current behavior. + Only error if there are conflicting field types or malformed rows. +- Stop the sync and wait for schema re-discovery: if a record is encountered that does not conform to the configured catalog’s schema, we log a message and stop the whole sync. Note: this option is not recommended if the files have very different columns or datatypes, because the inferred schema may vary significantly at discover time. When the `schemaless` is enabled, validation will be skipped. ## Breaking Changes (compared to previous S3 implementation) -* [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively. -* [CSV] Before, a string value would not be considered as `null_values` if the column type was a string. We will now start to cast string columns with values matching `null_values` to null. -* [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string. -* [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger. +- [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively. +- [CSV] Before, a string value would not be considered as `null_values` if the column type was a string. We will now start to cast string columns with values matching `null_values` to null. +- [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string. +- [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger. ## Incremental syncs + The file-based connectors supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -| :--------------------------------------------- |:-----------| +| :--------------------------------------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Replicate Incremental Deletes | No | @@ -127,7 +143,8 @@ The file-based connectors supports the following [sync modes](https://docs.airby We recommend you do not manually modify files that are already synced. The connector has file-level granularity, which means adding or modifying a row in a CSV file will trigger a re-sync of the content of that file. -### Incremental sync +### Incremental sync + After the initial sync, the connector only pulls files that were modified since the last sync. The connector checkpoints the connection states when it is done syncing all files for a given timestamp. The connection's state only keeps track of the last 10 000 files synced. If more than 10 000 files are synced, the connector won't be able to rely on the connection state to deduplicate files. In this case, the connector will initialize its cursor to the minimum between the earliest file in the history, or 3 days ago. diff --git a/airbyte-cdk/python/airbyte_cdk/sources/streams/concurrent/README.md b/airbyte-cdk/python/airbyte_cdk/sources/streams/concurrent/README.md index 436230cbd61..6970c3acd05 100644 --- a/airbyte-cdk/python/airbyte_cdk/sources/streams/concurrent/README.md +++ b/airbyte-cdk/python/airbyte_cdk/sources/streams/concurrent/README.md @@ -1,7 +1,7 @@ ## Breaking Changes & Limitations -* [bigger scope than Concurrent CDK] checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs -* `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source -* send_per_stream_state is always True for Concurrent CDK -* Using stream_state during read_records: The concern is that today, stream_instance.get_updated_state is called on every record and read_records on every slice. The implication is that the argument stream_state passed to read_records will have the value after the last stream_instance.get_updated_state of the previous slice. For Concurrent CDK, this is not possible as slices are processed in an unordered way. -* Cursor fields can only be data-time formatted as epoch. Eventually, we want to move to ISO 8601 as it provides more flexibility but for the first iteration on Stripe, it was easier to use the same format that was already used +- [bigger scope than Concurrent CDK] checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs +- `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source +- send_per_stream_state is always True for Concurrent CDK +- Using stream_state during read_records: The concern is that today, stream_instance.get_updated_state is called on every record and read_records on every slice. The implication is that the argument stream_state passed to read_records will have the value after the last stream_instance.get_updated_state of the previous slice. For Concurrent CDK, this is not possible as slices are processed in an unordered way. +- Cursor fields can only be data-time formatted as epoch. Eventually, we want to move to ISO 8601 as it provides more flexibility but for the first iteration on Stripe, it was easier to use the same format that was already used diff --git a/airbyte-cdk/python/sphinx-docs.md b/airbyte-cdk/python/sphinx-docs.md index 9cb2eae8e57..055055cf61a 100644 --- a/airbyte-cdk/python/sphinx-docs.md +++ b/airbyte-cdk/python/sphinx-docs.md @@ -1,72 +1,75 @@ # Sphinx Docs -We're using the [Sphinx](https://www.sphinx-doc.org/) library in order +We're using the [Sphinx](https://www.sphinx-doc.org/) library in order to automatically generate the docs for the [airbyte-cdk](https://pypi.org/project/airbyte-cdk/). ## Updating the docs structure (manually) Documentation structure is set in `airbyte-cdk/python/reference_docs/_source`, using the `.rst` files. -See [reStructuredText docs](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) +See [reStructuredText docs](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) for the key concepts. -Note that `index.rst` is the main index file, where we do define the layout of the main +Note that `index.rst` is the main index file, where we do define the layout of the main docs page and relation to other sections. Each time a new module added to `airbyte-cdk/python/airbyte_cdk` module you'll need to update the Sphinx rst schema. Let's dive into using an example: + - Assuming we're going to add a new package `airbyte_cdk/new_package`; - Let this file contain a few modules: `airbyte_cdk/new_package/module1.py` and `airbyte_cdk/new_package/module2.py`; -- The above structure should be in `rst` config as: +- The above structure should be in `rst` config as: - Add this block directly into `index.rst`: + ``` .. toctree:: :maxdepth: 2 :caption: New Package - + api/airbyte_cdk.new_package ``` + - Add a new file `api/airbyte_cdk.new_package.rst` with the following content: + ``` Submodules ---------- - + airbyte\_cdk.new\_package.module1 module -------------------------------------------- - + .. automodule:: airbyte_cdk.new_package.module1 :members: :undoc-members: :show-inheritance: - + .. automodule:: airbyte_cdk.new_package.module2 :members: :undoc-members: :show-inheritance: - + Module contents --------------- - + .. automodule:: airbyte_cdk.models :members: :undoc-members: :show-inheritance: ``` -For more examples see `airbyte-cdk/python/reference_docs/_source` +For more examples see `airbyte-cdk/python/reference_docs/_source` and read the [docs](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html). ## Updating the docs structure (automatically) -It's also possible to generate `.rst` files automatically using `generate_rst_schema.py` script. +It's also possible to generate `.rst` files automatically using `generate_rst_schema.py` script. You should also update this script in order to change the docs appearance or structure. -To generate the docs, -run `python generate_rst_schema.py -o _source/api ../../python/airbyte_cdk -f -t _source/templates` -from the `airbyte-cdk/python/reference_docs` root. - +To generate the docs, +run `python generate_rst_schema.py -o _source/api ../../python/airbyte_cdk -f -t _source/templates` +from the `airbyte-cdk/python/reference_docs` root. ## Building the docs locally @@ -77,18 +80,17 @@ This build could be useful on each `airbyte-cdk` update, especially if the packa - Run `make html` from the `airbyte-cdk/python/reference_docs` root; - Check out the `airbyte-cdk/python/reference_docs/_build` for the new documentation built. - ## Publishing to Read the Docs -Our current sphinx docs setup is meant to be published to [readthedocs](https://readthedocs.org/). -So it may be useful to check our docs published at https://airbyte-cdk.readthedocs.io/en/latest/ +Our current sphinx docs setup is meant to be published to [readthedocs](https://readthedocs.org/). +So it may be useful to check our docs published at https://airbyte-cdk.readthedocs.io/en/latest/ for the last build in case if the airbyte-cdk package was updated. -Publishing process is automatic and implemented via the GitHub incoming webhook. +Publishing process is automatic and implemented via the GitHub incoming webhook. See https://docs.readthedocs.io/en/stable/webhooks.html. -To check build logs and state, check the https://readthedocs.org/projects/airbyte-cdk/builds/. +To check build logs and state, check the https://readthedocs.org/projects/airbyte-cdk/builds/. You may also run build manually here if needed. -Publishing configuration is placed to `.readthedocs.yaml`. -See https://docs.readthedocs.io/en/stable/config-file/v2.html for the config description. \ No newline at end of file +Publishing configuration is placed to `.readthedocs.yaml`. +See https://docs.readthedocs.io/en/stable/config-file/v2.html for the config description. diff --git a/airbyte-ci/connectors/base_images/README.md b/airbyte-ci/connectors/base_images/README.md index 9aea896e936..dfdbc9d4e11 100644 --- a/airbyte-ci/connectors/base_images/README.md +++ b/airbyte-ci/connectors/base_images/README.md @@ -6,18 +6,17 @@ Our connector build pipeline ([`airbyte-ci`](https://github.com/airbytehq/airbyt Our base images are declared in code, using the [Dagger Python SDK](https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/). - [Python base image code declaration](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/base_images/python/bases.py) -- ~Java base image code declaration~ *TODO* - +- ~Java base image code declaration~ _TODO_ ## Where are the Dockerfiles? + Our base images are not declared using Dockerfiles. They are declared in code using the [Dagger Python SDK](https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/). We prefer this approach because it allows us to interact with base images container as code: we can use python to declare the base images and use the full power of the language to build and test them. However, we do artificially generate Dockerfiles for debugging and documentation purposes. - - ### Example for `airbyte/python-connector-base`: + ```dockerfile FROM docker.io/python:3.9.18-slim-bookworm@sha256:44b7f161ed03f85e96d423b9916cdc8cb0509fb970fd643bdbc9896d49e1cad0 RUN ln -snf /usr/share/zoneinfo/Etc/UTC /etc/localtime @@ -31,55 +30,56 @@ RUN sh -c apt-get update && apt-get install -y tesseract-ocr=5.3.0-2 poppler-uti RUN mkdir /usr/share/nltk_data ``` - - ## Base images - ### `airbyte/python-connector-base` -| Version | Published | Docker Image Address | Changelog | -|---------|-----------|--------------|-----------| -| 1.2.0 | ✅| docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9 | Add CDK system dependencies: nltk data, tesseract, poppler. | -| 1.1.0 | ✅| docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c | Install socat | -| 1.0.0 | ✅| docker.io/airbyte/python-connector-base:1.0.0@sha256:dd17e347fbda94f7c3abff539be298a65af2d7fc27a307d89297df1081a45c27 | Initial release: based on Python 3.9.18, on slim-bookworm system, with pip==23.2.1 and poetry==1.6.1 | - +| Version | Published | Docker Image Address | Changelog | +| ------- | --------- | --------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | +| 1.2.0 | ✅ | docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9 | Add CDK system dependencies: nltk data, tesseract, poppler. | +| 1.1.0 | ✅ | docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c | Install socat | +| 1.0.0 | ✅ | docker.io/airbyte/python-connector-base:1.0.0@sha256:dd17e347fbda94f7c3abff539be298a65af2d7fc27a307d89297df1081a45c27 | Initial release: based on Python 3.9.18, on slim-bookworm system, with pip==23.2.1 and poetry==1.6.1 | ## How to release a new base image version (example for Python) ### Requirements -* [Docker](https://docs.docker.com/get-docker/) -* [Poetry](https://python-poetry.org/docs/#installation) -* Dockerhub logins + +- [Docker](https://docs.docker.com/get-docker/) +- [Poetry](https://python-poetry.org/docs/#installation) +- Dockerhub logins ### Steps + 1. `poetry install` -2. Open `base_images/python/bases.py`. +2. Open `base_images/python/bases.py`. 3. Make changes to the `AirbytePythonConnectorBaseImage`, you're likely going to change the `get_container` method to change the base image. 4. Implement the `container` property which must return a `dagger.Container` object. 5. **Recommended**: Add new sanity checks to `run_sanity_check` to confirm that the new version is working as expected. 6. Cut a new base image version by running `poetry run generate-release`. You'll need your DockerHub credentials. It will: - - Prompt you to pick which base image you'd like to publish. - - Prompt you for a major/minor/patch/pre-release version bump. - - Prompt you for a changelog message. - - Run the sanity checks on the new version. - - Optional: Publish the new version to DockerHub. - - Regenerate the docs and the registry json file. + +- Prompt you to pick which base image you'd like to publish. +- Prompt you for a major/minor/patch/pre-release version bump. +- Prompt you for a changelog message. +- Run the sanity checks on the new version. +- Optional: Publish the new version to DockerHub. +- Regenerate the docs and the registry json file. + 7. Commit and push your changes. 8. Create a PR and ask for a review from the Connector Operations team. **Please note that if you don't publish your image while cutting the new version you can publish it later with `poetry run publish `.** No connector will use the new base image version until its metadata is updated to use it. If you're not fully confident with the new base image version please: - - please publish it as a pre-release version - - try out the new version on a couple of connectors - - cut a new version with a major/minor/patch bump and publish it - - This steps can happen in different PRs. +- please publish it as a pre-release version +- try out the new version on a couple of connectors +- cut a new version with a major/minor/patch bump and publish it +- This steps can happen in different PRs. ## Running tests locally + ```bash poetry run pytest # Static typing checks diff --git a/airbyte-ci/connectors/ci_credentials/README.md b/airbyte-ci/connectors/ci_credentials/README.md index 05e68356de8..40585f8e9c0 100644 --- a/airbyte-ci/connectors/ci_credentials/README.md +++ b/airbyte-ci/connectors/ci_credentials/README.md @@ -1,6 +1,7 @@ # CI Credentials CLI tooling to read and manage GSM secrets: + - `write-to-storage` download a connector's secrets locally in the connector's `secrets` folder - `update-secrets` uploads new connector secret version that were locally updated. @@ -43,26 +44,31 @@ pipx install git+https://github.com/airbytehq/airbyte.git#subdirectory=airbyte-c This command installs `ci_credentials` and makes it globally available in your terminal. > [!Note] +> > - `--force` is required to ensure updates are applied on subsequent installs. > - `--python=python3.10` is required to ensure the correct python version is used. ## Get GSM access + Download a Service account json key that has access to Google Secrets Manager. `ci_credentials` expects `GCP_GSM_CREDENTIALS` to be set in environment to be able to access secrets. ### Create Service Account -* Go to https://console.cloud.google.com/iam-admin/serviceaccounts/create?project=dataline-integration-testing -* In step #1 `Service account details`, set a name and a relevant description -* In step #2 `Grant this service account access to project`, select role `Owner` (there is a role that is more scope but I based this decision on others `-testing` service account) + +- Go to https://console.cloud.google.com/iam-admin/serviceaccounts/create?project=dataline-integration-testing +- In step #1 `Service account details`, set a name and a relevant description +- In step #2 `Grant this service account access to project`, select role `Owner` (there is a role that is more scope but I based this decision on others `-testing` service account) ### Create Service Account Token -* Go to https://console.cloud.google.com/iam-admin/serviceaccounts?project=dataline-integration-testing -* Find your service account and click on it -* Go in the tab "KEYS" -* Click on "ADD KEY -> Create new key" and select JSON. This will download a file on your computer + +- Go to https://console.cloud.google.com/iam-admin/serviceaccounts?project=dataline-integration-testing +- Find your service account and click on it +- Go in the tab "KEYS" +- Click on "ADD KEY -> Create new key" and select JSON. This will download a file on your computer ### Setup ci_credentials -* In your .zshrc, add: `export GCP_GSM_CREDENTIALS=$(cat )` + +- In your .zshrc, add: `export GCP_GSM_CREDENTIALS=$(cat )` ## Development @@ -75,9 +81,11 @@ pipx install --editable airbyte-ci/connectors/ci_credentials/ This is useful when you are making changes to the package and want to test them in real-time. > [!Note] +> > - The package name is `ci_credentials`, not `airbyte-ci`. You will need this when uninstalling or reinstalling. ## Usage + After installation, you can use the `ci_credentials` command in your terminal. ## Run it @@ -101,6 +109,7 @@ VERSION=dev ci_credentials all write-to-storage ``` ### Update secrets + To upload to GSM newly updated configurations from `airbyte-integrations/connectors/source-bings-ads/secrets/updated_configurations`: ```bash diff --git a/airbyte-ci/connectors/common_utils/README.md b/airbyte-ci/connectors/common_utils/README.md index 9565733d106..8d268898f0b 100644 --- a/airbyte-ci/connectors/common_utils/README.md +++ b/airbyte-ci/connectors/common_utils/README.md @@ -3,5 +3,6 @@ `common_utils` is a Python package that provides common utilities that are used in other `airbyte-ci` tools, such as `ci_credentials` and `base_images`. Currently: + - Logger - GCS API client diff --git a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/extra-header.md b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/extra-header.md index 20d0e6e5633..02a23ff5bd1 100644 --- a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/extra-header.md +++ b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/extra-header.md @@ -10,4 +10,4 @@ This is something ## Upgrading to 1.0.0 -This is extra \ No newline at end of file +This is extra diff --git a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/missing-entry.md b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/missing-entry.md index 6bc3ef77252..cf642efdc26 100644 --- a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/missing-entry.md +++ b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/missing-entry.md @@ -2,4 +2,4 @@ ## Upgrading to 1.0.0 -This is something \ No newline at end of file +This is something diff --git a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/out-of-order.md b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/out-of-order.md index 12e6bdd3705..dc2caf839c4 100644 --- a/airbyte-ci/connectors/connector_ops/tests/test_migration_files/out-of-order.md +++ b/airbyte-ci/connectors/connector_ops/tests/test_migration_files/out-of-order.md @@ -6,4 +6,4 @@ This is something ## Upgrading to 2.0.0 -This is something else \ No newline at end of file +This is something else diff --git a/airbyte-ci/connectors/connectors_qa/README.md b/airbyte-ci/connectors/connectors_qa/README.md index 5639bc648c9..61b6599e793 100644 --- a/airbyte-ci/connectors/connectors_qa/README.md +++ b/airbyte-ci/connectors/connectors_qa/README.md @@ -105,6 +105,7 @@ poe type_check ```bash poe lint ``` + ## Changelog ### 1.3.1 @@ -120,6 +121,7 @@ Added `CheckConnectorMaxSecondsBetweenMessagesValue` check that verifies presenc Added `ValidateBreakingChangesDeadlines` check that verifies the minimal compliance of breaking change rollout deadline. ### 1.1.0 + Introduced the `Check.run_on_released_connectors` flag. ### 1.0.4 @@ -141,4 +143,5 @@ Fix access to connector types: it should be accessed from the `Connector.connect - Make `CheckPublishToPyPiIsEnabled` run on source connectors only. ### 1.0.0 + Initial release of `connectors-qa` package. diff --git a/airbyte-ci/connectors/connectors_qa/src/connectors_qa/checks/documentation.py b/airbyte-ci/connectors/connectors_qa/src/connectors_qa/checks/documentation.py index 289f1d7e5f3..6e6a0f29082 100644 --- a/airbyte-ci/connectors/connectors_qa/src/connectors_qa/checks/documentation.py +++ b/airbyte-ci/connectors/connectors_qa/src/connectors_qa/checks/documentation.py @@ -15,7 +15,7 @@ class DocumentationCheck(Check): class CheckMigrationGuide(DocumentationCheck): name = "Breaking changes must be accompanied by a migration guide" - description = "When a breaking change is introduced, we check that a migration guide is available. It should be stored under `./docs/integrations/s/-migrations.md`.\nThis document should contain a section for each breaking change, in order of the version descending. It must explain users which action to take to migrate to the new version." + description = "When a breaking change is introduced, we check that a migration guide is available. It should be stored under `./docs/integrations/s/-migrations.md`.\nThis document should contain a section for each breaking change, in order of the version descending. It must explain users which action to take to migrate to the new version." def _run(self, connector: Connector) -> CheckResult: breaking_changes = get(connector.metadata, "releases.breakingChanges") diff --git a/airbyte-ci/connectors/connectors_qa/src/connectors_qa/templates/qa_checks.md.j2 b/airbyte-ci/connectors/connectors_qa/src/connectors_qa/templates/qa_checks.md.j2 index 2e5f670eeb9..a8ee6255f39 100644 --- a/airbyte-ci/connectors/connectors_qa/src/connectors_qa/templates/qa_checks.md.j2 +++ b/airbyte-ci/connectors/connectors_qa/src/connectors_qa/templates/qa_checks.md.j2 @@ -5,15 +5,15 @@ These checks are running in our CI/CD pipeline and are used to ensure a connecto Meeting these standards means that the connector will be able to be safely integrated into the Airbyte platform and released to registries (DockerHub, Pypi etc.). You can consider these checks as a set of guidelines to follow when developing a connector. They are by no mean replacing the need for a manual review of the connector codebase and the implementation of good test suites. - {% for category, checks in checks_by_category.items() %} ## {{ category.value }} {% for check in checks %} ### {{ check.name }} -*Applies to the following connector types: {{ ', '.join(check.applies_to_connector_types) }}* -*Applies to the following connector languages: {{ ', '.join(check.applies_to_connector_languages) }}* -*Applies to connector with {{ ', '.join(check.applies_to_connector_support_levels) if check.applies_to_connector_support_levels else 'any' }} support level* + +_Applies to the following connector types: {{ ', '.join(check.applies_to_connector_types) }}_ +_Applies to the following connector languages: {{ ', '.join(check.applies_to_connector_languages) }}_ +_Applies to connector with {{ ', '.join(check.applies_to_connector_support_levels) if check.applies_to_connector_support_levels else 'any' }} support level_ {{ check.description }} -{%- endfor %} {% endfor %} +{%- endfor %} diff --git a/airbyte-ci/connectors/live-tests/README.md b/airbyte-ci/connectors/live-tests/README.md index 972d9eefe7c..53f1884d953 100644 --- a/airbyte-ci/connectors/live-tests/README.md +++ b/airbyte-ci/connectors/live-tests/README.md @@ -3,12 +3,14 @@ This project contains utilities for running connector tests against live data. ## Requirements -* `docker` -* `Python ^3.10` -* `pipx` -* `poetry` + +- `docker` +- `Python ^3.10` +- `pipx` +- `poetry` ## Install + ```bash # From airbyte-ci/connectors/live-tests poetry install @@ -39,19 +41,22 @@ Options: This command is made to run any of the following connector commands against one or multiple connector images. **Available connector commands:** -* `spec` -* `check` -* `discover` -* `read` or `read_with_state` (requires a `--state-path` to be passed) + +- `spec` +- `check` +- `discover` +- `read` or `read_with_state` (requires a `--state-path` to be passed) It will write artifacts to an output directory: -* `stdout.log`: The collected standard output following the command execution -* `stderr.log`: The collected standard error following the c -* `http_dump.txt`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `9.0.1`) for debugging. -* `airbyte_messages.db`: A DuckDB database containing the messages produced by the connector. -* `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector. + +- `stdout.log`: The collected standard output following the command execution +- `stderr.log`: The collected standard error following the c +- `http_dump.txt`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `9.0.1`) for debugging. +- `airbyte_messages.db`: A DuckDB database containing the messages produced by the connector. +- `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector. #### Example + Let's run `debug` to check the output of `read` on two different versions of the same connector: ```bash @@ -99,22 +104,27 @@ poetry run live-tests debug read \ ``` ##### Consuming `http_dump.mitm` + You can install [`mitmproxy`](https://mitmproxy.org/): + ```bash pipx install mitmproxy ``` And run: + ```bash mitmweb --rfile=http_dump.mitm ``` ## Regression tests + We created a regression test suite to run tests to compare the outputs of connector commands on different versions of the same connector. ## Tutorial(s) -* [Loom Walkthrough (Airbyte Only)](https://www.loom.com/share/97c49d7818664b119cff6911a8a211a2?sid=4570a5b6-9c81-4db3-ba33-c74dc5845c3c) -* [Internal Docs (Airbyte Only)](https://docs.google.com/document/d/1pzTxJTsooc9iQDlALjvOWtnq6yRTvzVtbkJxY4R36_I/edit) + +- [Loom Walkthrough (Airbyte Only)](https://www.loom.com/share/97c49d7818664b119cff6911a8a211a2?sid=4570a5b6-9c81-4db3-ba33-c74dc5845c3c) +- [Internal Docs (Airbyte Only)](https://docs.google.com/document/d/1pzTxJTsooc9iQDlALjvOWtnq6yRTvzVtbkJxY4R36_I/edit) ### How to Use @@ -123,6 +133,7 @@ We created a regression test suite to run tests to compare the outputs of connec You can run the existing test suites with the following command: #### With local connection objects (`config.json`, `catalog.json`, `state.json`) + ```bash poetry run pytest src/live_tests/regression_tests \ --connector-image=airbyte/source-faker \ @@ -134,6 +145,7 @@ poetry run pytest src/live_tests/regression_tests \ ``` #### Using a live connection + The live connection objects will be fetched. ```bash @@ -142,27 +154,28 @@ The live connection objects will be fetched. --target-version=dev \ --control-version=latest \ --pr-url= # The URL of the PR you are testing - ``` +``` You can also pass local connection objects path to override the live connection objects with `--config-path`, `--state-path` or `--catalog-path`. #### Test artifacts + The test suite run will produce test artifacts in the `/tmp/regression_tests_artifacts/` folder. **They will get cleared after each test run on prompt exit. Please do not copy them elsewhere in your filesystem as they contain sensitive data that are not meant to be stored outside of your debugging session!** ##### Artifacts types -* `report.html`: A report of the test run. -* `stdout.log`: The collected standard output following the command execution -* `stderr.log`: The collected standard error following the command execution -* `http_dump.mitm`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `>=10`) for debugging. -* `http_dump.har`: An `mitmproxy` http stream log in HAR format (a JSON encoded version of the mitm dump). -* `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector. -* `duck.db`: A DuckDB database containing the messages produced by the connector. -* `dagger.log`: The log of the Dagger session, useful for debugging errors unrelated to the tests. + +- `report.html`: A report of the test run. +- `stdout.log`: The collected standard output following the command execution +- `stderr.log`: The collected standard error following the command execution +- `http_dump.mitm`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `>=10`) for debugging. +- `http_dump.har`: An `mitmproxy` http stream log in HAR format (a JSON encoded version of the mitm dump). +- `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector. +- `duck.db`: A DuckDB database containing the messages produced by the connector. +- `dagger.log`: The log of the Dagger session, useful for debugging errors unrelated to the tests. **Tests can also write specific artifacts like diffs under a directory named after the test function.** - ``` /tmp/regression_tests_artifacts └── session_1710754231 @@ -235,110 +248,137 @@ The test suite run will produce test artifacts in the `/tmp/regression_tests_art │   ├── stderr.log │   └── stdout.log └── dagger.log - ``` +``` #### HTTP Proxy and caching + We use a containerized `mitmproxy` to capture the HTTP traffic between the connector and the source. Connector command runs produce `http_dump.mitm` (can be consumed with `mitmproxy` (version `>=10`) for debugging) and `http_dump.har` (a JSON encoded version of the mitm dump) artifacts. The traffic recorded on the control connector is passed to the target connector proxy to cache the responses for requests with the same URL. This is useful to avoid hitting the source API multiple times when running the same command on different versions of the connector. ### Custom CLI Arguments -| Argument | Description | Required/Optional | -|----------------------------|---------------------------------------------------------------------------------------------------------------------------|-------------------| -| `--connector-image` | Docker image name of the connector to debug (e.g., `airbyte/source-faker:latest`, `airbyte/source-faker:dev`). | Required | -| `--control-version` | Version of the control connector for regression testing. | Required | -| `--target-version` | Version of the connector being tested. (Defaults to dev) | Optional | -| `--pr-url` | URL of the pull request being tested. | Required | -| `--connection-id` | ID of the connection for live testing. If not provided, a prompt will appear to choose. | Optional | -| `--config-path` | Path to the custom source configuration file. | Optional | -| `--catalog-path` | Path to the custom configured catalog file. | Optional | -| `--state-path` | Path to the custom state file. | Optional | -| `--http-cache` | Use the HTTP cache for the connector. | Optional | -| `--run-id` | Unique identifier for the test run. If not provided, a timestamp will be used. | Optional | -| `--auto-select-connection` | Automatically select a connection for testing. | Optional | -| `--stream` | Name of the stream to test. Can be specified multiple times to test multiple streams. | Optional | -| `--should-read-with-state` | Specify whether to read with state. If not provided, a prompt will appear to choose. | Optional | - +| Argument | Description | Required/Optional | +| -------------------------- | -------------------------------------------------------------------------------------------------------------- | ----------------- | +| `--connector-image` | Docker image name of the connector to debug (e.g., `airbyte/source-faker:latest`, `airbyte/source-faker:dev`). | Required | +| `--control-version` | Version of the control connector for regression testing. | Required | +| `--target-version` | Version of the connector being tested. (Defaults to dev) | Optional | +| `--pr-url` | URL of the pull request being tested. | Required | +| `--connection-id` | ID of the connection for live testing. If not provided, a prompt will appear to choose. | Optional | +| `--config-path` | Path to the custom source configuration file. | Optional | +| `--catalog-path` | Path to the custom configured catalog file. | Optional | +| `--state-path` | Path to the custom state file. | Optional | +| `--http-cache` | Use the HTTP cache for the connector. | Optional | +| `--run-id` | Unique identifier for the test run. If not provided, a timestamp will be used. | Optional | +| `--auto-select-connection` | Automatically select a connection for testing. | Optional | +| `--stream` | Name of the stream to test. Can be specified multiple times to test multiple streams. | Optional | +| `--should-read-with-state` | Specify whether to read with state. If not provided, a prompt will appear to choose. | Optional | ## Changelog ### 0.17.0 + Enable running in GitHub actions. ### 0.16.0 + Enable running with airbyte-ci. ### 0.15.0 + Automatic retrieval of connection objects for regression tests. The connection id is not required anymore. ### 0.14.2 + Fix KeyError when target & control streams differ. ### 0.14.1 + Improve performance when reading records per stream. ### 0.14.0 + Track usage via Segment. ### 0.13.0 + Show test docstring in the test report. ### 0.12.0 + Implement a test to compare schema inferred on both control and target version. ### 0.11.0 + Create a global duckdb instance to store messages produced by the connector in target and control version. ### 0.10.0 + Show record count per stream in report and list untested streams. ### 0.9.0 + Make the regressions tests suite better at handling large connector outputs. ### 0.8.1 + Improve diff output. ### 0.8.0 + Regression tests: add an HTML report. ### 0.7.0 + Improve the proxy workflow and caching logic + generate HAR files. ### 0.6.6 + Exit pytest if connection can't be retrieved. ### 0.6.6 + Cleanup debug files when prompt is closed. ### 0.6.5 + Improve ConnectorRunner logging. ### 0.6.4 + Add more data integrity checks to the regression tests suite. ### 0.6.3 + Make catalog diffs more readable. ### 0.6.2 + Clean up regression test artifacts on any exception. ### 0.6.1 + Modify diff output for `discover` and `read` tests. ### 0.5.1 + Handle connector command execution errors. ### 0.5.0 + Add new tests and confirmation prompts. ### 0.4.0 + Introduce DuckDB to store the messages produced by the connector. ### 0.3.0 + Pass connection id to the regression tests suite. ### 0.2.0 + Declare the regression tests suite. ### 0.1.0 + Implement initial primitives and a `debug` command to run connector commands and persist the outputs to local storage. diff --git a/airbyte-ci/connectors/metadata_service/lib/README.md b/airbyte-ci/connectors/metadata_service/lib/README.md index 5b2016c3f92..777ca46ecc9 100644 --- a/airbyte-ci/connectors/metadata_service/lib/README.md +++ b/airbyte-ci/connectors/metadata_service/lib/README.md @@ -10,7 +10,6 @@ To use this submodule, it is recommended that you use Poetry to manage dependenc poetry install ``` - ## Generating Models This submodule includes a tool for generating Python models from JSON Schema specifications. To generate the models, we use the library [datamodel-code-generator](https://github.com/koxudaxi/datamodel-code-generator). The generated models are stored in `models/generated`. @@ -24,13 +23,14 @@ poetry run poe generate-models This will read the JSON Schema specifications in `models/src` and generate Python models in `models/generated`. - ## Running Tests + ```bash poetry run pytest ``` ## Validating Metadata Files + To be considered valid, a connector must have a metadata.yaml file which must conform to the [ConnectorMetadataDefinitionV0](./metadata_service/models/src/ConnectorMetadataDefinitionV0.yaml) schema, and a documentation file. The paths to both files must be passed to the validate command. @@ -42,6 +42,7 @@ poetry run metadata_service validate tests/fixtures/metadata_validate/valid/meta ## Useful Commands ### Replicate Production Data in your Development Bucket + This will replicate all the production data to your development bucket. This is useful for testing the metadata service with real up to date data. _💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_ @@ -53,6 +54,7 @@ TARGET_BUCKET= poetry poe replicate-prod ``` ### Copy specific connector version to your Development Bucket + This will copy the specified connector version to your development bucket. This is useful for testing the metadata service with a specific version of a connector. _💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_ @@ -62,6 +64,7 @@ TARGET_BUCKET= CONNECTOR="airbyte/source-stripe" VERSION="3.17. ``` ### Promote Connector Version to Latest + This will promote the specified connector version to the latest version in the registry. This is useful for creating a mocked registry in which a prerelease connector is treated as if it was already published. _💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_ diff --git a/airbyte-ci/connectors/metadata_service/lib/tests/fixtures/doc.md b/airbyte-ci/connectors/metadata_service/lib/tests/fixtures/doc.md index 8c70aca8990..3ee6db6d8d8 100644 --- a/airbyte-ci/connectors/metadata_service/lib/tests/fixtures/doc.md +++ b/airbyte-ci/connectors/metadata_service/lib/tests/fixtures/doc.md @@ -1 +1 @@ -# The test doc for metadata_validate \ No newline at end of file +# The test doc for metadata_validate diff --git a/airbyte-ci/connectors/metadata_service/orchestrator/orchestrator/templates/connector_nightly_report.md b/airbyte-ci/connectors/metadata_service/orchestrator/orchestrator/templates/connector_nightly_report.md index 7ff69711a70..256746be8ba 100644 --- a/airbyte-ci/connectors/metadata_service/orchestrator/orchestrator/templates/connector_nightly_report.md +++ b/airbyte-ci/connectors/metadata_service/orchestrator/orchestrator/templates/connector_nightly_report.md @@ -6,18 +6,16 @@ Url: {{ last_action_url }} Run time: {{ last_action_run_time }} +CONNECTORS: total: {{ total_connectors }} -CONNECTORS: total: {{ total_connectors }} - -Sources: total: {{ source_stats["total"] }} / tested: {{ source_stats["tested"] }} / success: {{ source_stats["success"] }} ({{ source_stats["success_percent"] }}%) +Sources: total: {{ source_stats["total"] }} / tested: {{ source_stats["tested"] }} / success: {{ source_stats["success"] }} ({{ source_stats["success_percent"] }}%) Destinations: total: {{ destination_stats["total"] }} / tested: {{ destination_stats["tested"] }} / success: {{ destination_stats["success"] }} ({{ destination_stats["success_percent"] }}%) -**FAILED LAST BUILD ONLY - {{ failed_last_build_only_count }} connectors:** +**FAILED LAST BUILD ONLY - {{ failed_last_build_only_count }} connectors:** {{ failed_last_build_only }} - **FAILED TWO LAST BUILDS - {{ failed_last_build_two_builds_count }} connectors:** {{ failed_last_build_two_builds }} diff --git a/airbyte-ci/connectors/pipelines/CONTRIBUTING.md b/airbyte-ci/connectors/pipelines/CONTRIBUTING.md index 46878f840f7..718d91529a6 100644 --- a/airbyte-ci/connectors/pipelines/CONTRIBUTING.md +++ b/airbyte-ci/connectors/pipelines/CONTRIBUTING.md @@ -1,26 +1,25 @@ - ## What is `airbyte-ci`? `airbyte-ci` is a CLI written as a python package which is made to execute CI operations on the `airbyte` repo. It is heavily using the [Dagger](https://dagger.cloud/) library to build and orchestrate Docker containers programatically. It enables a centralized and programmatic approach at executing CI logics which can seamlessly run both locally and in remote CI environments. You can read more why we are using Dagger and the benefit it has provided in this [blog post](https://dagger.io/blog/airbyte-use-case) - ## When is a contribution to `airbyte-ci` a good fit for your use case? -* When you want to make global changes to connectors artifacts and build logic. -* When you want to execute something made to run both in CI or for local development. As airbyte-ci logic relies on container orchestration you can have reproducible environment and execution both locally and in a remote CI environment. -* When you want to orchestrate the tests and release of an internal package in CI. +- When you want to make global changes to connectors artifacts and build logic. +- When you want to execute something made to run both in CI or for local development. As airbyte-ci logic relies on container orchestration you can have reproducible environment and execution both locally and in a remote CI environment. +- When you want to orchestrate the tests and release of an internal package in CI. ## Who can I ask help from? The tool has been maintained by multiple Airbyters. Our top contributors who can help you figuring the best approach to implement your use case are: -* [@alafanechere](https://github.com/alafanechere). -* [@postamar](https://github.com/postamar) -* [@erohmensing](https://github.com/erohmensing) -* [@bnchrch](https://github.com/bnchrch) -* [@stephane-airbyte](https://github.com/stephane-airbyte) + +- [@alafanechere](https://github.com/alafanechere). +- [@postamar](https://github.com/postamar) +- [@erohmensing](https://github.com/erohmensing) +- [@bnchrch](https://github.com/bnchrch) +- [@stephane-airbyte](https://github.com/stephane-airbyte) ## Where is the code? @@ -29,7 +28,7 @@ The code is currently available in the `airbytehq/airbyte` repo under [ `airbyte ## What use cases it currently supports According to your need you might want to introduce a new logic to an existing flow or create a new one. -Here are the currently supported use cases. Feel free to grab them as example if you want to craft a new flow, or modify an existing one. If you are not sure about which direction to take feel free to ask advices (see [*Who Can I ask help?*](## Who can I ask help from?) from section). +Here are the currently supported use cases. Feel free to grab them as example if you want to craft a new flow, or modify an existing one. If you are not sure about which direction to take feel free to ask advices (see [*Who Can I ask help?*](## Who can I ask help from?) from section). | Command group | Feature | Command | Entrypoint path | | ------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------- | --------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | @@ -53,26 +52,32 @@ There are multiple way to have dev install of the tool. Feel free to grab the on **Please note that all the install mode lead to an editable install. There's no need to re-install the tool following a code change**. ### System requirements -* `Python` > 3.10 -* [`Poetry`](https://python-poetry.org/) or [`pipx`](https://github.com/pypa/pipx) + +- `Python` > 3.10 +- [`Poetry`](https://python-poetry.org/) or [`pipx`](https://github.com/pypa/pipx) ### Installation options + There are many ways to install Python tools / packages. For most users we recommend you use `make` but `pipx` and `poetry` are also viable options + #### With `make` + ```bash # From airbyte repo root: make tools.airbyte-ci-dev.install - ``` +``` #### With `pipx` + ```bash # From airbyte-ci/connectors/pipelines: pipx install --editable --force . ``` #### With `poetry` + ⚠️ This places you in a python environment specific to airbyte-ci. This can be a problem if you are developing airbyte-ci and testing/using your changes in another python project. ```bash @@ -81,17 +86,19 @@ poetry install poetry shell ``` - ## Main libraries used in the tool ### [Click](https://click.palletsprojects.com/en/8.1.x/) -This is a python light CLI framework we use to declare entrypoint. You'll interact with it if you have to deal with commands, command groups, option, arguments etc. + +This is a python light CLI framework we use to declare entrypoint. You'll interact with it if you have to deal with commands, command groups, option, arguments etc. ### [Dagger](https://dagger-io.readthedocs.io/en/sdk-python-v0.9.6/) + This is an SDK to build, execute and interact with Docker containers in Python. It's basically a nice API on top of [BuildKit](https://docs.docker.com/build/buildkit/). We use containers to wrap the majority of `airbyte-ci` operations as it allows us to: -* Execute language agnostic operations: you can execute bash commands, gradle tasks, etc. in containers with Python. Pure magic! -* Benefit from caching by default. You can consider a Dagger operation a "line in a Dockerfile". Each operation is cached by BuildKit if the inputs of the operation did not change. -* As Dagger exposes async APIs we can easily implement concurrent logic. This is great for performance. + +- Execute language agnostic operations: you can execute bash commands, gradle tasks, etc. in containers with Python. Pure magic! +- Benefit from caching by default. You can consider a Dagger operation a "line in a Dockerfile". Each operation is cached by BuildKit if the inputs of the operation did not change. +- As Dagger exposes async APIs we can easily implement concurrent logic. This is great for performance. **Please note that we are currently using v0.9.6 of Dagger. The library is under active development so please refer to [this specific version documentation](https://dagger-io.readthedocs.io/en/sdk-python-v0.9.6/) if you want an accurate view of the available APIs.** @@ -102,9 +109,9 @@ As Dagger exposes async APIs we use `anyio` (and the `asyncer` wrapper sometimes ## Design principles -*The principles set out below are ideals, but the first iterations on the project did not always respect them. Don't be surprised if you see code that contradicts what we're about to say (tech debt...).* +_The principles set out below are ideals, but the first iterations on the project did not always respect them. Don't be surprised if you see code that contradicts what we're about to say (tech debt...)._ -### `airbyte-ci` is *just* an orchestrator +### `airbyte-ci` is _just_ an orchestrator Ideally the steps declared in airbyte-ci pipeline do not contain any business logic themselves. They call external projects, within containers, which contains the business logic. @@ -113,8 +120,9 @@ Following this principles will help in decoupling airbyte-ci from other project Maintaining business logic in smaller projects also increases velocity, as introducing a new logic would not require changing airbyte-ci and, which is already a big project in terms of code lines. #### Good examples of this principle -* `connectors-qa`: We want to run specific static checks on all our connectors: we introduced a specific python package ([`connectors-qa`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/connectors_qa/README.md#L1))which declares and run the checks on connectors. We orchestrate the run of this package inside the [QaChecks](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/common.py#L122) step. This class is just aware of the tool location, its entry point, and what has to be mounted to the container for the command to run. -* Internal package testing: We expose an `airbyte-ci test` command which can run a CI pipeline on an internal poetry package. The pipeline logic is declared at the package level with `poe` tasks in the package `pyproject.toml`. `airbyte-ci` is made aware about what is has to run by parsing the content of the `[tool.airbyte_ci]` section of the `pyproject.toml`file. [Example](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/pyproject.toml#L39) + +- `connectors-qa`: We want to run specific static checks on all our connectors: we introduced a specific python package ([`connectors-qa`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/connectors_qa/README.md#L1))which declares and run the checks on connectors. We orchestrate the run of this package inside the [QaChecks](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/common.py#L122) step. This class is just aware of the tool location, its entry point, and what has to be mounted to the container for the command to run. +- Internal package testing: We expose an `airbyte-ci test` command which can run a CI pipeline on an internal poetry package. The pipeline logic is declared at the package level with `poe` tasks in the package `pyproject.toml`. `airbyte-ci` is made aware about what is has to run by parsing the content of the `[tool.airbyte_ci]` section of the `pyproject.toml`file. [Example](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/pyproject.toml#L39) ### No command or pipeline should be language specific @@ -124,45 +132,52 @@ We oftentimes have to introduce new flows for connectors / CDK. Even if the need The `airbyte-ci connectors build` command can build multiple connectors of different languages in a single execution. The higher level [`run_connector_build_pipeline` function](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/__init__.py#L36) is connector language agnostic and calls connector language specific sub pipelines according to the connector language. -We have per-language submodules in which language specific `BuildConnectorImages` classes are implemented: -* [`python_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/python_connectors.py) -* [`java_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/java_connectors.py#L14) +We have per-language submodules in which language specific `BuildConnectorImages` classes are implemented: + +- [`python_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/python_connectors.py) +- [`java_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/java_connectors.py#L14) ### Pipelines are functions, steps are classes A pipeline is a function: -* instantiating and running steps -* collecting step results and acting according to step results -* returning a report + +- instantiating and running steps +- collecting step results and acting according to step results +- returning a report A step is a class which inheriting from the `Step` base class: -* Can be instantiated with parameters -* Has a `_run` method which: - * Performs one or multiple operations according to input parameter and context values - * Returns a `StepResult` which can have a `succeeded`, `failed` or `skipped` `StepStatus` + +- Can be instantiated with parameters +- Has a `_run` method which: + - Performs one or multiple operations according to input parameter and context values + - Returns a `StepResult` which can have a `succeeded`, `failed` or `skipped` `StepStatus` **Steps should ideally not call other steps and the DAG of steps can be understand by reading the pipeline function.** #### Step examples: - * [`PytestStep`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L29) - * [`GradleTask`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/steps/gradle.py#L21) + +- [`PytestStep`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L29) +- [`GradleTask`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/steps/gradle.py#L21) + #### Pipelines examples: -* [`run_connector_publish_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/publish/pipeline.py#L296) -* [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48) + +- [`run_connector_publish_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/publish/pipeline.py#L296) +- [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48) ## Main classes ### [`PipelineContext`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/models/contexts/pipeline_context.py#L33) (and [`ConnectorContext`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/context.py#L33), [`PublishConnectorContext`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/publish/context.py#L19)) -Pipeline contexts are instantiated on each command execution and produced according to the CLI inputs. We populate this class with global configuration, helpers and attributes that are accessed during pipeline and step execution. +Pipeline contexts are instantiated on each command execution and produced according to the CLI inputs. We populate this class with global configuration, helpers and attributes that are accessed during pipeline and step execution. It has, for instance, the following attributes: -* The dagger client -* The list of modified files on the branch -* A `connector` attribute -* A `get_connector_dir` method to interact with the connector -* Global secrets to connect to protected resources -* A `is_ci` attribute to know if the current execution is a local or CI one. + +- The dagger client +- The list of modified files on the branch +- A `connector` attribute +- A `get_connector_dir` method to interact with the connector +- Global secrets to connect to protected resources +- A `is_ci` attribute to know if the current execution is a local or CI one. We use `PipelineContext` with context managers so that we can easily handle setup and teardown logic of context (like producing a `Report`) @@ -171,25 +186,26 @@ We use `PipelineContext` with context managers so that we can easily handle setu `Step` is an abstract class. It is meant to be inherited for implementation of pipeline steps which are use case specific. `Step` exposes a public `run` method which calls a private `_run` method wrapped with progress logger and a retry mechanism. When declaring a `Step` child class you are expected to: -* declare a `title` attribute or `property` -* implement the `_run` method which should return a `StepResult` object. You are free to override the `Step` methods if needed. + +- declare a `title` attribute or `property` +- implement the `_run` method which should return a `StepResult` object. You are free to override the `Step` methods if needed. ### [`Result` / `StepResult`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/models/steps.py#L86) The `Result` class (and its subclasses) are meant to characterize the result of a `Step` execution. `Result` objects are build with: -* `StepStatus` (success/failure/skipped) -* `stderr`: The standard error of the operation execution -* `stdout` : The standard output of the operation execution -* `excinfo`: An Exception instance if you want to handle an operation error -* `output`: Any object you'd like to attach to the result for reuse in other Steps -* `artifacts`: Any object produced by the Step that you'd like to attach to the `Report` + +- `StepStatus` (success/failure/skipped) +- `stderr`: The standard error of the operation execution +- `stdout` : The standard output of the operation execution +- `excinfo`: An Exception instance if you want to handle an operation error +- `output`: Any object you'd like to attach to the result for reuse in other Steps +- `artifacts`: Any object produced by the Step that you'd like to attach to the `Report` ### [`Report`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/models/reports.py#L34) A `Report` object is instantiated on `PipelineContext` teardown with a collection of step results. It is meant to persists execution results as json / html locally and in remote storage to share them with users or other automated processes. - ## Github Action orchestration A benefit of declaring CI logic in a centralized python package is that our CI logic can be agnostic from the CI platform it runs on. We are currently using GitHub actions. This section will explain how we run `airbyte-ci` in GitHub actions. @@ -197,16 +213,18 @@ A benefit of declaring CI logic in a centralized python package is that our CI l ### Multiple workflows re-using the same actions Each CI use case has its own Github Action worfklow: -* [Connector testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/connectors_tests.yml#L1) -* [Connector publish](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/publish_connectors.yml#L1) -* [Internal package testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/airbyte-ci-tests.yml#L1) -* etc. + +- [Connector testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/connectors_tests.yml#L1) +- [Connector publish](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/publish_connectors.yml#L1) +- [Internal package testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/airbyte-ci-tests.yml#L1) +- etc. They all use the [`run-airbyte-ci` re-usable action](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L1)to which they provide the `airbyte-ci` command the workflow should run and other environment specific options. The `run-airbyte-ci` action does the following: -* [Pull Dagger image and install airbyte-ci from binary (or sources if the tool was changed on the branch)](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L105) -* [Run the airbyte-ci command passed as an input with other options also passed as inputs](https://github.com/airbytehq/airbyte/blob/main/.github/actions/run-airbyte-ci/action.yml#L111) + +- [Pull Dagger image and install airbyte-ci from binary (or sources if the tool was changed on the branch)](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L105) +- [Run the airbyte-ci command passed as an input with other options also passed as inputs](https://github.com/airbytehq/airbyte/blob/main/.github/actions/run-airbyte-ci/action.yml#L111) ## A full example: breaking down the execution flow of a connector test pipeline @@ -215,12 +233,14 @@ Let's describe and follow what happens when we run: **This command is meant to run tests on connectors that were modified on the branch.** Let's assume I modified the `source-faker` connector. + ### 1. The `airbyte-ci` command group On command execution the [`airbyte-ci` command group](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/cli/airbyte_ci.py#L186) acts as the main entrypoint. It is: -* Provisioning the click context object with options values, that can be accessed in downstream commands. -* Checking if the local docker configuration is correct -* Wrapping the command execution with `dagger run` to get their nice terminal UI (unless `--disable-dagger-run` is passed) + +- Provisioning the click context object with options values, that can be accessed in downstream commands. +- Checking if the local docker configuration is correct +- Wrapping the command execution with `dagger run` to get their nice terminal UI (unless `--disable-dagger-run` is passed) ### 2. The `connectors` command subgroup @@ -229,13 +249,15 @@ It continues to populate the click context with other connectors specific option **It also computes the list of modified files on the branch and attach this list to the click context.** The `get_modified_files` function basically performs a `git diff` between the current branch and the `--diffed-branch` . ### 3. Reaching the `test` command -After going through the command groups we finally reach the actual command the user wants to execute: the [`test` command](https://github.com/airbytehq/airbyte/blob/main/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/commands.py#L72). + +After going through the command groups we finally reach the actual command the user wants to execute: the [`test` command](https://github.com/airbytehq/airbyte/blob/main/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/commands.py#L72). This function: -* Sends a pending commit status check to Github when we are running in CI -* Determines which steps should be skipped or kept according to user inputs (by building a `RunStepOptions` object) -* Instantiate one `ConnectorContext` per connector under test: we only modified `source-faker` so we'll have a single `ConnectorContext` to work with. -* Call `run_connectors_pipelines` with the `ConnectorContext`s and + +- Sends a pending commit status check to Github when we are running in CI +- Determines which steps should be skipped or kept according to user inputs (by building a `RunStepOptions` object) +- Instantiate one `ConnectorContext` per connector under test: we only modified `source-faker` so we'll have a single `ConnectorContext` to work with. +- Call `run_connectors_pipelines` with the `ConnectorContext`s and #### 4. Globally dispatching pipeline logic in `run_connectors_pipeline` @@ -243,17 +265,19 @@ This function: `run_connectors_pipeline`, as its taking a pipeline callable, it has no specific pipeline logic. This function: -* Instantiates the dagger client -* Create a task group to concurrently run the pipeline callable: we'd concurrently run test pipeline on multiple connectors if multiple connectors were modified. -* The concurrency of the pipeline is control via a semaphore object. + +- Instantiates the dagger client +- Create a task group to concurrently run the pipeline callable: we'd concurrently run test pipeline on multiple connectors if multiple connectors were modified. +- The concurrency of the pipeline is control via a semaphore object. #### 5. Actually running the pipeline in [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48) -*Reminder: this function is called for each connector selected for testing. It takes a `ConnectorContext` and a `Semaphore` as inputs.* +_Reminder: this function is called for each connector selected for testing. It takes a `ConnectorContext` and a `Semaphore` as inputs._ The specific steps to run in the pipeline for a connector is determined by the output of the [`get_test_steps`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L32) function which is building a step tree according to the connector language. **You can for instance check the declared step tree for python connectors [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L249).**: + ```python def get_test_steps(context: ConnectorContext) -> STEP_TREE: """ @@ -292,7 +316,7 @@ def get_test_steps(context: ConnectorContext) -> STEP_TREE: ] ``` -After creating the step tree (a.k.a a *DAG*) it enters the `Semaphore` and `PipelineContext` context manager to execute the steps to run with `run_steps`. `run_steps` executes steps concurrently according to their dependencies. +After creating the step tree (a.k.a a _DAG_) it enters the `Semaphore` and `PipelineContext` context manager to execute the steps to run with `run_steps`. `run_steps` executes steps concurrently according to their dependencies. Once the steps are executed we get step results. We can build a `ConnectorReport` from these results. The report is finally attached to the `context` so that it gets persisted on `context` teardown. @@ -329,12 +353,14 @@ async def run_connector_test_pipeline(context: ConnectorContext, semaphore: anyi ``` #### 6. `ConnectorContext` teardown + Once the context manager is exited (when we exit the `async with context` block) the [`ConnectorContext.__aexit__` function is executed](https://github.com/airbytehq/airbyte/blob/main/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/context.py#L237) This function: -* Determines the global success or failure state of the pipeline according to the StepResults -* Uploads connector secrets back to GSM if they got updated -* Persists the report to disk -* Prints the report to the console -* Uploads the report to remote storage if we're in CI -* Updates the per connector commit status check + +- Determines the global success or failure state of the pipeline according to the StepResults +- Uploads connector secrets back to GSM if they got updated +- Persists the report to disk +- Prints the report to the console +- Uploads the report to remote storage if we're in CI +- Updates the per connector commit status check diff --git a/airbyte-ci/connectors/pipelines/README.md b/airbyte-ci/connectors/pipelines/README.md index bc4fef4841a..e7d975fbd5b 100644 --- a/airbyte-ci/connectors/pipelines/README.md +++ b/airbyte-ci/connectors/pipelines/README.md @@ -298,14 +298,14 @@ flowchart TD #### Options | Option | Multiple | Default value | Description | -| ------------------------------------------------------- | -------- | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| ------------------------------------------------------- | -------- | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | | `--skip-step/-x` | True | | Skip steps by id e.g. `-x unit -x acceptance` | | `--only-step/-k` | True | | Only run specific steps by id e.g. `-k unit -k acceptance` | | `--fail-fast` | False | False | Abort after any tests fail, rather than continuing to run additional tests. Use this setting to confirm a known bug is fixed (or not), or when you only require a pass/fail result. | | `--code-tests-only` | True | False | Skip any tests not directly related to code updates. For instance, metadata checks, version bump checks, changelog verification, etc. Use this setting to help focus on code quality during development. | | `--concurrent-cat` | False | False | Make CAT tests run concurrently using pytest-xdist. Be careful about source or destination API rate limits. | | `--.=` | True | | You can pass extra parameters for specific test steps. More details in the extra parameters section below | -| `--ci-requirements` | False | | | Output the CI requirements as a JSON payload. It is used to determine the CI runner to use. +| `--ci-requirements` | False | | | Output the CI requirements as a JSON payload. It is used to determine the CI runner to use. | Note: @@ -461,12 +461,12 @@ remoteRegistries: ### `connectors up_to_date` command -Meant to be run on a cron script. +Meant to be run on a cron script. Actions: -* Upgrades dependecies to the current versions -* Can make a pull request and bump version, changelog +- Upgrades dependecies to the current versions +- Can make a pull request and bump version, changelog ``` Usage: airbyte-ci connectors up_to_date [OPTIONS] @@ -484,19 +484,19 @@ Options: Get source-openweather up to date. If there are changes, bump the version and add to changelog: -* `airbyte-ci connectors --name=source-openweather up_to_date`: upgrades main dependecies -* `airbyte-ci connectors --name=source-openweather up_to_date --dev`: forces update if there are only dev changes -* `airbyte-ci connectors --name=source-openweather up_to_date --dep pytest@^8.10 --dep airbyte-cdk@0.80.0`: allows update to toml files as well -* `airbyte-ci connectors --name=source-openweather up_to_date --pull`: make a pull request for it +- `airbyte-ci connectors --name=source-openweather up_to_date`: upgrades main dependecies +- `airbyte-ci connectors --name=source-openweather up_to_date --dev`: forces update if there are only dev changes +- `airbyte-ci connectors --name=source-openweather up_to_date --dep pytest@^8.10 --dep airbyte-cdk@0.80.0`: allows update to toml files as well +- `airbyte-ci connectors --name=source-openweather up_to_date --pull`: make a pull request for it - ### Other things it could do +### Other things it could do -* upgrade it the latest base image -* make sure it's the newest version of pytest -* do a `poetry update` to update everything else -* make the pull requests on a well known branch, replacing the last one if still open -* bump the toml and metadata and changelog -* also bump the manifest version of the CDK +- upgrade it the latest base image +- make sure it's the newest version of pytest +- do a `poetry update` to update everything else +- make the pull requests on a well known branch, replacing the last one if still open +- bump the toml and metadata and changelog +- also bump the manifest version of the CDK ### `connectors bump_version` command diff --git a/airbyte-ci/connectors/pipelines/pipelines/helpers/changelog.py b/airbyte-ci/connectors/pipelines/pipelines/helpers/changelog.py index bbe724c4832..b435a488ed1 100644 --- a/airbyte-ci/connectors/pipelines/pipelines/helpers/changelog.py +++ b/airbyte-ci/connectors/pipelines/pipelines/helpers/changelog.py @@ -35,13 +35,13 @@ class ChangelogEntry: def __eq__(self, other: object) -> bool: if not isinstance(other, ChangelogEntry): return False - retVal = ( + entry_matches = ( self.date == other.date and self.version == other.version and self.pr_number == other.pr_number and self.comment == other.comment ) - return retVal + return entry_matches def __ne__(self, other: object) -> bool: return not (self.__eq__(other)) @@ -103,6 +103,10 @@ class Changelog: self.new_entries.add(ChangelogEntry(date, version, pull_request_number, comment)) def to_markdown(self) -> str: + """ + Generates the complete markdown content for the changelog, + including both original and new entries, sorted by version, date, pull request number, and comment. + """ all_entries = set(self.original_entries.union(self.new_entries)) sorted_entries = sorted( sorted( diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog.py b/airbyte-ci/connectors/pipelines/tests/test_changelog.py index dcb54b47eed..3222830cd04 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog.py +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog.py @@ -16,6 +16,11 @@ pytestmark = [ PATH_TO_INITIAL_FILES = Path("airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files") PATH_TO_RESULT_FILES = Path("airbyte-ci/connectors/pipelines/tests/test_changelog/result_files") + +# When WRITE_TO_RESULT_FILE is set to True, all tests below will generate the resulting markdown +# and write it back to the fixture files. +# This is useful when you changed the source files and need to regenrate the fixtures. +# The comparison against target will still fail, but it will succeed on the subsequent test run. WRITE_TO_RESULT_FILE = False diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_newline.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_newline.md index 91dbd6fe3bc..3ef88ff3c15 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_newline.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_newline.md @@ -1,10 +1,11 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | -| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | -| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | + +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + | Version | Date | Pull Request | Subject | + |---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | + | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | + | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | + | 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_separator.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_separator.md index 0bb5c1c5874..2f8ba0ddb51 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_separator.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/changelog_header_no_separator.md @@ -1,10 +1,11 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | -| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | -| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | +| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/no_changelog_header.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/no_changelog_header.md index e8dc6156152..7bf38a05bbf 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/no_changelog_header.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/no_changelog_header.md @@ -1,10 +1,11 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. |---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | -| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | -| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | +| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_at_end.md index 954709e5679..b74f996a783 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_at_end.md @@ -1,11 +1,12 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | -| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | -| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | +| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | +| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | +| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_in_middle.md index 91d499c5180..38d22f55c64 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files/valid_changelog_in_middle.md @@ -1,12 +1,14 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | -| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | -| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | -Laurem Ipsum blah blah \ No newline at end of file +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | +| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | +| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | +| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + +Laurem Ipsum blah blah diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_at_end.md index ec82a0c5ea4..a82790ed851 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_at_end.md @@ -1,13 +1,14 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_in_middle.md index a2d9a316778..52fa93f9ff6 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_version_date_valid_changelog_in_middle.md @@ -1,14 +1,16 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + Laurem Ipsum blah blah diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_at_end.md index 843738afdc3..f9c07afa296 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_at_end.md @@ -1,13 +1,14 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-02 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_in_middle.md index 2e22f199994..cfd750b9b02 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/dupicate_versions_valid_changelog_in_middle.md @@ -1,14 +1,16 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-02 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + Laurem Ipsum blah blah diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_at_end.md index 47ffbeac1a7..2064364cae3 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_at_end.md @@ -1,12 +1,13 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_in_middle.md index fe7ff8cce83..404d5a0965a 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/duplicate_entry_valid_changelog_in_middle.md @@ -1,13 +1,15 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + Laurem Ipsum blah blah diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_at_end.md index be064c1fb03..b497d1a2e5d 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_at_end.md @@ -1,11 +1,12 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_in_middle.md index 2873736244b..e4d5be7fd6d 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/existing_entries_valid_changelog_in_middle.md @@ -1,12 +1,14 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + Laurem Ipsum blah blah diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_at_end.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_at_end.md index 47ffbeac1a7..2064364cae3 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_at_end.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_at_end.md @@ -1,12 +1,13 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | diff --git a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_in_middle.md b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_in_middle.md index fe7ff8cce83..404d5a0965a 100644 --- a/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_in_middle.md +++ b/airbyte-ci/connectors/pipelines/tests/test_changelog/result_files/single_insert_valid_changelog_in_middle.md @@ -1,13 +1,15 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. + +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- | | 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test | | 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 | | 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. | | 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. | -| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | | +| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | + Laurem Ipsum blah blah diff --git a/airbyte-integrations/bases/base-normalization/dbt-project-template/README.md b/airbyte-integrations/bases/base-normalization/dbt-project-template/README.md index 0444b3be9f8..13e812383e9 100644 --- a/airbyte-integrations/bases/base-normalization/dbt-project-template/README.md +++ b/airbyte-integrations/bases/base-normalization/dbt-project-template/README.md @@ -16,4 +16,4 @@ 1. You should find `profiles.yml` and a bunch of other dbt files/folders created there. 1. To check everything is setup properly: `dbt debug --profiles-dir=$(pwd) --project-dir=$(pwd)` 1. You can modify the `.sql` files and run `dbt run --profiles-dir=$(pwd) --project-dir=$(pwd)` too -1. You can inspect compiled dbt `.sql` files before they are run in the destination engine in `normalize/build/compiled` or `normalize/build/run` folders \ No newline at end of file +1. You can inspect compiled dbt `.sql` files before they are run in the destination engine in `normalize/build/compiled` or `normalize/build/run` folders diff --git a/airbyte-integrations/bases/base-normalization/integration_tests/resources/test_simple_streams/README.md b/airbyte-integrations/bases/base-normalization/integration_tests/resources/test_simple_streams/README.md index a5bb335e4b1..87e59f2f33e 100644 --- a/airbyte-integrations/bases/base-normalization/integration_tests/resources/test_simple_streams/README.md +++ b/airbyte-integrations/bases/base-normalization/integration_tests/resources/test_simple_streams/README.md @@ -1,9 +1,10 @@ # test_simple_streams -## Exchange Rate +## Exchange Rate This test suite is focusing on testing a simple stream (non-nested) of data similar to `source-exchangerates` using two different `destination_sync_modes`: + - `incremental` + `overwrite` with stream `exchange_rate` - `incremental` + `append_dedup` with stream `dedup_exchange_rate` diff --git a/airbyte-integrations/bases/base-normalization/setup/snowflake.md b/airbyte-integrations/bases/base-normalization/setup/snowflake.md index 2bf2bc58e51..b536c67950b 100644 --- a/airbyte-integrations/bases/base-normalization/setup/snowflake.md +++ b/airbyte-integrations/bases/base-normalization/setup/snowflake.md @@ -25,9 +25,10 @@ CREATE SCHEMA INTEGRATION_TEST_NORMALIZATION.TEST_SCHEMA; ``` If you ever need to start over, use this: + ```sql DROP DATABASE IF EXISTS INTEGRATION_TEST_NORMALIZATION; DROP USER IF EXISTS INTEGRATION_TEST_USER_NORMALIZATION; DROP ROLE IF EXISTS INTEGRATION_TESTER_NORMALIZATION; DROP WAREHOUSE IF EXISTS INTEGRATION_TEST_WAREHOUSE_NORMALIZATION; -``` \ No newline at end of file +``` diff --git a/airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md b/airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md index a245577563d..8d72333c0e7 100644 --- a/airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md +++ b/airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md @@ -1,42 +1,55 @@ # Changelog ## 3.7.0 + Add `validate_state_messages` to TestBasicRead.test_read:: Validate that all states contain neither legacy state emissions nor missing source stats in the state message. ## 3.6.0 + Relaxing CATs validation when a stream has a primary key defined. ## 3.5.0 + Add `validate_stream_statuses` to TestBasicRead.test_read:: Validate all statuses for all streams in the catalogs were emitted in correct order. ## 3.4.0 -Add TestConnectorDocumentation suite for validating connectors documentation structure and content. + +Add TestConnectorDocumentation suite for validating connectors documentation structure and content. ## 3.3.3 -Аix `NoAdditionalPropertiesValidator` if no type found in `items` + +Аix `NoAdditionalPropertiesValidator` if no type found in `items` ## 3.3.2 + Fix TestBasicRead.test_read.validate_schema: set `additionalProperties` to False recursively for objects. ## 3.3.1 -Fix TestSpec.test_oauth_is_default_method to skip connectors that doesn't have predicate_key object. + +Fix TestSpec.test_oauth_is_default_method to skip connectors that doesn't have predicate_key object. ## 3.3.0 + Add `test_certified_connector_has_allowed_hosts` and `test_certified_connector_has_suggested_streams` tests to the `connector_attribute` test suite ## 3.2.0 + Add TestBasicRead.test_all_supported_file_types_present, which validates that all supported file types are present in the sandbox account for certified file-based connectors. ## 3.1.0 + Add TestSpec.test_oauth_is_default_method test with OAuth is default option validation. ## 3.0.1 + Upgrade to Dagger 0.9.6 ## 3.0.0 + Upgrade to Dagger 0.9.5 ## 2.2.0 + Add connector_attribute test suite and stream primary key validation ## 2.1.4 @@ -62,343 +75,455 @@ Support loading it from its Dagger container id for better performance. Install pytest-xdist to support running tests in parallel. ## 2.0.2 + Make `test_two_sequential_reads` handle namespace property in stream descriptor. ## 2.0.1 + Changing `format` or `airbyte_type` in a field definition of a schema or specification is now a breaking change. ## 2.0.0 + Update test_incremental.test_two_sequential_reads to be unaware of the contents of the state message. This is to support connectors that have a custom implementation of a cursor. ## 1.0.4 + Fix edge case in skip_backward_compatibility_tests_fixture on discovery: if the current config structure is not compatible with the previous connector version, the discovery command failing and the previous connector version catalog could not be retrieved. ## 1.0.3 + Add tests for display_type property ## 1.0.2 + Fix bug in skip_backward_compatibility_tests_fixture, the previous connector version could not be retrieved. ## 1.0.1 + Pin airbyte-protocol-model to <1.0.0. ## 1.0.0 + Bump to Python 3.10, use dagger instead of docker-py in the ConnectorRunner. ## 0.11.5 + Changing test output and adding diff to test_read ## 0.11.4 + Relax checking of `oneOf` common property and allow optional `default` keyword additional to `const` keyword. ## 0.11.3 + Refactor test_oauth_flow_parameters to validate advanced_auth instead of the deprecated authSpecification ## 0.11.2 + Do not enforce spec.json/spec.yaml ## 0.11.1 + Test connector image labels and make sure they are set correctly and match metadata.yaml. ## 0.11.0 + Add backward_compatibility.check_if_field_removed test to check if a field has been removed from the catalog. ## 0.10.8 + Increase the connection timeout to Docker client to 2 minutes ([context](https://github.com/airbytehq/airbyte/issues/27401)) ## 0.10.7 + Fix on supporting arrays in the state (ensure string are parsed as string and not int) ## 0.10.6 + Supporting arrays in the state by allowing ints in cursor_paths ## 0.10.5 + Skipping test_catalog_has_supported_data_types as it is failing on too many connectors. Will first address globally the type/format problems at scale and then re-enable it. ## 0.10.4 + Fixing bug: test_catalog_has_supported_data_types should support stream properties having `/` in it. ## 0.10.3 + Fixing bug: test_catalog_has_supported_data_types , integer is a supported airbyte type. ## 0.10.2 + Fixing bug: test_catalog_has_supported_data_types was failing when a connector stream property is named 'type'. ## 0.10.1 + Reverting to 0.9.0 as the latest version. 0.10.0 was released with a bug failing CAT on a couple of connectors. ## 0.10.0 + Discovery test: add validation that fails if the declared types/format/airbyte_types in the connector's streams properties are not [supported data types](https://docs.airbyte.com/understanding-airbyte/supported-data-types/) or if their combination is invalid. ## 0.9.0 + Basic read test: add validation that fails if undeclared columns are present in records. Add `fail_on_extra_fields` input parameter to ignore this failure if desired. ## 0.8.0 + Spec tests: Make sure grouping and ordering properties are used in a consistent way. ## 0.7.2 + TestConnection: assert that a check with `exception` status emits a trace message. ## 0.7.1 + Discovery backward compatibility tests: handle errors on previous connectors catalog retrieval. Return None when the discovery failed. It should unblock the situation when tests fails even if you bypassed backward compatibility tests. ## 0.7.0 + Basic read test: add `ignored_fields`, change configuration format by adding optional `bypass_reason` [#22996](https://github.com/airbytehq/airbyte/pull/22996) ## 0.6.1 + Fix docker API - "Error" is optional. [#22987](https://github.com/airbytehq/airbyte/pull/22987) ## 0.6.0 + Allow passing custom environment variables to the connector under test. [#22937](https://github.com/airbytehq/airbyte/pull/22937). ## 0.5.3 + Spec tests: Make `oneOf` checks work for nested `oneOf`s. [#22395](https://github.com/airbytehq/airbyte/pull/22395) ## 0.5.2 + Check that `emitted_at` increases during subsequent reads. [#22291](https://github.com/airbytehq/airbyte/pull/22291) ## 0.5.1 + Fix discovered catalog caching for different configs. [#22301](https://github.com/airbytehq/airbyte/pull/22301) ## 0.5.0 + Re-release of 0.3.0 [#21451](https://github.com/airbytehq/airbyte/pull/21451) # Renamed image from `airbyte/source-acceptance-test` to `airbyte/connector-acceptance-test` - Older versions are only available under the old name ## 0.4.0 + Revert 0.3.0 ## 0.3.0 + (Broken) Add various stricter checks for specs (see PR for details). [#21451](https://github.com/airbytehq/airbyte/pull/21451) ## 0.2.26 + Check `future_state` only for incremental streams. [#21248](https://github.com/airbytehq/airbyte/pull/21248) ## 0.2.25 + Enable bypass reason for future state test config.[#20549](https://github.com/airbytehq/airbyte/pull/20549) ## 0.2.24 + Check for nullity of docker runner in `previous_discovered_catalog_fixture`.[#20899](https://github.com/airbytehq/airbyte/pull/20899) ## 0.2.23 + Skip backward compatibility tests on specifications if actual and previous specifications and discovered catalogs are identical.[#20435](https://github.com/airbytehq/airbyte/pull/20435) ## 0.2.22 + Capture control messages to store and use updated configurations. [#19979](https://github.com/airbytehq/airbyte/pull/19979). ## 0.2.21 + Optionally disable discovered catalog caching. [#19806](https://github.com/airbytehq/airbyte/pull/19806). ## 0.2.20 + Stricter integer field schema validation. [#19820](https://github.com/airbytehq/airbyte/pull/19820). ## 0.2.19 + Test for exposed secrets: const values can not hold secrets. [#19465](https://github.com/airbytehq/airbyte/pull/19465). ## 0.2.18 + Test connector specification against exposed secret fields. [#19124](https://github.com/airbytehq/airbyte/pull/19124). ## 0.2.17 + Make `incremental.future_state` mandatory in `high` `test_strictness_level`. [#19085](https://github.com/airbytehq/airbyte/pull/19085/). ## 0.2.16 + Run `basic_read` on the discovered catalog in `high` `test_strictness_level`. [#18937](https://github.com/airbytehq/airbyte/pull/18937). ## 0.2.15 + Make `expect_records` mandatory in `high` `test_strictness_level`. [#18497](https://github.com/airbytehq/airbyte/pull/18497/). ## 0.2.14 + Fail basic read in `high` `test_strictness_level` if no `bypass_reason` is set on empty_streams. [#18425](https://github.com/airbytehq/airbyte/pull/18425/). ## 0.2.13 + Fail tests in `high` `test_strictness_level` if all tests are not configured. [#18414](https://github.com/airbytehq/airbyte/pull/18414/). ## 0.2.12 + Declare `bypass_reason` field in test configuration. [#18364](https://github.com/airbytehq/airbyte/pull/18364). ## 0.2.11 + Declare `test_strictness_level` field in test configuration. [#18218](https://github.com/airbytehq/airbyte/pull/18218). ## 0.2.10 + Bump `airbyte-cdk~=0.2.0` ## 0.2.9 + Update tests after protocol change making `supported_sync_modes` a required property of `AirbyteStream` [#15591](https://github.com/airbytehq/airbyte/pull/15591/) ## 0.2.8 + Make full refresh tests tolerant to new records in a sequential read.[#17660](https://github.com/airbytehq/airbyte/pull/17660/) ## 0.2.7 + Fix a bug when a state is evaluated once before used in a loop of `test_read_sequential_slices` [#17757](https://github.com/airbytehq/airbyte/pull/17757/) ## 0.2.6 + Backward compatibility hypothesis testing: disable "filtering too much" health check. [#17871](https://github.com/airbytehq/airbyte/pull/17871) ## 0.2.5 + Unit test `test_state_with_abnormally_large_values` to check state emission testing is working. [#17791](https://github.com/airbytehq/airbyte/pull/17791) ## 0.2.4 + Make incremental tests compatible with per stream states.[#16686](https://github.com/airbytehq/airbyte/pull/16686/) ## 0.2.3 + Backward compatibility tests: improve `check_if_type_of_type_field_changed` to make it less radical when validating specs and allow `'str' -> ['str', '']` type changes.[#16429](https://github.com/airbytehq/airbyte/pull/16429/) ## 0.2.2 + Backward compatibility tests: improve `check_if_cursor_field_was_changed` to make it less radical and allow stream addition to catalog.[#15835](https://github.com/airbytehq/airbyte/pull/15835/) ## 0.2.1 + Don't fail on updating `additionalProperties`: fix IndexError [#15532](https://github.com/airbytehq/airbyte/pull/15532/) ## 0.2.0 + Finish backward compatibility syntactic tests implementation: check that cursor fields were not changed. [#15520](https://github.com/airbytehq/airbyte/pull/15520/) ## 0.1.62 + Backward compatibility tests: add syntactic validation of catalogs [#15486](https://github.com/airbytehq/airbyte/pull/15486/) ## 0.1.61 + Add unit tests coverage computation [#15443](https://github.com/airbytehq/airbyte/pull/15443/). ## 0.1.60 + Backward compatibility tests: validate fake previous config against current connector specification. [#15367](https://github.com/airbytehq/airbyte/pull/15367) ## 0.1.59 + Backward compatibility tests: add syntactic validation of specs [#15194](https://github.com/airbytehq/airbyte/pull/15194/). ## 0.1.58 + Bootstrap spec backward compatibility tests. Add fixtures to retrieve a previous connector version spec [#14954](https://github.com/airbytehq/airbyte/pull/14954/). ## 0.1.57 + Run connector from its image `working_dir` instead of from `/data`. ## 0.1.56 + Add test case in `TestDiscovery` and `TestConnection` to assert `additionalProperties` fields are set to true if they are declared [#14878](https://github.com/airbytehq/airbyte/pull/14878/). ## 0.1.55 + Add test case in `TestDiscovery` to assert `supported_sync_modes` stream field in catalog is set and not empty. ## 0.1.54 + Fixed `AirbyteTraceMessage` test case to make connectors fail more reliably. ## 0.1.53 + Add more granular incremental testing that walks through syncs and verifies records according to cursor value. ## 0.1.52 + Add test case for `AirbyteTraceMessage` emission on connector failure: [#12796](https://github.com/airbytehq/airbyte/pull/12796/). ## 0.1.51 + - Add `threshold_days` option for lookback window support in incremental tests. - Update CDK to prevent warnings when encountering new `AirbyteTraceMessage`s. ## 0.1.50 + Added support for passing a `.yaml` file as `spec_path`. ## 0.1.49 + Fixed schema parsing when a JSONschema `type` was not present - we now assume `object` if the `type` is not present. ## 0.1.48 + Add checking that oneOf common property has only `const` keyword, no `default` and `enum` keywords: [#11704](https://github.com/airbytehq/airbyte/pull/11704) ## 0.1.47 + Added local test success message containing git hash: [#11497](https://github.com/airbytehq/airbyte/pull/11497) ## 0.1.46 + Fix `test_oneof_usage` test: [#9861](https://github.com/airbytehq/airbyte/pull/9861) ## 0.1.45 + Check for not allowed keywords `allOf`, `not` in connectors schema: [#9851](https://github.com/airbytehq/airbyte/pull/9851) ## 0.1.44 + Fix incorrect name of `primary_keys` attribute: [#9768](https://github.com/airbytehq/airbyte/pull/9768) ## 0.1.43 + `TestFullRefresh` test can compare records using PKs: [#9768](https://github.com/airbytehq/airbyte/pull/9768) ## 0.1.36 + Add assert that `spec.json` file does not have any `$ref` in it: [#8842](https://github.com/airbytehq/airbyte/pull/8842) ## 0.1.32 + Add info about skipped failed tests in `/test` command message on GitHub: [#8691](https://github.com/airbytehq/airbyte/pull/8691) ## 0.1.31 + Take `ConfiguredAirbyteCatalog` from discover command by default ## 0.1.30 + Validate if each field in a stream has appeared at least once in some record. ## 0.1.29 + Add assert that output catalog does not have any `$ref` in it ## 0.1.28 + Print stream name when incremental sync tests fail ## 0.1.27 + Add ignored fields for full refresh test (unit tests) ## 0.1.26 + Add ignored fields for full refresh test ## 0.1.25 + Fix incorrect nested structures compare. ## 0.1.24 + Improve message about errors in the stream's schema: [#6934](https://github.com/airbytehq/airbyte/pull/6934) ## 0.1.23 + Fix incorrect auth init flow check defect. ## 0.1.22 + Fix checking schemas with root `$ref` keyword ## 0.1.21 + Fix rootObject oauth init parameter check ## 0.1.20 + Add oauth init flow parameter verification for spec. ## 0.1.19 + Assert a non-empty overlap between the fields present in the record and the declared json schema. ## 0.1.18 + Fix checking date-time format against nullable field. ## 0.1.17 + Fix serialize function for acceptance-tests: [#5738](https://github.com/airbytehq/airbyte/pull/5738) ## 0.1.16 + Fix for flake8-ckeck for acceptance-tests: [#5785](https://github.com/airbytehq/airbyte/pull/5785) ## 0.1.15 + Add detailed logging for acceptance tests: [5392](https://github.com/airbytehq/airbyte/pull/5392) ## 0.1.14 + Fix for NULL datetime in MySQL format (i.e. `0000-00-00`): [#4465](https://github.com/airbytehq/airbyte/pull/4465) ## 0.1.13 + Replace `validate_output_from_all_streams` with `empty_streams` param: [#4897](https://github.com/airbytehq/airbyte/pull/4897) ## 0.1.12 + Improve error message when data mismatches schema: [#4753](https://github.com/airbytehq/airbyte/pull/4753) ## 0.1.11 + Fix error in the naming of method `test_match_expected` for class `TestSpec`. ## 0.1.10 + Add validation of input config.json against spec.json. ## 0.1.9 + Add configurable validation of schema for all records in BasicRead test: [#4345](https://github.com/airbytehq/airbyte/pull/4345) The validation is ON by default. To disable validation for the source you need to set `validate_schema: off` in the config file. ## 0.1.8 + Fix cursor_path to support nested and absolute paths: [#4552](https://github.com/airbytehq/airbyte/pull/4552) ## 0.1.7 + Add: `test_spec` additionally checks if Dockerfile has `ENV AIRBYTE_ENTRYPOINT` defined and equal to space_joined `ENTRYPOINT` ## 0.1.6 + Add test whether PKs present and not None if `source_defined_primary_key` defined: [#4140](https://github.com/airbytehq/airbyte/pull/4140) ## 0.1.5 + Add configurable timeout for the acceptance tests: [#4296](https://github.com/airbytehq/airbyte/pull/4296) diff --git a/airbyte-integrations/bases/connector-acceptance-test/README.md b/airbyte-integrations/bases/connector-acceptance-test/README.md index 9ca4822380e..15937f7d80f 100644 --- a/airbyte-integrations/bases/connector-acceptance-test/README.md +++ b/airbyte-integrations/bases/connector-acceptance-test/README.md @@ -1,17 +1,21 @@ # Connector Acceptance Tests (CAT) + This package gathers multiple test suites to assess the sanity of any Airbyte connector. It is shipped as a [pytest](https://docs.pytest.org/en/7.1.x/) plugin and relies on pytest to discover, configure and execute tests. Test-specific documentation can be found [here](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/). ## Configuration + The acceptance tests are configured via the `acceptance-test-config.yml` YAML file, which is passed to the plugin via the `--acceptance-test-config` option. ## Running the acceptance tests locally + Note there are MANY ways to do this at this time, but we are working on consolidating them. Which method you choose to use depends on the context you are in. Pre-requisites: + - Setting up a Service Account for Google Secrets Manager (GSM) access. See [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/ci_credentials/README.md) - Ensuring that you have the `GCP_GSM_CREDENTIALS` environment variable set to the contents of your GSM service account key file. - [Poetry](https://python-poetry.org/docs/#installation) installed @@ -22,6 +26,7 @@ Pre-requisites: _Note: Install instructions for airbyte-ci are [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) _ **This runs connector acceptance and other tests that run in our CI** + ```bash airbyte-ci connectors --name= test ``` @@ -66,15 +71,15 @@ poetry install poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-faker --pdb ``` - ### Manually + 1. `cd` into your connector project (e.g. `airbyte-integrations/connectors/source-pokeapi`) 2. Edit `acceptance-test-config.yml` according to your need. Please refer to our [Connector Acceptance Test Reference](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/) if you need details about the available options. 3. Build the connector docker image ( e.g.: `airbyte-ci connectors --name=source-pokeapi build`) 4. Use one of the following ways to run tests (**from your connector project directory**) - ## Developing on the acceptance tests + You may want to iterate on the acceptance test project itself: adding new tests, fixing a bug etc. These iterations are more conveniently achieved by remaining in the current directory. @@ -82,14 +87,14 @@ These iterations are more conveniently achieved by remaining in the current dire 2. Run the unit tests on the acceptance tests themselves: `poetry run pytest unit_tests` (add the `--pdb` option if you want to enable the debugger on test failure) 3. To run specific unit test(s), add `-k` to the above command, e.g. `poetry run python -m pytest unit_tests -k 'test_property_can_store_secret'`. You can use wildcards `*` here as well. 4. Make the changes you want: - * Global pytest fixtures are defined in `./connector_acceptance_test/conftest.py` - * Existing test modules are defined in `./connector_acceptance_test/tests` - * `acceptance-test-config.yaml` structure is defined in `./connector_acceptance_test/config.py` + - Global pytest fixtures are defined in `./connector_acceptance_test/conftest.py` + - Existing test modules are defined in `./connector_acceptance_test/tests` + - `acceptance-test-config.yaml` structure is defined in `./connector_acceptance_test/config.py` 5. Unit test your changes by adding tests to `./unit_tests` 6. Run the unit tests on the acceptance tests again: `poetry run pytest unit_tests`, make sure the coverage did not decrease. You can bypass slow tests by using the `slow` marker: `poetry run pytest unit_tests -m "not slow"`. 7. Manually test the changes you made by running acceptance tests on a specific connector: - * First build the connector to ensure your local image is up-to-date: `airbyte-ci connectors --name=source-pokeapi build` - * Then run the acceptance tests on the connector: `poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-pokeapi` + - First build the connector to ensure your local image is up-to-date: `airbyte-ci connectors --name=source-pokeapi build` + - Then run the acceptance tests on the connector: `poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-pokeapi` 8. Make sure you updated `docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md` according to your changes 9. Update the project changelog `airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md` 10. Open a PR on our GitHub repository @@ -98,8 +103,9 @@ These iterations are more conveniently achieved by remaining in the current dire 13. Merge your PR ## Migrating `acceptance-test-config.yml` to latest configuration format + We introduced changes in the structure of `acceptance-test-config.yml` files in version 0.2.12. -The *legacy* configuration format is still supported but should be deprecated soon. +The _legacy_ configuration format is still supported but should be deprecated soon. To migrate a legacy configuration to the latest configuration format please run: ```bash diff --git a/airbyte-integrations/connector-templates/README.md b/airbyte-integrations/connector-templates/README.md index 8fd7b6461e6..43c90164974 100644 --- a/airbyte-integrations/connector-templates/README.md +++ b/airbyte-integrations/connector-templates/README.md @@ -1,6 +1,6 @@ # Connector templates -This directory contains templates used to bootstrap developing new connectors, as well as a generator module which generates code using the templates as input. +This directory contains templates used to bootstrap developing new connectors, as well as a generator module which generates code using the templates as input. -See the `generator/` directory to get started writing a new connector. -Other directories contain templates used to bootstrap a connector. +See the `generator/` directory to get started writing a new connector. +Other directories contain templates used to bootstrap a connector. diff --git a/airbyte-integrations/connector-templates/generator/README.md b/airbyte-integrations/connector-templates/generator/README.md index 38f79f4c1bc..6605785f2b1 100644 --- a/airbyte-integrations/connector-templates/generator/README.md +++ b/airbyte-integrations/connector-templates/generator/README.md @@ -1,6 +1,6 @@ # Connector generator -This module generates code to bootstrap your connector development. +This module generates code to bootstrap your connector development. ## Getting started @@ -12,7 +12,8 @@ npm run generate ``` ### Using Docker -If you don't want to install `npm` you can run the generator using Docker: + +If you don't want to install `npm` you can run the generator using Docker: ``` ./generate.sh @@ -21,26 +22,27 @@ If you don't want to install `npm` you can run the generator using Docker: ## Contributions ### Testing connector templates -To test that the templates generate valid code, we follow a slightly non-obvious strategy. Since the templates -themselves do not contain valid Java/Python/etc.. syntax, we can't build them directly. -At the same time, due to the way Gradle works (where phase 1 is "discovering" all the projects that need to be + +To test that the templates generate valid code, we follow a slightly non-obvious strategy. Since the templates +themselves do not contain valid Java/Python/etc.. syntax, we can't build them directly. +At the same time, due to the way Gradle works (where phase 1 is "discovering" all the projects that need to be built and phase 2 is running the build), it's not very ergonomic to have one Gradle task generate a module -from each template, build it in the same build lifecycle, then remove it. +from each template, build it in the same build lifecycle, then remove it. -So we use the following strategy: +So we use the following strategy: -1. Locally, generate an empty connector using the generator module (call the generated connector something like `java-jdbc-scaffolding`) +1. Locally, generate an empty connector using the generator module (call the generated connector something like `java-jdbc-scaffolding`) 1. Check the generated module into source control -Then, [in CI](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/gradle.yml), we test two invariants: +Then, [in CI](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/gradle.yml), we test two invariants: 1. There is no diff between the checked in module, and a module generated during using the latest version of the templater 1. The checked in module builds successfully -Together, these two invariants guarantee that the templates produce a valid module. +Together, these two invariants guarantee that the templates produce a valid module. -The way this is performed is as follows: +The way this is performed is as follows: -1. [in CI ](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/gradle.yml) we trigger the task `:airbyte-integrations:connector-templates:generator:generateScaffolds`. This task deletes the checked in `java-jdbc-scaffolding`. Then the task generates a fresh instance of the module with the same name `java-jdbc-scaffolding`. -1. We run a `git diff`. If there is a diff, then fail the build (this means the latest version of the templates produce code which has not been manually reviewed by someone who checked them in intentionally). Steps 1 & 2 test the first invariant. -1. Separately, in `settings.gradle`, the `java-jdbc-scaffolding` module is registered as a java submodule. This causes it to be built as part of the normal build cycle triggered in CI. If the generated code does not compile for whatever reason, the build will fail on building the `java-jdbc-scaffolding` module. +1. [in CI ](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/gradle.yml) we trigger the task `:airbyte-integrations:connector-templates:generator:generateScaffolds`. This task deletes the checked in `java-jdbc-scaffolding`. Then the task generates a fresh instance of the module with the same name `java-jdbc-scaffolding`. +1. We run a `git diff`. If there is a diff, then fail the build (this means the latest version of the templates produce code which has not been manually reviewed by someone who checked them in intentionally). Steps 1 & 2 test the first invariant. +1. Separately, in `settings.gradle`, the `java-jdbc-scaffolding` module is registered as a java submodule. This causes it to be built as part of the normal build cycle triggered in CI. If the generated code does not compile for whatever reason, the build will fail on building the `java-jdbc-scaffolding` module. diff --git a/airbyte-integrations/connector-templates/source-python/source_{{snakeCase name}}/schemas/TODO.md b/airbyte-integrations/connector-templates/source-python/source_{{snakeCase name}}/schemas/TODO.md index cf1efadb3c9..0037aeb60d8 100644 --- a/airbyte-integrations/connector-templates/source-python/source_{{snakeCase name}}/schemas/TODO.md +++ b/airbyte-integrations/connector-templates/source-python/source_{{snakeCase name}}/schemas/TODO.md @@ -1,20 +1,25 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). -The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. - +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. + The schema of a stream is the return value of `Stream.get_json_schema`. - + ## Static schemas + By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need. Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files. - + ## Dynamic schemas + If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org). -## Dynamically modifying static schemas -Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: +## Dynamically modifying static schemas + +Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: + ``` def get_json_schema(self): schema = super().get_json_schema() @@ -22,4 +27,4 @@ def get_json_schema(self): return schema ``` -Delete this file once you're done. Or don't. Up to you :) +Delete this file once you're done. Or don't. Up to you :) diff --git a/airbyte-integrations/connectors-performance/destination-harness/README.md b/airbyte-integrations/connectors-performance/destination-harness/README.md index a3c4d09c47d..c24f8c96154 100644 --- a/airbyte-integrations/connectors-performance/destination-harness/README.md +++ b/airbyte-integrations/connectors-performance/destination-harness/README.md @@ -6,6 +6,7 @@ This component is used by the `/connector-performance` GitHub action and is used destination connectors on a number of datasets. Associated files are: +
  • Main.java - the main entrypoint for the harness
  • PerformanceTest.java - sets up the destination connector, sends records to it, and measures throughput
  • run-harness-process.yaml - kubernetes file that processes dynamic arguments and runs the harness diff --git a/airbyte-integrations/connectors-performance/source-harness/README.md b/airbyte-integrations/connectors-performance/source-harness/README.md index 6b61bcb4d7d..fc8a0a8fdcc 100644 --- a/airbyte-integrations/connectors-performance/source-harness/README.md +++ b/airbyte-integrations/connectors-performance/source-harness/README.md @@ -2,5 +2,5 @@ Performance harness for source connectors. -This component is used by the `/connector-performance` GitHub action and is used in order to test throughput of +This component is used by the `/connector-performance` GitHub action and is used in order to test throughput of source connectors on a number of datasets. diff --git a/airbyte-integrations/connectors/destination-amazon-sqs/README.md b/airbyte-integrations/connectors/destination-amazon-sqs/README.md index 2856f60b1ae..05857ae25ff 100644 --- a/airbyte-integrations/connectors/destination-amazon-sqs/README.md +++ b/airbyte-integrations/connectors/destination-amazon-sqs/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/amazon-sqs) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_amazon_sqs/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-amazon-sqs build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-amazon-sqs build An image will be built with the tag `airbyte/destination-amazon-sqs:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-amazon-sqs:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-amazon-sqs:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-amazon-sqs:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-amazon-sqs test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-amazon-sqs test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-amazon-sqs/bootstrap.md b/airbyte-integrations/connectors/destination-amazon-sqs/bootstrap.md index ce91ec1ef14..6e13b9920ce 100644 --- a/airbyte-integrations/connectors/destination-amazon-sqs/bootstrap.md +++ b/airbyte-integrations/connectors/destination-amazon-sqs/bootstrap.md @@ -1,24 +1,29 @@ # Amazon SQS Destination ## What + This is a connector for producing messages to an [Amazon SQS Queue](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html) ## How + ### Sending messages -Amazon SQS allows messages to be sent individually or in batches. Currently, this Destination only supports sending messages individually. This can + +Amazon SQS allows messages to be sent individually or in batches. Currently, this Destination only supports sending messages individually. This can have performance implications if sending high volumes of messages. #### Message Body + By default, the SQS Message body is built using the AirbyteMessageRecord's 'data' property. -If the **message_body_key** config item is set, we use the value as a key within the AirbyteMessageRecord's 'data' property. This could be +If the **message_body_key** config item is set, we use the value as a key within the AirbyteMessageRecord's 'data' property. This could be improved to handle nested keys by using JSONPath syntax to lookup values. For example, given the input Record: + ``` -{ - "data": - { +{ + "data": + { "parent_key": { "nested_key": "nested_value" }, @@ -28,8 +33,9 @@ For example, given the input Record: ``` With no **message_body_key** set, the output SQS Message body will be + ``` -{ +{ "parent_key": { "nested_key": "nested_value" }, @@ -38,6 +44,7 @@ With no **message_body_key** set, the output SQS Message body will be ``` With **message_body_key** set to `parent_key`, the output SQS Message body will be + ``` { "nested_key": "nested_value" @@ -45,15 +52,18 @@ With **message_body_key** set to `parent_key`, the output SQS Message body will ``` #### Message attributes + The airbyte_emmited_at timestamp is added to every message as an Attribute by default. This could be improved to allow the user to set Attributes through the UI, or to take keys from the Record as Attributes. #### FIFO Queues -A Queue URL that ends with '.fifo' **must** be a valid FIFO Queue. When the queue is FIFO, the *message_group_id* property is required. + +A Queue URL that ends with '.fifo' **must** be a valid FIFO Queue. When the queue is FIFO, the _message_group_id_ property is required. Currently, a unique uuid4 is generated as the dedupe ID for every message. This could be improved to allow the user to specify a path in the Record to use as a dedupe ID. ### Credentials + Requires an AWS IAM Access Key ID and Secret Key. This could be improved to add support for configured AWS profiles, env vars etc. diff --git a/airbyte-integrations/connectors/destination-astra/README.md b/airbyte-integrations/connectors/destination-astra/README.md index 94fea87af40..18174297ae7 100644 --- a/airbyte-integrations/connectors/destination-astra/README.md +++ b/airbyte-integrations/connectors/destination-astra/README.md @@ -6,18 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/astra) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_astra/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -27,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -36,6 +40,7 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -43,15 +48,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name destination-astra build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-astra:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -79,6 +88,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/destination-astra:latest @@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/destination-astra:dev . # Running the spec command against your patched connector docker run airbyte/destination-astra:dev spec -```` +``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-astra:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-astra:dev check --config /secrets/config.json @@ -112,7 +127,9 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ### Unit Tests To run unit tests locally, from the connector directory run: ``` + poetry run pytest -s unit_tests + ``` ### Integration Tests @@ -120,10 +137,12 @@ There are two types of integration tests: Acceptance Tests (Airbyte's test suite #### Custom Integration tests Place custom tests inside `integration_tests/` folder, then, from the connector root, run ``` + poetry run pytest -s integration_tests + ``` #### Acceptance Tests -Coming soon: +Coming soon: ### Using `airbyte-ci` to run tests See [airbyte-ci documentation](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#connectors-test-command) @@ -141,3 +160,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re 1. Create a Pull Request. 1. Pat yourself on the back for being an awesome contributor. 1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. +``` diff --git a/airbyte-integrations/connectors/destination-aws-datalake/README.md b/airbyte-integrations/connectors/destination-aws-datalake/README.md index 72fe3deb31c..ceb77ddc219 100644 --- a/airbyte-integrations/connectors/destination-aws-datalake/README.md +++ b/airbyte-integrations/connectors/destination-aws-datalake/README.md @@ -55,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-aws-datalake build ``` @@ -65,6 +66,7 @@ airbyte-ci connectors --name=destination-aws-datalake build An image will be built with the tag `airbyte/destination-aws-datalake:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-aws-datalake:dev . ``` @@ -80,14 +82,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-aws-datalake:dev cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-aws-datalake:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-aws-datalake test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. @@ -97,11 +101,13 @@ All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The re We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-aws-datalake test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -109,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-azure-blob-storage/README.md b/airbyte-integrations/connectors/destination-azure-blob-storage/README.md index 9c5e25ec868..67a200236c7 100644 --- a/airbyte-integrations/connectors/destination-azure-blob-storage/README.md +++ b/airbyte-integrations/connectors/destination-azure-blob-storage/README.md @@ -13,21 +13,24 @@ As a community contributor, you will need access to Azure to run the integration - Feel free to modify the config files with different settings in the acceptance test file (e.g. `AzureBlobStorageJsonlDestinationAcceptanceTest.java`, method `getFormatConfig`), as long as they follow the schema defined in [spec.json](src/main/resources/spec.json). ## Airbyte Employee + - Access the `Azure Blob Storage Account` secrets on Last Pass. - Replace the `config.json` under `sample_secrets`. - Rename the directory from `sample_secrets` to `secrets`. ### Infra setup + 1. Log in to the [Azure portal](https://portal.azure.com/#home) using the `integration-test@airbyte.io` account 1. Go to [Storage Accounts](https://portal.azure.com/#view/HubsExtension/BrowseResource/resourceType/Microsoft.Storage%2FStorageAccounts) 1. Create a new storage account with a reasonable name (currently `airbyteteststorage`), under the `integration-test-rg` resource group. - 1. In the `Redundancy` setting, choose `Locally-redundant storage (LRS)`. - 1. Hit `Review` (you can leave all the other settings as the default) and then `Create`. +1. In the `Redundancy` setting, choose `Locally-redundant storage (LRS)`. +1. Hit `Review` (you can leave all the other settings as the default) and then `Create`. 1. Navigate into that storage account -> `Containers`. Make a new container with a reasonable name (currently `airbytetescontainername`). 1. Then go back up to the storage account -> `Access keys`. This is the `azure_blob_storage_account_key` config field. - 1. There are two keys; use the first one. We don't need 100% uptime on our integration tests, so there's no need to alternate between the two keys. +1. There are two keys; use the first one. We don't need 100% uptime on our integration tests, so there's no need to alternate between the two keys. ## Add New Output Format + - Add a new enum in `AzureBlobStorageFormat'. - Modify `spec.json` to specify the configuration of this new format. - Update `AzureBlobStorageFormatConfigs` to be able to construct a config for this new format. diff --git a/airbyte-integrations/connectors/destination-bigquery/README.md b/airbyte-integrations/connectors/destination-bigquery/README.md index f911b3a4541..3689307d679 100644 --- a/airbyte-integrations/connectors/destination-bigquery/README.md +++ b/airbyte-integrations/connectors/destination-bigquery/README.md @@ -1,12 +1,15 @@ ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-bigquery:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -15,16 +18,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-bigquery:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-bigquery:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-bigquery:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-bigquery:dev check --config /secrets/config.json @@ -33,22 +40,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/bigquery`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/BigQueryDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-bigquery:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-bigquery:integrationTest ``` @@ -56,7 +70,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-bigquery test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +80,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-chroma/README.md b/airbyte-integrations/connectors/destination-chroma/README.md index eb27467da10..ae1157cc2e8 100644 --- a/airbyte-integrations/connectors/destination-chroma/README.md +++ b/airbyte-integrations/connectors/destination-chroma/README.md @@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/chroma) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_chroma/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -34,9 +39,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-chroma build ``` @@ -44,12 +50,15 @@ airbyte-ci connectors --name=destination-chroma build An image will be built with the tag `airbyte/destination-chroma:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-chroma:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-chroma:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-chroma:dev check --config /secrets/config.json @@ -58,35 +67,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-chroma test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run: + ``` poetry run pytest -s integration_tests -``` +``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-chroma test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -94,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-clickhouse/README.md b/airbyte-integrations/connectors/destination-clickhouse/README.md index 5ce61a36118..4a97f34009e 100644 --- a/airbyte-integrations/connectors/destination-clickhouse/README.md +++ b/airbyte-integrations/connectors/destination-clickhouse/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-clickhouse:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-clickhouse:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-clickhouse:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-clickhouse:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-clickhouse:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/clickhouse`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/clickhouseDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-clickhouse:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-clickhouse:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-clickhouse test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-clickhouse/bootstrap.md b/airbyte-integrations/connectors/destination-clickhouse/bootstrap.md index c728bde55a2..c1463730f51 100644 --- a/airbyte-integrations/connectors/destination-clickhouse/bootstrap.md +++ b/airbyte-integrations/connectors/destination-clickhouse/bootstrap.md @@ -15,4 +15,3 @@ This destination connector uses ClickHouse official JDBC driver, which uses HTTP ## API Reference The ClickHouse reference documents: [https://clickhouse.com/docs/en/](https://clickhouse.com/docs/en/) - diff --git a/airbyte-integrations/connectors/destination-convex/README.md b/airbyte-integrations/connectors/destination-convex/README.md index dc91b1ed511..19e0cccba5d 100644 --- a/airbyte-integrations/connectors/destination-convex/README.md +++ b/airbyte-integrations/connectors/destination-convex/README.md @@ -54,9 +54,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-convex build ``` @@ -64,6 +65,7 @@ airbyte-ci connectors --name=destination-convex build An image will be built with the tag `airbyte/destination-convex:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-convex:dev . ``` @@ -79,14 +81,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-convex:dev check cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-convex:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-convex test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. @@ -99,7 +103,9 @@ We split dependencies between two groups, dependencies that are: - required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-convex test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -107,4 +113,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-cumulio/README.md b/airbyte-integrations/connectors/destination-cumulio/README.md index 62261106b05..9372535c849 100644 --- a/airbyte-integrations/connectors/destination-cumulio/README.md +++ b/airbyte-integrations/connectors/destination-cumulio/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/cumulio) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_cumulio/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -47,9 +54,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-cumulio build ``` @@ -57,12 +65,15 @@ airbyte-ci connectors --name=destination-cumulio build An image will be built with the tag `airbyte/destination-cumulio:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-cumulio:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-cumulio:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-cumulio:dev check --config /secrets/config.json @@ -71,23 +82,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-cumulio test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-cumulio test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -95,4 +113,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-databend/README.md b/airbyte-integrations/connectors/destination-databend/README.md index 9b50cd9ffbf..36004d6b5f6 100644 --- a/airbyte-integrations/connectors/destination-databend/README.md +++ b/airbyte-integrations/connectors/destination-databend/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/databend) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_databend/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-databend build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-databend build An image will be built with the tag `airbyte/destination-databend:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-databend:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-databend:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databend:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-databend test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-databend test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-databricks/README.md b/airbyte-integrations/connectors/destination-databricks/README.md index 4f2162f728f..8d0b07a06e7 100644 --- a/airbyte-integrations/connectors/destination-databricks/README.md +++ b/airbyte-integrations/connectors/destination-databricks/README.md @@ -4,17 +4,21 @@ This is the repository for the Databricks destination connector in Java. For information about how to use this connector within Airbyte, see [the User Documentation](https://docs.airbyte.io/integrations/destinations/databricks). ## Databricks JDBC Driver + This connector requires a JDBC driver to connect to Databricks cluster. Before using this connector, you must agree to the [JDBC ODBC driver license](https://databricks.com/jdbc-odbc-driver-license). This means that you can only use this driver to connector third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols. ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-databricks:build ``` #### Create credentials + **If you are a community contributor**, you will need access to AWS S3, Azure blob storage, and Databricks cluster to run the integration tests: - Create a Databricks cluster. See [documentation](https://docs.databricks.com/clusters/create.html). @@ -34,16 +38,20 @@ From the Airbyte repository root, run: ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-databricks:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-databricks:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-databricks:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databricks:dev check --config /secrets/config.json @@ -52,22 +60,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/databricks`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/databricksDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-databricks:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-databricks:integrationTest ``` @@ -75,7 +90,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-databricks test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -83,4 +100,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-dev-null/README.md b/airbyte-integrations/connectors/destination-dev-null/README.md index a0969564d32..95f0acc09cc 100644 --- a/airbyte-integrations/connectors/destination-dev-null/README.md +++ b/airbyte-integrations/connectors/destination-dev-null/README.md @@ -1,11 +1,13 @@ # Destination Dev Null -This destination is a "safe" version of the [E2E Test destination](https://docs.airbyte.io/integrations/destinations/e2e-test). It only allows the "silent" mode. +This destination is a "safe" version of the [E2E Test destination](https://docs.airbyte.io/integrations/destinations/e2e-test). It only allows the "silent" mode. ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-dev-null:build ``` @@ -13,16 +15,20 @@ From the Airbyte repository root, run: ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-dev-null:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-dev-null:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-dev-null:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-dev-null:dev check --config /secrets/config.json @@ -31,12 +37,16 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-dev-null:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-dev-null:integrationTest ``` @@ -44,7 +54,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-dev-null test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -52,4 +64,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-duckdb/README.md b/airbyte-integrations/connectors/destination-duckdb/README.md index a43524f06dd..8857bce3546 100644 --- a/airbyte-integrations/connectors/destination-duckdb/README.md +++ b/airbyte-integrations/connectors/destination-duckdb/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate python -m pip install --upgrade pip pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/duckdb) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_duckdb/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config integration_tests/config.json @@ -47,26 +54,28 @@ python main.py discover --config integration_tests/config.json cat integration_tests/messages.jsonl| python main.py write --config integration_tests/config.json --catalog integration_tests/configured_catalog.json ``` - ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-duckdb build [--architecture=...] ``` - An image will be built with the tag `airbyte/destination-duckdb:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-duckdb:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-duckdb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-duckdb:dev check --config /secrets/config.json @@ -74,25 +83,31 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-duckdb:dev check cat integration_tests/messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-duckdb:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-duckdb test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-duckdb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -100,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-dynamodb/README.md b/airbyte-integrations/connectors/destination-dynamodb/README.md index 4bc24d1abef..4212ae2575b 100644 --- a/airbyte-integrations/connectors/destination-dynamodb/README.md +++ b/airbyte-integrations/connectors/destination-dynamodb/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-dynamodb:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-dynamodb:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-dynamodb:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-dynamodb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-dynamodb:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/dynamodb`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/dynamodbDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-dynamodb:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-dynamodb:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-dynamodb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-e2e-test/README.md b/airbyte-integrations/connectors/destination-e2e-test/README.md index 0303b6c0e58..c4837fe1af1 100644 --- a/airbyte-integrations/connectors/destination-e2e-test/README.md +++ b/airbyte-integrations/connectors/destination-e2e-test/README.md @@ -5,27 +5,34 @@ This is the repository for the Null destination connector in Java. For informati ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-e2e-test:build ``` #### Create credentials + No credential is needed for this connector. ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-e2e-test:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-e2e-test:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-e2e-test:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-e2e-test:dev check --config /secrets/config.json @@ -34,25 +41,33 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` #### Cloud variant + The cloud variant of this connector is Dev Null Destination. It only allows the "silent" mode. When this mode is changed, please make sure that the Dev Null Destination is updated and published accordingly as well. ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/e2e-test`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. See example(s) in `src/test-integration/java/io/airbyte/integrations/destinations/e2e-test/`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-e2e-test:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-e2e-test:integrationTest ``` @@ -60,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-e2e-test test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -68,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-elasticsearch/README.md b/airbyte-integrations/connectors/destination-elasticsearch/README.md index b693a900f41..4a0a9cfdba4 100644 --- a/airbyte-integrations/connectors/destination-elasticsearch/README.md +++ b/airbyte-integrations/connectors/destination-elasticsearch/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-elasticsearch:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-elasticsearch:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-elasticsearch:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-elasticsearch:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-elasticsearch:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/elasticsearch`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/elasticsearchDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-elasticsearch:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-elasticsearch:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-elasticsearch test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-elasticsearch/bootstrap.md b/airbyte-integrations/connectors/destination-elasticsearch/bootstrap.md index b7f614cba5d..fc60ccd133c 100644 --- a/airbyte-integrations/connectors/destination-elasticsearch/bootstrap.md +++ b/airbyte-integrations/connectors/destination-elasticsearch/bootstrap.md @@ -1,33 +1,34 @@ # Elasticsearch Destination Elasticsearch is a Lucene based search engine that's a type of NoSql storage. -Documents are created in an `index`, similar to a `table`in a relation database. +Documents are created in an `index`, similar to a `table`in a relation database. The documents are structured with fields that may contain nested complex structures. -[Read more about Elastic](https://elasticsearch.org/) +[Read more about Elastic](https://elasticsearch.org/) This connector maps an incoming `stream` to an Elastic `index`. When using destination sync mode `append` and `append_dedup`, an `upsert` operation is performed against the Elasticsearch index. -When using `overwrite`, the records/docs are place in a temp index, then cloned to the target index. +When using `overwrite`, the records/docs are place in a temp index, then cloned to the target index. The target index is deleted first, if it exists before the sync. -The [ElasticsearchConnection.java](./src/main/java/io/airbyte/integrations/destination/elasticsearch/ElasticsearchConnection.java) -handles the communication with the Elastic server. +The [ElasticsearchConnection.java](./src/main/java/io/airbyte/integrations/destination/elasticsearch/ElasticsearchConnection.java) +handles the communication with the Elastic server. This uses the `elasticsearch-java` rest client from the Elasticsearch team - [https://github.com/elastic/elasticsearch-java/](https://github.com/elastic/elasticsearch-java/) -The [ElasticsearchAirbyteMessageConsumerFactory.java](./src/main/java/io/airbyte/integrations/destination/elasticsearch/ElasticsearchAirbyteMessageConsumerFactory.java) -contains the logic for organizing a batch of records and reporting progress. +The [ElasticsearchAirbyteMessageConsumerFactory.java](./src/main/java/io/airbyte/integrations/destination/elasticsearch/ElasticsearchAirbyteMessageConsumerFactory.java) +contains the logic for organizing a batch of records and reporting progress. The `namespace` and stream `name` are used to generate an index name. -The index is created if it doesn't exist, but no other index configuration is done at this time. +The index is created if it doesn't exist, but no other index configuration is done at this time. Elastic will determine the type of data by detection. You can create an index ahead of time for field type customization. Basic authentication and API key authentication are supported. -## Development -See the Elasticsearch client tests for examples on how to use the library. +## Development -[https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java](https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java) \ No newline at end of file +See the Elasticsearch client tests for examples on how to use the library. + +[https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java](https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java) diff --git a/airbyte-integrations/connectors/destination-firebolt/README.md b/airbyte-integrations/connectors/destination-firebolt/README.md index d19fb11dc8a..fa3df8a7d62 100644 --- a/airbyte-integrations/connectors/destination-firebolt/README.md +++ b/airbyte-integrations/connectors/destination-firebolt/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/firebolt) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_firebolt/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ cat integration_tests/messages.jsonl | python main.py write --config secrets/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-firebolt build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-firebolt build An image will be built with the tag `airbyte/destination-firebolt:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-firebolt:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-firebolt:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-firebolt:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat integration_tests/messages.jsonl | docker run --rm -v $(pwd)/secrets:/secret ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-firebolt test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-firebolt test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-firebolt/bootstrap.md b/airbyte-integrations/connectors/destination-firebolt/bootstrap.md index dade5200d2d..5184f2553ec 100644 --- a/airbyte-integrations/connectors/destination-firebolt/bootstrap.md +++ b/airbyte-integrations/connectors/destination-firebolt/bootstrap.md @@ -2,7 +2,7 @@ ## Overview -Firebolt is a cloud data warehouse purpose-built to provide sub-second analytics performance on massive, terabyte-scale data sets. +Firebolt is a cloud data warehouse purpose-built to provide sub-second analytics performance on massive, terabyte-scale data sets. Firebolt has two main concepts: Databases, which denote the storage of data and Engines, which describe the compute layer on top of a Database. @@ -18,5 +18,5 @@ This connector uses [firebolt-sdk](https://pypi.org/project/firebolt-sdk/), whic ## Notes -* Integration testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector. -* S3 is generally faster writing strategy and should be preferred. \ No newline at end of file +- Integration testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector. +- S3 is generally faster writing strategy and should be preferred. diff --git a/airbyte-integrations/connectors/destination-firestore/README.md b/airbyte-integrations/connectors/destination-firestore/README.md index 448c941fe0a..8869c869d68 100644 --- a/airbyte-integrations/connectors/destination-firestore/README.md +++ b/airbyte-integrations/connectors/destination-firestore/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -40,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-firestore build ``` @@ -59,12 +66,15 @@ airbyte-ci connectors --name=destination-firestore build An image will be built with the tag `airbyte/destination-firestore:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-firestore:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-firestore:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-firestore:dev check --config /secrets/config.json @@ -73,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-firestore test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-firestore test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-gcs/README.md b/airbyte-integrations/connectors/destination-gcs/README.md index 62ae22ab7d5..02029267bf0 100644 --- a/airbyte-integrations/connectors/destination-gcs/README.md +++ b/airbyte-integrations/connectors/destination-gcs/README.md @@ -15,13 +15,15 @@ As a community contributor, you can follow these steps to run integration tests. ## Airbyte Employee - Access the `SECRET_DESTINATION-GCS__CREDS` secrets on SecretManager, and put it in `sample_secrets/config.json`. -_ Access the `SECRET_DESTINATION-GCS_NO_MULTIPART_ROLE_CREDS` secrets on SecretManager, and put it in `sample_secrets/insufficient_roles_config.json`. + \_ Access the `SECRET_DESTINATION-GCS_NO_MULTIPART_ROLE_CREDS` secrets on SecretManager, and put it in `sample_secrets/insufficient_roles_config.json`. - Rename the directory from `sample_secrets` to `secrets`. ### GCP Service Account for Testing + Two service accounts have been created in our GCP for testing this destination. Both of them have access to Cloud Storage through HMAC keys. The keys are persisted together with the connector integration test credentials in LastPass. - Account: `gcs-destination-connector-test@dataline-integration-testing.iam.gserviceaccount.com` + - This account has the required permission to pass the integration test. Note that the uploader needs `storage.multipartUploads` permissions, which may not be intuitive. - Role: `GCS Destination User` - Permissions: @@ -48,6 +50,7 @@ Two service accounts have been created in our GCP for testing this destination. - LastPass entry: `destination gcs creds (no multipart permission)` ## Add New Output Format + - Add a new enum in `S3Format`. - Modify `spec.json` to specify the configuration of this new format. - Update `S3FormatConfigs` to be able to construct a config for this new format. diff --git a/airbyte-integrations/connectors/destination-google-sheets/README.md b/airbyte-integrations/connectors/destination-google-sheets/README.md index 419e1fcc606..71830638545 100644 --- a/airbyte-integrations/connectors/destination-google-sheets/README.md +++ b/airbyte-integrations/connectors/destination-google-sheets/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/google-sheets) to generate the necessary credentials. Then create a file `secrets/config_oauth.json` conforming to the `destination_google_sheets/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config_oauth.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config_oauth.json @@ -48,9 +55,10 @@ cat integration_tests/test_data/messages.txt | python main.py write --config sec ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-google-sheets build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-google-sheets build An image will be built with the tag `airbyte/destination-google-sheets:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-google-sheets:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-google-sheets:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-google-sheets:dev check --config /secrets/config_oauth.json @@ -72,23 +83,30 @@ cat integration_tests/test_data/messages.txt | docker run --rm -v $(pwd)/secrets ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-google-sheets test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-google-sheets test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-iceberg/README.md b/airbyte-integrations/connectors/destination-iceberg/README.md index be2a860e782..2e82cce6b73 100644 --- a/airbyte-integrations/connectors/destination-iceberg/README.md +++ b/airbyte-integrations/connectors/destination-iceberg/README.md @@ -32,7 +32,6 @@ the [instructions](https://docs.airbyte.io/connector-development#using-credentia Build the connector image via Gradle: - ``` ./gradlew :airbyte-integrations:connectors:destination-iceberg:buildConnectorImage ``` @@ -83,7 +82,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-iceberg test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -91,4 +92,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-iceberg/bootstrap.md b/airbyte-integrations/connectors/destination-iceberg/bootstrap.md index c9ae78bceb3..6e822b4acb0 100644 --- a/airbyte-integrations/connectors/destination-iceberg/bootstrap.md +++ b/airbyte-integrations/connectors/destination-iceberg/bootstrap.md @@ -9,4 +9,3 @@ Spark, Trino, PrestoDB, Flink, Hive and Impala using a high-performance table fo The Iceberg reference documents: [https://iceberg.apache.org/docs/latest/api/](https://iceberg.apache.org/docs/latest/api/) - diff --git a/airbyte-integrations/connectors/destination-kafka/README.md b/airbyte-integrations/connectors/destination-kafka/README.md index 56aabe505cd..0a96711caa0 100644 --- a/airbyte-integrations/connectors/destination-kafka/README.md +++ b/airbyte-integrations/connectors/destination-kafka/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-kafka:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-kafka:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-kafka:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-kafka:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-kafka:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/kafka`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/kafkaDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-kafka:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-kafka:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-kafka test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-kvdb/README.md b/airbyte-integrations/connectors/destination-kvdb/README.md index b834894111b..712bb1de188 100644 --- a/airbyte-integrations/connectors/destination-kvdb/README.md +++ b/airbyte-integrations/connectors/destination-kvdb/README.md @@ -5,22 +5,27 @@ This is the repository for the [Kvdb](https://kvdb.io) destination connector, wr ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -29,12 +34,15 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-kvdb:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials from [Kvdb](https://kvdb.io/docs/api/), and then create a file `secrets/config.json` conforming to the `destination_kvdb/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. See `integration_tests/sample_config.json` for a sample config file. @@ -43,6 +51,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -52,10 +61,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-kvdb build ``` @@ -63,51 +72,71 @@ airbyte-ci connectors --name=destination-kvdb build An image will be built with the tag `airbyte/destination-kvdb:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-kvdb:dev . ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-kvdb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-kvdb:dev check --config /secrets/config.json # messages.jsonl is a file containing line-separated JSON representing AirbyteMessages cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-kvdb:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` + ## Testing - Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named. + +Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named. First install test dependencies into your virtual environment: + ``` pip install .[tests] ``` + ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` python -m pytest unit_tests ``` ### Integration Tests + There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all destination connectors) and custom integration tests (which are specific to this connector). + #### Custom Integration tests + Place custom tests inside `integration_tests/` folder, then, from the connector root, run + ``` python -m pytest integration_tests ``` + #### Acceptance Tests + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-kvdb test ``` - ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-kvdb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -115,4 +144,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-langchain/README.md b/airbyte-integrations/connectors/destination-langchain/README.md index 76903c7373f..89db8d0567d 100644 --- a/airbyte-integrations/connectors/destination-langchain/README.md +++ b/airbyte-integrations/connectors/destination-langchain/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.10.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/langchain) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_langchain/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-langchain build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-langchain build An image will be built with the tag `airbyte/destination-langchain:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-langchain:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-langchain:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-langchain:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-langchain test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-langchain test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-langchain/bootstrap.md b/airbyte-integrations/connectors/destination-langchain/bootstrap.md index 2554eaf1caf..3a3135af938 100644 --- a/airbyte-integrations/connectors/destination-langchain/bootstrap.md +++ b/airbyte-integrations/connectors/destination-langchain/bootstrap.md @@ -1,9 +1,10 @@ # Langchain Destination Connector Bootstrap This destination does three things: -* Split records into chunks and separates metadata from text data -* Embeds text data into an embedding vector -* Stores the metadata and embedding vector in a vector database + +- Split records into chunks and separates metadata from text data +- Embeds text data into an embedding vector +- Stores the metadata and embedding vector in a vector database The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/. @@ -27,4 +28,4 @@ The DocArrayHnswSearch integration is storing the vector metadata in a local fil The Chroma integration is storing the vector metadata in a local file in the local root (`/local` in the container, `/tmp/airbyte_local` on the host), similar to the DocArrayHnswSearch. This is called the "persistent client" mode in Chroma. The integration is mostly using langchains abstraction, but it can also dedupe records and reset streams independently. -You can use the `test_local.py` file to check whether the pipeline works as expected. \ No newline at end of file +You can use the `test_local.py` file to check whether the pipeline works as expected. diff --git a/airbyte-integrations/connectors/destination-meilisearch/README.md b/airbyte-integrations/connectors/destination-meilisearch/README.md index 207e2898208..1eb33b5d9aa 100644 --- a/airbyte-integrations/connectors/destination-meilisearch/README.md +++ b/airbyte-integrations/connectors/destination-meilisearch/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/meilisearch) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_meilisearch/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-meilisearch build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-meilisearch build An image will be built with the tag `airbyte/destination-meilisearch:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-meilisearch:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-meilisearch:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-meilisearch:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-meilisearch test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-meilisearch test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-milvus/README.md b/airbyte-integrations/connectors/destination-milvus/README.md index b37491365c3..57d9133b37f 100644 --- a/airbyte-integrations/connectors/destination-milvus/README.md +++ b/airbyte-integrations/connectors/destination-milvus/README.md @@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/milvus) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_langchain/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -34,8 +39,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -43,15 +48,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name=destination-milvus build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-milvus:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -79,6 +88,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/destination-milvus:latest @@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/destination-milvus:dev . # Running the spec command against your patched connector docker run airbyte/destination-milvus:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-langchain:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-langchain:dev check --config /secrets/config.json @@ -107,35 +122,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-milvus test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run: + ``` poetry run pytest -s integration_tests -``` +``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-milvus test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -143,4 +169,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-milvus/bootstrap.md b/airbyte-integrations/connectors/destination-milvus/bootstrap.md index 02a2cee7d2a..3ff5bb91cf6 100644 --- a/airbyte-integrations/connectors/destination-milvus/bootstrap.md +++ b/airbyte-integrations/connectors/destination-milvus/bootstrap.md @@ -1,9 +1,10 @@ # Milvus Destination Connector Bootstrap This destination does three things: -* Split records into chunks and separates metadata from text data -* Embeds text data into an embedding vector -* Stores the metadata and embedding vector in a vector database + +- Split records into chunks and separates metadata from text data +- Embeds text data into an embedding vector +- Stores the metadata and embedding vector in a vector database The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/. diff --git a/airbyte-integrations/connectors/destination-mongodb-strict-encrypt/README.md b/airbyte-integrations/connectors/destination-mongodb-strict-encrypt/README.md index 7f0c78e9808..1fd26972b8c 100644 --- a/airbyte-integrations/connectors/destination-mongodb-strict-encrypt/README.md +++ b/airbyte-integrations/connectors/destination-mongodb-strict-encrypt/README.md @@ -10,16 +10,16 @@ As a community contributor, you will need access to a MongoDB to run tests. 2. Go to the `Database Access` page and add new database user with read and write permissions 3. Add new database with default collection 4. Add host, port or cluster_url, database name, username and password to `secrets/credentials.json` file - ``` - { - "database": "database_name", - "user": "user", - "password": "password", - "cluster_url": "cluster_url", - "host": "host", - "port": "port" - } - ``` + ``` + { + "database": "database_name", + "user": "user", + "password": "password", + "cluster_url": "cluster_url", + "host": "host", + "port": "port" + } + ``` ## Airbyte Employee diff --git a/airbyte-integrations/connectors/destination-pinecone/README.md b/airbyte-integrations/connectors/destination-pinecone/README.md index 6ea64e53430..2b24e7d288a 100644 --- a/airbyte-integrations/connectors/destination-pinecone/README.md +++ b/airbyte-integrations/connectors/destination-pinecone/README.md @@ -5,17 +5,21 @@ This is the repository for the Pinecone destination connector, written in Python ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/pinecone) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_pinecone/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -25,6 +29,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -33,8 +38,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -42,15 +47,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name=destination-pinecone build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-pinecone:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -70,6 +78,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -78,6 +87,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/destination-pinecone:latest @@ -88,16 +98,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/destination-pinecone:dev . # Running the spec command against your patched connector docker run airbyte/destination-pinecone:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-pinecone:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-pinecone:dev check --config /secrets/config.json @@ -106,35 +121,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-pinecone test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run: + ``` poetry run pytest -s integration_tests -``` +``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-pinecone test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -142,4 +168,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-pinecone/bootstrap.md b/airbyte-integrations/connectors/destination-pinecone/bootstrap.md index cd6d535124d..f7e05d59394 100644 --- a/airbyte-integrations/connectors/destination-pinecone/bootstrap.md +++ b/airbyte-integrations/connectors/destination-pinecone/bootstrap.md @@ -1,8 +1,9 @@ # Pinecone Destination Connector Bootstrap This destination does three things: -* Split records into chunks and separates metadata from text data -* Embeds text data into an embedding vector -* Stores the metadata and embedding vector in Pinecone -The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/. \ No newline at end of file +- Split records into chunks and separates metadata from text data +- Embeds text data into an embedding vector +- Stores the metadata and embedding vector in Pinecone + +The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/. diff --git a/airbyte-integrations/connectors/destination-qdrant/README.md b/airbyte-integrations/connectors/destination-qdrant/README.md index 45a3f2ff188..61db40715fa 100644 --- a/airbyte-integrations/connectors/destination-qdrant/README.md +++ b/airbyte-integrations/connectors/destination-qdrant/README.md @@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.10.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/qdrant) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_qdrant/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -34,9 +39,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-qdrant build ``` @@ -44,12 +50,15 @@ airbyte-ci connectors --name=destination-qdrant build An image will be built with the tag `airbyte/destination-qdrant:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-qdrant:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-qdrant:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-qdrant:dev check --config /secrets/config.json @@ -58,35 +67,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-qdrant test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run: + ``` poetry run pytest -s integration_tests -``` +``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-qdrant test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -94,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-rabbitmq/README.md b/airbyte-integrations/connectors/destination-rabbitmq/README.md index f6952028a51..0cdf8aff02b 100644 --- a/airbyte-integrations/connectors/destination-rabbitmq/README.md +++ b/airbyte-integrations/connectors/destination-rabbitmq/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/rabbitmq) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_rabbitmq/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-rabbitmq build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-rabbitmq build An image will be built with the tag `airbyte/destination-rabbitmq:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-rabbitmq:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-rabbitmq:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-rabbitmq:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-rabbitmq test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-rabbitmq test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-redis/README.md b/airbyte-integrations/connectors/destination-redis/README.md index ab09827ef20..cfc6a67dca9 100644 --- a/airbyte-integrations/connectors/destination-redis/README.md +++ b/airbyte-integrations/connectors/destination-redis/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-redis:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-redis:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-redis:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-redis:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-redis:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/redis`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/redisDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-redis:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-redis:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-redis test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-redis/bootstrap.md b/airbyte-integrations/connectors/destination-redis/bootstrap.md index bcc91121e0f..b0b28ef0200 100644 --- a/airbyte-integrations/connectors/destination-redis/bootstrap.md +++ b/airbyte-integrations/connectors/destination-redis/bootstrap.md @@ -1,18 +1,17 @@ # Redis Destination -Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache, pub/sub and message broker. -Redis provides data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams. +Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache, pub/sub and message broker. +Redis provides data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams. Redis has built-in replication, Lua scripting, LRU eviction, transactions, and different levels of on-disk persistence. To achieve top performance, Redis works with an in-memory dataset. Depending on your use case, you can persist your data either by periodically dumping the dataset to disk or by appending each command to a disk-based log. You can also disable persistence if you just need a feature-rich, networked, in-memory cache. [Read more about Redis](https://redis.io/) - This connector maps an incoming Airbyte namespace and stream to a different key in the Redis data structure. The connector supports the `append` sync mode by adding keys to an existing keyset and `overwrite` by deleting the existing ones and replacing them with the new ones. The implementation uses the [Jedis](https://github.com/redis/jedis) java client to access the Redis cache. [RedisCache](./src/main/java/io/airbyte/integrations/destination/redis/RedisCache.java) is the main entrypoint for defining operations that can be performed against Redis. The interface allows you to implement any Redis supported data type for storing data based on your needs. -At the moment there is only one implementation [RedisHCache](./src/main/java/io/airbyte/integrations/destination/redis/RedisHCache.java) which stores the incoming messages in a Hash structure. Internally it uses a Jedis instance retrieved from the +At the moment there is only one implementation [RedisHCache](./src/main/java/io/airbyte/integrations/destination/redis/RedisHCache.java) which stores the incoming messages in a Hash structure. Internally it uses a Jedis instance retrieved from the [RedisPoolManager](./src/main/java/io/airbyte/integrations/destination/redis/RedisPoolManager.java). Retrieve records from the Redis cache are mapped to [RedisRecord](./src/main/java/io/airbyte/integrations/destination/redis/RedisRecord.java) The [RedisMessageConsumer](./src/main/java/io/airbyte/integrations/destination/redis/RedisMessageConsumer.java) @@ -22,4 +21,4 @@ class contains the logic for handling airbyte messages and storing them in Redis See the [RedisHCache](./src/main/java/io/airbyte/integrations/destination/redis/RedisHCache.java) class for an example on how to use the Jedis client for accessing the Redis cache. -If you want to learn more, read the [Jedis docs](https://github.com/redis/jedis/wiki) \ No newline at end of file +If you want to learn more, read the [Jedis docs](https://github.com/redis/jedis/wiki) diff --git a/airbyte-integrations/connectors/destination-redshift/README.md b/airbyte-integrations/connectors/destination-redshift/README.md index ecea733473a..15eb4f96e20 100644 --- a/airbyte-integrations/connectors/destination-redshift/README.md +++ b/airbyte-integrations/connectors/destination-redshift/README.md @@ -22,5 +22,5 @@ Consult the integration test area for Redshift. The actual secrets for integration tests can be found in Google Cloud Secrets Manager. Search on redshift for the labels: -- SECRET_DESTINATION-REDSHIFT__CREDS - used for Standard tests. (__config.json__) -- SECRET_DESTINATION-REDSHIFT_STAGING__CREDS - used for S3 Staging tests. (__config_staging.json__) +- SECRET_DESTINATION-REDSHIFT**CREDS - used for Standard tests. (**config.json\_\_) +- SECRET_DESTINATION-REDSHIFT_STAGING**CREDS - used for S3 Staging tests. (**config_staging.json\_\_) diff --git a/airbyte-integrations/connectors/destination-s3-glue/README.md b/airbyte-integrations/connectors/destination-s3-glue/README.md index 7ac2f084fc6..4a401c65254 100644 --- a/airbyte-integrations/connectors/destination-s3-glue/README.md +++ b/airbyte-integrations/connectors/destination-s3-glue/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-s3-glue:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-s3-glue:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-s3-glue:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-s3-glue:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-s3-glue:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/s3_glue`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/s3_glueDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-s3-glue:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-s3-glue:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-s3-glue test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-s3/README.md b/airbyte-integrations/connectors/destination-s3/README.md index 967b83d8348..05b4c8d7975 100644 --- a/airbyte-integrations/connectors/destination-s3/README.md +++ b/airbyte-integrations/connectors/destination-s3/README.md @@ -19,6 +19,7 @@ As a community contributor, you will need access to AWS to run the integration t - Rename the directory from `sample_secrets` to `secrets`. ## Add New Output Format + - Add a new enum in `S3Format`. - Modify `spec.json` to specify the configuration of this new format. - Update `S3FormatConfigs` to be able to construct a config for this new format. diff --git a/airbyte-integrations/connectors/destination-sftp-json/README.md b/airbyte-integrations/connectors/destination-sftp-json/README.md index a584dd8a99b..34f47f2a33d 100644 --- a/airbyte-integrations/connectors/destination-sftp-json/README.md +++ b/airbyte-integrations/connectors/destination-sftp-json/README.md @@ -8,22 +8,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -32,6 +37,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sftp-json) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -41,6 +47,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -50,9 +57,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-sftp-json build ``` @@ -60,12 +68,15 @@ airbyte-ci connectors --name=destination-sftp-json build An image will be built with the tag `airbyte/destination-sftp-json:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-sftp-json:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-sftp-json:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev check --config /secrets/config.json @@ -74,23 +85,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-sftp-json test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sftp-json test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -98,4 +116,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-snowflake/README.md b/airbyte-integrations/connectors/destination-snowflake/README.md index 79de239617d..dbfac0e343d 100644 --- a/airbyte-integrations/connectors/destination-snowflake/README.md +++ b/airbyte-integrations/connectors/destination-snowflake/README.md @@ -95,7 +95,9 @@ DROP WAREHOUSE IF EXISTS INTEGRATION_TEST_WAREHOUSE_DESTINATION; ``` ### Setup for various error-case users: + Log in as the `INTEGRATION_TEST_USER_DESTINATION` user, and run this: + ```sql drop schema if exists INTEGRATION_TEST_DESTINATION.TEXT_SCHEMA; create schema INTEGRATION_TEST_DESTINATION.TEXT_SCHEMA; diff --git a/airbyte-integrations/connectors/destination-sqlite/README.md b/airbyte-integrations/connectors/destination-sqlite/README.md index 18e9e61a6ca..510426ad1b5 100644 --- a/airbyte-integrations/connectors/destination-sqlite/README.md +++ b/airbyte-integrations/connectors/destination-sqlite/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sqlite) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sqlite/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-sqlite build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-sqlite build An image will be built with the tag `airbyte/destination-sqlite:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-sqlite:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-sqlite:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sqlite:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-sqlite test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sqlite test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-starburst-galaxy/README.md b/airbyte-integrations/connectors/destination-starburst-galaxy/README.md index 2125775075a..1f9133c8ff2 100644 --- a/airbyte-integrations/connectors/destination-starburst-galaxy/README.md +++ b/airbyte-integrations/connectors/destination-starburst-galaxy/README.md @@ -8,6 +8,7 @@ For information about how to use this connector within Airbyte, see [the user do #### Build with Gradle From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:build ``` @@ -24,15 +25,18 @@ If you are an Airbyte core member, you must follow the [instructions](https://do #### Build Build the connector image with Gradle: + ``` ./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:buildConnectorImage ``` + When building with Gradle, the Docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` labels in the Dockerfile. #### Run -Following example commands are Starburst Galaxy-specific version of the [Airbyte protocol commands](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol): +Following example commands are Starburst Galaxy-specific version of the [Airbyte protocol commands](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol): + ``` docker run --rm airbyte/destination-starburst-galaxy:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-starburst-galaxy:dev check --config /secrets/config.json @@ -41,13 +45,16 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ### Run tests with Gradle -All commands should be run from airbyte project root. +All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:integrationTest ``` diff --git a/airbyte-integrations/connectors/destination-teradata/README.md b/airbyte-integrations/connectors/destination-teradata/README.md index 3bcb00e7972..f4fbbc9d023 100644 --- a/airbyte-integrations/connectors/destination-teradata/README.md +++ b/airbyte-integrations/connectors/destination-teradata/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-teradata:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:destination-teradata:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/destination-teradata:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-teradata:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-teradata:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/teradata`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/teradataDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-teradata:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-teradata:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-teradata test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-timeplus/README.md b/airbyte-integrations/connectors/destination-timeplus/README.md index 6ba14518f63..9b1379593bf 100755 --- a/airbyte-integrations/connectors/destination-timeplus/README.md +++ b/airbyte-integrations/connectors/destination-timeplus/README.md @@ -52,9 +52,10 @@ cat integration_tests/messages.jsonl | python main.py write --config secrets/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-timeplus build ``` @@ -62,6 +63,7 @@ airbyte-ci connectors --name=destination-timeplus build An image will be built with the tag `airbyte/destination-timeplus:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-timeplus:dev . ``` @@ -77,14 +79,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-timeplus:dev chec cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-timeplus:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-timeplus test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. @@ -97,7 +101,9 @@ We split dependencies between two groups, dependencies that are: - required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-timeplus test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -105,4 +111,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-typesense/README.md b/airbyte-integrations/connectors/destination-typesense/README.md index a1b61228a32..58e1672239b 100644 --- a/airbyte-integrations/connectors/destination-typesense/README.md +++ b/airbyte-integrations/connectors/destination-typesense/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/typesense) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_typesense/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-typesense build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-typesense build An image will be built with the tag `airbyte/destination-typesense:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-typesense:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-typesense:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-typesense:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-typesense test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-typesense test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-vectara/README.md b/airbyte-integrations/connectors/destination-vectara/README.md index 13fd46d9fd0..41982286ff9 100644 --- a/airbyte-integrations/connectors/destination-vectara/README.md +++ b/airbyte-integrations/connectors/destination-vectara/README.md @@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/vectara) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_vectara/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -25,19 +29,18 @@ See `integration_tests/sample_config.json` for a sample config file. **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `destination vectara test creds` and place them into `secrets/config.json`. - ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json python main.py write --config secrets/config.json --catalog integration_tests/configured_catalog.json ``` - ### Locally running the connector docker image - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -45,15 +48,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name=destination-vectara build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-vectara:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -73,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -81,6 +88,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/destination-vectara:latest @@ -91,9 +99,11 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/destination-vectara:dev . # Running the spec command against your patched connector @@ -101,7 +111,9 @@ docker run airbyte/destination-vectara:dev spec ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-vectara:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-vectara:dev check --config /secrets/config.json @@ -110,39 +122,50 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-vectara test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all destination connectors) and custom integration tests (which are specific to this connector). + #### Custom Integration tests + Place custom tests inside `integration_tests/` folder, then, from the connector root, run + ``` poetry run pytest -s integration_tests ``` #### Acceptance Tests -Coming soon: - +Coming soon: ## Dependency Management + All of your dependencies should go in `pyproject.toml`. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `[tool.poetry.dependencies]` list. -* required for the testing need to go to `[tool.poetry.group.dev.dependencies]` list + +- required for your connector to work need to go to `[tool.poetry.dependencies]` list. +- required for the testing need to go to `[tool.poetry.group.dev.dependencies]` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-vectara test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -150,4 +173,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-weaviate/README.md b/airbyte-integrations/connectors/destination-weaviate/README.md index 24aaea31bce..caac39d07db 100644 --- a/airbyte-integrations/connectors/destination-weaviate/README.md +++ b/airbyte-integrations/connectors/destination-weaviate/README.md @@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/weaviate) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_weaviate/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -34,8 +39,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co ### Locally running the connector docker image - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -43,15 +48,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name=destination-weaviate build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-weaviate:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -79,6 +88,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/destination-weaviate:latest @@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/destination-weaviate:dev . # Running the spec command against your patched connector docker run airbyte/destination-weaviate:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-weaviate:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-weaviate:dev check --config /secrets/config.json @@ -107,35 +122,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-weaviate test ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest -s unit_tests ``` ### Integration Tests + To run integration tests locally, make sure you create a secrets/config.json as explained above, and then run: + ``` poetry run pytest -s integration_tests -``` +``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-weaviate test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -143,4 +169,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-xata/README.md b/airbyte-integrations/connectors/destination-xata/README.md index e6153ac20ba..1d0c8d29980 100644 --- a/airbyte-integrations/connectors/destination-xata/README.md +++ b/airbyte-integrations/connectors/destination-xata/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/xata) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_xata/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=destination-xata build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-xata build An image will be built with the tag `airbyte/destination-xata:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/destination-xata:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-xata:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-xata:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=destination-xata test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-xata test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/destination-yellowbrick/README.md b/airbyte-integrations/connectors/destination-yellowbrick/README.md index c583ba33a83..fcea49e0168 100644 --- a/airbyte-integrations/connectors/destination-yellowbrick/README.md +++ b/airbyte-integrations/connectors/destination-yellowbrick/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:destination-yellowbrick:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,15 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: + ``` ./gradlew :airbyte-integrations:connectors:destination-yellowbrick:airbyteDocker ``` + When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/destination-yellowbrick:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-yellowbrick:dev check --config /secrets/config.json @@ -37,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/destinations/yellowbrick`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/destinations/yellowbrickDestinationAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-yellowbrick:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:destination-yellowbrick:integrationTest ``` @@ -60,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing unit and integration tests. 1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)). 1. Create a Pull Request. diff --git a/airbyte-integrations/connectors/source-activecampaign/README.md b/airbyte-integrations/connectors/source-activecampaign/README.md index 51f5f7aadf2..6673655e23d 100644 --- a/airbyte-integrations/connectors/source-activecampaign/README.md +++ b/airbyte-integrations/connectors/source-activecampaign/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/activecampaign) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_activecampaign/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-activecampaign build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-activecampaign build An image will be built with the tag `airbyte/source-activecampaign:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-activecampaign:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-activecampaign:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-activecampaign:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-activecampaign test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-activecampaign test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-adjust/README.md b/airbyte-integrations/connectors/source-adjust/README.md index c624a57f43c..4d293b02953 100644 --- a/airbyte-integrations/connectors/source-adjust/README.md +++ b/airbyte-integrations/connectors/source-adjust/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/adjust) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_adjust/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name source-adjust build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name source-adjust build An image will be built with the tag `airbyte/source-adjust:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-adjust:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-adjust:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-adjust:dev check --config /secrets/config.json @@ -73,17 +84,22 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-adjust test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list diff --git a/airbyte-integrations/connectors/source-aha/README.md b/airbyte-integrations/connectors/source-aha/README.md index aa43d70e16d..4df129fe116 100644 --- a/airbyte-integrations/connectors/source-aha/README.md +++ b/airbyte-integrations/connectors/source-aha/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/aha) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_aha/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-aha build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-aha build An image will be built with the tag `airbyte/source-aha:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-aha:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-aha:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aha:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-aha test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-aha test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-aircall/README.md b/airbyte-integrations/connectors/source-aircall/README.md index 750124c2a5a..889154ea3d8 100644 --- a/airbyte-integrations/connectors/source-aircall/README.md +++ b/airbyte-integrations/connectors/source-aircall/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/aircall) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_aircall/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-aircall build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-aircall build An image will be built with the tag `airbyte/source-aircall:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-aircall:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-aircall:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aircall:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-aircall test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-aircall test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-airtable/README.md b/airbyte-integrations/connectors/source-airtable/README.md index 3424957010f..9c8026f3a42 100644 --- a/airbyte-integrations/connectors/source-airtable/README.md +++ b/airbyte-integrations/connectors/source-airtable/README.md @@ -1,31 +1,32 @@ # Airtable source connector - This is the repository for the Airtable source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/airtable). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/airtable) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_airtable/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-airtable spec poetry run source-airtable check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-airtable read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-airtable build ``` An image will be available on your host with the tag `airbyte/source-airtable:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-airtable:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-airtable:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-airtable test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-airtable test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/airtable.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-alpha-vantage/README.md b/airbyte-integrations/connectors/source-alpha-vantage/README.md index 6fd81b7208f..86168407648 100644 --- a/airbyte-integrations/connectors/source-alpha-vantage/README.md +++ b/airbyte-integrations/connectors/source-alpha-vantage/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/alpha-vantage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_alpha_vantage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-alpha-vantage build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-alpha-vantage build An image will be built with the tag `airbyte/source-alpha-vantage:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-alpha-vantage:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-alpha-vantage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-alpha-vantage:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-alpha-vantage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-alpha-vantage test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-amazon-ads/README.md b/airbyte-integrations/connectors/source-amazon-ads/README.md index d94cf866336..b7f237c9de7 100644 --- a/airbyte-integrations/connectors/source-amazon-ads/README.md +++ b/airbyte-integrations/connectors/source-amazon-ads/README.md @@ -1,31 +1,32 @@ # Amazon-Ads source connector - This is the repository for the Amazon-Ads source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/amazon-ads). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/amazon-ads) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amazon_ads/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-amazon-ads spec poetry run source-amazon-ads check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-amazon-ads read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-amazon-ads build ``` An image will be available on your host with the tag `airbyte/source-amazon-ads:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-amazon-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-ads:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-amazon-ads test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amazon-ads test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/amazon-ads.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-amazon-seller-partner/README.md b/airbyte-integrations/connectors/source-amazon-seller-partner/README.md index 178a3bbca31..28269305671 100644 --- a/airbyte-integrations/connectors/source-amazon-seller-partner/README.md +++ b/airbyte-integrations/connectors/source-amazon-seller-partner/README.md @@ -1,31 +1,32 @@ # Amazon Seller Partner Source - This is the repository for the Amazon Seller-Partner source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/amazon-seller-partner). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/amazon-seller-partner) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amazon_seller_partner/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `integration_tests/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-amazon-seller-partner spec poetry run source-amazon-seller-partner check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-amazon-seller-partner read --config secrets/config.json --cata ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-amazon-seller-partner build ``` An image will be available on your host with the tag `airbyte/source-amazon-seller-partner:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-amazon-seller-partner:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-seller-partner:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-amazon-seller-partner test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amazon-seller-partner test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/amazon-seller-partner.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-amazon-sqs/README.md b/airbyte-integrations/connectors/source-amazon-sqs/README.md index 007a1acdf02..79af68c65a7 100644 --- a/airbyte-integrations/connectors/source-amazon-sqs/README.md +++ b/airbyte-integrations/connectors/source-amazon-sqs/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/amazon-sqs) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amazon_sqs/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-amazon-sqs build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-amazon-sqs build An image will be built with the tag `airbyte/source-amazon-sqs:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-amazon-sqs:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-amazon-sqs:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-amazon-sqs test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amazon-sqs test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-amazon-sqs/bootstrap.md b/airbyte-integrations/connectors/source-amazon-sqs/bootstrap.md index 42d2210a63d..f7ec7d1bf30 100644 --- a/airbyte-integrations/connectors/source-amazon-sqs/bootstrap.md +++ b/airbyte-integrations/connectors/source-amazon-sqs/bootstrap.md @@ -1,11 +1,14 @@ # Amazon SQS Source ## What + This is a connector for consuming messages from an [Amazon SQS Queue](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html) ## How + ### Polling -It uses [long polling](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-short-and-long-polling.html) to consume in batches + +It uses [long polling](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-short-and-long-polling.html) to consume in batches of up to 10 at a time (10 is the maximum defined by the AWS API). The batch size is configurable between 1 and 10 (a size of 0 would use short-polling, this is not allowed). @@ -13,14 +16,17 @@ The batch size is configurable between 1 and 10 (a size of 0 would use short-pol Using larger batches reduces the amount of connections thus increasing performance. ### Deletes -Optionally, it can delete messages after reading - the delete_message() call is made __after__ yielding the message to the generator. -This means that messages aren't deleted unless read by a Destination - however, there is still potential that this could result in -missed messages if the Destination fails __after__ taking the message, but before commiting to to its own downstream. + +Optionally, it can delete messages after reading - the delete_message() call is made **after** yielding the message to the generator. +This means that messages aren't deleted unless read by a Destination - however, there is still potential that this could result in +missed messages if the Destination fails **after** taking the message, but before commiting to to its own downstream. ### Credentials + Requires an AWS IAM Access Key ID and Secret Key. This could be improved to add support for configured AWS profiles, env vars etc. ### Output -Although messages are consumed in batches, they are output from the Source as individual messages. \ No newline at end of file + +Although messages are consumed in batches, they are output from the Source as individual messages. diff --git a/airbyte-integrations/connectors/source-amplitude/README.md b/airbyte-integrations/connectors/source-amplitude/README.md index 6d9f9f816a6..246d7aef8ce 100644 --- a/airbyte-integrations/connectors/source-amplitude/README.md +++ b/airbyte-integrations/connectors/source-amplitude/README.md @@ -1,31 +1,32 @@ # Amplitude source connector - This is the repository for the Amplitude source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/amplitude). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/amplitude) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amplitude/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-amplitude spec poetry run source-amplitude check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-amplitude read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-amplitude build ``` An image will be available on your host with the tag `airbyte/source-amplitude:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-amplitude:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amplitude:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-amplitude test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amplitude test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/amplitude.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-apify-dataset/README.md b/airbyte-integrations/connectors/source-apify-dataset/README.md index 36ec48bdb49..93c9824a0a1 100644 --- a/airbyte-integrations/connectors/source-apify-dataset/README.md +++ b/airbyte-integrations/connectors/source-apify-dataset/README.md @@ -1,31 +1,32 @@ # Apify-Dataset source connector - This is the repository for the Apify-Dataset source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/apify-dataset). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/apify-dataset) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_apify_dataset/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-apify-dataset spec poetry run source-apify-dataset check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-apify-dataset read --config secrets/config.json --catalog samp ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-apify-dataset build ``` An image will be available on your host with the tag `airbyte/source-apify-dataset:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-apify-dataset:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apify-dataset:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-apify-dataset test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-apify-dataset test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/apify-dataset.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-appfollow/README.md b/airbyte-integrations/connectors/source-appfollow/README.md index 31306ce4b03..b44ac2709d7 100644 --- a/airbyte-integrations/connectors/source-appfollow/README.md +++ b/airbyte-integrations/connectors/source-appfollow/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/appfollow) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_appfollow/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-appfollow build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-appfollow build An image will be built with the tag `airbyte/source-appfollow:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-appfollow:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-appfollow:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appfollow:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-appfollow test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-appfollow test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-apple-search-ads/README.md b/airbyte-integrations/connectors/source-apple-search-ads/README.md index 1c8b95f9aba..778d05cb013 100644 --- a/airbyte-integrations/connectors/source-apple-search-ads/README.md +++ b/airbyte-integrations/connectors/source-apple-search-ads/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/apple-search-ads) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_apple_search_ads/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-apple-search-ads build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-apple-search-ads build An image will be built with the tag `airbyte/source-apple-search-ads:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-apple-search-ads:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-apple-search-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apple-search-ads:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-apple-search-ads test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-apple-search-ads test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-apple-search-ads/bootstrap.md b/airbyte-integrations/connectors/source-apple-search-ads/bootstrap.md index 83a87921d30..1b605794ebe 100644 --- a/airbyte-integrations/connectors/source-apple-search-ads/bootstrap.md +++ b/airbyte-integrations/connectors/source-apple-search-ads/bootstrap.md @@ -1,26 +1,23 @@ - ## Base streams Apple Search Ads is a REST based API. Connector is implemented with the [Airbyte Low-Code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview/) Connector has base streams including attributes about entities in the API (e.g: what campaigns, which adgroups, etc…), and all of them support full refresh only: -* [Campaigns](https://developer.apple.com/documentation/apple_search_ads/get_all_campaigns) -* [AdGroups](https://developer.apple.com/documentation/apple_search_ads/get_all_ad_groups) -* [Keywords](https://developer.apple.com/documentation/apple_search_ads/get_all_targeting_keywords_in_an_ad_group) +- [Campaigns](https://developer.apple.com/documentation/apple_search_ads/get_all_campaigns) +- [AdGroups](https://developer.apple.com/documentation/apple_search_ads/get_all_ad_groups) +- [Keywords](https://developer.apple.com/documentation/apple_search_ads/get_all_targeting_keywords_in_an_ad_group) ## Report streams Connector also has report streams including statistics about entities (e.g: how many spending on a campaign, how many clicks on a keyword, etc...) which support incremental sync. -* [Campaign-Level Report](https://developer.apple.com/documentation/apple_search_ads/get_campaign-level_reports) -* [Ad Group-Level Report](https://developer.apple.com/documentation/apple_search_ads/get__ad_group-level_reports) -* [Keyword-Level Report](https://developer.apple.com/documentation/apple_search_ads/get_keyword-level_reports) - +- [Campaign-Level Report](https://developer.apple.com/documentation/apple_search_ads/get_campaign-level_reports) +- [Ad Group-Level Report](https://developer.apple.com/documentation/apple_search_ads/get__ad_group-level_reports) +- [Keyword-Level Report](https://developer.apple.com/documentation/apple_search_ads/get_keyword-level_reports) Connector uses `start_date` config for initial reports sync and current date as an end date if this one is not explicitly set. At the moment, report streams are only set to the `DAILY` granularity (e.g: `campaigns_report_daily`, `adgroups_report_daily`, `keywords_report_daily`). - See [this](https://docs.airbyte.io/integrations/sources/apple-search-ads) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-appsflyer/README.md b/airbyte-integrations/connectors/source-appsflyer/README.md index 6acca4cd2e3..f6bb966066e 100644 --- a/airbyte-integrations/connectors/source-appsflyer/README.md +++ b/airbyte-integrations/connectors/source-appsflyer/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/appsflyer) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_appsflyer/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-appsflyer build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-appsflyer build An image will be built with the tag `airbyte/source-appsflyer:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-appsflyer:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-appsflyer:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appsflyer:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-appsflyer test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-appsflyer test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-asana/README.md b/airbyte-integrations/connectors/source-asana/README.md index 84a96fb4dbd..f07a5b67706 100644 --- a/airbyte-integrations/connectors/source-asana/README.md +++ b/airbyte-integrations/connectors/source-asana/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/asana) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_asana/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-asana build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-asana build An image will be built with the tag `airbyte/source-asana:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-asana:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-asana:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-asana:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-asana test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-asana test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-ashby/README.md b/airbyte-integrations/connectors/source-ashby/README.md index d19ec1c25c3..dfe13493ecc 100644 --- a/airbyte-integrations/connectors/source-ashby/README.md +++ b/airbyte-integrations/connectors/source-ashby/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/ashby) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_ashby/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-ashby build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-ashby build An image will be built with the tag `airbyte/source-ashby:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-ashby:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-ashby:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ashby:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-ashby test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-ashby test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-auth0/README.md b/airbyte-integrations/connectors/source-auth0/README.md index 843b66367cb..bb98494fd62 100644 --- a/airbyte-integrations/connectors/source-auth0/README.md +++ b/airbyte-integrations/connectors/source-auth0/README.md @@ -1,31 +1,32 @@ # Auth0 source connector - This is the repository for the Auth0 source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/auth0). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/auth0) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_auth0/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-auth0 spec poetry run source-auth0 check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-auth0 read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-auth0 build ``` An image will be available on your host with the tag `airbyte/source-auth0:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-auth0:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-auth0:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-auth0 test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-auth0 test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/auth0.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-avni/README.md b/airbyte-integrations/connectors/source-avni/README.md index 075de8d4779..524f520c1a5 100644 --- a/airbyte-integrations/connectors/source-avni/README.md +++ b/airbyte-integrations/connectors/source-avni/README.md @@ -6,14 +6,17 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Building via Gradle + You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow. To build using Gradle, from the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-avni:build ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/avni) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_avni/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -25,56 +28,73 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image #### Build + First, make sure you build the latest Docker image: + ``` docker build . -t airbyte/source-avni:dev ``` You can also build the connector image via Gradle: + ``` ./gradlew :airbyte-integrations:connectors:source-avni:airbyteDocker ``` + When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-avni:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-avni:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-avni:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-avni:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` + ## Testing #### Acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. To run your integration tests with Docker, run: + ``` ./acceptance-test-docker.sh ``` ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-avni:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-avni:integrationTest ``` ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing unit and integration tests. 1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)). 1. Create a Pull Request. diff --git a/airbyte-integrations/connectors/source-aws-cloudtrail/README.md b/airbyte-integrations/connectors/source-aws-cloudtrail/README.md index 7d47e723769..7b595768591 100644 --- a/airbyte-integrations/connectors/source-aws-cloudtrail/README.md +++ b/airbyte-integrations/connectors/source-aws-cloudtrail/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/aws-cloudtrail) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_aws_cloudtrail/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,10 +55,8 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - - - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -59,15 +64,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name source-aws-cloudtrail build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-aws-cloudtrail:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -87,6 +95,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -95,6 +104,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/source-aws-cloudtrail:latest @@ -105,16 +115,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/source-aws-cloudtrail:dev . # Running the spec command against your patched connector docker run airbyte/source-aws-cloudtrail:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-aws-cloudtrail:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aws-cloudtrail:dev check --config /secrets/config.json @@ -123,23 +138,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-aws-cloudtrail test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-aws-cloudtrail test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. diff --git a/airbyte-integrations/connectors/source-azure-blob-storage/README.md b/airbyte-integrations/connectors/source-azure-blob-storage/README.md index ac99c3b6e0b..4c5cd12283f 100644 --- a/airbyte-integrations/connectors/source-azure-blob-storage/README.md +++ b/airbyte-integrations/connectors/source-azure-blob-storage/README.md @@ -1,15 +1,14 @@ # Azure-Blob-Storage source connector - This is the repository for the Azure-Blob-Storage source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/azure-blob-storage). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Generate new oauth token @@ -17,35 +16,37 @@ Tenant id should be provided by user, reason: https://learn.microsoft.com/en-us/answers/questions/1531138/which-tenant-id-do-i-have-to-use-to-get-tokens-and 1. GET https://login.microsoftonline.com//oauth2/v2.0/authorize - ?response_type=code - &client_id= - &scope=offline_access https://storage.azure.com/.default - &redirect_uri=http://localhost:8000/auth_flow - &response_mode=query - &state=1234 + ?response_type=code + &client_id= + &scope=offline_access https://storage.azure.com/.default + &redirect_uri=http://localhost:8000/auth_flow + &response_mode=query + &state=1234 2. POST https://login.microsoftonline.com//oauth2/v2.0/token -client_id: -code: -redirect_uri:http://localhost:8000/auth_flow -grant_type:authorization_code -client_secret: + client_id: + code: + redirect_uri:http://localhost:8000/auth_flow + grant_type:authorization_code + client_secret: ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/azure-blob-storage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_azure_blob_storage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-azure-blob-storage spec poetry run source-azure-blob-storage check --config secrets/config.json @@ -54,23 +55,28 @@ poetry run source-azure-blob-storage read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-azure-blob-storage build ``` An image will be available on your host with the tag `airbyte/source-azure-blob-storage:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-azure-blob-storage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-blob-storage:dev check --config /secrets/config.json @@ -79,18 +85,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-azure-blob-storage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -98,14 +109,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-azure-blob-storage test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/azure-blob-storage.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-azure-table/README.md b/airbyte-integrations/connectors/source-azure-table/README.md index 8fb2ae68d6b..7b9adf11354 100644 --- a/airbyte-integrations/connectors/source-azure-table/README.md +++ b/airbyte-integrations/connectors/source-azure-table/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/azure-table) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_azure_table/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,18 +45,20 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json -python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json --state integration_tests/state.json +python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json --state integration_tests/state.json ``` ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-azure-table build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-azure-table build An image will be built with the tag `airbyte/source-azure-table:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-azure-table:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-azure-table:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-table:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-azure-table test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-azure-table test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-babelforce/README.md b/airbyte-integrations/connectors/source-babelforce/README.md index 7ae9fd8b12d..f33e6ebcb0a 100644 --- a/airbyte-integrations/connectors/source-babelforce/README.md +++ b/airbyte-integrations/connectors/source-babelforce/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/babelforce) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_babelforce/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-babelforce build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-babelforce build An image will be built with the tag `airbyte/source-babelforce:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-babelforce:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-babelforce:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-babelforce:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-babelforce test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-babelforce test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-bamboo-hr/README.md b/airbyte-integrations/connectors/source-bamboo-hr/README.md index 88838636298..6bd34064cca 100644 --- a/airbyte-integrations/connectors/source-bamboo-hr/README.md +++ b/airbyte-integrations/connectors/source-bamboo-hr/README.md @@ -1,31 +1,32 @@ # Bamboo-Hr source connector - This is the repository for the Bamboo-Hr source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/bamboo-hr). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/bamboo-hr) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_bamboo_hr/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-bamboo-hr spec poetry run source-bamboo-hr check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-bamboo-hr read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-bamboo-hr build ``` An image will be available on your host with the tag `airbyte/source-bamboo-hr:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-bamboo-hr:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bamboo-hr:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-bamboo-hr test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-bamboo-hr test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/bamboo-hr.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-bigcommerce/README.md b/airbyte-integrations/connectors/source-bigcommerce/README.md index 8ab2beb4e49..6c3c0f694cb 100644 --- a/airbyte-integrations/connectors/source-bigcommerce/README.md +++ b/airbyte-integrations/connectors/source-bigcommerce/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/bigcommerce) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_bigcommerce/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-bigcommerce build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-bigcommerce build An image will be built with the tag `airbyte/source-bigcommerce:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-bigcommerce:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-bigcommerce:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bigcommerce:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-bigcommerce test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-bigcommerce test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-bigquery/integration_tests/README.md b/airbyte-integrations/connectors/source-bigquery/integration_tests/README.md index 96aa5492669..9bf604a7f6c 100644 --- a/airbyte-integrations/connectors/source-bigquery/integration_tests/README.md +++ b/airbyte-integrations/connectors/source-bigquery/integration_tests/README.md @@ -1,3 +1,4 @@ # Seeding the dataset + You can find the SQL scripts in this folder if you need to create or fix the SAT dataset. For more instructions and information about valid scripts, please check this [doc](https://docs.google.com/document/d/1k5TvxaNhKdr44aJIHWWtLk14Tzd2gbNX-J8YNoTj8u0/edit#heading=h.ls9oiedt9wyy). diff --git a/airbyte-integrations/connectors/source-bing-ads/README.md b/airbyte-integrations/connectors/source-bing-ads/README.md index d8e88f8da26..f300f3394f9 100644 --- a/airbyte-integrations/connectors/source-bing-ads/README.md +++ b/airbyte-integrations/connectors/source-bing-ads/README.md @@ -1,31 +1,32 @@ # Bing-Ads source connector - This is the repository for the Bing-Ads source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/bing-ads). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/bing-ads) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_bing_ads/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-bing-ads spec poetry run source-bing-ads check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-bing-ads read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-bing-ads build ``` An image will be available on your host with the tag `airbyte/source-bing-ads:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-bing-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bing-ads:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-bing-ads test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-bing-ads test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/bing-ads.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-bing-ads/bootstrap.md b/airbyte-integrations/connectors/source-bing-ads/bootstrap.md index aa87f973f8f..efc8060ad0e 100644 --- a/airbyte-integrations/connectors/source-bing-ads/bootstrap.md +++ b/airbyte-integrations/connectors/source-bing-ads/bootstrap.md @@ -1,31 +1,30 @@ - ## Core streams Bing Ads is a SOAP based API. Connector is implemented with [SDK](https://github.com/BingAds/BingAds-Python-SDK) library Connector has such core streams, and all of them support full refresh only: -* [Account](https://docs.microsoft.com/en-us/advertising/customer-management-service/advertiseraccount?view=bingads-13) -* [Campaign](https://docs.microsoft.com/en-us/advertising/campaign-management-service/campaign?view=bingads-13) -* [AdGroup](https://docs.microsoft.com/en-us/advertising/campaign-management-service/getadgroupsbycampaignid?view=bingads-13) -* [Ad](https://docs.microsoft.com/en-us/advertising/campaign-management-service/getadsbyadgroupid?view=bingads-13) +- [Account](https://docs.microsoft.com/en-us/advertising/customer-management-service/advertiseraccount?view=bingads-13) +- [Campaign](https://docs.microsoft.com/en-us/advertising/campaign-management-service/campaign?view=bingads-13) +- [AdGroup](https://docs.microsoft.com/en-us/advertising/campaign-management-service/getadgroupsbycampaignid?view=bingads-13) +- [Ad](https://docs.microsoft.com/en-us/advertising/campaign-management-service/getadsbyadgroupid?view=bingads-13) ## Report streams Connector also has report streams, which support incremental sync. -* [AccountPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/accountperformancereportrequest?view=bingads-13) -* [AdPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/adperformancereportrequest?view=bingads-13) -* [AdGroupPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/adgroupperformancereportrequest?view=bingads-13) -* [CampaignPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/campaignperformancereportrequest?view=bingads-13) -* [BudgetSummaryReport](https://docs.microsoft.com/en-us/advertising/reporting-service/budgetsummaryreportrequest?view=bingads-13) -* [KeywordPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/keywordperformancereportrequest?view=bingads-13) +- [AccountPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/accountperformancereportrequest?view=bingads-13) +- [AdPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/adperformancereportrequest?view=bingads-13) +- [AdGroupPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/adgroupperformancereportrequest?view=bingads-13) +- [CampaignPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/campaignperformancereportrequest?view=bingads-13) +- [BudgetSummaryReport](https://docs.microsoft.com/en-us/advertising/reporting-service/budgetsummaryreportrequest?view=bingads-13) +- [KeywordPerformanceReport](https://docs.microsoft.com/en-us/advertising/reporting-service/keywordperformancereportrequest?view=bingads-13) To be able to pull report data you need to generate 2 separate requests. -* [First](https://docs.microsoft.com/en-us/advertising/reporting-service/submitgeneratereport?view=bingads-13) - to request appropriate report +- [First](https://docs.microsoft.com/en-us/advertising/reporting-service/submitgeneratereport?view=bingads-13) - to request appropriate report -* [Second](https://docs.microsoft.com/en-us/advertising/reporting-service/pollgeneratereport?view=bingads-13) - to poll acatual data. Report download timeout is 5 min +- [Second](https://docs.microsoft.com/en-us/advertising/reporting-service/pollgeneratereport?view=bingads-13) - to poll acatual data. Report download timeout is 5 min Initially all fields in report streams have string values, connector uses `reports.REPORT_FIELD_TYPES` collection to transform values to numerical fields if possible @@ -33,7 +32,7 @@ Connector uses `reports_start_date` config for initial reports sync and current Connector has `hourly_reports`, `daily_reports`, `weekly_reports`, `monthly_reports` report streams. For example `account_performance_report_daily`, `ad_group_performance_report_weekly`. All these reports streams will be generated on execute. -If `lookback_window` is set to a non-null value, initial reports sync will start at `reports_start_date - lookback_window`. Following reports sync will start at `cursor_value - lookback_window`. +If `lookback_window` is set to a non-null value, initial reports sync will start at `reports_start_date - lookback_window`. Following reports sync will start at `cursor_value - lookback_window`. ## Request caching @@ -41,8 +40,8 @@ Based on [library](https://vcrpy.readthedocs.io/en/latest/) Connector uses caching for these streams: -* Account -* Campaign -* AdGroup +- Account +- Campaign +- AdGroup See [this](https://docs.airbyte.io/integrations/sources/bing-ads) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-braintree/README.md b/airbyte-integrations/connectors/source-braintree/README.md index 5314c18967d..0d84b4a276d 100644 --- a/airbyte-integrations/connectors/source-braintree/README.md +++ b/airbyte-integrations/connectors/source-braintree/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/braintree) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_braintree/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-braintree build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-braintree build An image will be built with the tag `airbyte/source-braintree:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-braintree:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-braintree:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braintree:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-braintree test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-braintree test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-braze/README.md b/airbyte-integrations/connectors/source-braze/README.md index b8010776cb7..10d2a87683c 100644 --- a/airbyte-integrations/connectors/source-braze/README.md +++ b/airbyte-integrations/connectors/source-braze/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/braze) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_braze/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-braze build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-braze build An image will be built with the tag `airbyte/source-braze:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-braze:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-braze:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braze:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-braze test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-braze test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-breezometer/README.md b/airbyte-integrations/connectors/source-breezometer/README.md index 7da049dfa8d..6940fb5ea2e 100644 --- a/airbyte-integrations/connectors/source-breezometer/README.md +++ b/airbyte-integrations/connectors/source-breezometer/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/breezometer) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_breezometer/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-breezometer build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-breezometer build An image will be built with the tag `airbyte/source-breezometer:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-breezometer:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-breezometer:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-breezometer:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-breezometer test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-breezometer test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-callrail/README.md b/airbyte-integrations/connectors/source-callrail/README.md index 199429bab10..21244029eb9 100644 --- a/airbyte-integrations/connectors/source-callrail/README.md +++ b/airbyte-integrations/connectors/source-callrail/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/callrail) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_callrail/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-callrail build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-callrail build An image will be built with the tag `airbyte/source-callrail:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-callrail:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-callrail:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-callrail:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-callrail test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-callrail test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-captain-data/README.md b/airbyte-integrations/connectors/source-captain-data/README.md index 4ac93829acb..c9f5c0d6b8c 100644 --- a/airbyte-integrations/connectors/source-captain-data/README.md +++ b/airbyte-integrations/connectors/source-captain-data/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/captain-data) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_captain_data/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-captain-data build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-captain-data build An image will be built with the tag `airbyte/source-captain-data:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-captain-data:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-captain-data:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-captain-data:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-captain-data test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-captain-data test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-cart/BOOTSTRAP.md b/airbyte-integrations/connectors/source-cart/BOOTSTRAP.md index 67db894242b..85569882d8b 100644 --- a/airbyte-integrations/connectors/source-cart/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-cart/BOOTSTRAP.md @@ -2,12 +2,13 @@ Cart.com is a straightforward CRUD REST API. Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). -It consists of some REST resources like shopping_cart, users, products, etc… each of which have a list endpoint with a timestamp filter that can be used to perform incremental syncs. +It consists of some REST resources like shopping_cart, users, products, etc… each of which have a list endpoint with a timestamp filter that can be used to perform incremental syncs. -Auth uses a pre-created API token which can be created in the UI. -Pagination uses a cursor pagination strategy. -Rate limiting is just a standard exponential backoff when you see a 429 HTTP status code. +Auth uses a pre-created API token which can be created in the UI. +Pagination uses a cursor pagination strategy. +Rate limiting is just a standard exponential backoff when you see a 429 HTTP status code. See the links below for information about specific streams and some nuances about the connector: + - [information about streams](https://docs.google.com/spreadsheets/d/1s-MAwI5d3eBlBOD8II_sZM7pw5FmZtAJsx1KJjVRFNU/edit#gid=1796337932) (`Cart.com` tab) - [nuances about the connector](https://docs.airbyte.io/integrations/sources/cart) diff --git a/airbyte-integrations/connectors/source-cart/README.md b/airbyte-integrations/connectors/source-cart/README.md index c18dede39bd..3e847d94ced 100644 --- a/airbyte-integrations/connectors/source-cart/README.md +++ b/airbyte-integrations/connectors/source-cart/README.md @@ -1,31 +1,32 @@ # Cart source connector - This is the repository for the Cart source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/cart). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/cart) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_cart/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-cart spec poetry run source-cart check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-cart read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-cart build ``` An image will be available on your host with the tag `airbyte/source-cart:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-cart:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-cart:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-cart test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-cart test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/cart.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-chargebee/README.md b/airbyte-integrations/connectors/source-chargebee/README.md index 84399c6eb84..8b699aea5b8 100644 --- a/airbyte-integrations/connectors/source-chargebee/README.md +++ b/airbyte-integrations/connectors/source-chargebee/README.md @@ -1,31 +1,32 @@ # Chargebee source connector - This is the repository for the Chargebee source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/chargebee). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/chargebee) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_chargebee/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-chargebee spec poetry run source-chargebee check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-chargebee read --config secrets/config.json --catalog integrat ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-chargebee build ``` An image will be available on your host with the tag `airbyte/source-chargebee:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-chargebee:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargebee:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-chargebee test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-chargebee test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/chargebee.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-chargify/README.md b/airbyte-integrations/connectors/source-chargify/README.md index 5d4ab397f1e..a6f49562709 100644 --- a/airbyte-integrations/connectors/source-chargify/README.md +++ b/airbyte-integrations/connectors/source-chargify/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/chargify) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_chargify/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-chargify build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-chargify build An image will be built with the tag `airbyte/source-chargify:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-chargify:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-chargify:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargify:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-chargify test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-chargify test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-chartmogul/README.md b/airbyte-integrations/connectors/source-chartmogul/README.md index 5e1706e3c81..52c8f502f2e 100644 --- a/airbyte-integrations/connectors/source-chartmogul/README.md +++ b/airbyte-integrations/connectors/source-chartmogul/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python3 -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/chartmogul) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_chartmogul/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-chartmogul build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-chartmogul build An image will be built with the tag `airbyte/source-chartmogul:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-chartmogul:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-chartmogul:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chartmogul:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-chartmogul test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-chartmogul test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-chartmogul/bootstrap.md b/airbyte-integrations/connectors/source-chartmogul/bootstrap.md index c7dfde476b2..d47b7ef4222 100644 --- a/airbyte-integrations/connectors/source-chartmogul/bootstrap.md +++ b/airbyte-integrations/connectors/source-chartmogul/bootstrap.md @@ -1,22 +1,28 @@ # Chartmogul + Chartmogul is an online subscription analytics platform. It retrieves data from payment processors (e.g. Stripe) and makes sense out of it. ## Streams Connector currently implements following full refresh streams: -* [Customers](https://dev.chartmogul.com/reference/list-customers) -* [CustomerCount] (https://dev.chartmogul.com/reference/retrieve-customer-count) -* [Activities](https://dev.chartmogul.com/reference/list-activities) + +- [Customers](https://dev.chartmogul.com/reference/list-customers) +- [CustomerCount] (https://dev.chartmogul.com/reference/retrieve-customer-count) +- [Activities](https://dev.chartmogul.com/reference/list-activities) `start_date` config is used for retrieving `Activies`. `Customers` stream does not use this config. Even if it was possible to filter by `start_date`, it would cause issues when modeling data. That is because activies after `start_date` can be triggered by customers who were created way before that. ### Incremental streams + Incremental streams were not implemented due to following reasons: -* `Customers` API endpoint does not provide filtering by creation/update date. -* `Activities` API does provide pagination based on last entries UUID, however it is not stable, since it is possible to for activity to disappear retrospectively. + +- `Customers` API endpoint does not provide filtering by creation/update date. +- `Activities` API does provide pagination based on last entries UUID, however it is not stable, since it is possible to for activity to disappear retrospectively. ### Next steps + It is theoretically possible to make `Activities` stream incremental. One would need to keep track of both UUID and created_at and read stream until `datetime.now()`. Dynamic end date would be necessary since activities can also have a future date. Since data can be changed retrospectively, a `lookback window` would also be necessary to catch all the changes. ### Rate limits -The API rate limit is at 40 requests/second. Read [Rate Limits](https://dev.chartmogul.com/docs/rate-limits) for more informations. \ No newline at end of file + +The API rate limit is at 40 requests/second. Read [Rate Limits](https://dev.chartmogul.com/docs/rate-limits) for more informations. diff --git a/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/BOOTSTRAP.md b/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/BOOTSTRAP.md index 28b9e03fdd1..b2de5553780 100644 --- a/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/BOOTSTRAP.md @@ -3,7 +3,8 @@ ClickHouse is an open-source column-oriented DBMS for online analytical processing. ClickHouse was developed by the Russian IT company Yandex for the Yandex.Metrica web analytics service. There are roughly two kinds of queries allowed in Clickhouse: + 1. API based (not supported by airbyte) 2. JDBC based (used by airbyte). For more details please follow this [link](https://clickhouse.com/docs/en/interfaces/jdbc/). -Also make sure to read [the documentation](https://clickhouse.com/docs/en/) in its entirety to have a strong understanding of this important aspect of the product. +Also make sure to read [the documentation](https://clickhouse.com/docs/en/) in its entirety to have a strong understanding of this important aspect of the product. diff --git a/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/ReadMe.md b/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/ReadMe.md index 26ba470a99f..30b09e246b5 100644 --- a/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/ReadMe.md +++ b/airbyte-integrations/connectors/source-clickhouse-strict-encrypt/ReadMe.md @@ -5,5 +5,6 @@ In order to test the Clickhouse destination, you need to have the up and running This connector inherits the Clickhouse source, but support SSL connections only. # Integration tests + For ssl test custom image is used. To push it run this command under the tools\integration-tests-ssl dir: -*docker build -t your_user/clickhouse-with-ssl:dev -f Clickhouse.Dockerfile .* +_docker build -t your_user/clickhouse-with-ssl:dev -f Clickhouse.Dockerfile ._ diff --git a/airbyte-integrations/connectors/source-clickhouse/BOOTSTRAP.md b/airbyte-integrations/connectors/source-clickhouse/BOOTSTRAP.md index 28b9e03fdd1..b2de5553780 100644 --- a/airbyte-integrations/connectors/source-clickhouse/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-clickhouse/BOOTSTRAP.md @@ -3,7 +3,8 @@ ClickHouse is an open-source column-oriented DBMS for online analytical processing. ClickHouse was developed by the Russian IT company Yandex for the Yandex.Metrica web analytics service. There are roughly two kinds of queries allowed in Clickhouse: + 1. API based (not supported by airbyte) 2. JDBC based (used by airbyte). For more details please follow this [link](https://clickhouse.com/docs/en/interfaces/jdbc/). -Also make sure to read [the documentation](https://clickhouse.com/docs/en/) in its entirety to have a strong understanding of this important aspect of the product. +Also make sure to read [the documentation](https://clickhouse.com/docs/en/) in its entirety to have a strong understanding of this important aspect of the product. diff --git a/airbyte-integrations/connectors/source-clickhouse/ReadMe.md b/airbyte-integrations/connectors/source-clickhouse/ReadMe.md index c0e976415f2..55c60cd2cc0 100644 --- a/airbyte-integrations/connectors/source-clickhouse/ReadMe.md +++ b/airbyte-integrations/connectors/source-clickhouse/ReadMe.md @@ -1,3 +1,4 @@ # Integration tests + For ssl test custom image is used. To push it run this command under the tools\integration-tests-ssl dir: -*docker build -t your_user/clickhouse-with-ssl:dev -f Clickhouse.Dockerfile .* +_docker build -t your_user/clickhouse-with-ssl:dev -f Clickhouse.Dockerfile ._ diff --git a/airbyte-integrations/connectors/source-clickup-api/README.md b/airbyte-integrations/connectors/source-clickup-api/README.md index 155ef6c59a2..4f01e530810 100644 --- a/airbyte-integrations/connectors/source-clickup-api/README.md +++ b/airbyte-integrations/connectors/source-clickup-api/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/clickup-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_clickup_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-clickup-api build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-clickup-api build An image will be built with the tag `airbyte/source-clickup-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-clickup-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-clickup-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clickup-api:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-clickup-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-clickup-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-clockify/README.md b/airbyte-integrations/connectors/source-clockify/README.md index 79a28c0af1d..990499b33fc 100644 --- a/airbyte-integrations/connectors/source-clockify/README.md +++ b/airbyte-integrations/connectors/source-clockify/README.md @@ -1,31 +1,32 @@ # Clockify source connector - This is the repository for the Clockify source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/clockify). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/clockify) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_clockify/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-clockify spec poetry run source-clockify check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-clockify read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-clockify build ``` An image will be available on your host with the tag `airbyte/source-clockify:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-clockify:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clockify:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-clockify test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-clockify test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/clockify.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-close-com/README.md b/airbyte-integrations/connectors/source-close-com/README.md index 8dbd979e6b6..a60e2992a75 100644 --- a/airbyte-integrations/connectors/source-close-com/README.md +++ b/airbyte-integrations/connectors/source-close-com/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/close-com) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_close_com/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-close-com build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-close-com build An image will be built with the tag `airbyte/source-close-com:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-close-com:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-close-com:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-close-com:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-close-com test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-close-com test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-close-com/bootstrap.md b/airbyte-integrations/connectors/source-close-com/bootstrap.md index 9314a0f2bba..429ad5fff7e 100644 --- a/airbyte-integrations/connectors/source-close-com/bootstrap.md +++ b/airbyte-integrations/connectors/source-close-com/bootstrap.md @@ -4,9 +4,9 @@ The Close.com API allows users to retrieve information about leads, contacts, activities etc. **API** doc available [here](https://developer.close.com/). -Auth uses a pre-created API token which can be created in the UI. +Auth uses a pre-created API token which can be created in the UI. -In one case, `_skip` and `_limit` params are used for pagination. +In one case, `_skip` and `_limit` params are used for pagination. Some streams have `_limit` param (`number_of_items_per_page` variable in code) due to maximum Close.com limit of data per request. In other case, the `cursor_next` field from response is used for pagination in `_cursor` param. @@ -14,9 +14,9 @@ Rate limiting is just a standard exponential backoff when you see a 429 HTTP sta Some of streams supports Incremental sync. Incremental sync available when API endpoint supports one of query params: `date_created` or `date_updated`. -There are not `state_checkpoint_interval` for *activities* and *events* due to impossibility ordering data ascending. +There are not `state_checkpoint_interval` for _activities_ and _events_ due to impossibility ordering data ascending. -Also, Close.com source has general stream classes for *activities*, *tasks*, *custom fields*, *connected accounts*, and *bulk actions*. +Also, Close.com source has general stream classes for _activities_, _tasks_, _custom fields_, _connected accounts_, and _bulk actions_. It is implemented due to different schema for each of stream. See [this](https://docs.airbyte.io/integrations/sources/close-com) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-coda/README.md b/airbyte-integrations/connectors/source-coda/README.md index 848b2fc3d76..af6bdcbf457 100644 --- a/airbyte-integrations/connectors/source-coda/README.md +++ b/airbyte-integrations/connectors/source-coda/README.md @@ -1,31 +1,32 @@ # Coda source connector - This is the repository for the Coda source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/coda). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/coda) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_coda/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-coda spec poetry run source-coda check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-coda read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-coda build ``` An image will be available on your host with the tag `airbyte/source-coda:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-coda:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coda:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-coda test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-coda test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/coda.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-coin-api/README.md b/airbyte-integrations/connectors/source-coin-api/README.md index 9be80651b88..74a3b118399 100644 --- a/airbyte-integrations/connectors/source-coin-api/README.md +++ b/airbyte-integrations/connectors/source-coin-api/README.md @@ -1,31 +1,32 @@ # Coin-Api source connector - This is the repository for the Coin-Api source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/coin-api). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/coin-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_coin_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-coin-api spec poetry run source-coin-api check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-coin-api read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-coin-api build ``` An image will be available on your host with the tag `airbyte/source-coin-api:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-coin-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coin-api:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-coin-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-coin-api test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/coin-api.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-coingecko-coins/README.md b/airbyte-integrations/connectors/source-coingecko-coins/README.md index c35987386b9..42a4c38ffc8 100644 --- a/airbyte-integrations/connectors/source-coingecko-coins/README.md +++ b/airbyte-integrations/connectors/source-coingecko-coins/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/coingecko-coins) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_coingecko_coins/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-coingecko-coins build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-coingecko-coins build An image will be built with the tag `airbyte/source-coingecko-coins:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-coingecko-coins:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-coingecko-coins:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coingecko-coins:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-coingecko-coins test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-coingecko-coins test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-coinmarketcap/README.md b/airbyte-integrations/connectors/source-coinmarketcap/README.md index 85a5701114c..5f0c71b3276 100644 --- a/airbyte-integrations/connectors/source-coinmarketcap/README.md +++ b/airbyte-integrations/connectors/source-coinmarketcap/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/coinmarketcap) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_coinmarketcap/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-coinmarketcap build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-coinmarketcap build An image will be built with the tag `airbyte/source-coinmarketcap:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-coinmarketcap:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-coinmarketcap:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coinmarketcap:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-coinmarketcap test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-coinmarketcap test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-commcare/README.md b/airbyte-integrations/connectors/source-commcare/README.md index af931d5d8e5..86ffe4cf0f1 100644 --- a/airbyte-integrations/connectors/source-commcare/README.md +++ b/airbyte-integrations/connectors/source-commcare/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/commcare) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_commcare/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-commcare build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-commcare build An image will be built with the tag `airbyte/source-commcare:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-commcare:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-commcare:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commcare:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-commcare test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-commcare test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-commcare/source_commcare/schemas/TODO.md b/airbyte-integrations/connectors/source-commcare/source_commcare/schemas/TODO.md index cf1efadb3c9..0037aeb60d8 100644 --- a/airbyte-integrations/connectors/source-commcare/source_commcare/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-commcare/source_commcare/schemas/TODO.md @@ -1,20 +1,25 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). -The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. - +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. + The schema of a stream is the return value of `Stream.get_json_schema`. - + ## Static schemas + By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need. Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files. - + ## Dynamic schemas + If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org). -## Dynamically modifying static schemas -Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: +## Dynamically modifying static schemas + +Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: + ``` def get_json_schema(self): schema = super().get_json_schema() @@ -22,4 +27,4 @@ def get_json_schema(self): return schema ``` -Delete this file once you're done. Or don't. Up to you :) +Delete this file once you're done. Or don't. Up to you :) diff --git a/airbyte-integrations/connectors/source-commercetools/README.md b/airbyte-integrations/connectors/source-commercetools/README.md index 20b53f7ec0d..4b8a833509d 100644 --- a/airbyte-integrations/connectors/source-commercetools/README.md +++ b/airbyte-integrations/connectors/source-commercetools/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/commercetools) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_commercetools/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-commercetools build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-commercetools build An image will be built with the tag `airbyte/source-commercetools:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-commercetools:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-commercetools:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commercetools:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-commercetools test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-commercetools test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-configcat/README.md b/airbyte-integrations/connectors/source-configcat/README.md index 9de06ba31ce..e1132adae0a 100644 --- a/airbyte-integrations/connectors/source-configcat/README.md +++ b/airbyte-integrations/connectors/source-configcat/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/configcat) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_configcat/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-configcat build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-configcat build An image will be built with the tag `airbyte/source-configcat:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-configcat:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-configcat:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-configcat:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-configcat test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-configcat test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-confluence/README.md b/airbyte-integrations/connectors/source-confluence/README.md index 0546e4376f3..f2ec95bf7c9 100644 --- a/airbyte-integrations/connectors/source-confluence/README.md +++ b/airbyte-integrations/connectors/source-confluence/README.md @@ -1,31 +1,32 @@ # Confluence source connector - This is the repository for the Confluence source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/confluence). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/confluence) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_confluence/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-confluence spec poetry run source-confluence check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-confluence read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-confluence build ``` An image will be available on your host with the tag `airbyte/source-confluence:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-confluence:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-confluence:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-confluence test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-confluence test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/confluence.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-convertkit/README.md b/airbyte-integrations/connectors/source-convertkit/README.md index d9e9cf2ac88..cb9a1cfc01e 100644 --- a/airbyte-integrations/connectors/source-convertkit/README.md +++ b/airbyte-integrations/connectors/source-convertkit/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/convertkit) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_convertkit/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-convertkit build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-convertkit build An image will be built with the tag `airbyte/source-convertkit:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-convertkit:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-convertkit:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convertkit:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-convertkit test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-convertkit test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-convex/README.md b/airbyte-integrations/connectors/source-convex/README.md index 50d5c8f1977..84fe147cce5 100644 --- a/airbyte-integrations/connectors/source-convex/README.md +++ b/airbyte-integrations/connectors/source-convex/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/convex) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_convex/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-convex build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-convex build An image will be built with the tag `airbyte/source-convex:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-convex:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-convex:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-convex test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-convex test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-copper/README.md b/airbyte-integrations/connectors/source-copper/README.md index 2fb6d154486..da98444f091 100644 --- a/airbyte-integrations/connectors/source-copper/README.md +++ b/airbyte-integrations/connectors/source-copper/README.md @@ -1,31 +1,32 @@ # Copper source connector - This is the repository for the Copper source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/copper). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/copper) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_copper/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-copper spec poetry run source-copper check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-copper read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-copper build ``` An image will be available on your host with the tag `airbyte/source-copper:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-copper:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-copper:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-copper test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-copper test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/copper.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-customer-io/README.md b/airbyte-integrations/connectors/source-customer-io/README.md index 0f0790855f0..2994836d17a 100644 --- a/airbyte-integrations/connectors/source-customer-io/README.md +++ b/airbyte-integrations/connectors/source-customer-io/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/customer-io) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_customer_io/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-customer-io build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-customer-io build An image will be built with the tag `airbyte/source-customer-io:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-customer-io:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-customer-io:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-customer-io test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-customer-io test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-datadog/README.md b/airbyte-integrations/connectors/source-datadog/README.md index 1cad0882c4f..224b773f636 100644 --- a/airbyte-integrations/connectors/source-datadog/README.md +++ b/airbyte-integrations/connectors/source-datadog/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/datadog) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_datadog/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-datadog build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-datadog build An image will be built with the tag `airbyte/source-datadog:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-datadog:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-datadog:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datadog:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-datadog test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-datadog test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-datascope/BOOTSTRAP.md b/airbyte-integrations/connectors/source-datascope/BOOTSTRAP.md index 68a89e9425a..46a36c3bae2 100644 --- a/airbyte-integrations/connectors/source-datascope/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-datascope/BOOTSTRAP.md @@ -1,10 +1,12 @@ -# DataScope +# DataScope + DataScope is a mobile solution that helps you collect data offline, manage field teams, and share business insights. Use the intuitive Form Builder to create your forms, and then analyze the data you've collected via powerful and personalized dashboards. The streams implemented allows you to pull data from the following DataScope objects: -- Locations -- Answers + +- Locations +- Answers - Lists - Notifications -For more information about the DataScope API, see the [DataScope API documentation](https://dscope.github.io/docs/). \ No newline at end of file +For more information about the DataScope API, see the [DataScope API documentation](https://dscope.github.io/docs/). diff --git a/airbyte-integrations/connectors/source-datascope/README.md b/airbyte-integrations/connectors/source-datascope/README.md index 226989dc0f6..ff706a00ec4 100644 --- a/airbyte-integrations/connectors/source-datascope/README.md +++ b/airbyte-integrations/connectors/source-datascope/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/datascope) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_datascope/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-datascope build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-datascope build An image will be built with the tag `airbyte/source-datascope:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-datascope:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-datascope:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datascope:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-datascope test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-datascope test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-db2/CHANGELOG.md b/airbyte-integrations/connectors/source-db2/CHANGELOG.md index 2482683b688..d833db383b4 100644 --- a/airbyte-integrations/connectors/source-db2/CHANGELOG.md +++ b/airbyte-integrations/connectors/source-db2/CHANGELOG.md @@ -1,4 +1,5 @@ # Changelog ## 0.1.0 + Initial Release. diff --git a/airbyte-integrations/connectors/source-db2/README.md b/airbyte-integrations/connectors/source-db2/README.md index d4606b29c32..5142fdcc4c9 100644 --- a/airbyte-integrations/connectors/source-db2/README.md +++ b/airbyte-integrations/connectors/source-db2/README.md @@ -1,10 +1,11 @@ # IBM DB2 Source ## Documentation -* [User Documentation](https://docs.airbyte.io/integrations/sources/db2) +- [User Documentation](https://docs.airbyte.io/integrations/sources/db2) ## Integration tests + For acceptance tests run `./gradlew :airbyte-integrations:connectors:db2:integrationTest` diff --git a/airbyte-integrations/connectors/source-declarative-manifest/README.md b/airbyte-integrations/connectors/source-declarative-manifest/README.md index f7cfc6b502b..c0b1466d7c3 100644 --- a/airbyte-integrations/connectors/source-declarative-manifest/README.md +++ b/airbyte-integrations/connectors/source-declarative-manifest/README.md @@ -11,25 +11,27 @@ an interface to the low-code CDK and as such, should not be modified without a c ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install ``` - ### Create credentials + The credentials for source-declarative-manifest are a little different. Your `config` will need to contain the injected declarative manifest, as indicated in the `spec`. It will also need to contain the fields that the spec coming out of the manifest requires. An example is available in `integration_tests/pokeapi_config.json`. To use this example in the following instructions, copy this file to `secrets/config.json`. - ### Locally running the connector + ``` poetry run source-declarative-manifest spec poetry run source-declarative-manifest check --config secrets/config.json @@ -38,23 +40,28 @@ poetry run source-declarative-manifest read --config secrets/config.json --catal ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-declarative-manifest build ``` An image will be available on your host with the tag `airbyte/source-declarative-manifest:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-declarative-manifest:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-declarative-manifest:dev check --config /secrets/config.json @@ -63,22 +70,25 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-declarative-manifest test ``` + This source does not currently pass the full test suite. - ### Dependency Management + The manifest declarative source is built to be an interface to the low-code CDK source. This means that this source should not have any production dependencies other than the Airbyte Python CDK. If for some reason you feel that a new dependency is needed, you likely want to add it to the CDK instead. It is expected that a given version of the source-declarative-manifest connector corresponds to the same version in its CDK dependency. - ## Publishing a new version of the connector + New versions of this connector should only be published (automatically) via the manual Airbyte CDK release process. If you want to make a change to this connector that is not a result of a CDK change and a corresponding -CDK dependency bump, please reach out to the Connector Extensibility team for guidance. \ No newline at end of file +CDK dependency bump, please reach out to the Connector Extensibility team for guidance. diff --git a/airbyte-integrations/connectors/source-delighted/README.md b/airbyte-integrations/connectors/source-delighted/README.md index 30071549b3d..d5f1d803fcc 100644 --- a/airbyte-integrations/connectors/source-delighted/README.md +++ b/airbyte-integrations/connectors/source-delighted/README.md @@ -1,31 +1,32 @@ # Delighted source connector - This is the repository for the Delighted source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/delighted). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/delighted) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_delighted/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-delighted spec poetry run source-delighted check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-delighted read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-delighted build ``` An image will be available on your host with the tag `airbyte/source-delighted:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-delighted:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-delighted:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-delighted test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-delighted test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/delighted.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-dixa/README.md b/airbyte-integrations/connectors/source-dixa/README.md index a1ad889ba93..00603e902b0 100644 --- a/airbyte-integrations/connectors/source-dixa/README.md +++ b/airbyte-integrations/connectors/source-dixa/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/dixa) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_dixa/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-dixa build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-dixa build An image will be built with the tag `airbyte/source-dixa:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-dixa:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-dixa:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dixa:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-dixa test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-dixa test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-dockerhub/README.md b/airbyte-integrations/connectors/source-dockerhub/README.md index a96b26c8aa6..27e86a9803b 100644 --- a/airbyte-integrations/connectors/source-dockerhub/README.md +++ b/airbyte-integrations/connectors/source-dockerhub/README.md @@ -1,31 +1,32 @@ # Dockerhub source connector - This is the repository for the Dockerhub source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/dockerhub). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/dockerhub) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_dockerhub/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-dockerhub spec poetry run source-dockerhub check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-dockerhub read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-dockerhub build ``` An image will be available on your host with the tag `airbyte/source-dockerhub:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-dockerhub:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dockerhub:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-dockerhub test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-dockerhub test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/dockerhub.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-dockerhub/bootstrap.md b/airbyte-integrations/connectors/source-dockerhub/bootstrap.md index 0c0f4fdec9b..71b972c4da7 100644 --- a/airbyte-integrations/connectors/source-dockerhub/bootstrap.md +++ b/airbyte-integrations/connectors/source-dockerhub/bootstrap.md @@ -11,4 +11,4 @@ If you are reading this in the future and need to expand this source connector t - Original notes: https://github.com/airbytehq/airbyte/issues/12773#issuecomment-1126785570 - Auth docs: https://docs.docker.com/registry/spec/auth/jwt/ - Might also want to use OAuth2: https://docs.docker.com/registry/spec/auth/oauth/ -- Scope docs: https://docs.docker.com/registry/spec/auth/scope/ \ No newline at end of file +- Scope docs: https://docs.docker.com/registry/spec/auth/scope/ diff --git a/airbyte-integrations/connectors/source-dremio/README.md b/airbyte-integrations/connectors/source-dremio/README.md index 4e6fe409067..63e5a3774ab 100644 --- a/airbyte-integrations/connectors/source-dremio/README.md +++ b/airbyte-integrations/connectors/source-dremio/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/dremio) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_dremio/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-dremio build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-dremio build An image will be built with the tag `airbyte/source-dremio:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-dremio:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-dremio:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dremio:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-dremio test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-dremio test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-drift/README.md b/airbyte-integrations/connectors/source-drift/README.md index 53a39fa050a..48cb161c2c6 100644 --- a/airbyte-integrations/connectors/source-drift/README.md +++ b/airbyte-integrations/connectors/source-drift/README.md @@ -1,31 +1,32 @@ # Drift source connector - This is the repository for the Drift source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/drift). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/drift) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_drift/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-drift spec poetry run source-drift check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-drift read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-drift build ``` An image will be available on your host with the tag `airbyte/source-drift:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-drift:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-drift:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-drift test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-drift test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/drift.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-dynamodb/README.md b/airbyte-integrations/connectors/source-dynamodb/README.md index 9923e01a6d0..9b08058f63f 100644 --- a/airbyte-integrations/connectors/source-dynamodb/README.md +++ b/airbyte-integrations/connectors/source-dynamodb/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-dynamodb:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-dynamodb:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-dynamodb:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-dynamodb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dynamodb:dev check --config /secrets/config.json @@ -38,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/...` -Place integration tests in `src/test-integration/...` +Place integration tests in `src/test-integration/...` #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/sources/dynamodbSourceAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-dynamodb:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-dynamodb:integrationTest ``` @@ -62,7 +76,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-dynamodb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -70,4 +86,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-e2e-test-cloud/README.md b/airbyte-integrations/connectors/source-e2e-test-cloud/README.md index bf16bd7df3a..2dab02485c0 100644 --- a/airbyte-integrations/connectors/source-e2e-test-cloud/README.md +++ b/airbyte-integrations/connectors/source-e2e-test-cloud/README.md @@ -5,27 +5,34 @@ This is the Cloud variant of the [E2E Test Source](https://docs.airbyte.io/integ ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-e2e-test-cloud:build ``` #### Create credentials -No credential is needed for this connector. + +No credential is needed for this connector. ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-e2e-test-cloud:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-e2e-test-cloud:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-e2e-test-cloud:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-e2e-test-cloud:dev check --config /secrets/config.json @@ -34,25 +41,33 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` #### Cloud variant + The cloud version of this connector only allows the `CONTINUOUS FEED` mode. When this mode is changed, please make sure that the cloud variant is updated and published accordingly as well. ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/sources/e2e-test`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. See example(s) in `src/test-integration/java/io/airbyte/integrations/sources/e2e-test/`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:sources-e2e-test:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:sources-e2e-test:integrationTest ``` @@ -60,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-e2e-test-cloud test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -68,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-e2e-test/README.md b/airbyte-integrations/connectors/source-e2e-test/README.md index a75d586bbcf..c84f4df97bc 100644 --- a/airbyte-integrations/connectors/source-e2e-test/README.md +++ b/airbyte-integrations/connectors/source-e2e-test/README.md @@ -3,36 +3,45 @@ This is the repository for the mock source connector in Java. For information about how to use this connector within Airbyte, see [the User Documentation](https://docs.airbyte.io/integrations/sources/e2e-test) ## Mock Json record generation + The [airbytehq/jsongenerator](https://github.com/airbytehq/jsongenerator) is used to generate random Json records based on the specified Json schema. This library is forked from [jimblackler/jsongenerator](https://github.com/jimblackler/jsongenerator) authored by [Jim Blackler](https://github.com/jimblackler) and licensed under Apache 2.0. Although this library seems to be the best one available for Json generation in Java, it has two downsides. - - It relies on JavaScript inside Java (through `org.mozilla:rhino-engine`), and fetches remote JavaScript snippets (in the [PatternReverser](https://github.com/jimblackler/jsongenerator/blob/master/src/main/java/net/jimblackler/jsongenerator/PatternReverser.java)). - - It does not allow customization of individual field. The generated Json object can be seemingly garbled. We may use libraries such as [java-faker](https://github.com/DiUS/java-faker) in the future to argument it. + +- It relies on JavaScript inside Java (through `org.mozilla:rhino-engine`), and fetches remote JavaScript snippets (in the [PatternReverser](https://github.com/jimblackler/jsongenerator/blob/master/src/main/java/net/jimblackler/jsongenerator/PatternReverser.java)). +- It does not allow customization of individual field. The generated Json object can be seemingly garbled. We may use libraries such as [java-faker](https://github.com/DiUS/java-faker) in the future to argument it. ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-e2e-test:build ``` #### Create credentials -No credential is needed for this connector. + +No credential is needed for this connector. ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-e2e-test:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-e2e-test:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-e2e-test:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-e2e-test:dev check --config /secrets/config.json @@ -41,25 +50,33 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` #### Cloud variant + The cloud version of this connector only allows the `CONTINUOUS FEED` mode. When this mode is changed, please make sure that the cloud variant is updated and published accordingly as well. ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/sources/e2e-test`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. See example(s) in `src/test-integration/java/io/airbyte/integrations/sources/e2e-test/`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:sources-e2e-test:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:sources-e2e-test:integrationTest ``` @@ -67,7 +84,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-e2e-test test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -75,4 +94,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-elasticsearch/README.md b/airbyte-integrations/connectors/source-elasticsearch/README.md index 5b12fbcb1a6..44239861f70 100644 --- a/airbyte-integrations/connectors/source-elasticsearch/README.md +++ b/airbyte-integrations/connectors/source-elasticsearch/README.md @@ -6,29 +6,37 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-elasticsearch:build ``` #### Create credentials + Credentials can be provided in three ways: + 1. Basic -2. +2. ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-elasticsearch:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-elasticsearch:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-elasticsearch:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-elasticsearch:dev check --config /secrets/config.json @@ -37,25 +45,33 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` #### Sync Mode Support + Current version of this connector only allows the `FULL REFRESH` mode. ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/sources/elasticsearch-test`. #### Acceptance Tests + Airbyte has a standard test suite that all destination connectors must pass. See example(s) in `src/test-integration/java/io/airbyte/integrations/sources/elasticsearch/`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-elasticsearch:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-elasticsearch:integrationTest ``` @@ -63,7 +79,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-elasticsearch test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -71,4 +89,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-emailoctopus/BOOTSTRAP.md b/airbyte-integrations/connectors/source-emailoctopus/BOOTSTRAP.md index 235e73bf8af..60ca77cf2f2 100644 --- a/airbyte-integrations/connectors/source-emailoctopus/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-emailoctopus/BOOTSTRAP.md @@ -3,6 +3,7 @@ EmailOctopus is an email marketing tool. Link to API [here](https://emailoctopus.com/api-documentation). ## How to get an API key + - [Sign up for EmailOctopus](https://emailoctopus.com/account/sign-up). I recall there is a verification process that involves speaking with support staff. - Pricing is volume-based, so a sandbox account should be free: see [Pricing](https://emailoctopus.com/pricing). -- Once signed in, generate an API key from the [API documentation page](https://emailoctopus.com/api-documentation). \ No newline at end of file +- Once signed in, generate an API key from the [API documentation page](https://emailoctopus.com/api-documentation). diff --git a/airbyte-integrations/connectors/source-emailoctopus/README.md b/airbyte-integrations/connectors/source-emailoctopus/README.md index 9594545c144..f1a14fb0118 100644 --- a/airbyte-integrations/connectors/source-emailoctopus/README.md +++ b/airbyte-integrations/connectors/source-emailoctopus/README.md @@ -1,31 +1,32 @@ # Emailoctopus source connector - This is the repository for the Emailoctopus source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/emailoctopus). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/emailoctopus) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_emailoctopus/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-emailoctopus spec poetry run source-emailoctopus check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-emailoctopus read --config secrets/config.json --catalog sampl ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-emailoctopus build ``` An image will be available on your host with the tag `airbyte/source-emailoctopus:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-emailoctopus:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-emailoctopus:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-emailoctopus test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-emailoctopus test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/emailoctopus.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-everhour/README.md b/airbyte-integrations/connectors/source-everhour/README.md index c33ef8dcf44..ba6fa59ae8f 100644 --- a/airbyte-integrations/connectors/source-everhour/README.md +++ b/airbyte-integrations/connectors/source-everhour/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/everhour) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_everhour/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-everhour build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-everhour build An image will be built with the tag `airbyte/source-everhour:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-everhour:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-everhour:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-everhour:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-everhour test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-everhour test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-exchange-rates/README.md b/airbyte-integrations/connectors/source-exchange-rates/README.md index 522c6992526..5a582d2ae61 100644 --- a/airbyte-integrations/connectors/source-exchange-rates/README.md +++ b/airbyte-integrations/connectors/source-exchange-rates/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/exchange-rates) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_exchange_rates/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-exchange-rates build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-exchange-rates build An image will be built with the tag `airbyte/source-exchange-rates:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-exchange-rates:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-exchange-rates:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-exchange-rates:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-exchange-rates test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-exchange-rates test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-facebook-marketing/BOOTSTRAP.md b/airbyte-integrations/connectors/source-facebook-marketing/BOOTSTRAP.md index f45c605f1f8..81aecba3496 100644 --- a/airbyte-integrations/connectors/source-facebook-marketing/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-facebook-marketing/BOOTSTRAP.md @@ -1,18 +1,21 @@ # Facebook Marketing The Facebook Marketing API allows a developer to retrieve information about a user’s marketing endeavors on the Facebook platform. Some example use cases: + - Retrieve the performance of the ad campaigns in the user’s account - Retrieve all ad campaigns that a user has run in the past There are roughly two kinds of queries we’d be interested in making to Facebook Marketing API: + 1. Obtain attributes about entities in the API e.g: what campaigns did we run, what ads, etc… 2. Obtain statistics about ad campaigns e.g: how many people saw them, how many people bought products as a result, etc... This is the most common use case for the API, known as [insights](https://developers.facebook.com/docs/marketing-api/insights). In general when querying the FB API for insights there are a few things to keep in mind: + - You can input [parameters](https://developers.facebook.com/docs/marketing-api/insights/parameters) to control which response you get e.g: you can get statistics at the level of an ad, ad group, campaign, or ad account - An important parameter you can configure is [fields](https://developers.facebook.com/docs/marketing-api/insights/fields), which controls which information is included in the response. For example, if you include “campaign.title” as a field, you will receive the title of that campaign in the response. When fields is not specified, many endpoints return a minimal set of fields. -- Data can be segmented using [breakdowns](https://developers.facebook.com/docs/marketing-api/insights/breakdowns) i.e: you can either get the number of impressions for a campaign as a single number or you can get it broken down by device, gender, or country of the person viewing the advertisement. Make sure to read the provided link about breakdowns in its entirety to understand +- Data can be segmented using [breakdowns](https://developers.facebook.com/docs/marketing-api/insights/breakdowns) i.e: you can either get the number of impressions for a campaign as a single number or you can get it broken down by device, gender, or country of the person viewing the advertisement. Make sure to read the provided link about breakdowns in its entirety to understand -Also make sure to read [this overview of insights](https://developers.facebook.com/docs/marketing-api/insights) in its entirety to have a strong understanding of this important aspect of the API. +Also make sure to read [this overview of insights](https://developers.facebook.com/docs/marketing-api/insights) in its entirety to have a strong understanding of this important aspect of the API. See [this](https://docs.airbyte.io/integrations/sources/facebook-marketing) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-facebook-marketing/README.md b/airbyte-integrations/connectors/source-facebook-marketing/README.md index 1d2f74775df..23976b99682 100644 --- a/airbyte-integrations/connectors/source-facebook-marketing/README.md +++ b/airbyte-integrations/connectors/source-facebook-marketing/README.md @@ -1,31 +1,32 @@ # Facebook-Marketing source connector - This is the repository for the Facebook-Marketing source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/facebook-marketing). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/facebook-marketing) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_facebook_marketing/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-facebook-marketing spec poetry run source-facebook-marketing check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-facebook-marketing read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-facebook-marketing build ``` An image will be available on your host with the tag `airbyte/source-facebook-marketing:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-facebook-marketing:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-marketing:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-facebook-marketing test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-facebook-marketing test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/facebook-marketing.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-facebook-marketing/source_facebook_marketing/README.md b/airbyte-integrations/connectors/source-facebook-marketing/source_facebook_marketing/README.md index 3f903e93eb1..c527b3dec77 100644 --- a/airbyte-integrations/connectors/source-facebook-marketing/source_facebook_marketing/README.md +++ b/airbyte-integrations/connectors/source-facebook-marketing/source_facebook_marketing/README.md @@ -4,6 +4,7 @@ - source.py - mainly check and discovery logic - spec.py - connector's specification - streams/ - everything related to streams, usually it is a module, but we have too much for one file + - base_streams.py - all general logic should go there, you define class of streams as general as possible - streams.py - concrete classes, one for each stream, here should be only declarative logic and small overrides - base_insights_streams.py - piece of general logic for big subclass of streams - insight streams @@ -17,24 +18,29 @@ ## API FB Marketing API provides three ways to interact: + - single request - batch request - async request -FB provides a `facebook_business` library, which is an auto generated code from their API spec. +FB provides a `facebook_business` library, which is an auto generated code from their API spec. We use it because it provides: + - nice error handling - batch requests helpers - auto serialize/de-serialize responses to FB objects - transparently iterates over paginated response ## Single request + Is the most common way to request something. We use the two-steps strategy to read most of the data: + 1. first request to get list of IDs (filtered by cursor if supported) 2. loop over list of ids and request details for each ID, this step sometimes use batch request -## Batch request +## Batch request + is a batch of requests serialized in the body of a single request. The response of such request will be a list of responses for each individual request (body, headers, etc). FB lib use interface with callbacks, batch object will call corresponding (success or failure) callback for each type of response. @@ -43,21 +49,25 @@ FB API limit number of requests in a single batch to 50. **Important note**: - Batch object doesn’t perform pagination of individual responses, + Batch object doesn’t perform pagination of individual responses, so you may lose data if the response have pagination. ## Async Request + FB recommends to use Async Requests when common requests begin to timeout. Async Request is a 3-step process: + - create async request - check its status (in a loop) - fetch response when status is done ### Combination with batch + Unfortunately all attempts to create multiple async requests in a single batch failed - `ObjectParser` from FB lib don’t know how to parse `AdReportRun` response. Instead, we use batch to check status of multiple async jobs at once (respecting batch limit of 50) ### Insights + We use Async Requests to read Insights, FB API for this called `AdReportRun`. Insights are reports based on ads performance, you can think about it as an SQL query: @@ -74,10 +84,10 @@ select from AdAccount(me) where start_date = …. and end_ FB will perform calculations on its backed with various complexity depending on fields we ask, most heavy fields are unique metrics: `unique_clicks`, `unique_actions`, etc. Additionally, Insights has fields that show stats from last N days, so-called attribution window, it can be `1d`, `7d`, and `28d`, by default we use all of them. -According to FB docs insights data can be changed up to 28 days after it has being published. +According to FB docs insights data can be changed up to 28 days after it has being published. That's why we re-read 28 days in the past from now each time we sync insight stream. -When amount of data and computation is too big for FB servers to handle the jobs start to failing. Throttle and call rate metrics don’t reflect this problem and can’t be used to monitor. +When amount of data and computation is too big for FB servers to handle the jobs start to failing. Throttle and call rate metrics don’t reflect this problem and can’t be used to monitor. Instead, we use the following technic. Taking into account that we group by ad we can safely change our from table to smaller dataset/edge_object (campaign, adset, ad). Empirically we figured out that account level insights contains data for all campaigns from last 28 days and, very rarely, campaigns that didn’t even start yet. @@ -92,7 +102,8 @@ create async job for account level insight for the day A get list of campaigns for last 28 day create async job for each campaign and day A ``` + If campaign-level async job fails second time we split it by `AdSets` or `Ads`. -Reports from users show that sometimes async job can stuck for very long time (hours+), +Reports from users show that sometimes async job can stuck for very long time (hours+), and because FB doesn’t provide any canceling API after 1 hour of waiting we start another job. diff --git a/airbyte-integrations/connectors/source-facebook-pages/README.md b/airbyte-integrations/connectors/source-facebook-pages/README.md index 9349965b5e1..e551e2c1d1c 100644 --- a/airbyte-integrations/connectors/source-facebook-pages/README.md +++ b/airbyte-integrations/connectors/source-facebook-pages/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/facebook-pages) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_facebook_pages/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,10 +17,8 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - - - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -27,15 +26,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name source-facebook-pages build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-facebook-pages:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -55,6 +57,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -63,6 +66,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/source-facebook-pages:latest @@ -73,16 +77,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/source-facebook-pages:dev . # Running the spec command against your patched connector docker run airbyte/source-facebook-pages:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-facebook-pages:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-pages:dev check --config /secrets/config.json @@ -91,23 +100,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-facebook-pages test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-facebook-pages test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. diff --git a/airbyte-integrations/connectors/source-faker/README.md b/airbyte-integrations/connectors/source-faker/README.md index e42d204e2dd..082aeb47b12 100644 --- a/airbyte-integrations/connectors/source-faker/README.md +++ b/airbyte-integrations/connectors/source-faker/README.md @@ -1,31 +1,32 @@ # Faker source connector - This is the repository for the Faker source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/faker). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/faker) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_faker/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-faker spec poetry run source-faker check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-faker read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-faker build ``` An image will be available on your host with the tag `airbyte/source-faker:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-faker:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-faker:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-faker test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-faker test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/faker.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-fastbill/README.md b/airbyte-integrations/connectors/source-fastbill/README.md index db4cb41ebb0..3b40ddf780b 100644 --- a/airbyte-integrations/connectors/source-fastbill/README.md +++ b/airbyte-integrations/connectors/source-fastbill/README.md @@ -1,31 +1,32 @@ # Fastbill source connector - This is the repository for the Fastbill source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/fastbill). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/fastbill) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_fastbill/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-fastbill spec poetry run source-fastbill check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-fastbill read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-fastbill build ``` An image will be available on your host with the tag `airbyte/source-fastbill:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-fastbill:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fastbill:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-fastbill test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-fastbill test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/fastbill.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-fauna/README.md b/airbyte-integrations/connectors/source-fauna/README.md index 5086b513087..7a94853cb0a 100644 --- a/airbyte-integrations/connectors/source-fauna/README.md +++ b/airbyte-integrations/connectors/source-fauna/README.md @@ -9,21 +9,25 @@ it. ## Running locally First, start a local fauna container: + ``` docker run --rm --name faunadb -p 8443:8443 fauna/faunadb ``` In another terminal, cd into the connector directory: + ``` cd airbyte-integrations/connectors/source-fauna ``` Once started the container is up, setup the database: + ``` fauna eval "$(cat examples/setup_database.fql)" --domain localhost --port 8443 --scheme http --secret secret ``` Finally, run the connector: + ``` python main.py spec python main.py check --config examples/config_localhost.json @@ -40,6 +44,7 @@ python main.py read --config examples/config_localhost.json --catalog examples/c ## Running the intergration tests First, cd into the connector directory: + ``` cd airbyte-integrations/connectors/source-fauna ``` @@ -47,16 +52,17 @@ cd airbyte-integrations/connectors/source-fauna The integration tests require a secret config.json. Ping me on slack to get this file. Once you have this file, put it in `secrets/config.json`. A sample of this file can be found at `examples/secret_config.json`. Once the file is created, build the connector: + ``` docker build . -t airbyte/source-fauna:dev ``` Now, run the integration tests: + ``` python -m pytest -p integration_tests.acceptance ``` - # Fauna Source This is the repository for the Fauna source connector, written in Python. @@ -65,22 +71,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -89,6 +100,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/fauna) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_fauna/spec.yaml` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -98,6 +110,7 @@ See `examples/secret_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -107,9 +120,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-fauna build ``` @@ -117,12 +131,15 @@ airbyte-ci connectors --name=source-fauna build An image will be built with the tag `airbyte/source-fauna:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-fauna:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-fauna:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fauna:dev check --config /secrets/config.json @@ -131,23 +148,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-fauna test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-fauna test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -155,4 +179,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-fauna/bootstrap.md b/airbyte-integrations/connectors/source-fauna/bootstrap.md index 50e11fe32be..0e86f032f8c 100644 --- a/airbyte-integrations/connectors/source-fauna/bootstrap.md +++ b/airbyte-integrations/connectors/source-fauna/bootstrap.md @@ -53,4 +53,3 @@ that users know the document has been deleted. Docs: [Events](https://docs.fauna.com/fauna/current/api/fql/functions/events?lang=python). - diff --git a/airbyte-integrations/connectors/source-fauna/overview.md b/airbyte-integrations/connectors/source-fauna/overview.md index 08d2b392e69..2737e3f2194 100644 --- a/airbyte-integrations/connectors/source-fauna/overview.md +++ b/airbyte-integrations/connectors/source-fauna/overview.md @@ -94,8 +94,7 @@ mode, you can easily query for documents that are present at a certain time. Fauna documents have a lot of extra types. These types need to be converted into the Airbyte JSON format. Below is an exhaustive list of how all fauna documents are converted. - -| Fauna Type | Format | Note | +| Fauna Type | Format | Note | | ------------- | ------------------------------------------------------------------- | -------------------------------------------------- | | Document Ref | `{ id: "id", "collection": "collection-name", "type": "document" }` | | | Other Ref | `{ id: "id", "type": "ref-type" }` | This includes collection refs, database refs, etc. | diff --git a/airbyte-integrations/connectors/source-file/README.md b/airbyte-integrations/connectors/source-file/README.md index c313bbb3a0f..ba699471d02 100644 --- a/airbyte-integrations/connectors/source-file/README.md +++ b/airbyte-integrations/connectors/source-file/README.md @@ -1,31 +1,32 @@ # File source connector - This is the repository for the File source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/file). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/file) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_file/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-file spec poetry run source-file check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-file read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-file build ``` An image will be available on your host with the tag `airbyte/source-file:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-file:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-file:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-file test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-file test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/file.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-firebase-realtime-database/README.md b/airbyte-integrations/connectors/source-firebase-realtime-database/README.md index f4be44977da..b7f17b454d0 100644 --- a/airbyte-integrations/connectors/source-firebase-realtime-database/README.md +++ b/airbyte-integrations/connectors/source-firebase-realtime-database/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/firebase-realtime-database) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_firebase_realtime_database/spec.yaml` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-firebase-realtime-database build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-firebase-realtime-database build An image will be built with the tag `airbyte/source-firebase-realtime-database:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-firebase-realtime-database:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-firebase-realtime-database:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebase-realtime-database:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-firebase-realtime-database test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-firebase-realtime-database test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-firebase-realtime-database/bootstrap.md b/airbyte-integrations/connectors/source-firebase-realtime-database/bootstrap.md index b2298d84069..71ac603e48a 100644 --- a/airbyte-integrations/connectors/source-firebase-realtime-database/bootstrap.md +++ b/airbyte-integrations/connectors/source-firebase-realtime-database/bootstrap.md @@ -1,23 +1,24 @@ ## Firebase Realtime Database database structure and API specification + Firebase Realtime Database’s database is a JSON tree. The database is specified by URL “https://{database-name}.firebaseio.com/”. If we have data in the database "https://my-database.firebaseio.com/" as below, ```json { - "my-data": { - "dinosaurs": { - "lambeosaurus": { - "height": 2.1, - "length": 12.5, - "weight": 5000 - }, - "stegosaurus": { - "height": 4, - "length": 9, - "weight": 2500 - } - } + "my-data": { + "dinosaurs": { + "lambeosaurus": { + "height": 2.1, + "length": 12.5, + "weight": 5000 + }, + "stegosaurus": { + "height": 4, + "length": 9, + "weight": 2500 + } } + } } ``` @@ -26,16 +27,16 @@ Then it returns data as follows, ```json { - "lambeosaurus": { - "height": 2.1, - "length": 12.5, - "weight": 5000 - }, - "stegosaurus": { - "height": 4, - "length": 9, - "weight": 2500 - } + "lambeosaurus": { + "height": 2.1, + "length": 12.5, + "weight": 5000 + }, + "stegosaurus": { + "height": 4, + "length": 9, + "weight": 2500 + } } ``` @@ -50,7 +51,9 @@ For example, in the above case, it emits records like below. The connector sync only one stream specified by the path user configured. In the above case, if user set database_name="my-database" and path="my-data/dinosaurs", the stream is "dinosaurs" only. ## Authentication + This connector authenticates with a Google Cloud's service-account with the "Firebase Realtime Database Viewer" roles, which grants permissions to read from Firebase Realtime Database. ## Source Acceptance Test specification + We register the test data in the database before executing the source acceptance test. The test data to be registered is `integration_tests/records.json`. We delete all records after test execution. Data registration and deletion are performed via REST API using curl, but since OAuth2 authentication is performed using a Google Cloud's service-account, an access token is obtained using the gcloud command. Therefore, these processes are executed on the `cloudsdktool/google-cloud-cli` container. diff --git a/airbyte-integrations/connectors/source-firebolt/README.md b/airbyte-integrations/connectors/source-firebolt/README.md index 6a517492a9e..2f338f540b5 100644 --- a/airbyte-integrations/connectors/source-firebolt/README.md +++ b/airbyte-integrations/connectors/source-firebolt/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/firebolt) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_firebolt/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-firebolt build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-firebolt build An image will be built with the tag `airbyte/source-firebolt:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-firebolt:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-firebolt:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebolt:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-firebolt test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-firebolt test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-firebolt/bootstrap.md b/airbyte-integrations/connectors/source-firebolt/bootstrap.md index 3635bc17e1e..89027b1feea 100644 --- a/airbyte-integrations/connectors/source-firebolt/bootstrap.md +++ b/airbyte-integrations/connectors/source-firebolt/bootstrap.md @@ -2,7 +2,7 @@ ## Overview -Firebolt is a cloud data warehouse purpose-built to provide sub-second analytics performance on massive, terabyte-scale data sets. +Firebolt is a cloud data warehouse purpose-built to provide sub-second analytics performance on massive, terabyte-scale data sets. Firebolt has two main concepts: Databases, which denote the storage of data and Engines, which describe the compute layer on top of a Database. @@ -15,7 +15,7 @@ This connector uses [firebolt-sdk](https://pypi.org/project/firebolt-sdk/), whic ## Notes -* External tables are not available as a source for performance reasons. -* Only Full reads are supported for now. -* Integration/Acceptance testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector. -* Pagination is not available at the moment so large enough data sets might cause out of memory errors \ No newline at end of file +- External tables are not available as a source for performance reasons. +- Only Full reads are supported for now. +- Integration/Acceptance testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector. +- Pagination is not available at the moment so large enough data sets might cause out of memory errors diff --git a/airbyte-integrations/connectors/source-flexport/README.md b/airbyte-integrations/connectors/source-flexport/README.md index 1ebd5343400..fc800ca8268 100644 --- a/airbyte-integrations/connectors/source-flexport/README.md +++ b/airbyte-integrations/connectors/source-flexport/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/flexport) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_flexport/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-flexport build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-flexport build An image will be built with the tag `airbyte/source-flexport:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-flexport:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-flexport:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-flexport:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-flexport test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-flexport test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-flexport/bootstrap.md b/airbyte-integrations/connectors/source-flexport/bootstrap.md index 8c4fbffc79e..148f5e83daf 100644 --- a/airbyte-integrations/connectors/source-flexport/bootstrap.md +++ b/airbyte-integrations/connectors/source-flexport/bootstrap.md @@ -4,10 +4,10 @@ Flexport is a straightforward CRUD REST [API](https://developers.flexport.com/s/ API documentation is either outdated or incomplete. The issues are following: -1) Some resources that get embedded by default are not documented at all. However, since the schema of all resources follows the same pattern, their schema can be easily deduced too. -2) The documentation doesn't specify which properties are nullable - trial and error is the only way to learn that. -3) Some properties' type is ambiguous, i.e., `create` action specifies a property as required while `read` returns a nullable value. -4) The type of some properties is mislabeled, e.g., `integer` instead of an actual `string` type. +1. Some resources that get embedded by default are not documented at all. However, since the schema of all resources follows the same pattern, their schema can be easily deduced too. +2. The documentation doesn't specify which properties are nullable - trial and error is the only way to learn that. +3. Some properties' type is ambiguous, i.e., `create` action specifies a property as required while `read` returns a nullable value. +4. The type of some properties is mislabeled, e.g., `integer` instead of an actual `string` type. Authentication uses a pre-created API token which can be [created in the UI](https://apidocs.flexport.com/reference/authentication). diff --git a/airbyte-integrations/connectors/source-freshcaller/README.md b/airbyte-integrations/connectors/source-freshcaller/README.md index f50f7789a1c..133d98974a8 100644 --- a/airbyte-integrations/connectors/source-freshcaller/README.md +++ b/airbyte-integrations/connectors/source-freshcaller/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/freshcaller) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_freshcaller/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-freshcaller build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-freshcaller build An image will be built with the tag `airbyte/source-freshcaller:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-freshcaller:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-freshcaller:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshcaller:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-freshcaller test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-freshcaller test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. diff --git a/airbyte-integrations/connectors/source-freshsales/README.md b/airbyte-integrations/connectors/source-freshsales/README.md index c4f2450e08e..a7e8beeee88 100644 --- a/airbyte-integrations/connectors/source-freshsales/README.md +++ b/airbyte-integrations/connectors/source-freshsales/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/freshsales) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_freshsales/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-freshsales build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-freshsales build An image will be built with the tag `airbyte/source-freshsales:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-freshsales:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-freshsales:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-freshsales test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-freshsales test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. diff --git a/airbyte-integrations/connectors/source-freshservice/README.md b/airbyte-integrations/connectors/source-freshservice/README.md index fb9083f71f7..a350a4d9d9b 100644 --- a/airbyte-integrations/connectors/source-freshservice/README.md +++ b/airbyte-integrations/connectors/source-freshservice/README.md @@ -1,31 +1,32 @@ # Freshservice source connector - This is the repository for the Freshservice source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/freshservice). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/freshservice) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_freshservice/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-freshservice spec poetry run source-freshservice check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-freshservice read --config secrets/config.json --catalog sampl ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-freshservice build ``` An image will be available on your host with the tag `airbyte/source-freshservice:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-freshservice:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshservice:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-freshservice test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-freshservice test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/freshservice.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-fullstory/README.md b/airbyte-integrations/connectors/source-fullstory/README.md index 8780501576a..2bc631aefa4 100644 --- a/airbyte-integrations/connectors/source-fullstory/README.md +++ b/airbyte-integrations/connectors/source-fullstory/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/fullstory) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_fullstory/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-fullstory build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-fullstory build An image will be built with the tag `airbyte/source-fullstory:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-fullstory:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-fullstory:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-fullstory test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-fullstory test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gainsight-px/README.md b/airbyte-integrations/connectors/source-gainsight-px/README.md index 5504ac34382..d5930670922 100644 --- a/airbyte-integrations/connectors/source-gainsight-px/README.md +++ b/airbyte-integrations/connectors/source-gainsight-px/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/gainsight-px) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gainsight_px/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-gainsight-px build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-gainsight-px build An image will be built with the tag `airbyte/source-gainsight-px:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-gainsight-px:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gainsight-px:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gainsight-px:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gainsight-px test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gainsight-px test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gcs/README.md b/airbyte-integrations/connectors/source-gcs/README.md index 119172d7c35..7f2cee475fc 100644 --- a/airbyte-integrations/connectors/source-gcs/README.md +++ b/airbyte-integrations/connectors/source-gcs/README.md @@ -1,31 +1,32 @@ # Gcs source connector - This is the repository for the Gcs source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/gcs). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/gcs) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gcs/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-gcs spec poetry run source-gcs check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-gcs read --config secrets/config.json --catalog sample_files/c ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-gcs build ``` An image will be available on your host with the tag `airbyte/source-gcs:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gcs:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gcs:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gcs test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gcs test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/gcs.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-genesys/README.md b/airbyte-integrations/connectors/source-genesys/README.md index 6bed9a552ae..6f6f93e36d0 100644 --- a/airbyte-integrations/connectors/source-genesys/README.md +++ b/airbyte-integrations/connectors/source-genesys/README.md @@ -4,24 +4,30 @@ This is the repository for the Genesys source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/genesys). We are using `OAuth2` as this is the only supported authentication method. + ## Local development ### Prerequisites + #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/genesys) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_genesys/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-genesys build ``` @@ -58,12 +67,15 @@ airbyte-ci connectors --name=source-genesys build An image will be built with the tag `airbyte/source-genesys:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-genesys:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-genesys:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-genesys:dev check --config /secrets/config.json @@ -72,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-genesys test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-genesys test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-getlago/README.md b/airbyte-integrations/connectors/source-getlago/README.md index 79ef2999b3c..80d9c6dccec 100644 --- a/airbyte-integrations/connectors/source-getlago/README.md +++ b/airbyte-integrations/connectors/source-getlago/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/getlago) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_getlago/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-getlago build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-getlago build An image will be built with the tag `airbyte/source-getlago:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-getlago:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-getlago:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-getlago:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-getlago test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-getlago test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-github/README.md b/airbyte-integrations/connectors/source-github/README.md index 1a60fa93802..9b1715e97dd 100644 --- a/airbyte-integrations/connectors/source-github/README.md +++ b/airbyte-integrations/connectors/source-github/README.md @@ -1,31 +1,32 @@ # Github source connector - This is the repository for the Github source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/github). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/github) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_github/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-github spec poetry run source-github check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-github read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-github build ``` An image will be available on your host with the tag `airbyte/source-github:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-github:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-github:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-github test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-github test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/github.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-github/fixtures/README.md b/airbyte-integrations/connectors/source-github/fixtures/README.md index 4f0f2bba7f1..bbbf51d6ae1 100644 --- a/airbyte-integrations/connectors/source-github/fixtures/README.md +++ b/airbyte-integrations/connectors/source-github/fixtures/README.md @@ -1,18 +1,25 @@ # Create Template GitHub Repository ## Pre requirements + ### 1. Create a repository on www.github.com + ### 2. Create an api key https://github.com/settings/tokens (select all checkboxes, with all checkboxes script will have all privileges and will not fail) --- + ### 1. Copy github-filler to another directory without any initialized repository + ### 2. Then just run and enter credentials + ./run.sh --- ## After all the steps, you will have a GitHub repository with data that covers almost all GitHub streams (in Airbyte connectors), but you will need to add some data manually. + 1. Collaborators (invite collaborators) 2. Asignees (asignee issues to collaborators) 3. Teams (create a teams inside organization) + ## All of this data can be generated through the GitHub site. diff --git a/airbyte-integrations/connectors/source-gitlab/README.md b/airbyte-integrations/connectors/source-gitlab/README.md index acb7de8147d..5c6036578f8 100644 --- a/airbyte-integrations/connectors/source-gitlab/README.md +++ b/airbyte-integrations/connectors/source-gitlab/README.md @@ -1,31 +1,32 @@ # Gitlab source connector - This is the repository for the Gitlab source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/gitlab). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/gitlab) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gitlab/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-gitlab spec poetry run source-gitlab check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-gitlab read --config secrets/config.json --catalog integration ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-gitlab build ``` An image will be available on your host with the tag `airbyte/source-gitlab:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gitlab:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gitlab:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gitlab test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gitlab test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/gitlab.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-glassfrog/README.md b/airbyte-integrations/connectors/source-glassfrog/README.md index 9687862a28f..b9b4e6ff438 100644 --- a/airbyte-integrations/connectors/source-glassfrog/README.md +++ b/airbyte-integrations/connectors/source-glassfrog/README.md @@ -1,31 +1,32 @@ # Glassfrog source connector - This is the repository for the Glassfrog source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/glassfrog). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/glassfrog) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_glassfrog/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-glassfrog spec poetry run source-glassfrog check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-glassfrog read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-glassfrog build ``` An image will be available on your host with the tag `airbyte/source-glassfrog:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-glassfrog:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-glassfrog:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-glassfrog test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-glassfrog test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/glassfrog.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-gnews/README.md b/airbyte-integrations/connectors/source-gnews/README.md index 69852a8d6d5..8bec0927560 100644 --- a/airbyte-integrations/connectors/source-gnews/README.md +++ b/airbyte-integrations/connectors/source-gnews/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/gnews) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gnews/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-gnews build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-gnews build An image will be built with the tag `airbyte/source-gnews:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-gnews:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gnews:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gnews:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gnews test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gnews test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gocardless/README.md b/airbyte-integrations/connectors/source-gocardless/README.md index 3698a91fb18..a953c197cfe 100644 --- a/airbyte-integrations/connectors/source-gocardless/README.md +++ b/airbyte-integrations/connectors/source-gocardless/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/gocardless) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gocardless/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-gocardless build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-gocardless build An image will be built with the tag `airbyte/source-gocardless:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-gocardless:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gocardless:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gocardless:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gocardless test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gocardless test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gong/README.md b/airbyte-integrations/connectors/source-gong/README.md index 009cb7b6ce7..c1419ad1593 100644 --- a/airbyte-integrations/connectors/source-gong/README.md +++ b/airbyte-integrations/connectors/source-gong/README.md @@ -1,31 +1,32 @@ # Gong source connector - This is the repository for the Gong source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/gong). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/gong) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gong/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-gong spec poetry run source-gong check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-gong read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-gong build ``` An image will be available on your host with the tag `airbyte/source-gong:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gong:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gong:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gong test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gong test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/gong.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-ads/BOOTSTRAP.md b/airbyte-integrations/connectors/source-google-ads/BOOTSTRAP.md index 4092d3c7075..6e1b1c86cb8 100644 --- a/airbyte-integrations/connectors/source-google-ads/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-google-ads/BOOTSTRAP.md @@ -1,25 +1,26 @@ # Google Ads -Link to API Docs is [here](https://developers.google.com/google-ads/api/docs/start). +Link to API Docs is [here](https://developers.google.com/google-ads/api/docs/start). The GAds API is basically a SQL interface on top of the Google Ads API resources. The reference for the SQL language (called GAQL) can be found [here](https://developers.google.com/google-ads/api/docs/query/overview). The resources are listed [here](https://developers.google.com/google-ads/api/reference/rpc/v8/overview). -When querying data, there are three categories of information that can be fetched: +When querying data, there are three categories of information that can be fetched: -- **Attributes**: These are properties of the various entities in the API e.g: the title or ID of an ad campaign. +- **Attributes**: These are properties of the various entities in the API e.g: the title or ID of an ad campaign. - **Metrics**: metrics are statistics related to entities in the API. For example, the number of impressions for an ad or an ad campaign. All available metrics can be found [here](https://developers.google.com/google-ads/api/fields/v15/metrics). -- **Segments**: These are ways to partition metrics returned in the query by particular attributes. For example, one could query for the number of impressions (views of an ad) by running SELECT -metrics.impressions FROM campaigns which would return the number of impressions for each campaign e.g: 10k impressions. Or you could query for impressions segmented by device type e.g; SELECT -metrics.impressions, segments.device FROM campaigns which would return the number of impressions broken down by device type e.g: 3k iOS and 7k Android. When summing the result across all segments, -the sum should be the same (approximately) as when requesting the whole query without segments. This is a useful feature for granular data analysis as an advertiser may for example want to know if -their ad is successful with a particular kind of person over the other. See more about segmentation [here](https://developers.google.com/google-ads/api/docs/concepts/retrieving-objects). +- **Segments**: These are ways to partition metrics returned in the query by particular attributes. For example, one could query for the number of impressions (views of an ad) by running SELECT + metrics.impressions FROM campaigns which would return the number of impressions for each campaign e.g: 10k impressions. Or you could query for impressions segmented by device type e.g; SELECT + metrics.impressions, segments.device FROM campaigns which would return the number of impressions broken down by device type e.g: 3k iOS and 7k Android. When summing the result across all segments, + the sum should be the same (approximately) as when requesting the whole query without segments. This is a useful feature for granular data analysis as an advertiser may for example want to know if + their ad is successful with a particular kind of person over the other. See more about segmentation [here](https://developers.google.com/google-ads/api/docs/concepts/retrieving-objects). -If you want to get a representation of the raw resources in the API e.g: just know what are all the ads or campaigns in your Google account, you would query only for attributes e.g. SELECT campaign.title FROM campaigns. +If you want to get a representation of the raw resources in the API e.g: just know what are all the ads or campaigns in your Google account, you would query only for attributes e.g. SELECT campaign.title FROM campaigns. But if you wanted to get reports about the data (a common use case is impression data for an ad campaign) then you would query for metrics, potentially with segmentation. See the links below for information about specific streams and some nuances about the connector: + - [information about streams](https://docs.google.com/spreadsheets/d/1s-MAwI5d3eBlBOD8II_sZM7pw5FmZtAJsx1KJjVRFNU/edit#gid=1796337932) (`Google Ads` tab) -- [nuances about the connector](https://docs.airbyte.io/integrations/sources/google-ads) \ No newline at end of file +- [nuances about the connector](https://docs.airbyte.io/integrations/sources/google-ads) diff --git a/airbyte-integrations/connectors/source-google-ads/README.md b/airbyte-integrations/connectors/source-google-ads/README.md index 57f91437a43..f178facd15b 100644 --- a/airbyte-integrations/connectors/source-google-ads/README.md +++ b/airbyte-integrations/connectors/source-google-ads/README.md @@ -1,31 +1,32 @@ # Google-Ads source connector - This is the repository for the Google-Ads source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-ads). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-ads) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_ads/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-ads spec poetry run source-google-ads check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-ads read --config secrets/config.json --catalog integra ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-ads build ``` An image will be available on your host with the tag `airbyte/source-google-ads:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-ads:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-ads test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-ads test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-ads.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-analytics-data-api/README.md b/airbyte-integrations/connectors/source-google-analytics-data-api/README.md index 85ddc7e2f33..c0964d617bc 100644 --- a/airbyte-integrations/connectors/source-google-analytics-data-api/README.md +++ b/airbyte-integrations/connectors/source-google-analytics-data-api/README.md @@ -1,31 +1,32 @@ # Google-Analytics-Data-Api source connector - This is the repository for the Google-Analytics-Data-Api source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-analytics-data-api). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-analytics-data-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_analytics_data_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-analytics-data-api spec poetry run source-google-analytics-data-api check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-analytics-data-api read --config secrets/config.json -- ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-analytics-data-api build ``` An image will be available on your host with the tag `airbyte/source-google-analytics-data-api:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-analytics-data-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-data-api:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-analytics-data-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-analytics-data-api test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-analytics-data-api.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-analytics-v4-service-account-only/README.md b/airbyte-integrations/connectors/source-google-analytics-v4-service-account-only/README.md index 2c931f8d643..0f6492a4a77 100644 --- a/airbyte-integrations/connectors/source-google-analytics-v4-service-account-only/README.md +++ b/airbyte-integrations/connectors/source-google-analytics-v4-service-account-only/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/google-analytics-v4) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_analytics_v4/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,8 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -58,15 +64,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name=source-google-analytics-v4-service-account-only build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-google-analytics-v4-service-account-only:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -86,6 +95,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -94,6 +104,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/source-google-analytics-v4-service-account-only:latest @@ -104,16 +115,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/source-google-analytics-v4-service-account-only:dev . # Running the spec command against your patched connector docker run airbyte/source-google-analytics-v4-service-account-only:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-analytics-v4-service-account-only:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-v4-service-account-only:dev check --config /secrets/config.json @@ -122,23 +138,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-analytics-v4 test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-analytics-v4-service-account-only test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -146,4 +169,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-google-analytics-v4/README.md b/airbyte-integrations/connectors/source-google-analytics-v4/README.md index 4a399dcb19d..414fb0a12e0 100644 --- a/airbyte-integrations/connectors/source-google-analytics-v4/README.md +++ b/airbyte-integrations/connectors/source-google-analytics-v4/README.md @@ -1,31 +1,32 @@ # Google-Analytics-V4 source connector - This is the repository for the Google-Analytics-V4 source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-analytics-v4). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-analytics-v4) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_analytics_v4/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-analytics-v4 spec poetry run source-google-analytics-v4 check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-analytics-v4 read --config secrets/config.json --catalo ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-analytics-v4 build ``` An image will be available on your host with the tag `airbyte/source-google-analytics-v4:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-analytics-v4:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-v4:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-analytics-v4 test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-analytics-v4 test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-analytics-v4.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-directory/README.md b/airbyte-integrations/connectors/source-google-directory/README.md index 103cf550af2..f2ca125e4a6 100644 --- a/airbyte-integrations/connectors/source-google-directory/README.md +++ b/airbyte-integrations/connectors/source-google-directory/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/freshsales) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_freshsales/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name source-freshsales build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name source-freshsales build An image will be built with the tag `airbyte/source-freshsales:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-freshsales:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-freshsales:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-directory test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-directory test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-google-drive/README.md b/airbyte-integrations/connectors/source-google-drive/README.md index c93c464c643..28de6a5501f 100644 --- a/airbyte-integrations/connectors/source-google-drive/README.md +++ b/airbyte-integrations/connectors/source-google-drive/README.md @@ -1,31 +1,32 @@ # Google Drive source connector - This is the repository for the Google Drive source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-drive). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-drive) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_drive/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-drive spec poetry run source-google-drive check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-drive read --config secrets/config.json --catalog sampl ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-drive build ``` An image will be available on your host with the tag `airbyte/source-google-drive:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-drive:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-drive:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-drive test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,11 +89,13 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-drive test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-drive.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). diff --git a/airbyte-integrations/connectors/source-google-pagespeed-insights/README.md b/airbyte-integrations/connectors/source-google-pagespeed-insights/README.md index 39b85dd9cce..8eedfa717d6 100644 --- a/airbyte-integrations/connectors/source-google-pagespeed-insights/README.md +++ b/airbyte-integrations/connectors/source-google-pagespeed-insights/README.md @@ -1,31 +1,32 @@ # Google-Pagespeed-Insights source connector - This is the repository for the Google-Pagespeed-Insights source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-pagespeed-insights). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-pagespeed-insights) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_pagespeed_insights/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-pagespeed-insights spec poetry run source-google-pagespeed-insights check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-pagespeed-insights read --config secrets/config.json -- ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-pagespeed-insights build ``` An image will be available on your host with the tag `airbyte/source-google-pagespeed-insights:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-pagespeed-insights:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-pagespeed-insights:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-pagespeed-insights test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-pagespeed-insights test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-pagespeed-insights.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-search-console/BOOTSTRAP.md b/airbyte-integrations/connectors/source-google-search-console/BOOTSTRAP.md index 5462f295c28..1215f2f57d8 100644 --- a/airbyte-integrations/connectors/source-google-search-console/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-google-search-console/BOOTSTRAP.md @@ -1,17 +1,17 @@ # Google Search Console -From [the docs](https://support.google.com/webmasters/answer/9128668?hl=en): +From [the docs](https://support.google.com/webmasters/answer/9128668?hl=en): -Google Search Console is a free service offered by Google that helps you monitor, maintain, and troubleshoot your site's presence in Google Search results. +Google Search Console is a free service offered by Google that helps you monitor, maintain, and troubleshoot your site's presence in Google Search results. Search Console offers tools and reports for the following actions: -* Confirm that Google can find and crawl your site. -* Fix indexing problems and request re-indexing of new or updated content. -* View Google Search traffic data for your site: how often your site appears in Google Search, which search queries show your site, how often searchers click through for those queries, and more. -* Receive alerts when Google encounters indexing, spam, or other issues on your site. -* Show you which sites link to your website. -* Troubleshoot issues for AMP, mobile usability, and other Search features. +- Confirm that Google can find and crawl your site. +- Fix indexing problems and request re-indexing of new or updated content. +- View Google Search traffic data for your site: how often your site appears in Google Search, which search queries show your site, how often searchers click through for those queries, and more. +- Receive alerts when Google encounters indexing, spam, or other issues on your site. +- Show you which sites link to your website. +- Troubleshoot issues for AMP, mobile usability, and other Search features. The API docs: https://developers.google.com/webmaster-tools/search-console-api-original/v3/parameters. @@ -21,26 +21,28 @@ The API docs: https://developers.google.com/webmaster-tools/search-console-api-o 2. [Sitemaps](https://developers.google.com/webmaster-tools/search-console-api-original/v3/sitemaps) – Full refresh 3. [Analytics](https://developers.google.com/webmaster-tools/search-console-api-original/v3/searchanalytics) – Full refresh, Incremental -There are multiple streams in the `Analytics` endpoint. -We have them because if we want to get all the data from the GSC (using the SearchAnalyticsAllFields stream), -we have to deal with a large dataset. +There are multiple streams in the `Analytics` endpoint. +We have them because if we want to get all the data from the GSC (using the SearchAnalyticsAllFields stream), +we have to deal with a large dataset. -In order to reduce the amount of data, and to retrieve a specific dataset (for example, to get country specific data) -we can use SearchAnalyticsByCountry. +In order to reduce the amount of data, and to retrieve a specific dataset (for example, to get country specific data) +we can use SearchAnalyticsByCountry. So each of the SearchAnalytics streams groups data by certain dimensions like date, country, page, etc. There are: - 1. SearchAnalyticsByDate - 2. SearchAnalyticsByCountry - 3. SearchAnalyticsByPage - 4. SearchAnalyticsByQuery - 5. SearchAnalyticsAllFields + +1. SearchAnalyticsByDate +2. SearchAnalyticsByCountry +3. SearchAnalyticsByPage +4. SearchAnalyticsByQuery +5. SearchAnalyticsAllFields ## Authorization There are 2 types of authorization `User Account` and `Service Account`. -To chose one we use an authorization field with the `oneOf` parameter in the `spec.json` file. +To chose one we use an authorization field with the `oneOf` parameter in the `spec.json` file. See the links below for information about specific streams and some nuances about the connector: + - [information about streams](https://docs.google.com/spreadsheets/d/1s-MAwI5d3eBlBOD8II_sZM7pw5FmZtAJsx1KJjVRFNU/edit#gid=1796337932) (`Google Search Console` tab) - [nuances about the connector](https://docs.airbyte.io/integrations/sources/google-search-console) diff --git a/airbyte-integrations/connectors/source-google-search-console/README.md b/airbyte-integrations/connectors/source-google-search-console/README.md index 6ed565336b9..dcaea569418 100755 --- a/airbyte-integrations/connectors/source-google-search-console/README.md +++ b/airbyte-integrations/connectors/source-google-search-console/README.md @@ -1,31 +1,32 @@ # Google-Search-Console source connector - This is the repository for the Google-Search-Console source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-search-console). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-search-console) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_search_console/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-search-console spec poetry run source-google-search-console check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-search-console read --config secrets/config.json --cata ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-search-console build ``` An image will be available on your host with the tag `airbyte/source-google-search-console:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-search-console:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-search-console:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-search-console test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-search-console test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-search-console.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-search-console/credentials/README.md b/airbyte-integrations/connectors/source-google-search-console/credentials/README.md index a7f999e226f..8d56355a167 100644 --- a/airbyte-integrations/connectors/source-google-search-console/credentials/README.md +++ b/airbyte-integrations/connectors/source-google-search-console/credentials/README.md @@ -6,4 +6,3 @@ 2. Fill the file `credentials.json` with your personal credentials from step 1. 3. Run the `./get_credentials.sh` script and follow the instructions. 4. Copy the `refresh_token` from the console. - diff --git a/airbyte-integrations/connectors/source-google-sheets/README.md b/airbyte-integrations/connectors/source-google-sheets/README.md index 5ee60ccc388..901c964107a 100644 --- a/airbyte-integrations/connectors/source-google-sheets/README.md +++ b/airbyte-integrations/connectors/source-google-sheets/README.md @@ -1,31 +1,32 @@ # Google-Sheets source connector - This is the repository for the Google-Sheets source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-sheets). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-sheets) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_sheets/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-sheets spec poetry run source-google-sheets check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-sheets read --config secrets/config.json --catalog samp ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-sheets build ``` An image will be available on your host with the tag `airbyte/source-google-sheets:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-sheets:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-sheets:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-sheets test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-sheets test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-sheets.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-webfonts/README.md b/airbyte-integrations/connectors/source-google-webfonts/README.md index 5300dad1475..2b625402023 100644 --- a/airbyte-integrations/connectors/source-google-webfonts/README.md +++ b/airbyte-integrations/connectors/source-google-webfonts/README.md @@ -1,31 +1,32 @@ # Google-Webfonts source connector - This is the repository for the Google-Webfonts source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/google-webfonts). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/google-webfonts) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_google_webfonts/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-google-webfonts spec poetry run source-google-webfonts check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-google-webfonts read --config secrets/config.json --catalog sa ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-google-webfonts build ``` An image will be available on your host with the tag `airbyte/source-google-webfonts:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-google-webfonts:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-webfonts:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-google-webfonts test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-google-webfonts test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/google-webfonts.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-google-webfonts/bootstrap.md b/airbyte-integrations/connectors/source-google-webfonts/bootstrap.md index 10cfa3b880b..cd3daf7e7ae 100644 --- a/airbyte-integrations/connectors/source-google-webfonts/bootstrap.md +++ b/airbyte-integrations/connectors/source-google-webfonts/bootstrap.md @@ -1,7 +1,7 @@ # Google-webfonts The connector uses the v1 API documented here: https://developers.google.com/fonts/docs/developer_api . It is -straightforward HTTP REST API with API authentication. +straightforward HTTP REST API with API authentication. ## API key @@ -32,7 +32,7 @@ Just pass the generated API key and optional parameters for establishing the con 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter your config params if needed. (Optional) -6. Click **Set up source**. +4. Enter your config params if needed. (Optional) +5. Click **Set up source**. - * We use only GET methods, towards the webfonts endpoints which is straightforward \ No newline at end of file +- We use only GET methods, towards the webfonts endpoints which is straightforward diff --git a/airbyte-integrations/connectors/source-greenhouse/README.md b/airbyte-integrations/connectors/source-greenhouse/README.md index 5061a5ae3f3..2f361144073 100644 --- a/airbyte-integrations/connectors/source-greenhouse/README.md +++ b/airbyte-integrations/connectors/source-greenhouse/README.md @@ -1,31 +1,32 @@ # Greenhouse source connector - This is the repository for the Greenhouse source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/greenhouse). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/greenhouse) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_greenhouse/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-greenhouse spec poetry run source-greenhouse check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-greenhouse read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-greenhouse build ``` An image will be available on your host with the tag `airbyte/source-greenhouse:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-greenhouse:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-greenhouse:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-greenhouse test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-greenhouse test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/greenhouse.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-gridly/README.md b/airbyte-integrations/connectors/source-gridly/README.md index f0693194776..c46ad5f929e 100644 --- a/airbyte-integrations/connectors/source-gridly/README.md +++ b/airbyte-integrations/connectors/source-gridly/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/gridly) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gridly/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-gridly build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-gridly build An image will be built with the tag `airbyte/source-gridly:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-gridly:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gridly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gridly:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gridly test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gridly test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gutendex/README.md b/airbyte-integrations/connectors/source-gutendex/README.md index 3423fa3c754..518e35fd63b 100644 --- a/airbyte-integrations/connectors/source-gutendex/README.md +++ b/airbyte-integrations/connectors/source-gutendex/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/gutendex) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_gutendex/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-gutendex build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-gutendex build An image will be built with the tag `airbyte/source-gutendex:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-gutendex:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-gutendex:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gutendex:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-gutendex test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-gutendex test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-gutendex/bootstrap.md b/airbyte-integrations/connectors/source-gutendex/bootstrap.md index 961a8e20b7a..7c07f54448a 100644 --- a/airbyte-integrations/connectors/source-gutendex/bootstrap.md +++ b/airbyte-integrations/connectors/source-gutendex/bootstrap.md @@ -48,4 +48,4 @@ No published rate limit. No authentication. -See [this](https://docs.airbyte.io/integrations/sources/gutendex) link for the connector docs. \ No newline at end of file +See [this](https://docs.airbyte.io/integrations/sources/gutendex) link for the connector docs. diff --git a/airbyte-integrations/connectors/source-gutendex/source_gutendex/schemas/TODO.md b/airbyte-integrations/connectors/source-gutendex/source_gutendex/schemas/TODO.md index 0e1dfe18bb8..b040faf128f 100644 --- a/airbyte-integrations/connectors/source-gutendex/source_gutendex/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-gutendex/source_gutendex/schemas/TODO.md @@ -1,16 +1,19 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). You can describe the schema of your streams using one `.json` file per stream. - + ## Static schemas + From the `gutendex.yaml` configuration file, you read the `.json` files in the `schemas/` directory. You can refer to a schema in your configuration file using the `schema_loader` component's `file_path` field. For example: + ``` schema_loader: type: JsonSchema file_path: "./source_gutendex/schemas/customers.json" ``` + Every stream specified in the configuration file should have a corresponding `.json` schema file. Delete this file once you're done. Or don't. Up to you :) - diff --git a/airbyte-integrations/connectors/source-harness/README.md b/airbyte-integrations/connectors/source-harness/README.md index 2956defc0dd..98e74c65d76 100644 --- a/airbyte-integrations/connectors/source-harness/README.md +++ b/airbyte-integrations/connectors/source-harness/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/harness) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_harness/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-harness build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-harness build An image will be built with the tag `airbyte/source-harness:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-harness:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-harness:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harness:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-harness test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-harness test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-harvest/README.md b/airbyte-integrations/connectors/source-harvest/README.md index ed2dcbaaa64..6c989b65589 100644 --- a/airbyte-integrations/connectors/source-harvest/README.md +++ b/airbyte-integrations/connectors/source-harvest/README.md @@ -1,31 +1,32 @@ # Harvest source connector - This is the repository for the Harvest source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/harvest). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/harvest) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_harvest/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-harvest spec poetry run source-harvest check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-harvest read --config secrets/config.json --catalog integratio ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-harvest build ``` An image will be available on your host with the tag `airbyte/source-harvest:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-harvest:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harvest:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-harvest test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-harvest test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/harvest.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-hellobaton/README.md b/airbyte-integrations/connectors/source-hellobaton/README.md index c56dd42ea65..496e94054f7 100644 --- a/airbyte-integrations/connectors/source-hellobaton/README.md +++ b/airbyte-integrations/connectors/source-hellobaton/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/hellobaton) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_hellobaton/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-hellobaton build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-hellobaton build An image will be built with the tag `airbyte/source-hellobaton:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-hellobaton:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-hellobaton:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hellobaton:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-hellobaton test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-hellobaton test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-hubplanner/README.md b/airbyte-integrations/connectors/source-hubplanner/README.md index e7c245255f8..2d7bab8b801 100644 --- a/airbyte-integrations/connectors/source-hubplanner/README.md +++ b/airbyte-integrations/connectors/source-hubplanner/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/hubplanner) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_hubplanner/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-hubplanner build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-hubplanner build An image will be built with the tag `airbyte/source-hubplanner:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-hubplanner:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-hubplanner:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubplanner:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-hubplanner test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-hubplanner test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-hubspot/README.md b/airbyte-integrations/connectors/source-hubspot/README.md index b2d544eab1d..a84300762cb 100644 --- a/airbyte-integrations/connectors/source-hubspot/README.md +++ b/airbyte-integrations/connectors/source-hubspot/README.md @@ -1,31 +1,32 @@ # Hubspot source connector - This is the repository for the Hubspot source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/hubspot). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/hubspot) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_hubspot/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-hubspot spec poetry run source-hubspot check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-hubspot read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-hubspot build ``` An image will be available on your host with the tag `airbyte/source-hubspot:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-hubspot:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubspot:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-hubspot test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-hubspot test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/hubspot.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-insightly/README.md b/airbyte-integrations/connectors/source-insightly/README.md index e9e6fe29fc2..c8f161c1d99 100644 --- a/airbyte-integrations/connectors/source-insightly/README.md +++ b/airbyte-integrations/connectors/source-insightly/README.md @@ -1,31 +1,32 @@ # Insightly source connector - This is the repository for the Insightly source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/insightly). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/insightly) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_insightly/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-insightly spec poetry run source-insightly check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-insightly read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-insightly build ``` An image will be available on your host with the tag `airbyte/source-insightly:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-insightly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-insightly:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-insightly test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-insightly test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/insightly.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-instagram/README.md b/airbyte-integrations/connectors/source-instagram/README.md index 6d7485e922a..42d6f18703d 100644 --- a/airbyte-integrations/connectors/source-instagram/README.md +++ b/airbyte-integrations/connectors/source-instagram/README.md @@ -1,31 +1,32 @@ # Instagram source connector - This is the repository for the Instagram source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/instagram). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/instagram) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_instagram/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-instagram spec poetry run source-instagram check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-instagram read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-instagram build ``` An image will be available on your host with the tag `airbyte/source-instagram:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-instagram:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instagram:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-instagram test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-instagram test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/instagram.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-instatus/README.md b/airbyte-integrations/connectors/source-instatus/README.md index 3eaac6c4cbf..b03d1a94cb9 100644 --- a/airbyte-integrations/connectors/source-instatus/README.md +++ b/airbyte-integrations/connectors/source-instatus/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/instatus) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_instatus/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-instatus build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-instatus build An image will be built with the tag `airbyte/source-instatus:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-instatus:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-instatus:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instatus:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-instatus test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-instatus test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-intercom/README.md b/airbyte-integrations/connectors/source-intercom/README.md index 931a8c75dde..258f2dd41d7 100644 --- a/airbyte-integrations/connectors/source-intercom/README.md +++ b/airbyte-integrations/connectors/source-intercom/README.md @@ -1,31 +1,32 @@ # Intercom source connector - This is the repository for the Intercom source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/intercom). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/intercom) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_intercom/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-intercom spec poetry run source-intercom check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-intercom read --config secrets/config.json --catalog integrati ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-intercom build ``` An image will be available on your host with the tag `airbyte/source-intercom:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-intercom:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intercom:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-intercom test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-intercom test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/intercom.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-intruder/README.md b/airbyte-integrations/connectors/source-intruder/README.md index 94b8b878d4e..295fa0256ac 100644 --- a/airbyte-integrations/connectors/source-intruder/README.md +++ b/airbyte-integrations/connectors/source-intruder/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/intruder) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_intruder/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-intruder build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-intruder build An image will be built with the tag `airbyte/source-intruder:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-intruder:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-intruder:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intruder:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-intruder test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-intruder test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-ip2whois/README.md b/airbyte-integrations/connectors/source-ip2whois/README.md index e6f9a2e1253..7abc37bd6f9 100644 --- a/airbyte-integrations/connectors/source-ip2whois/README.md +++ b/airbyte-integrations/connectors/source-ip2whois/README.md @@ -1,31 +1,32 @@ # Ip2Whois source connector - This is the repository for the Ip2Whois source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/ip2whois). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/ip2whois) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_ip2whois/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-ip2whois spec poetry run source-ip2whois check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-ip2whois read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-ip2whois build ``` An image will be available on your host with the tag `airbyte/source-ip2whois:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-ip2whois:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ip2whois:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-ip2whois test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-ip2whois test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/ip2whois.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-iterable/README.md b/airbyte-integrations/connectors/source-iterable/README.md index db00a882386..9d738aeedab 100644 --- a/airbyte-integrations/connectors/source-iterable/README.md +++ b/airbyte-integrations/connectors/source-iterable/README.md @@ -1,31 +1,32 @@ # Iterable source connector - This is the repository for the Iterable source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/iterable). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/iterable) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_iterable/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-iterable spec poetry run source-iterable check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-iterable read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-iterable build ``` An image will be available on your host with the tag `airbyte/source-iterable:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-iterable:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-iterable:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-iterable test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-iterable test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/iterable.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-jira/README.md b/airbyte-integrations/connectors/source-jira/README.md index a9c7ce7bf48..ef5c0dd5e91 100644 --- a/airbyte-integrations/connectors/source-jira/README.md +++ b/airbyte-integrations/connectors/source-jira/README.md @@ -1,31 +1,32 @@ # Jira source connector - This is the repository for the Jira source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/jira). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/jira) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_jira/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-jira spec poetry run source-jira check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-jira read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-jira build ``` An image will be available on your host with the tag `airbyte/source-jira:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-jira:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-jira:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-jira test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-jira test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/jira.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-k6-cloud/README.md b/airbyte-integrations/connectors/source-k6-cloud/README.md index 37aae40d63c..5e7f37f873e 100644 --- a/airbyte-integrations/connectors/source-k6-cloud/README.md +++ b/airbyte-integrations/connectors/source-k6-cloud/README.md @@ -1,31 +1,32 @@ # K6-Cloud source connector - This is the repository for the K6-Cloud source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/k6-cloud). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/k6-cloud) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_k6_cloud/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-k6-cloud spec poetry run source-k6-cloud check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-k6-cloud read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-k6-cloud build ``` An image will be available on your host with the tag `airbyte/source-k6-cloud:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-k6-cloud:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-k6-cloud:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-k6-cloud test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-k6-cloud test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/k6-cloud.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-kafka/README.md b/airbyte-integrations/connectors/source-kafka/README.md index 342b3758e16..5c043247bc8 100644 --- a/airbyte-integrations/connectors/source-kafka/README.md +++ b/airbyte-integrations/connectors/source-kafka/README.md @@ -1,4 +1,4 @@ -# Kafka Source +# Kafka Source This is the repository for the Kafka source connector. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/kafka). @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-kafka:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-kafka:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-kafka:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-kafka:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kafka:dev check --config /secrets/config.json @@ -38,16 +45,21 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/source/kafka`. #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. ### Using gradle to run tests + All commands should be run from airbyte project root. To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-kafka:integrationTest ``` @@ -55,7 +67,9 @@ All commands should be run from airbyte project root. To run acceptance and cust ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-kafka test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -63,4 +77,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-klarna/README.md b/airbyte-integrations/connectors/source-klarna/README.md index 7f303b280a0..c6284c23b37 100644 --- a/airbyte-integrations/connectors/source-klarna/README.md +++ b/airbyte-integrations/connectors/source-klarna/README.md @@ -1,31 +1,32 @@ # Klarna source connector - This is the repository for the Klarna source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/klarna). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/klarna) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_klarna/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-klarna spec poetry run source-klarna check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-klarna read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-klarna build ``` An image will be available on your host with the tag `airbyte/source-klarna:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-klarna:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klarna:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-klarna test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-klarna test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/klarna.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-klaus-api/README.md b/airbyte-integrations/connectors/source-klaus-api/README.md index 34a602108bc..00593dca6bd 100644 --- a/airbyte-integrations/connectors/source-klaus-api/README.md +++ b/airbyte-integrations/connectors/source-klaus-api/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,14 +36,17 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Building via Gradle + You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow. To build using Gradle, from the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-klaus-api:build ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/klaus-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_klaus_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -48,6 +56,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -58,79 +67,107 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image #### Build + First, make sure you build the latest Docker image: + ``` docker build . -t airbyte/source-klaus-api:dev ``` If you want to build the Docker image with the CDK on your local machine (rather than the most recent package published to pypi), from the airbyte base directory run: + ```bash CONNECTOR_TAG= CONNECTOR_NAME= sh airbyte-integrations/scripts/build-connector-image-with-local-cdk.sh ``` - You can also build the connector image via Gradle: + ``` ./gradlew :airbyte-integrations:connectors:source-klaus-api:airbyteDocker ``` + When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-klaus-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaus-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaus-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-klaus-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` + ## Testing + Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named. First install test dependencies into your virtual environment: + ``` pip install .[tests] ``` + ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` python -m pytest unit_tests ``` ### Integration Tests + There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector). + #### Custom Integration tests + Place custom tests inside `integration_tests/` folder, then, from the connector root, run + ``` python -m pytest integration_tests ``` + #### Acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. To run your integration tests with acceptance tests, from the connector root, run + ``` python -m pytest integration_tests -p integration_tests.acceptance ``` + To run your integration tests with docker ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-klaus-api:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-klaus-api:integrationTest ``` ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing unit and integration tests. 1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)). 1. Create a Pull Request. diff --git a/airbyte-integrations/connectors/source-klaviyo/README.md b/airbyte-integrations/connectors/source-klaviyo/README.md index 76b9e4d8d6e..1a876c45272 100644 --- a/airbyte-integrations/connectors/source-klaviyo/README.md +++ b/airbyte-integrations/connectors/source-klaviyo/README.md @@ -1,31 +1,32 @@ # Klaviyo source connector - This is the repository for the Klaviyo source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/klaviyo). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/klaviyo) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_klaviyo/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-klaviyo spec poetry run source-klaviyo check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-klaviyo read --config secrets/config.json --catalog integratio ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-klaviyo build ``` An image will be available on your host with the tag `airbyte/source-klaviyo:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-klaviyo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaviyo:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-klaviyo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-klaviyo test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/klaviyo.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-kyriba/README.md b/airbyte-integrations/connectors/source-kyriba/README.md index cecc4e07395..1d0ce2d4725 100644 --- a/airbyte-integrations/connectors/source-kyriba/README.md +++ b/airbyte-integrations/connectors/source-kyriba/README.md @@ -1,31 +1,32 @@ # Kyriba source connector - This is the repository for the Kyriba source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/kyriba). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/kyriba) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_kyriba/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-kyriba spec poetry run source-kyriba check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-kyriba read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-kyriba build ``` An image will be available on your host with the tag `airbyte/source-kyriba:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-kyriba:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kyriba:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-kyriba test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-kyriba test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/kyriba.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-kyve/README.md b/airbyte-integrations/connectors/source-kyve/README.md index 481e5466ffd..f0a226ccf22 100644 --- a/airbyte-integrations/connectors/source-kyve/README.md +++ b/airbyte-integrations/connectors/source-kyve/README.md @@ -9,21 +9,22 @@ For information about how to set up an end-to-end pipeline with this connector, ## Source configuration setup -1. In order to create an ELT pipeline with KYVE source you should specify the **`Pool-ID`** of the [KYVE storage pool](https://app.kyve.network/#/pools) from which you want to retrieve data. +1. In order to create an ELT pipeline with KYVE source you should specify the **`Pool-ID`** of the [KYVE storage pool](https://app.kyve.network/#/pools) from which you want to retrieve data. 2. You can specify a specific **`Bundle-Start-ID`** in case you want to narrow the records that will be retrieved from the pool. You can find the valid bundles in the KYVE app (e.g. [Cosmos Hub pool](https://app.kyve.network/#/pools/0/bundles)). 3. In order to extract the validated data from KYVE, you can specify the endpoint which will be requested **`KYVE-API URL Base`**. By default, the official KYVE **`mainnet`** endpoint will be used, providing the data of [these pools](https://app.kyve.network/#/pools). - ***Note:*** - KYVE Network consists of three individual networks: *Korellia* is the `devnet` used for development purposes, *Kaon* is the `testnet` used for testing purposes, and **`mainnet`** is the official network. Although through Kaon and Korellia validated data can be used for development purposes, it is recommended to only trust the data validated on Mainnet. + **_Note:_** + KYVE Network consists of three individual networks: _Korellia_ is the `devnet` used for development purposes, _Kaon_ is the `testnet` used for testing purposes, and **`mainnet`** is the official network. Although through Kaon and Korellia validated data can be used for development purposes, it is recommended to only trust the data validated on Mainnet. ## Multiple pools + You can fetch with one source configuration more than one pool simultaneously. You just need to specify the **`Pool-IDs`** and the **`Bundle-Start-IDs`** for the KYVE storage pool you want to archive separated with comma. ## Changelog | Version | Date | Subject | -| :------ |:---------|:-----------------------------------------------------| +| :------ | :------- | :--------------------------------------------------- | | 0.1.0 | 25-05-23 | Initial release of KYVE source connector | -| 0.2.0 | 10-11-23 | Update KYVE source to support to Mainnet and Testnet | \ No newline at end of file +| 0.2.0 | 10-11-23 | Update KYVE source to support to Mainnet and Testnet | diff --git a/airbyte-integrations/connectors/source-launchdarkly/README.md b/airbyte-integrations/connectors/source-launchdarkly/README.md index 3fc1a43a35a..18e91816d5f 100644 --- a/airbyte-integrations/connectors/source-launchdarkly/README.md +++ b/airbyte-integrations/connectors/source-launchdarkly/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/launchdarkly) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_launchdarkly/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-launchdarkly build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-launchdarkly build An image will be built with the tag `airbyte/source-launchdarkly:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-launchdarkly:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-launchdarkly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-launchdarkly:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-launchdarkly test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-launchdarkly test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-lemlist/README.md b/airbyte-integrations/connectors/source-lemlist/README.md index 049a94cddb9..4b8cc22f6d0 100644 --- a/airbyte-integrations/connectors/source-lemlist/README.md +++ b/airbyte-integrations/connectors/source-lemlist/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/lemlist) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_lemlist/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-lemlist build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-lemlist build An image will be built with the tag `airbyte/source-lemlist:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-lemlist:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-lemlist:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lemlist:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-lemlist test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-lemlist test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-lever-hiring/README.md b/airbyte-integrations/connectors/source-lever-hiring/README.md index 168feee4a8c..db375ae1a26 100644 --- a/airbyte-integrations/connectors/source-lever-hiring/README.md +++ b/airbyte-integrations/connectors/source-lever-hiring/README.md @@ -5,22 +5,27 @@ This is the repository for the Lever Hiring source connector, written in Python. ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -29,6 +34,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, get the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_lever_hiring/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `integration_tests/sample_config.json` for a sample config file. @@ -37,6 +43,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -46,9 +53,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-lever-hiring build ``` @@ -56,12 +64,15 @@ airbyte-ci connectors --name=source-lever-hiring build An image will be built with the tag `airbyte/source-lever-hiring:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-lever-hiring:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-lever-hiring:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lever-hiring:dev check --config /secrets/config.json @@ -70,23 +81,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-lever-hiring test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-lever-hiring test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -94,4 +112,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-linkedin-ads/README.md b/airbyte-integrations/connectors/source-linkedin-ads/README.md index 6d6a5d6b6b9..a1494dc793a 100644 --- a/airbyte-integrations/connectors/source-linkedin-ads/README.md +++ b/airbyte-integrations/connectors/source-linkedin-ads/README.md @@ -1,31 +1,32 @@ # Linkedin-Ads source connector - This is the repository for the Linkedin-Ads source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/linkedin-ads). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/linkedin-ads) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_linkedin_ads/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-linkedin-ads spec poetry run source-linkedin-ads check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-linkedin-ads read --config secrets/config.json --catalog sampl ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-linkedin-ads build ``` An image will be available on your host with the tag `airbyte/source-linkedin-ads:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-linkedin-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-ads:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-linkedin-ads test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-linkedin-ads test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/linkedin-ads.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-linkedin-pages/README.md b/airbyte-integrations/connectors/source-linkedin-pages/README.md index aa8934c3e36..f77ae9d5db8 100644 --- a/airbyte-integrations/connectors/source-linkedin-pages/README.md +++ b/airbyte-integrations/connectors/source-linkedin-pages/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/linkedin-pages) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_linkedin_pages/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-linkedin-pages build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-linkedin-pages build An image will be built with the tag `airbyte/source-linkedin-pages:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-linkedin-pages:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-linkedin-pages:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-pages:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-linkedin-pages test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-linkedin-pages test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-linkedin-pages/bootstrap.md b/airbyte-integrations/connectors/source-linkedin-pages/bootstrap.md index 90e5cbd5eb5..d7368a1fe47 100644 --- a/airbyte-integrations/connectors/source-linkedin-pages/bootstrap.md +++ b/airbyte-integrations/connectors/source-linkedin-pages/bootstrap.md @@ -4,4 +4,4 @@ You must have a LinkedIn Developers' App created in order to request access to t The app also must be verified by an admin of the LinkedIn organization your app is created for. Once the app is "verified" and granted access to the Marketing Developer Platform API, you can use their easy-peasy OAuth Token Tools to generate access tokens **and** refresh tokens. -You can access the `client id` and `client secret` in the **Auth** tab of the app dashboard to round out all of the authorization needs you may have. \ No newline at end of file +You can access the `client id` and `client secret` in the **Auth** tab of the app dashboard to round out all of the authorization needs you may have. diff --git a/airbyte-integrations/connectors/source-linnworks/README.md b/airbyte-integrations/connectors/source-linnworks/README.md index eae9b5f359b..93115680f27 100644 --- a/airbyte-integrations/connectors/source-linnworks/README.md +++ b/airbyte-integrations/connectors/source-linnworks/README.md @@ -7,8 +7,8 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector diff --git a/airbyte-integrations/connectors/source-lokalise/README.md b/airbyte-integrations/connectors/source-lokalise/README.md index fbeee5a12a5..97f560ced3f 100644 --- a/airbyte-integrations/connectors/source-lokalise/README.md +++ b/airbyte-integrations/connectors/source-lokalise/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/lokalise) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_lokalise/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-lokalise build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-lokalise build An image will be built with the tag `airbyte/source-lokalise:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-lokalise:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-lokalise:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lokalise:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-lokalise test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-lokalise test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-looker/README.md b/airbyte-integrations/connectors/source-looker/README.md index d1b02c9ad5f..e62de563da6 100644 --- a/airbyte-integrations/connectors/source-looker/README.md +++ b/airbyte-integrations/connectors/source-looker/README.md @@ -1,27 +1,32 @@ -# Looker Source +# Looker Source -This is the repository for the Looker source connector, written in Python. +This is the repository for the Looker source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/looker). ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/looker) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_looker/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -38,8 +44,8 @@ See `integration_tests/sample_config.json` for a sample config file. **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source looker test creds` and place them into `secrets/config.json`. - ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-looker build ``` @@ -59,12 +66,15 @@ airbyte-ci connectors --name=source-looker build An image will be built with the tag `airbyte/source-looker:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-looker:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-looker:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-looker:dev check --config /secrets/config.json @@ -72,22 +82,27 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-looker:dev discover -- docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-looker:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-looker test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-looker test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -95,4 +110,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-mailchimp/README.md b/airbyte-integrations/connectors/source-mailchimp/README.md index 2e6d772187b..0c4d824d982 100644 --- a/airbyte-integrations/connectors/source-mailchimp/README.md +++ b/airbyte-integrations/connectors/source-mailchimp/README.md @@ -1,31 +1,32 @@ # Mailchimp source connector - This is the repository for the Mailchimp source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/mailchimp). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/mailchimp) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailchimp/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-mailchimp spec poetry run source-mailchimp check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-mailchimp read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-mailchimp build ``` An image will be available on your host with the tag `airbyte/source-mailchimp:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailchimp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailchimp:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailchimp test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailchimp test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/mailchimp.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-mailerlite/README.md b/airbyte-integrations/connectors/source-mailerlite/README.md index f7ea62d35eb..e2438b7abbe 100644 --- a/airbyte-integrations/connectors/source-mailerlite/README.md +++ b/airbyte-integrations/connectors/source-mailerlite/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/mailerlite) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailerlite/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-mailerlite build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-mailerlite build An image will be built with the tag `airbyte/source-mailerlite:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-mailerlite:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailerlite:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailerlite:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailerlite test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailerlite test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-mailersend/README.md b/airbyte-integrations/connectors/source-mailersend/README.md index a9fbcee8c34..77fa9a72692 100644 --- a/airbyte-integrations/connectors/source-mailersend/README.md +++ b/airbyte-integrations/connectors/source-mailersend/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/mailersend) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailersend/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-mailersend build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-mailersend build An image will be built with the tag `airbyte/source-mailersend:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-mailersend:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailersend:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailersend:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailersend test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailersend test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-mailgun/README.md b/airbyte-integrations/connectors/source-mailgun/README.md index d92d3c97844..cc763da7633 100644 --- a/airbyte-integrations/connectors/source-mailgun/README.md +++ b/airbyte-integrations/connectors/source-mailgun/README.md @@ -1,31 +1,32 @@ # Mailgun source connector - This is the repository for the Mailgun source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/mailgun). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/mailgun) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailgun/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-mailgun spec poetry run source-mailgun check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-mailgun read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-mailgun build ``` An image will be available on your host with the tag `airbyte/source-mailgun:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailgun:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailgun:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailgun test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailgun test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/mailgun.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-mailjet-mail/README.md b/airbyte-integrations/connectors/source-mailjet-mail/README.md index 4bba003682d..6163b55c1af 100644 --- a/airbyte-integrations/connectors/source-mailjet-mail/README.md +++ b/airbyte-integrations/connectors/source-mailjet-mail/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/mailjet-mail) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailjet_mail/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-mailjet-mail build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-mailjet-mail build An image will be built with the tag `airbyte/source-mailjet-mail:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-mailjet-mail:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailjet-mail:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-mail:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailjet-mail test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailjet-mail test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-mailjet-sms/README.md b/airbyte-integrations/connectors/source-mailjet-sms/README.md index 6dc431e80e8..42975b4b0ef 100644 --- a/airbyte-integrations/connectors/source-mailjet-sms/README.md +++ b/airbyte-integrations/connectors/source-mailjet-sms/README.md @@ -1,31 +1,32 @@ # Mailjet-Sms source connector - This is the repository for the Mailjet-Sms source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/mailjet-sms). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/mailjet-sms) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mailjet_sms/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-mailjet-sms spec poetry run source-mailjet-sms check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-mailjet-sms read --config secrets/config.json --catalog sample ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-mailjet-sms build ``` An image will be available on your host with the tag `airbyte/source-mailjet-sms:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mailjet-sms:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-sms:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mailjet-sms test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mailjet-sms test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/mailjet-sms.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-marketo/README.md b/airbyte-integrations/connectors/source-marketo/README.md index 6dc59472c6e..3b6c7e7be05 100644 --- a/airbyte-integrations/connectors/source-marketo/README.md +++ b/airbyte-integrations/connectors/source-marketo/README.md @@ -1,31 +1,32 @@ # Marketo source connector - This is the repository for the Marketo source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/marketo). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/marketo) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_marketo/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-marketo spec poetry run source-marketo check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-marketo read --config secrets/config.json --catalog integratio ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-marketo build ``` An image will be available on your host with the tag `airbyte/source-marketo:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-marketo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-marketo:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-marketo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-marketo test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/marketo.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-marketo/bootstrap.md b/airbyte-integrations/connectors/source-marketo/bootstrap.md index 89285211093..2e3567e0f4c 100644 --- a/airbyte-integrations/connectors/source-marketo/bootstrap.md +++ b/airbyte-integrations/connectors/source-marketo/bootstrap.md @@ -2,27 +2,27 @@ Marketo is a REST based API. Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). -Connector has such core streams, and all of them except Activity_types support full refresh and incremental sync: -* [Activity\_types](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Activities/getAllActivityTypesUsingGET). -* [Campaigns](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Campaigns/getCampaignsUsingGET). -* [Lists](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Static_Lists/getListByIdUsingGET). -* [Programs](https://developers.marketo.com/rest-api/endpoint-reference/asset-endpoint-reference/#!/Programs/browseProgramsUsingGET). +Connector has such core streams, and all of them except Activity_types support full refresh and incremental sync: +- [Activity_types](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Activities/getAllActivityTypesUsingGET). +- [Campaigns](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Campaigns/getCampaignsUsingGET). +- [Lists](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Static_Lists/getListByIdUsingGET). +- [Programs](https://developers.marketo.com/rest-api/endpoint-reference/asset-endpoint-reference/#!/Programs/browseProgramsUsingGET). ## Bulk export streams Connector also has bulk export streams, which support incremental sync. -* [Activities\_X](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Activities/getLeadActivitiesUsingGET). -* [Leads](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Leads/getLeadByIdUsingGET). +- [Activities_X](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Activities/getLeadActivitiesUsingGET). +- [Leads](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Leads/getLeadByIdUsingGET). To be able to pull export data you need to generate 3 separate requests. See [Marketo docs](https://developers.marketo.com/rest-api/bulk-extract/bulk-lead-extract/). -* [First](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#/Bulk_Export_Leads/createExportLeadsUsingPOST) - to create a job +- [First](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#/Bulk_Export_Leads/createExportLeadsUsingPOST) - to create a job -* [Second](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#/Bulk_Export_Leads/enqueueExportLeadsUsingPOST) - to enqueue job +- [Second](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#/Bulk_Export_Leads/enqueueExportLeadsUsingPOST) - to enqueue job -* [Third](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Bulk_Export_Leads/getExportLeadsFileUsingGET) - to poll export data +- [Third](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Bulk_Export_Leads/getExportLeadsFileUsingGET) - to poll export data For get status of extracting see [Status](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Bulk_Export_Leads/getExportLeadsStatusUsingGET) - the status is only updated once every 60 seconds. Job timeout - 180 min. diff --git a/airbyte-integrations/connectors/source-merge/README.md b/airbyte-integrations/connectors/source-merge/README.md index 24901415147..61b853c3853 100644 --- a/airbyte-integrations/connectors/source-merge/README.md +++ b/airbyte-integrations/connectors/source-merge/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/merge) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_merge/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-merge build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-merge build An image will be built with the tag `airbyte/source-merge:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-merge:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-merge:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-merge:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-merge test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-merge test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-metabase/README.md b/airbyte-integrations/connectors/source-metabase/README.md index 86bf75320d0..2060f723a6e 100644 --- a/airbyte-integrations/connectors/source-metabase/README.md +++ b/airbyte-integrations/connectors/source-metabase/README.md @@ -1,31 +1,32 @@ # Metabase source connector - This is the repository for the Metabase source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/metabase). ## Local development ### Prerequisites -* Python (~=3.7) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.7) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/metabase) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_metabase/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-metabase spec poetry run source-metabase check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-metabase read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-metabase build ``` An image will be available on your host with the tag `airbyte/source-metabase:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-metabase:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-metabase:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-metabase test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management + All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-metabase test` 2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/metabase.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-metabase/bootstrap.md b/airbyte-integrations/connectors/source-metabase/bootstrap.md index fb9ebdced69..4a03ca4f1d9 100644 --- a/airbyte-integrations/connectors/source-metabase/bootstrap.md +++ b/airbyte-integrations/connectors/source-metabase/bootstrap.md @@ -7,6 +7,7 @@ It also offers embeddable charts and interactive dashboards, GUI and SQL editors that queries data from major data warehouses and databases with auditing and data sandboxing features, and more. Just like Airbyte, it offers the options for deployment: + - self-hosted through their Open-Source or licensed (paid) versions which unlock more features. - cloud managed by Metabase for their paying customers. @@ -27,9 +28,9 @@ Because of this, the connector configuration needs to be supplied with the sessi edit its own configuration with the new value everytime it runs. A consequence of this limitation is that the configuration of the connector will have to be updated when the credential token expires -(every 14 days). Unless, the airbyte-server is able to refresh this token and persist the value of the new token. +(every 14 days). Unless, the airbyte-server is able to refresh this token and persist the value of the new token. -If the connector is supplied with only username and password, a session_token will be generated everytime an +If the connector is supplied with only username and password, a session_token will be generated everytime an authenticated query is running, which might trigger security alerts on the user's account. All the API from metabase don't seem to support incremental sync modes as they don't expose cursor field values or pagination. @@ -38,4 +39,3 @@ So all streams only support full refresh sync modes for the moment. ## API Reference The Metabase reference documents: [Metabase API documentation](https://www.metabase.com/docs/latest/api-documentation.html) - diff --git a/airbyte-integrations/connectors/source-microsoft-dataverse/README.md b/airbyte-integrations/connectors/source-microsoft-dataverse/README.md index 26a4fcff9b3..e51083521a0 100644 --- a/airbyte-integrations/connectors/source-microsoft-dataverse/README.md +++ b/airbyte-integrations/connectors/source-microsoft-dataverse/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/microsoft-dataverse) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_microsoft_dataverse/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-microsoft-dataverse build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-microsoft-dataverse build An image will be built with the tag `airbyte/source-microsoft-dataverse:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-microsoft-dataverse:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-microsoft-dataverse:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-dataverse:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-microsoft-dataverse test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-microsoft-dataverse test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-microsoft-onedrive/README.md b/airbyte-integrations/connectors/source-microsoft-onedrive/README.md index 90cdae8aafc..6d69f141e10 100644 --- a/airbyte-integrations/connectors/source-microsoft-onedrive/README.md +++ b/airbyte-integrations/connectors/source-microsoft-onedrive/README.md @@ -1,31 +1,32 @@ # Microsoft OneDrive source connector - This is the repository for the Microsoft OneDrive source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/microsoft-onedrive). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/microsoft-onedrive) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_microsoft_onedrive/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-microsoft-onedrive spec poetry run source-microsoft-onedrive check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-microsoft-onedrive read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-microsoft-onedrive build ``` An image will be available on your host with the tag `airbyte/source-microsoft-onedrive:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-microsoft-onedrive:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-onedrive:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-microsoft-onedrive test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-microsoft-onedrive test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/microsoft-onedrive.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-microsoft-sharepoint/README.md b/airbyte-integrations/connectors/source-microsoft-sharepoint/README.md index 5ff95071916..7deee881e28 100644 --- a/airbyte-integrations/connectors/source-microsoft-sharepoint/README.md +++ b/airbyte-integrations/connectors/source-microsoft-sharepoint/README.md @@ -1,31 +1,32 @@ # Microsoft SharePoint source connector - This is the repository for the Microsoft SharePoint source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/microsoft-sharepoint). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/microsoft-sharepoint) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_microsoft_sharepoint/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-microsoft-sharepoint spec poetry run source-microsoft-sharepoint check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-microsoft-sharepoint read --config secrets/config.json --catal ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-microsoft-sharepoint build ``` An image will be available on your host with the tag `airbyte/source-microsoft-sharepoint:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-microsoft-sharepoint:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-sharepoint:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-microsoft-sharepoint test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-microsoft-sharepoint test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/microsoft-sharepoint.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-microsoft-teams/README.md b/airbyte-integrations/connectors/source-microsoft-teams/README.md index 54da8169d26..05ad1fdea93 100644 --- a/airbyte-integrations/connectors/source-microsoft-teams/README.md +++ b/airbyte-integrations/connectors/source-microsoft-teams/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/microsoft-teams) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-microsoft-teams build ``` An image will be available on your host with the tag `airbyte/source-microsoft-teams:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-microsoft-teams:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-teams:dev check --config /secrets/config.json @@ -69,6 +67,7 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ### Running our CI test suite You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-microsoft-teams test ``` @@ -80,8 +79,9 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -91,13 +91,14 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-microsoft-teams test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/microsoft-teams.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-mixpanel/README.md b/airbyte-integrations/connectors/source-mixpanel/README.md index 98574c7aaf0..2725bb71959 100644 --- a/airbyte-integrations/connectors/source-mixpanel/README.md +++ b/airbyte-integrations/connectors/source-mixpanel/README.md @@ -1,31 +1,32 @@ # Mixpanel source connector - This is the repository for the Mixpanel source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/mixpanel). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/mixpanel) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_mixpanel/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-mixpanel spec poetry run source-mixpanel check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-mixpanel read --config secrets/config.json --catalog integrati ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-mixpanel build ``` An image will be available on your host with the tag `airbyte/source-mixpanel:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-mixpanel:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mixpanel:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-mixpanel test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-mixpanel test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/mixpanel.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-monday/README.md b/airbyte-integrations/connectors/source-monday/README.md index 39551615dec..d9975b3950b 100644 --- a/airbyte-integrations/connectors/source-monday/README.md +++ b/airbyte-integrations/connectors/source-monday/README.md @@ -1,31 +1,32 @@ # Monday source connector - This is the repository for the Monday source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/monday). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/monday) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_monday/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-monday spec poetry run source-monday check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-monday read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-monday build ``` An image will be available on your host with the tag `airbyte/source-monday:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-monday:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-monday:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-monday test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-monday test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/monday.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-mongodb-v2/README.md b/airbyte-integrations/connectors/source-mongodb-v2/README.md index 0b648be99cd..3affffb37d1 100644 --- a/airbyte-integrations/connectors/source-mongodb-v2/README.md +++ b/airbyte-integrations/connectors/source-mongodb-v2/README.md @@ -1,13 +1,16 @@ # MongoDb Source ## Documentation + This is the repository for the MongoDb source connector in Java. For information about how to use this connector within Airbyte, see [User Documentation](https://docs.airbyte.io/integrations/sources/mongodb-v2) ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-mongodb-v2:build ``` @@ -15,15 +18,18 @@ From the Airbyte repository root, run: ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-mongodb-v2:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-mongodb-v2:dev`. the Dockerfile. ## Testing + We use `JUnit` for Java tests. ### Test Configuration @@ -37,15 +43,15 @@ As a community contributor, you will need to have an Atlas cluster to test Mongo 1. Create `secrets/credentials.json` file 1. Insert below json to the file with your configuration - ``` - { - "cluster_type": "ATLAS_REPLICA_SET" - "database": "database_name", - "username": "username", - "password": "password", - "connection_string": "mongodb+srv://cluster0.abcd1.mongodb.net/", - "auth_source": "auth_database", - } + ``` + { + "cluster_type": "ATLAS_REPLICA_SET" + "database": "database_name", + "username": "username", + "password": "password", + "connection_string": "mongodb+srv://cluster0.abcd1.mongodb.net/", + "auth_source": "auth_database", + } ``` where `installation_type` is one of `ATLAS_REPLICA_SET` or `SELF_HOSTED_REPLICA_SET` depending on the location of the target cluster. @@ -54,9 +60,10 @@ As a community contributor, you will need to have an Atlas cluster to test Mongo 1. Access the `MONGODB_TEST_CREDS` secret on LastPass 1. Create a file with the contents at `secrets/credentials.json` - #### Acceptance Tests + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-mongodb-v2:integrationTest ``` diff --git a/airbyte-integrations/connectors/source-mssql/README.md b/airbyte-integrations/connectors/source-mssql/README.md index f98f780d9d1..78a636b36e0 100644 --- a/airbyte-integrations/connectors/source-mssql/README.md +++ b/airbyte-integrations/connectors/source-mssql/README.md @@ -3,11 +3,13 @@ ## Performance Test To run performance tests in commandline: + ```shell ./gradlew :airbyte-integrations:connectors:source-mssql:performanceTest [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` In pull request: + ```shell /test-performance connector=connectors/source-mssql [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` @@ -18,7 +20,7 @@ In pull request: ### Use MsSQL script to populate the benchmark database -In order to create a database with a certain number of tables, and a certain number of records in each of them, +In order to create a database with a certain number of tables, and a certain number of records in each of them, you need to follow a few simple steps. 1. Create a new database. @@ -28,4 +30,4 @@ you need to follow a few simple steps. cd airbyte-integrations/connectors/source-mssql sqlcmd -S Serverinstance -E -i src/test-performance/sql/create_mssql_benchmarks.sql ``` -4. After the script finishes its work, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test_(the number of tables minus 1)**. +4. After the script finishes its work, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test\_(the number of tables minus 1)**. diff --git a/airbyte-integrations/connectors/source-my-hours/README.md b/airbyte-integrations/connectors/source-my-hours/README.md index 5353a0bdad0..cd418429976 100644 --- a/airbyte-integrations/connectors/source-my-hours/README.md +++ b/airbyte-integrations/connectors/source-my-hours/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/my-hours) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-my-hours build ``` An image will be available on your host with the tag `airbyte/source-my-hours:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-my-hours:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-my-hours:dev check --config /secrets/config.json @@ -69,6 +67,7 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ### Running our CI test suite You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-my-hours test ``` @@ -80,8 +79,9 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -91,13 +91,14 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-my-hours test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/my-hours.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-my-hours/bootstrap.md b/airbyte-integrations/connectors/source-my-hours/bootstrap.md index b77cb9cbd1a..25b2d189324 100644 --- a/airbyte-integrations/connectors/source-my-hours/bootstrap.md +++ b/airbyte-integrations/connectors/source-my-hours/bootstrap.md @@ -2,11 +2,11 @@ This connector has the following streams, and all of them support full refresh only. -* [Time Logs](https://documenter.getpostman.com/view/8879268/TVmV4YYU#a023832e-c39d-4cff-a639-d673fb8846c1) -* [Clients](https://documenter.getpostman.com/view/8879268/TVmV4YYU#79916508-c2ba-4ed4-9d97-bbb769687c11) -* [Projects](https://documenter.getpostman.com/view/8879268/TVmV4YYU#64fa3d61-a785-4727-bd33-f549b987c7b2) -* [Tags](https://documenter.getpostman.com/view/8879268/TVmV4YYU#a7ef468e-120b-40de-ad52-79e9d485f688) -* [Users](https://documenter.getpostman.com/view/8879268/TVmV4YYU#da5fa9cc-f337-4888-bf18-21e68a07ee3d) +- [Time Logs](https://documenter.getpostman.com/view/8879268/TVmV4YYU#a023832e-c39d-4cff-a639-d673fb8846c1) +- [Clients](https://documenter.getpostman.com/view/8879268/TVmV4YYU#79916508-c2ba-4ed4-9d97-bbb769687c11) +- [Projects](https://documenter.getpostman.com/view/8879268/TVmV4YYU#64fa3d61-a785-4727-bd33-f549b987c7b2) +- [Tags](https://documenter.getpostman.com/view/8879268/TVmV4YYU#a7ef468e-120b-40de-ad52-79e9d485f688) +- [Users](https://documenter.getpostman.com/view/8879268/TVmV4YYU#da5fa9cc-f337-4888-bf18-21e68a07ee3d) ## Authentication diff --git a/airbyte-integrations/connectors/source-mysql/README.md b/airbyte-integrations/connectors/source-mysql/README.md index 945aff0c9c9..0685ff81fd9 100644 --- a/airbyte-integrations/connectors/source-mysql/README.md +++ b/airbyte-integrations/connectors/source-mysql/README.md @@ -1,13 +1,16 @@ # MySQL Source ## Documentation + This is the repository for the MySQL only source connector in Java. For information about how to use this connector within Airbyte, see [User Documentation](https://docs.airbyte.io/integrations/sources/mysql) ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-mysql:build ``` @@ -15,19 +18,24 @@ From the Airbyte repository root, run: ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-mysql:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-mysql:dev`. the Dockerfile. ## Testing + We use `JUnit` for Java tests. ### Acceptance Tests + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-mysql:integrationTest ``` @@ -35,11 +43,13 @@ To run acceptance and custom integration tests: ### Performance Tests To run performance tests in commandline: + ```shell ./gradlew :airbyte-integrations:connectors:source-mysql:performanceTest [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` In pull request: + ```shell /test-performance connector=connectors/source-mysql [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` @@ -59,5 +69,5 @@ you need to follow a few simple steps. ```shell cd airbyte-integrations/connectors/source-mysql mysql -h hostname -u user database < src/test-performance/sql/create_mysql_benchmarks.sql - ``` -4. After the script finishes its work, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test_(the number of tables minus 1)**. + ``` +4. After the script finishes its work, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test\_(the number of tables minus 1)**. diff --git a/airbyte-integrations/connectors/source-n8n/README.md b/airbyte-integrations/connectors/source-n8n/README.md index 414f367c91d..ce86e6a5631 100644 --- a/airbyte-integrations/connectors/source-n8n/README.md +++ b/airbyte-integrations/connectors/source-n8n/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/n8n) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_n8n/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-n8n build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-n8n build An image will be built with the tag `airbyte/source-n8n:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-n8n:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-n8n:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-n8n:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-n8n test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-n8n test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-nasa/README.md b/airbyte-integrations/connectors/source-nasa/README.md index ec9a6ae245e..290026a1d3a 100644 --- a/airbyte-integrations/connectors/source-nasa/README.md +++ b/airbyte-integrations/connectors/source-nasa/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/nasa) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_nasa/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-nasa build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-nasa build An image will be built with the tag `airbyte/source-nasa:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-nasa:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-nasa:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nasa:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-nasa test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-nasa test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-netsuite/README.md b/airbyte-integrations/connectors/source-netsuite/README.md index 8b14d70cbf2..dcc2761b6ce 100644 --- a/airbyte-integrations/connectors/source-netsuite/README.md +++ b/airbyte-integrations/connectors/source-netsuite/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/netsuite) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_netsuite_soap/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-netsuite build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-netsuite build An image will be built with the tag `airbyte/source-netsuite:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-netsuite:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-netsuite:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-netsuite:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-netsuite test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-netsuite test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-news-api/README.md b/airbyte-integrations/connectors/source-news-api/README.md index 0408e1fadd7..c11394659ce 100644 --- a/airbyte-integrations/connectors/source-news-api/README.md +++ b/airbyte-integrations/connectors/source-news-api/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/news-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_news_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-news-api build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-news-api build An image will be built with the tag `airbyte/source-news-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-news-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-news-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-news-api:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-news-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-news-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-newsdata/README.md b/airbyte-integrations/connectors/source-newsdata/README.md index 78ba3f90d9a..cfef44818f8 100644 --- a/airbyte-integrations/connectors/source-newsdata/README.md +++ b/airbyte-integrations/connectors/source-newsdata/README.md @@ -1,31 +1,32 @@ # Newsdata source connector - This is the repository for the Newsdata source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/newsdata). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/newsdata) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_newsdata/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-newsdata spec poetry run source-newsdata check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-newsdata read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-newsdata build ``` An image will be available on your host with the tag `airbyte/source-newsdata:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-newsdata:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-newsdata:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-newsdata test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-newsdata test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/newsdata.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-notion/README.md b/airbyte-integrations/connectors/source-notion/README.md index ef004889412..2fc969db460 100644 --- a/airbyte-integrations/connectors/source-notion/README.md +++ b/airbyte-integrations/connectors/source-notion/README.md @@ -1,31 +1,32 @@ # Notion source connector - This is the repository for the Notion source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/notion). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/notion) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_notion/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-notion spec poetry run source-notion check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-notion read --config secrets/config.json --catalog integration ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-notion build ``` An image will be available on your host with the tag `airbyte/source-notion:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-notion:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-notion:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-notion test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-notion test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/notion.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-notion/bootstrap.md b/airbyte-integrations/connectors/source-notion/bootstrap.md index 6d492039f87..b91bfe7af6a 100644 --- a/airbyte-integrations/connectors/source-notion/bootstrap.md +++ b/airbyte-integrations/connectors/source-notion/bootstrap.md @@ -29,4 +29,3 @@ Notion API consists of three endpoints which can be extracted data from: ## API Reference The API reference documents: [https://developers.notion.com/reference/intro](https://developers.notion.com/reference) - diff --git a/airbyte-integrations/connectors/source-nytimes/README.md b/airbyte-integrations/connectors/source-nytimes/README.md index d6b385246b7..e8a60982a54 100644 --- a/airbyte-integrations/connectors/source-nytimes/README.md +++ b/airbyte-integrations/connectors/source-nytimes/README.md @@ -1,31 +1,32 @@ # Nytimes source connector - This is the repository for the Nytimes source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/nytimes). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/nytimes) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_nytimes/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-nytimes spec poetry run source-nytimes check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-nytimes read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-nytimes build ``` An image will be available on your host with the tag `airbyte/source-nytimes:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-nytimes:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nytimes:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-nytimes test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-nytimes test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/nytimes.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-okta/README.md b/airbyte-integrations/connectors/source-okta/README.md index 611b3c1f5f8..63b74be94d4 100644 --- a/airbyte-integrations/connectors/source-okta/README.md +++ b/airbyte-integrations/connectors/source-okta/README.md @@ -53,9 +53,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-okta build ``` @@ -63,6 +64,7 @@ airbyte-ci connectors --name=source-okta build An image will be built with the tag `airbyte/source-okta:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-okta:dev . ``` @@ -78,14 +80,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-okta:dev discover --co docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-okta:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-okta test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. @@ -94,11 +98,13 @@ If your connector requires to create or destroy resources for use during accepta All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-okta test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -106,4 +112,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-omnisend/README.md b/airbyte-integrations/connectors/source-omnisend/README.md index 4bfe23ff047..46acaff54ad 100644 --- a/airbyte-integrations/connectors/source-omnisend/README.md +++ b/airbyte-integrations/connectors/source-omnisend/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/omnisend) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_omnisend/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-omnisend build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-omnisend build An image will be built with the tag `airbyte/source-omnisend:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-omnisend:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-omnisend:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-omnisend:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-omnisend test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-omnisend test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-onesignal/README.md b/airbyte-integrations/connectors/source-onesignal/README.md index 86c23f85b8a..bdde18876b2 100644 --- a/airbyte-integrations/connectors/source-onesignal/README.md +++ b/airbyte-integrations/connectors/source-onesignal/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/onesignal) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_onesignal/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-onesignal build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-onesignal build An image will be built with the tag `airbyte/source-onesignal:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-onesignal:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-onesignal:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-onesignal:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-onesignal test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-onesignal test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-open-exchange-rates/README.md b/airbyte-integrations/connectors/source-open-exchange-rates/README.md index 212223d3d56..aed82370260 100644 --- a/airbyte-integrations/connectors/source-open-exchange-rates/README.md +++ b/airbyte-integrations/connectors/source-open-exchange-rates/README.md @@ -1,31 +1,32 @@ # Open-Exchange-Rates source connector - This is the repository for the Open-Exchange-Rates source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/open-exchange-rates). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/open-exchange-rates) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_open_exchange_rates/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-open-exchange-rates spec poetry run source-open-exchange-rates check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-open-exchange-rates read --config secrets/config.json --catalo ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-open-exchange-rates build ``` An image will be available on your host with the tag `airbyte/source-open-exchange-rates:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-open-exchange-rates:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-open-exchange-rates:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-open-exchange-rates test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-open-exchange-rates test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/open-exchange-rates.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-openweather/README.md b/airbyte-integrations/connectors/source-openweather/README.md index 1d912dd9d1d..36e9013b933 100644 --- a/airbyte-integrations/connectors/source-openweather/README.md +++ b/airbyte-integrations/connectors/source-openweather/README.md @@ -1,31 +1,32 @@ # Openweather source connector - This is the repository for the Openweather source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/openweather). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/openweather) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_openweather/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-openweather spec poetry run source-openweather check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-openweather read --config secrets/config.json --catalog sample ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-openweather build ``` An image will be available on your host with the tag `airbyte/source-openweather:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-openweather:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-openweather:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-openweather test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-openweather test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/openweather.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-opsgenie/README.md b/airbyte-integrations/connectors/source-opsgenie/README.md index b17409ce3b4..3b3b1e8a1f3 100644 --- a/airbyte-integrations/connectors/source-opsgenie/README.md +++ b/airbyte-integrations/connectors/source-opsgenie/README.md @@ -1,31 +1,32 @@ # Opsgenie source connector - This is the repository for the Opsgenie source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/opsgenie). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/opsgenie) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_opsgenie/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-opsgenie spec poetry run source-opsgenie check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-opsgenie read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-opsgenie build ``` An image will be available on your host with the tag `airbyte/source-opsgenie:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-opsgenie:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-opsgenie:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-opsgenie test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-opsgenie test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/opsgenie.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-oracle/BOOTSTRAP.md b/airbyte-integrations/connectors/source-oracle/BOOTSTRAP.md index c65fc81a521..695f1437600 100644 --- a/airbyte-integrations/connectors/source-oracle/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-oracle/BOOTSTRAP.md @@ -1,10 +1,12 @@ # Oracle Source + The Oracle source connector allows syncing the data from the Oracle DB. The current source connector supports Oracle 11g or above. -The connector uses *ojdbc8* driver underneath to establish the connection. The Oracle source does not alter the schema present in your database. +The connector uses _ojdbc8_ driver underneath to establish the connection. The Oracle source does not alter the schema present in your database. ### Important details + Connector works with `useFetchSizeWithLongColumn=true` property, which required to select the data from `LONG` or `LONG RAW` type columns. Oracle recommends avoiding LONG and LONG RAW columns. Use LOB instead. They are included in Oracle only for legacy reasons. THIS IS A THIN ONLY PROPERTY. IT SHOULD NOT BE USED WITH ANY OTHER DRIVERS. -See [this](https://docs.airbyte.io/integrations/sources/oracle) link for the nuances about the connector. \ No newline at end of file +See [this](https://docs.airbyte.io/integrations/sources/oracle) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-oracle/README.md b/airbyte-integrations/connectors/source-oracle/README.md index 32ce4e9c798..33e1290d947 100644 --- a/airbyte-integrations/connectors/source-oracle/README.md +++ b/airbyte-integrations/connectors/source-oracle/README.md @@ -1,13 +1,16 @@ # Oracle Source ## Documentation + This is the repository for the Oracle only source connector in Java. For information about how to use this connector within Airbyte, see [User Documentation](https://docs.airbyte.io/integrations/sources/oracle) ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-oracle:build ``` @@ -15,20 +18,26 @@ From the Airbyte repository root, run: ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-oracle:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-oracle:dev`. the Dockerfile. ## Testing + We use `JUnit` for Java tests. ### Test Configuration + #### Acceptance Tests + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-oracle:integrationTest ``` diff --git a/airbyte-integrations/connectors/source-orb/README.md b/airbyte-integrations/connectors/source-orb/README.md index cd3b9c13af7..813cab85b9e 100644 --- a/airbyte-integrations/connectors/source-orb/README.md +++ b/airbyte-integrations/connectors/source-orb/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/orb) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_orb/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-orb build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-orb build An image will be built with the tag `airbyte/source-orb:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-orb:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-orb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orb:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-orb test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-orb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-orb/bootstrap.md b/airbyte-integrations/connectors/source-orb/bootstrap.md index a7164a42fb7..84b5a9bf260 100644 --- a/airbyte-integrations/connectors/source-orb/bootstrap.md +++ b/airbyte-integrations/connectors/source-orb/bootstrap.md @@ -2,11 +2,11 @@ Orb is a REST API. Connector has the following streams, and all of them support incremental refresh. -* [Subscriptions]( https://docs.withorb.com/reference/list-subscriptions) -* [Plans](https://docs.withorb.com/reference/list-plans) -* [Customers](https://docs.withorb.com/reference/list-customers) -* [Credits Ledger Entries](https://docs.withorb.com/reference/view-credits-ledger) -* [Invoices](https://docs.withorb.com/docs/orb-docs/api-reference/schemas/invoice) +- [Subscriptions](https://docs.withorb.com/reference/list-subscriptions) +- [Plans](https://docs.withorb.com/reference/list-plans) +- [Customers](https://docs.withorb.com/reference/list-customers) +- [Credits Ledger Entries](https://docs.withorb.com/reference/view-credits-ledger) +- [Invoices](https://docs.withorb.com/docs/orb-docs/api-reference/schemas/invoice) Note that the Credits Ledger Entries must read all Customers for an incremental sync, but will only incrementally return new ledger entries for each customer. @@ -18,12 +18,12 @@ Orb's API uses cursor-based pagination, which is documented [here](https://docs. ## Enriching Credit Ledger entries -The connector configuration includes two properties: `numeric_event_properties_keys` and `string_event_properties_keys`. +The connector configuration includes two properties: `numeric_event_properties_keys` and `string_event_properties_keys`. -When a ledger entry has an `event_id` attached to it (e.g. an automated decrement), the connector will make a follow-up request to enrich those entries with event properties corresponding to the keys provided. The connector assumes (and generates schema) that property values corresponding to the keys listed in `numeric_event_properties_keys` are numeric, and the property values corresponding to the keys listed in `string_event_properties_keys` are string typed. +When a ledger entry has an `event_id` attached to it (e.g. an automated decrement), the connector will make a follow-up request to enrich those entries with event properties corresponding to the keys provided. The connector assumes (and generates schema) that property values corresponding to the keys listed in `numeric_event_properties_keys` are numeric, and the property values corresponding to the keys listed in `string_event_properties_keys` are string typed. ## Authentication This connector authenticates against the Orb API with an API key that can be issued via the Orb Admin Console. -Please reach out to the Orb team at [team@withorb.com](mailto:team@withorb.com) to request an Orb Account and API Key. \ No newline at end of file +Please reach out to the Orb team at [team@withorb.com](mailto:team@withorb.com) to request an Orb Account and API Key. diff --git a/airbyte-integrations/connectors/source-orbit/README.md b/airbyte-integrations/connectors/source-orbit/README.md index eb8ec8058fa..bd7cdd17bb3 100644 --- a/airbyte-integrations/connectors/source-orbit/README.md +++ b/airbyte-integrations/connectors/source-orbit/README.md @@ -1,31 +1,32 @@ # Orbit source connector - This is the repository for the Orbit source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/orbit). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/orbit) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_orbit/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-orbit spec poetry run source-orbit check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-orbit read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-orbit build ``` An image will be available on your host with the tag `airbyte/source-orbit:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-orbit:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orbit:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-orbit test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-orbit test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/orbit.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-oura/README.md b/airbyte-integrations/connectors/source-oura/README.md index 33f60603f2e..3685a65b360 100644 --- a/airbyte-integrations/connectors/source-oura/README.md +++ b/airbyte-integrations/connectors/source-oura/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/oura) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_oura/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-oura build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-oura build An image will be built with the tag `airbyte/source-oura:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-oura:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-oura:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-oura:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-oura test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-oura test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-outbrain-amplify/README.md b/airbyte-integrations/connectors/source-outbrain-amplify/README.md index 8a0bf0a9eb5..17298f81d7c 100644 --- a/airbyte-integrations/connectors/source-outbrain-amplify/README.md +++ b/airbyte-integrations/connectors/source-outbrain-amplify/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/outbrain-amplify) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_outbrain_amplify/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-outbrain-amplify build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-outbrain-amplify build An image will be built with the tag `airbyte/source-outbrain-amplify:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-outbrain-amplify:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-outbrain-amplify:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outbrain-amplify:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-outbrain-amplify test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-outbrain-amplify test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-outbrain-amplify/bootstrap.md b/airbyte-integrations/connectors/source-outbrain-amplify/bootstrap.md index 7bb8b128a21..cc830afd605 100644 --- a/airbyte-integrations/connectors/source-outbrain-amplify/bootstrap.md +++ b/airbyte-integrations/connectors/source-outbrain-amplify/bootstrap.md @@ -1,24 +1,27 @@ -The (Outbrain Amplify Source is [a REST based API](https://www.outbrain.com//). +The (Outbrain Amplify Source is [a REST based API](https://www.outbrain.com//). Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). ## Outbrain-Amplify api stream + Outbrain Amplify is a content discovery and advertising platform that helps businesses and publishers promote their content to a wider audience. Customers can use Outbrain Amplify to promote their content across a range of premium publishers, including some of the biggest names in media. They can create custom campaigns, set specific targeting criteria, and monitor the performance of their campaigns in real-time. The platform also offers a range of tools and features to help customers optimize their campaigns and improve their ROI. Offers a powerful way for businesses and publishers to reach new audiences and drive more traffic to their content. With its advanced targeting capabilities and robust reporting tools, the platform can help customers achieve their marketing goals and grow their businesses. + ## Endpoints -* marketers stream --> Non-Non-Incremental -* campaigns by marketers stream. --> Non-Non-Incremental -* campaigns geo location stream. --> Non-Incremental -* promoted links for campaigns stream. --> Non-Incremental -* promoted links sequence for campaigns stream. --> Non-Incremental -* budgets for marketers stream. --> Non-Incremental -* performance report campaigns by marketers stream. --> Non-Incremental -* performance report periodic by marketers stream. --> Non-Incremental -* performance report periodic by marketers campaign stream. --> Non-Incremental -* performance report periodic content by promoted links campaign stream. --> Non-Incremental -* performance report marketers by publisher stream. --> Non-Incremental -* performance report publishers by campaigns stream. --> Non-Incremental -* performance report marketers by platforms stream. --> Non-Incremental -* performance report marketers campaigns by platforms stream. --> Non-Incremental -* performance report marketers by geo performance stream. --> Non-Incremental -* performance report marketers campaigns by geo stream. --> Non-Incremental -* performance report marketers by Interest stream. --> Non-Incremental \ No newline at end of file + +- marketers stream --> Non-Non-Incremental +- campaigns by marketers stream. --> Non-Non-Incremental +- campaigns geo location stream. --> Non-Incremental +- promoted links for campaigns stream. --> Non-Incremental +- promoted links sequence for campaigns stream. --> Non-Incremental +- budgets for marketers stream. --> Non-Incremental +- performance report campaigns by marketers stream. --> Non-Incremental +- performance report periodic by marketers stream. --> Non-Incremental +- performance report periodic by marketers campaign stream. --> Non-Incremental +- performance report periodic content by promoted links campaign stream. --> Non-Incremental +- performance report marketers by publisher stream. --> Non-Incremental +- performance report publishers by campaigns stream. --> Non-Incremental +- performance report marketers by platforms stream. --> Non-Incremental +- performance report marketers campaigns by platforms stream. --> Non-Incremental +- performance report marketers by geo performance stream. --> Non-Incremental +- performance report marketers campaigns by geo stream. --> Non-Incremental +- performance report marketers by Interest stream. --> Non-Incremental diff --git a/airbyte-integrations/connectors/source-outreach/README.md b/airbyte-integrations/connectors/source-outreach/README.md index 74361385589..d3f5f28f2ff 100644 --- a/airbyte-integrations/connectors/source-outreach/README.md +++ b/airbyte-integrations/connectors/source-outreach/README.md @@ -1,31 +1,32 @@ # Outreach source connector - This is the repository for the Outreach source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/outreach). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/outreach) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_outreach/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-outreach spec poetry run source-outreach check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-outreach read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-outreach build ``` An image will be available on your host with the tag `airbyte/source-outreach:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-outreach:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outreach:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-outreach test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-outreach test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/outreach.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-pagerduty/README.md b/airbyte-integrations/connectors/source-pagerduty/README.md index aea16529a84..f8389afe069 100644 --- a/airbyte-integrations/connectors/source-pagerduty/README.md +++ b/airbyte-integrations/connectors/source-pagerduty/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pagerduty) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pagerduty/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pagerduty build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-pagerduty build An image will be built with the tag `airbyte/source-pagerduty:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pagerduty:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pagerduty:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pagerduty:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pagerduty test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pagerduty test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pardot/README.md b/airbyte-integrations/connectors/source-pardot/README.md index dcd89f3ddf5..ebcf261ee94 100644 --- a/airbyte-integrations/connectors/source-pardot/README.md +++ b/airbyte-integrations/connectors/source-pardot/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/pardot) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pardot/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pardot build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-pardot build An image will be built with the tag `airbyte/source-pardot:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pardot:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pardot:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pardot:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pardot test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pardot test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-partnerstack/README.md b/airbyte-integrations/connectors/source-partnerstack/README.md index e805cbcf54f..7a8846f12ec 100644 --- a/airbyte-integrations/connectors/source-partnerstack/README.md +++ b/airbyte-integrations/connectors/source-partnerstack/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/partnerstack) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_partnerstack/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-partnerstack build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-partnerstack build An image will be built with the tag `airbyte/source-partnerstack:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-partnerstack:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-partnerstack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-partnerstack:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-partnerstack test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-partnerstack test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-paypal-transaction/CHANGELOG.md b/airbyte-integrations/connectors/source-paypal-transaction/CHANGELOG.md index 7cddffd5f10..5f99b971afe 100644 --- a/airbyte-integrations/connectors/source-paypal-transaction/CHANGELOG.md +++ b/airbyte-integrations/connectors/source-paypal-transaction/CHANGELOG.md @@ -1,13 +1,17 @@ # Changelog ## 0.1.0 + Source implementation with support of Transactions and Balances streams ## 1.0.0 + Mark Client ID and Client Secret as required files ## 2.1.0 + Migration to Low code ## 2.3.0 -Adding New Streams - Payments, Disputes, Invoices, Product Catalog \ No newline at end of file + +Adding New Streams - Payments, Disputes, Invoices, Product Catalog diff --git a/airbyte-integrations/connectors/source-paypal-transaction/README.md b/airbyte-integrations/connectors/source-paypal-transaction/README.md index 20bfbffa3cc..004af3ec4e8 100644 --- a/airbyte-integrations/connectors/source-paypal-transaction/README.md +++ b/airbyte-integrations/connectors/source-paypal-transaction/README.md @@ -5,19 +5,20 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development - #### Prerequisites - * Python (~=3.9) - * Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) - * Paypal Client ID and Client Secret - * If you are going to use the data generator scripts you need to setup yourPaypal Sandbox and a Buyer user in your sandbox, to simulate the data. YOu cna get that information in the [Apps & Credentials page](https://developer.paypal.com/dashboard/applications/live). - * Buyer Username - * Buyer Password - * Payer ID (Account ID) + +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Paypal Client ID and Client Secret +- If you are going to use the data generator scripts you need to setup yourPaypal Sandbox and a Buyer user in your sandbox, to simulate the data. YOu cna get that information in the [Apps & Credentials page](https://developer.paypal.com/dashboard/applications/live). + - Buyer Username + - Buyer Password + - Payer ID (Account ID) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` @@ -29,9 +30,8 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. -* You must have created your credentials under the `secrets/` folder -* For the read command, you can create separate catalogs to test the streams individually. All catalogs are under the folder `integration_tests`. Select the one you want to test with the read command. - +- You must have created your credentials under the `secrets/` folder +- For the read command, you can create separate catalogs to test the streams individually. All catalogs are under the folder `integration_tests`. Select the one you want to test with the read command. ### Locally running the connector @@ -44,6 +44,7 @@ poetry run source-paypal-transaction read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: ``` @@ -55,20 +56,23 @@ poetry run pytest unit_tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: - ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -82,10 +86,10 @@ if TYPE_CHECKING: An image will be available on your host with the tag `airbyte/source-paypal-transaction:dev`. - - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-paypal-transaction:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paypal-transaction:dev check --config /secrets/config.json @@ -93,7 +97,6 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paypal-transaction:dev docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-paypal-transaction:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json ``` - ### Running our CI test suite You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): @@ -109,10 +112,10 @@ airbyte-ci connectors --name source-paypal-transaction --use-local-secrets test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. - ## Running Unit tests locally To run unit tests locally, form the root `source_paypal_transaction` directory run: @@ -128,86 +131,96 @@ Some endpoints will require special permissions on the sandbox to update and cha In the `bin` folder you will find several data generator scripts: -* **disputes_generator.py:** - * Update dispute: Uses the _PATCH_ method of the `https://api-m.paypal.com/v1/customer/disputes/{dispute_id}` endpoint. You need the ID and create a payload to pass it as an argument. See more information [here](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_patch). +- **disputes_generator.py:** - ```bash - python disputes_generator.py update DISPUTE_ID ''[{"op": "replace", "path": "/reason", "value": "The new reason"}]' - ``` - - * Update Evidence status: Uses the _POST_ method of the `https://api-m.paypal.com/v1/customer/disputes/{dispute_id}/require-evidence` endpoint. You need the ID and select an option to pass it as an argument. See more information [here](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_require-evidence) - ```bash - python update_dispute.py require-evidence DISPUTE_ID SELLER_EVIDENCE - ``` + - Update dispute: Uses the _PATCH_ method of the `https://api-m.paypal.com/v1/customer/disputes/{dispute_id}` endpoint. You need the ID and create a payload to pass it as an argument. See more information [here](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_patch). -* **invoices.py:** - * Create draft invoice: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v2/invoicing/invoices` endpoint. It will automatically generate an invoice (no need to pass any parameters). See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_create). + ```bash + python disputes_generator.py update DISPUTE_ID ''[{"op": "replace", "path": "/reason", "value": "The new reason"}]' + ``` - ```bash - python invoices.py create_draft - ``` - - * Send a Draft Invoice: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v2/invoicing/invoices/{invoice_id}/send` endpoint. You need the Invoice ID, a subject and a note (just to have something to update) and an email as an argument. See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_send) - ```bash - python invoices.py send_draft --invoice_id "INV2-XXXX-XXXX-XXXX-XXXX" --subject "Your Invoice Subject" --note "Your custom note" --additional_recipients example@email.com - ``` + - Update Evidence status: Uses the _POST_ method of the `https://api-m.paypal.com/v1/customer/disputes/{dispute_id}/require-evidence` endpoint. You need the ID and select an option to pass it as an argument. See more information [here](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_require-evidence) -* **payments_generator.py:** - * Partially update payment: Uses the _PATCH_ method of the `https://api-m.paypal.com/v1/payments/payment/{payment_id}` endpoint. You need the payment ID and a payload with new values. (no need to pass any parameters). See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_create). + ```bash + python update_dispute.py require-evidence DISPUTE_ID SELLER_EVIDENCE + ``` - ```bash - python script_name.py update PAYMENT_ID '[{"op": "replace", "path": "/transactions/0/amount", "value": {"total": "50.00", "currency": "USD"}}]' - ``` - -* **paypal_transaction_generator.py:** - Make sure you have the `buyer_username`, `buyer_password` and `payer_id` in your config file. You can get the sample configuratin in the `sample_config.json`. +- **invoices.py:** - * Generate transactions: This uses Selenium, so you will be prompted to your account to simulate the complete transaction flow. You can add a number at the end of the command to do more than one transaction. By default the script runs 3 transactions. + - Create draft invoice: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v2/invoicing/invoices` endpoint. It will automatically generate an invoice (no need to pass any parameters). See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_create). - **NOTE: Be midnfu of the number of transactions, as it will be interacting with your machine, and you may not be able to use it while creating the transactions** + ```bash + python invoices.py create_draft + ``` - ```bash - python paypal_transaction_generator.py [NUMBER_OF_DESIRED_TRANSACTIONS] - ``` + - Send a Draft Invoice: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v2/invoicing/invoices/{invoice_id}/send` endpoint. You need the Invoice ID, a subject and a note (just to have something to update) and an email as an argument. See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_send) -* **product_catalog.py:** - * Create a product: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v1/catalogs/products` endpoint. You need to add the description and the category in the command line. For the proper category see more information [here](https://developer.paypal.com/docs/api/catalog-products/v1/#products_create). + ```bash + python invoices.py send_draft --invoice_id "INV2-XXXX-XXXX-XXXX-XXXX" --subject "Your Invoice Subject" --note "Your custom note" --additional_recipients example@email.com + ``` - ```bash - python product_catalog.py --action create --description "YOUR DESCRIPTION" --category PAYPAL_CATEGORY - ``` - - * Update a product: Uses the _PATCH_ method of the `https://developer.paypal.com/docs/api/catalog-products/v1/#products_patch` endpoint. You need the product ID, a description and the Category as an argument. See more information [here](https://developer.paypal.com/docs/api/catalog-products/v1/#products_patch) - ```bash - python product_catalog.py --action update --product_id PRODUCT_ID --update_payload '[{"op": "replace", "path": "/description", "value": "My Update. Does it changes it?"}]' - ``` +- **payments_generator.py:** + + - Partially update payment: Uses the _PATCH_ method of the `https://api-m.paypal.com/v1/payments/payment/{payment_id}` endpoint. You need the payment ID and a payload with new values. (no need to pass any parameters). See more information [here](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_create). + + ```bash + python script_name.py update PAYMENT_ID '[{"op": "replace", "path": "/transactions/0/amount", "value": {"total": "50.00", "currency": "USD"}}]' + ``` + +- **paypal_transaction_generator.py:** + Make sure you have the `buyer_username`, `buyer_password` and `payer_id` in your config file. You can get the sample configuratin in the `sample_config.json`. + + - Generate transactions: This uses Selenium, so you will be prompted to your account to simulate the complete transaction flow. You can add a number at the end of the command to do more than one transaction. By default the script runs 3 transactions. + + **NOTE: Be midnfu of the number of transactions, as it will be interacting with your machine, and you may not be able to use it while creating the transactions** + + ```bash + python paypal_transaction_generator.py [NUMBER_OF_DESIRED_TRANSACTIONS] + ``` + +- **product_catalog.py:** + + - Create a product: Uses the _POST_ method of the `https://api-m.sandbox.paypal.com/v1/catalogs/products` endpoint. You need to add the description and the category in the command line. For the proper category see more information [here](https://developer.paypal.com/docs/api/catalog-products/v1/#products_create). + + ```bash + python product_catalog.py --action create --description "YOUR DESCRIPTION" --category PAYPAL_CATEGORY + ``` + + - Update a product: Uses the _PATCH_ method of the `https://developer.paypal.com/docs/api/catalog-products/v1/#products_patch` endpoint. You need the product ID, a description and the Category as an argument. See more information [here](https://developer.paypal.com/docs/api/catalog-products/v1/#products_patch) + + ```bash + python product_catalog.py --action update --product_id PRODUCT_ID --update_payload '[{"op": "replace", "path": "/description", "value": "My Update. Does it changes it?"}]' + ``` ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` - Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-paypal-transaction test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/paypal-transaction.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-paystack/BOOTSTRAP.md b/airbyte-integrations/connectors/source-paystack/BOOTSTRAP.md index 94fffb8f725..b2db6cee3ed 100644 --- a/airbyte-integrations/connectors/source-paystack/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-paystack/BOOTSTRAP.md @@ -3,6 +3,7 @@ paystack.com is a Payment Gateway and its REST API is similar to Stripe's. This Paystack API connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). The Paystack API has resources including (not exhaustive) + - Customers - Transactions - Payments and payment attempts - Subscriptions - Recurring payments @@ -13,9 +14,11 @@ The Paystack API has resources including (not exhaustive) The Paystack API can be used to charge customers, and to perform CRUD operations on any of the above resources. For Airbyte only the "R" - read operations are needed, however Paystack currently supports a single secret key which can do all CRUD operations. ## Notes & Quirks + - Pagination uses the query parameters "page" (starting at 1) and "perPage". - The standard cursor field is "createdAt" on all responses, except the "Invoices" stream which uses "created_at". It's likely the interface for this resource is either outdated or failed to be backward compatible (some other resources have both fields and some have only "createdAt"). ## Useful links below + - [Paystack connector documentation](https://docs.airbyte.io/integrations/sources/paystack) - Information about specific streams and some nuances about the connector -- [Paystack dashboard](https://dashboard.paystack.com/#/settings/developer) - To grab your API token \ No newline at end of file +- [Paystack dashboard](https://dashboard.paystack.com/#/settings/developer) - To grab your API token diff --git a/airbyte-integrations/connectors/source-paystack/README.md b/airbyte-integrations/connectors/source-paystack/README.md index 5717838e916..9c970516503 100644 --- a/airbyte-integrations/connectors/source-paystack/README.md +++ b/airbyte-integrations/connectors/source-paystack/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/paystack) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_paystack/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-paystack build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-paystack build An image will be built with the tag `airbyte/source-paystack:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-paystack:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-paystack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paystack:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-paystack test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-paystack test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pendo/README.md b/airbyte-integrations/connectors/source-pendo/README.md index 020e7a5fc47..83be08c86a0 100644 --- a/airbyte-integrations/connectors/source-pendo/README.md +++ b/airbyte-integrations/connectors/source-pendo/README.md @@ -1,31 +1,32 @@ # Pendo source connector - This is the repository for the Pendo source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/pendo). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pendo) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pendo/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-pendo spec poetry run source-pendo check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-pendo read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-pendo build ``` An image will be available on your host with the tag `airbyte/source-pendo:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pendo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pendo:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pendo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pendo test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/pendo.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-persistiq/README.md b/airbyte-integrations/connectors/source-persistiq/README.md index 0a4bbfb8c9e..1d7dd2da161 100644 --- a/airbyte-integrations/connectors/source-persistiq/README.md +++ b/airbyte-integrations/connectors/source-persistiq/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/persistiq) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_persistiq/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-persistiq build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-persistiq build An image will be built with the tag `airbyte/source-persistiq:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-persistiq:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-persistiq:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-persistiq:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-persistiq test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-persistiq test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pexels-api/README.md b/airbyte-integrations/connectors/source-pexels-api/README.md index bb20c4f5d53..2542ffa9659 100644 --- a/airbyte-integrations/connectors/source-pexels-api/README.md +++ b/airbyte-integrations/connectors/source-pexels-api/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/pexels-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pexels_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,9 +46,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pexels-api build ``` @@ -50,12 +57,15 @@ airbyte-ci connectors --name=source-pexels-api build An image will be built with the tag `airbyte/source-pexels-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pexels-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pexels-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pexels-api:dev check --config /secrets/config.json @@ -64,23 +74,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pexels-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pexels-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -88,4 +105,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pexels-api/bootstrap.md b/airbyte-integrations/connectors/source-pexels-api/bootstrap.md index ef9f1c3f6da..4ae88ce5591 100644 --- a/airbyte-integrations/connectors/source-pexels-api/bootstrap.md +++ b/airbyte-integrations/connectors/source-pexels-api/bootstrap.md @@ -1,12 +1,12 @@ # Pexels-API The connector uses the v1 API documented here: https://www.pexels.com/api/documentation . It is -straightforward HTTP REST API with API based authentication. +straightforward HTTP REST API with API based authentication. ## API key Api key is mandate for this connector to work, It could be generated by a free account at https://www.pexels.com/api/new/. -Just pass the generated API key and optional parameters for establishing the connection. +Just pass the generated API key and optional parameters for establishing the connection. ## Implementation details @@ -17,11 +17,11 @@ Just pass the generated API key and optional parameters for establishing the con - Generate an API key (Example: 12345) - Params (If specific info is needed) - Available params - - query: Ocean, Tigers, Pears, etc. Default is people - - orientation: landscape, portrait or square. Default is landscape - - size: large, medium, small. Default is large - - color: red, orange, yellow, green, turquoise, blue, violet, pink, brown, black, gray, white or any hexidecimal color code. - - locale: en-US, pt-BR, es-ES, ca-ES, de-DE, it-IT, fr-FR, sv-SE, id-ID, pl-PL, ja-JP, zh-TW, zh-CN, ko-KR, th-TH, nl-NL, hu-HU, vi-VN,
    cs-CZ, da-DK, fi-FI, uk-UA, el-GR, ro-RO, nb-NO, sk-SK, tr-TR, ru-RU. Default is en-US + - query: Ocean, Tigers, Pears, etc. Default is people + - orientation: landscape, portrait or square. Default is landscape + - size: large, medium, small. Default is large + - color: red, orange, yellow, green, turquoise, blue, violet, pink, brown, black, gray, white or any hexidecimal color code. + - locale: en-US, pt-BR, es-ES, ca-ES, de-DE, it-IT, fr-FR, sv-SE, id-ID, pl-PL, ja-JP, zh-TW, zh-CN, ko-KR, th-TH, nl-NL, hu-HU, vi-VN,
    cs-CZ, da-DK, fi-FI, uk-UA, el-GR, ro-RO, nb-NO, sk-SK, tr-TR, ru-RU. Default is en-US ## Step 2: Generate schema for the endpoint @@ -34,7 +34,7 @@ Just pass the generated API key and optional parameters for establishing the con 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter your config params if needed. (Optional) -6. Click **Set up source**. +4. Enter your config params if needed. (Optional) +5. Click **Set up source**. - * We use only GET methods, towards the API endpoints which is straightforward \ No newline at end of file +- We use only GET methods, towards the API endpoints which is straightforward diff --git a/airbyte-integrations/connectors/source-pinterest/README.md b/airbyte-integrations/connectors/source-pinterest/README.md index 71c73a2027e..cf49110cb77 100644 --- a/airbyte-integrations/connectors/source-pinterest/README.md +++ b/airbyte-integrations/connectors/source-pinterest/README.md @@ -1,31 +1,32 @@ # Pinterest source connector - This is the repository for the Pinterest source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/pinterest). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pinterest) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pinterest/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-pinterest spec poetry run source-pinterest check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-pinterest read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-pinterest build ``` An image will be available on your host with the tag `airbyte/source-pinterest:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pinterest:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pinterest:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pinterest test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pinterest test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/pinterest.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-pinterest/bootstrap.md b/airbyte-integrations/connectors/source-pinterest/bootstrap.md index b639dcd0209..7bc335da1e0 100644 --- a/airbyte-integrations/connectors/source-pinterest/bootstrap.md +++ b/airbyte-integrations/connectors/source-pinterest/bootstrap.md @@ -2,22 +2,21 @@ Pinterest is a REST based API. Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). -Connector has such core streams: - -* [Account analytics](https://developers.pinterest.com/docs/api/v5/#operation/user_account/analytics) \(Incremental\) -* [Boards](https://developers.pinterest.com/docs/api/v5/#operation/boards/list) \(Full table\) - * [Board sections](https://developers.pinterest.com/docs/api/v5/#operation/board_sections/list) \(Full table\) - * [Pins on board section](https://developers.pinterest.com/docs/api/v5/#operation/board_sections/list_pins) \(Full table\) - * [Pins on board](https://developers.pinterest.com/docs/api/v5/#operation/boards/list_pins) \(Full table\) -* [Ad accounts](https://developers.pinterest.com/docs/api/v5/#operation/ad_accounts/list) \(Full table\) - * [Ad account analytics](https://developers.pinterest.com/docs/api/v5/#operation/ad_account/analytics) \(Incremental\) - * [Campaigns](https://developers.pinterest.com/docs/api/v5/#operation/campaigns/list) \(Incremental\) - * [Campaign analytics](https://developers.pinterest.com/docs/api/v5/#operation/campaigns/list) \(Incremental\) - * [Ad groups](https://developers.pinterest.com/docs/api/v5/#operation/ad_groups/list) \(Incremental\) - * [Ad group analytics](https://developers.pinterest.com/docs/api/v5/#operation/ad_groups/analytics) \(Incremental\) - * [Ads](https://developers.pinterest.com/docs/api/v5/#operation/ads/list) \(Incremental\) - * [Ad analytics](https://developers.pinterest.com/docs/api/v5/#operation/ads/analytics) \(Incremental\) +Connector has such core streams: +- [Account analytics](https://developers.pinterest.com/docs/api/v5/#operation/user_account/analytics) \(Incremental\) +- [Boards](https://developers.pinterest.com/docs/api/v5/#operation/boards/list) \(Full table\) + - [Board sections](https://developers.pinterest.com/docs/api/v5/#operation/board_sections/list) \(Full table\) + - [Pins on board section](https://developers.pinterest.com/docs/api/v5/#operation/board_sections/list_pins) \(Full table\) + - [Pins on board](https://developers.pinterest.com/docs/api/v5/#operation/boards/list_pins) \(Full table\) +- [Ad accounts](https://developers.pinterest.com/docs/api/v5/#operation/ad_accounts/list) \(Full table\) + - [Ad account analytics](https://developers.pinterest.com/docs/api/v5/#operation/ad_account/analytics) \(Incremental\) + - [Campaigns](https://developers.pinterest.com/docs/api/v5/#operation/campaigns/list) \(Incremental\) + - [Campaign analytics](https://developers.pinterest.com/docs/api/v5/#operation/campaigns/list) \(Incremental\) + - [Ad groups](https://developers.pinterest.com/docs/api/v5/#operation/ad_groups/list) \(Incremental\) + - [Ad group analytics](https://developers.pinterest.com/docs/api/v5/#operation/ad_groups/analytics) \(Incremental\) + - [Ads](https://developers.pinterest.com/docs/api/v5/#operation/ads/list) \(Incremental\) + - [Ad analytics](https://developers.pinterest.com/docs/api/v5/#operation/ads/analytics) \(Incremental\) Connector uses `start_date` config for initial reports sync depend on connector and current date as an end data. diff --git a/airbyte-integrations/connectors/source-pipedrive/README.md b/airbyte-integrations/connectors/source-pipedrive/README.md index 7fbcac238e0..a2fedd088f8 100644 --- a/airbyte-integrations/connectors/source-pipedrive/README.md +++ b/airbyte-integrations/connectors/source-pipedrive/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pipedrive) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pipedrive/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pipedrive build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-pipedrive build An image will be built with the tag `airbyte/source-pipedrive:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pipedrive:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pipedrive:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pipedrive:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pipedrive test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pipedrive test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pivotal-tracker/README.md b/airbyte-integrations/connectors/source-pivotal-tracker/README.md index bf5fa41cbc3..0a5fa3db561 100644 --- a/airbyte-integrations/connectors/source-pivotal-tracker/README.md +++ b/airbyte-integrations/connectors/source-pivotal-tracker/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/pivotal-tracker) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pivotal_tracker/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pivotal-tracker build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-pivotal-tracker build An image will be built with the tag `airbyte/source-pivotal-tracker:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pivotal-tracker:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pivotal-tracker:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pivotal-tracker:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pivotal-tracker test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pivotal-tracker test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-plaid/README.md b/airbyte-integrations/connectors/source-plaid/README.md index 43550443354..0f78a5508ba 100644 --- a/airbyte-integrations/connectors/source-plaid/README.md +++ b/airbyte-integrations/connectors/source-plaid/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/plaid) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_plaid/spec.json` file. Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-plaid build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-plaid build An image will be built with the tag `airbyte/source-plaid:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-plaid:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-plaid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plaid:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-plaid test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-plaid test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-plausible/BOOTSTRAP.md b/airbyte-integrations/connectors/source-plausible/BOOTSTRAP.md index bf25537a119..b10a51b74c2 100644 --- a/airbyte-integrations/connectors/source-plausible/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-plausible/BOOTSTRAP.md @@ -3,6 +3,7 @@ Plausible is a privacy-first, subscription-only website analytics service. Link to their stats API is [here](https://plausible.io/docs/stats-api). ## How to get an API key + - [Sign up for Plausible](https://plausible.io/register). There is a 30-day free trial but beyond that it is a paid subscription. - [Add a website](https://plausible.io/docs/plausible-script). - Generate an API key from the [Settings page](https://plausible.io/settings). diff --git a/airbyte-integrations/connectors/source-plausible/README.md b/airbyte-integrations/connectors/source-plausible/README.md index e3bca2ee96c..4daf136d7ed 100644 --- a/airbyte-integrations/connectors/source-plausible/README.md +++ b/airbyte-integrations/connectors/source-plausible/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/plausible) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_plausible/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-plausible build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-plausible build An image will be built with the tag `airbyte/source-plausible:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-plausible:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-plausible:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plausible:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-plausible test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-plausible test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-pocket/README.md b/airbyte-integrations/connectors/source-pocket/README.md index 767afb93e18..3f6139c52f0 100644 --- a/airbyte-integrations/connectors/source-pocket/README.md +++ b/airbyte-integrations/connectors/source-pocket/README.md @@ -1,31 +1,32 @@ # Pocket source connector - This is the repository for the Pocket source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/pocket). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pocket) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pocket/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-pocket spec poetry run source-pocket check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-pocket read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-pocket build ``` An image will be available on your host with the tag `airbyte/source-pocket:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pocket:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pocket:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pocket test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pocket test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/pocket.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-pocket/bootstrap.md b/airbyte-integrations/connectors/source-pocket/bootstrap.md index a817be1dc7a..4c1a8610c46 100644 --- a/airbyte-integrations/connectors/source-pocket/bootstrap.md +++ b/airbyte-integrations/connectors/source-pocket/bootstrap.md @@ -16,4 +16,4 @@ In order to use the /v3/get endpoint, your consumer key must have the "Retrieve" ## Secret generation -In order to generate both needed secrets to authenticate (consumer key and access token), you can follow the steps described in [https://docs.airbyte.com/integrations/sources/pocket](https://docs.airbyte.com/integrations/sources/pocket) \ No newline at end of file +In order to generate both needed secrets to authenticate (consumer key and access token), you can follow the steps described in [https://docs.airbyte.com/integrations/sources/pocket](https://docs.airbyte.com/integrations/sources/pocket) diff --git a/airbyte-integrations/connectors/source-pokeapi/README.md b/airbyte-integrations/connectors/source-pokeapi/README.md index 0cd90facb88..c372c238ad3 100644 --- a/airbyte-integrations/connectors/source-pokeapi/README.md +++ b/airbyte-integrations/connectors/source-pokeapi/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pokeapi) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pokeapi/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-pokeapi build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-pokeapi build An image will be built with the tag `airbyte/source-pokeapi:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-pokeapi:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pokeapi:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pokeapi:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pokeapi test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pokeapi test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-polygon-stock-api/README.md b/airbyte-integrations/connectors/source-polygon-stock-api/README.md index 966f3fc4a50..dfc5209d8e6 100644 --- a/airbyte-integrations/connectors/source-polygon-stock-api/README.md +++ b/airbyte-integrations/connectors/source-polygon-stock-api/README.md @@ -1,31 +1,32 @@ # Polygon-Stock-Api source connector - This is the repository for the Polygon-Stock-Api source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/polygon-stock-api). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/polygon-stock-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_polygon_stock_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-polygon-stock-api spec poetry run source-polygon-stock-api check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-polygon-stock-api read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-polygon-stock-api build ``` An image will be available on your host with the tag `airbyte/source-polygon-stock-api:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-polygon-stock-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-polygon-stock-api:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-polygon-stock-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-polygon-stock-api test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/polygon-stock-api.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-postgres/README.md b/airbyte-integrations/connectors/source-postgres/README.md index 0c6c726d6b9..5e8175d6ede 100644 --- a/airbyte-integrations/connectors/source-postgres/README.md +++ b/airbyte-integrations/connectors/source-postgres/README.md @@ -3,11 +3,13 @@ ## Performance Test To run performance tests in commandline: + ```shell ./gradlew :airbyte-integrations:connectors:source-postgres:performanceTest [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` In pull request: + ```shell /test-performance connector=connectors/source-postgres [--cpulimit=cpulimit/] [--memorylimit=memorylimit/] ``` @@ -18,7 +20,7 @@ In pull request: ### Use Postgres script to populate the benchmark database -In order to create a database with a certain number of tables, and a certain number of records in each of them, +In order to create a database with a certain number of tables, and a certain number of records in each of them, you need to follow a few simple steps. 1. Create a new database. @@ -30,4 +32,4 @@ you need to follow a few simple steps. psql -h -d -U -p -a -q -f src/test-performance/sql/2-create-insert-rows-to-table-procedure.sql psql -h -d -U -p -a -q -f src/test-performance/sql/3-run-script.sql ``` -4. After the script finishes, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test_(the number of tables minus 1)**. +4. After the script finishes, you will receive the number of tables specified in the script, with names starting with **test_0** and ending with **test\_(the number of tables minus 1)**. diff --git a/airbyte-integrations/connectors/source-postgres/integration_tests/README.md b/airbyte-integrations/connectors/source-postgres/integration_tests/README.md index 45e74b238d3..e41730dd349 100644 --- a/airbyte-integrations/connectors/source-postgres/integration_tests/README.md +++ b/airbyte-integrations/connectors/source-postgres/integration_tests/README.md @@ -1,5 +1,6 @@ This directory contains files used to run Connector Acceptance Tests. -* `abnormal_state.json` describes a connector state with a non-existing cursor value. -* `expected_records.txt` lists all the records expected as the output of the basic read operation. -* `incremental_configured_catalog.json` is a configured catalog used as an input of the `incremental` test. -* `seed.sql` is the query we manually ran on a test postgres instance to seed it with test data and enable CDC. \ No newline at end of file + +- `abnormal_state.json` describes a connector state with a non-existing cursor value. +- `expected_records.txt` lists all the records expected as the output of the basic read operation. +- `incremental_configured_catalog.json` is a configured catalog used as an input of the `incremental` test. +- `seed.sql` is the query we manually ran on a test postgres instance to seed it with test data and enable CDC. diff --git a/airbyte-integrations/connectors/source-posthog/README.md b/airbyte-integrations/connectors/source-posthog/README.md index f4871371a74..9ce5082a306 100644 --- a/airbyte-integrations/connectors/source-posthog/README.md +++ b/airbyte-integrations/connectors/source-posthog/README.md @@ -6,20 +6,25 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -28,6 +33,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/posthog) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_posthog/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -37,6 +43,7 @@ See `sample_files/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -46,9 +53,10 @@ python main.py read --config secrets/config.json --catalog sample_files/configur ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-posthog build ``` @@ -56,12 +64,15 @@ airbyte-ci connectors --name=source-posthog build An image will be built with the tag `airbyte/source-posthog:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-posthog:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-posthog:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-posthog:dev check --config /secrets/config.json @@ -70,23 +81,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-posthog test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-posthog test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -94,4 +112,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-postmarkapp/README.md b/airbyte-integrations/connectors/source-postmarkapp/README.md index c8bd4f25e1a..8e3cc593ad8 100644 --- a/airbyte-integrations/connectors/source-postmarkapp/README.md +++ b/airbyte-integrations/connectors/source-postmarkapp/README.md @@ -1,31 +1,32 @@ # Postmarkapp source connector - This is the repository for the Postmarkapp source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/postmarkapp). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/postmarkapp) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_postmarkapp/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-postmarkapp spec poetry run source-postmarkapp check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-postmarkapp read --config secrets/config.json --catalog sample ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-postmarkapp build ``` An image will be available on your host with the tag `airbyte/source-postmarkapp:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-postmarkapp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-postmarkapp:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-postmarkapp test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-postmarkapp test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/postmarkapp.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-prestashop/README.md b/airbyte-integrations/connectors/source-prestashop/README.md index e65d710ee70..1e5a6bd9559 100644 --- a/airbyte-integrations/connectors/source-prestashop/README.md +++ b/airbyte-integrations/connectors/source-prestashop/README.md @@ -1,31 +1,32 @@ # Prestashop source connector - This is the repository for the Prestashop source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/prestashop). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/prestashop) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_prestashop/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-prestashop spec poetry run source-prestashop check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-prestashop read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-prestashop build ``` An image will be available on your host with the tag `airbyte/source-prestashop:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-prestashop:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-prestashop:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-prestashop test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-prestashop test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/prestashop.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-primetric/README.md b/airbyte-integrations/connectors/source-primetric/README.md index e5954ff6185..bfcab53e8cf 100644 --- a/airbyte-integrations/connectors/source-primetric/README.md +++ b/airbyte-integrations/connectors/source-primetric/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/primetric) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `integration_tests/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-primetric build ``` An image will be available on your host with the tag `airbyte/source-primetric:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-primetric:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-primetric:dev check --config /secrets/config.json @@ -69,6 +67,7 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ### Running our CI test suite You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-primetric test ``` @@ -80,8 +79,9 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -91,13 +91,14 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-primetric test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/primetric.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-public-apis/README.md b/airbyte-integrations/connectors/source-public-apis/README.md index 19e543a7e8c..ad768bdb6df 100644 --- a/airbyte-integrations/connectors/source-public-apis/README.md +++ b/airbyte-integrations/connectors/source-public-apis/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/public-apis) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_public_apis/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-public-apis build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-public-apis build An image will be built with the tag `airbyte/source-public-apis:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-public-apis:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-public-apis:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-public-apis:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-public-apis test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-public-apis test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-punk-api/README.md b/airbyte-integrations/connectors/source-punk-api/README.md index 9f142149b0e..fd091300e2b 100644 --- a/airbyte-integrations/connectors/source-punk-api/README.md +++ b/airbyte-integrations/connectors/source-punk-api/README.md @@ -6,29 +6,36 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`. If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything should work as you expect. + #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/punk-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_punk_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,9 +46,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-punk-api build ``` @@ -49,12 +57,15 @@ airbyte-ci connectors --name=source-punk-api build An image will be built with the tag `airbyte/source-punk-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-punk-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-punk-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-punk-api:dev check --config /secrets/config.json @@ -63,23 +74,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-punk-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-punk-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -87,4 +105,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-punk-api/bootstrap.md b/airbyte-integrations/connectors/source-punk-api/bootstrap.md index 505cba3d16d..8be19358878 100644 --- a/airbyte-integrations/connectors/source-punk-api/bootstrap.md +++ b/airbyte-integrations/connectors/source-punk-api/bootstrap.md @@ -1,7 +1,7 @@ # Punk-API The connector uses the v2 API documented here: https://punkapi.com/documentation/v2 . It is -straightforward HTTP REST API with API authentication. +straightforward HTTP REST API with API authentication. ## API key @@ -27,8 +27,8 @@ Just pass the dummy API key and optional parameter for establishing the connecti 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. -4. Enter your dummy `api_key`. -5. Enter the params configuration if needed: ID (Optional) -6. Click **Set up source**. +3. Enter your dummy `api_key`. +4. Enter the params configuration if needed: ID (Optional) +5. Click **Set up source**. - * We use only GET methods, towards the beers endpoints which is straightforward \ No newline at end of file +- We use only GET methods, towards the beers endpoints which is straightforward diff --git a/airbyte-integrations/connectors/source-pypi/README.md b/airbyte-integrations/connectors/source-pypi/README.md index 8dad6f52118..50e7f4cdb3a 100644 --- a/airbyte-integrations/connectors/source-pypi/README.md +++ b/airbyte-integrations/connectors/source-pypi/README.md @@ -1,31 +1,32 @@ # Pypi source connector - This is the repository for the Pypi source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/pypi). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/pypi) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_pypi/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-pypi spec poetry run source-pypi check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-pypi read --config secrets/config.json --catalog sample_files/ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-pypi build ``` An image will be available on your host with the tag `airbyte/source-pypi:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-pypi:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pypi:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-pypi test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-pypi test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/pypi.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-python-http-tutorial/README.md b/airbyte-integrations/connectors/source-python-http-tutorial/README.md index 94435e30ab9..d7acc16f25d 100644 --- a/airbyte-integrations/connectors/source-python-http-tutorial/README.md +++ b/airbyte-integrations/connectors/source-python-http-tutorial/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,14 +35,17 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Building via Gradle + You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow. To build using Gradle, from the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-python-http-tutorial:build ``` #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/python-http-tutorial) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_python_http_tutorial/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -46,8 +54,8 @@ See `sample_files/sample_config.json` for a sample config file. **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source python-http-tutorial test creds` and place them into `secrets/config.json`. - ### Locally running the connector + ``` python main.py spec python main.py check --config sample_files/config.json @@ -56,7 +64,9 @@ python main.py read --config sample_files/config.json --catalog sample_files/con ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` python -m pytest unit_tests ``` @@ -64,7 +74,9 @@ python -m pytest unit_tests ### Locally running the connector docker image #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-python-http-tutorial build ``` @@ -72,13 +84,15 @@ airbyte-ci connectors --name=source-python-http-tutorial build An image will be built with the tag `airbyte/source-python-http-tutorial:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-python-http-tutorial:dev . ``` - #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-python-http-tutorial:dev spec docker run --rm -v $(pwd)/sample_files:/sample_files airbyte/source-python-http-tutorial:dev check --config /sample_files/config.json @@ -87,19 +101,24 @@ docker run --rm -v $(pwd)/sample_files:/sample_files -v $(pwd)/sample_files:/sam ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-python-http-tutorial test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. ### Publishing a new version of the connector + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-python-http-tutorial test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -107,4 +126,3 @@ All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-python-http-tutorial/source_python_http_tutorial/schemas/TODO.md b/airbyte-integrations/connectors/source-python-http-tutorial/source_python_http_tutorial/schemas/TODO.md index cf1efadb3c9..0037aeb60d8 100644 --- a/airbyte-integrations/connectors/source-python-http-tutorial/source_python_http_tutorial/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-python-http-tutorial/source_python_http_tutorial/schemas/TODO.md @@ -1,20 +1,25 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). -The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. - +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. + The schema of a stream is the return value of `Stream.get_json_schema`. - + ## Static schemas + By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need. Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files. - + ## Dynamic schemas + If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org). -## Dynamically modifying static schemas -Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: +## Dynamically modifying static schemas + +Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: + ``` def get_json_schema(self): schema = super().get_json_schema() @@ -22,4 +27,4 @@ def get_json_schema(self): return schema ``` -Delete this file once you're done. Or don't. Up to you :) +Delete this file once you're done. Or don't. Up to you :) diff --git a/airbyte-integrations/connectors/source-qonto/README.md b/airbyte-integrations/connectors/source-qonto/README.md index 10fb3bd5400..eed7df75654 100644 --- a/airbyte-integrations/connectors/source-qonto/README.md +++ b/airbyte-integrations/connectors/source-qonto/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/metabase) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_metabase/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -38,8 +44,8 @@ See `sample_files/sample_config.json` for a sample config file. **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source metabase test creds` and place them into `secrets/config.json`. - ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,24 +54,31 @@ python main.py read --config secrets/config.json --catalog sample_files/configur ``` ### Unit Tests + To run unit tests locally, from the connector directory run: + ``` python -m pytest unit_tests ``` #### Acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. To run your integration tests with acceptance tests, from the connector root, run + ``` python -m pytest integration_tests -p integration_tests.acceptance ``` ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-qonto test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -73,4 +86,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-qualaroo/README.md b/airbyte-integrations/connectors/source-qualaroo/README.md index 2c8fdc2325e..ca798f18c07 100644 --- a/airbyte-integrations/connectors/source-qualaroo/README.md +++ b/airbyte-integrations/connectors/source-qualaroo/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/qualaroo) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_qualaroo/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-qualaroo build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-qualaroo build An image will be built with the tag `airbyte/source-qualaroo:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-qualaroo:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-qualaroo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-qualaroo:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-qualaroo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-qualaroo test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-quickbooks/README.md b/airbyte-integrations/connectors/source-quickbooks/README.md index bf8d8b6eb75..82ff0195b89 100644 --- a/airbyte-integrations/connectors/source-quickbooks/README.md +++ b/airbyte-integrations/connectors/source-quickbooks/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/quickbooks) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_quickbooks/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-quickbooks build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-quickbooks build An image will be built with the tag `airbyte/source-quickbooks:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-quickbooks:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-quickbooks:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-quickbooks:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-quickbooks test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-quickbooks test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-railz/README.md b/airbyte-integrations/connectors/source-railz/README.md index 4e8976d2d64..ebdda0ec77d 100644 --- a/airbyte-integrations/connectors/source-railz/README.md +++ b/airbyte-integrations/connectors/source-railz/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/railz) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_railz/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-railz build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-railz build An image will be built with the tag `airbyte/source-railz:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-railz:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-railz:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-railz:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-railz test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-railz test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-rd-station-marketing/README.md b/airbyte-integrations/connectors/source-rd-station-marketing/README.md index 3523942fef0..3c78e0bc75e 100644 --- a/airbyte-integrations/connectors/source-rd-station-marketing/README.md +++ b/airbyte-integrations/connectors/source-rd-station-marketing/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python3 -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/rd-station-marketing) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_rd_station_marketing/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/cat ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-rd-station-marketing build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-rd-station-marketing build An image will be built with the tag `airbyte/source-rd-station-marketing:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-rd-station-marketing:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-rd-station-marketing:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rd-station-marketing:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-rd-station-marketing test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-rd-station-marketing test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-recharge/README.md b/airbyte-integrations/connectors/source-recharge/README.md index c09b9cde6f4..d3a870acb71 100644 --- a/airbyte-integrations/connectors/source-recharge/README.md +++ b/airbyte-integrations/connectors/source-recharge/README.md @@ -1,31 +1,32 @@ # Recharge source connector - This is the repository for the Recharge source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/recharge). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/recharge) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_recharge/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-recharge spec poetry run source-recharge check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-recharge read --config secrets/config.json --catalog integrati ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-recharge build ``` An image will be available on your host with the tag `airbyte/source-recharge:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-recharge:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recharge:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-recharge test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-recharge test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/recharge.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-recreation/BOOTSTRAP.md b/airbyte-integrations/connectors/source-recreation/BOOTSTRAP.md index c3c9069c541..64b788479f5 100644 --- a/airbyte-integrations/connectors/source-recreation/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-recreation/BOOTSTRAP.md @@ -1,11 +1,12 @@ # Recreation.gov -The Recreation Information Database (RIDB) provides data resources to citizens, -offering a single point of access to information about recreational opportunities nationwide. -The RIDB represents an authoritative source of information and services for millions of visitors to federal lands, -historic sites, museums, and other attractions/resources. -This initiative integrates multiple Federal channels and -sources about recreation opportunities into a one-stop, + +The Recreation Information Database (RIDB) provides data resources to citizens, +offering a single point of access to information about recreational opportunities nationwide. +The RIDB represents an authoritative source of information and services for millions of visitors to federal lands, +historic sites, museums, and other attractions/resources. +This initiative integrates multiple Federal channels and +sources about recreation opportunities into a one-stop, searchable database of recreational areas nationwide [[ridb.recreation.gov](https://ridb.recreation.gov/docs)]. With this Airbyte connector, you can retrieve data from the [Recreation API](https://ridb.recreation.gov/landing) and -sync it to your data warehouse. \ No newline at end of file +sync it to your data warehouse. diff --git a/airbyte-integrations/connectors/source-recreation/README.md b/airbyte-integrations/connectors/source-recreation/README.md index 0abe3e9aaa3..acd3b58c6ad 100644 --- a/airbyte-integrations/connectors/source-recreation/README.md +++ b/airbyte-integrations/connectors/source-recreation/README.md @@ -1,31 +1,32 @@ # Recreation source connector - This is the repository for the Recreation source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/recreation). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/recreation) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_recreation/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-recreation spec poetry run source-recreation check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-recreation read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-recreation build ``` An image will be available on your host with the tag `airbyte/source-recreation:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-recreation:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recreation:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-recreation test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-recreation test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/recreation.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-recruitee/README.md b/airbyte-integrations/connectors/source-recruitee/README.md index 9bdca249298..5b4b9d69b99 100644 --- a/airbyte-integrations/connectors/source-recruitee/README.md +++ b/airbyte-integrations/connectors/source-recruitee/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/recruitee) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_recruitee/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-recruitee build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-recruitee build An image will be built with the tag `airbyte/source-recruitee:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-recruitee:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-recruitee:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recruitee:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-recruitee test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-recruitee test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-recurly/README.md b/airbyte-integrations/connectors/source-recurly/README.md index 936201b1a14..4afdb6c30d4 100644 --- a/airbyte-integrations/connectors/source-recurly/README.md +++ b/airbyte-integrations/connectors/source-recurly/README.md @@ -7,8 +7,8 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector diff --git a/airbyte-integrations/connectors/source-redshift/integration_tests/README.md b/airbyte-integrations/connectors/source-redshift/integration_tests/README.md index 96aa5492669..9bf604a7f6c 100644 --- a/airbyte-integrations/connectors/source-redshift/integration_tests/README.md +++ b/airbyte-integrations/connectors/source-redshift/integration_tests/README.md @@ -1,3 +1,4 @@ # Seeding the dataset + You can find the SQL scripts in this folder if you need to create or fix the SAT dataset. For more instructions and information about valid scripts, please check this [doc](https://docs.google.com/document/d/1k5TvxaNhKdr44aJIHWWtLk14Tzd2gbNX-J8YNoTj8u0/edit#heading=h.ls9oiedt9wyy). diff --git a/airbyte-integrations/connectors/source-reply-io/README.md b/airbyte-integrations/connectors/source-reply-io/README.md index a1cc013d47f..9c960956f5d 100644 --- a/airbyte-integrations/connectors/source-reply-io/README.md +++ b/airbyte-integrations/connectors/source-reply-io/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/reply-io) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_reply_io/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-reply-io build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-reply-io build An image will be built with the tag `airbyte/source-reply-io:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-reply-io:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-reply-io:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-reply-io:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-reply-io test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-reply-io test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-retently/README.md b/airbyte-integrations/connectors/source-retently/README.md index d6f9c23b306..f43e8d7db84 100644 --- a/airbyte-integrations/connectors/source-retently/README.md +++ b/airbyte-integrations/connectors/source-retently/README.md @@ -1,31 +1,32 @@ # Retently source connector - This is the repository for the Retently source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/retently). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/retently) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_retently/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-retently spec poetry run source-retently check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-retently read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-retently build ``` An image will be available on your host with the tag `airbyte/source-retently:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-retently:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-retently:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-retently test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-retently test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/retently.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-ringcentral/README.md b/airbyte-integrations/connectors/source-ringcentral/README.md index e42e5e059dd..40f07e78d19 100644 --- a/airbyte-integrations/connectors/source-ringcentral/README.md +++ b/airbyte-integrations/connectors/source-ringcentral/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/ringcentral) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_ringcentral/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-ringcentral build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-ringcentral build An image will be built with the tag `airbyte/source-ringcentral:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-ringcentral:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-ringcentral:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ringcentral:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-ringcentral test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-ringcentral test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-rki-covid/README.md b/airbyte-integrations/connectors/source-rki-covid/README.md index 4e23b0ba850..f30de3e7c35 100644 --- a/airbyte-integrations/connectors/source-rki-covid/README.md +++ b/airbyte-integrations/connectors/source-rki-covid/README.md @@ -4,7 +4,9 @@ This is the repository for the RkI (Robert Koch-Institut - von Marlon Lückert) For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/rki-covid). ## Local development + ### Developed Streams (Endpoints) + ``` Germany: 1. /germany @@ -26,23 +28,28 @@ Germany: ``` ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -51,6 +58,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/rki-covid) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_rki_covid/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -60,6 +68,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -69,9 +78,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-rki-covid build ``` @@ -79,12 +89,15 @@ airbyte-ci connectors --name=source-rki-covid build An image will be built with the tag `airbyte/source-rki-covid:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-rki-covid:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-rki-covid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rki-covid:dev check --config /secrets/config.json @@ -93,23 +106,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-rki-covid test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-rki-covid test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -117,4 +137,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-rki-covid/bootstrap.md b/airbyte-integrations/connectors/source-rki-covid/bootstrap.md index de01a936889..82a9c35f09f 100644 --- a/airbyte-integrations/connectors/source-rki-covid/bootstrap.md +++ b/airbyte-integrations/connectors/source-rki-covid/bootstrap.md @@ -1,33 +1,37 @@ -The (Robert Koch-Institut - von Marlon Lückert) Covid-19 is [a REST based API](https://api.corona-zahlen.org/). +The (Robert Koch-Institut - von Marlon Lückert) Covid-19 is [a REST based API](https://api.corona-zahlen.org/). Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). ## Cases In Germany Covid api stream + The basic entry stream is 'germany'. All other streams are extended version of base stream and passing parameters also result in sliced data. -For production, every developer application can view multiple streams. +For production, every developer application can view multiple streams. ## Endpoints -* [Provides covid cases and other information in Germany.](https://api.corona-zahlen.org/germany) \(Non-Incremental\ Entry-Stream) -* [Provides covid cases and other information in Germany, group by age.](https://api.corona-zahlen.org/germany/age-groups) \(Non-Incremental\) -* [Provides cases in Germany based on days.](https://api.corona-zahlen.org/germany/germany/history/cases/:days) \(Incremental\) -* [Provides incidence rate of covid in Germany based on days.](https://api.corona-zahlen.org/germany/germany/history/incidence/:days) \(Incremental\) -* [Provides death rate in Germany over days](https://api.corona-zahlen.org/germany/germany/history/deaths/:days) \(Incremental\) -* [Provides recovery rate in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/recovered/:days) \(Incremental\) -* [Provides frozen incidence in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/frozen-incidence/:days) \(Incremental\) -* [Provides hospitalization rate in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/hospitalization/:days) \(Incremental\) + +- [Provides covid cases and other information in Germany.](https://api.corona-zahlen.org/germany) \(Non-Incremental\ Entry-Stream) +- [Provides covid cases and other information in Germany, group by age.](https://api.corona-zahlen.org/germany/age-groups) \(Non-Incremental\) +- [Provides cases in Germany based on days.](https://api.corona-zahlen.org/germany/germany/history/cases/:days) \(Incremental\) +- [Provides incidence rate of covid in Germany based on days.](https://api.corona-zahlen.org/germany/germany/history/incidence/:days) \(Incremental\) +- [Provides death rate in Germany over days](https://api.corona-zahlen.org/germany/germany/history/deaths/:days) \(Incremental\) +- [Provides recovery rate in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/recovered/:days) \(Incremental\) +- [Provides frozen incidence in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/frozen-incidence/:days) \(Incremental\) +- [Provides hospitalization rate in Germany over days.](https://api.corona-zahlen.org/germany/germany/history/hospitalization/:days) \(Incremental\) ## Cases In States Of Germany Covid api stream + The basic entry stream is 'GermanyStates'. All other streams are extended version of base stream and passing parameters also result in sliced data. -For production, every developer application can view multiple streams. +For production, every developer application can view multiple streams. ## Endpoints -* [Provides covid cases and other information in Germany.](https://api.corona-zahlen.org/state) \(Non-Incremental\ Entry-Stream) -* [Provides covid cases and other information in Germany, group by age.](https://api.corona-zahlen.org/states/age-groupss) \(Non-Incremental\) -* [Provides cases in Germany based on days.](https://api.corona-zahlen.org/germany/states/history/cases/:days) \(Non-Incremental\) -* [Provides incidence rate of covid in Germany based on days.](https://api.corona-zahlen.org/germany/states/history/incidence/:days) \(Non-Incremental\) -* [Provides death rate in Germany over days](https://api.corona-zahlen.org/germany/states/history/deaths/:days) \(Non-Incremental\) -* [Provides recovery rate in Germany over days.](https://api.corona-zahlen.org/germany/states/history/recovered/:days) \(Non-Incremental\) -* [Provides frozen incidence in Germany over days.](https://api.corona-zahlen.org/germany/states/history/frozen-incidence/:days) \(Non-Incremental\) -* [Provides hospitalization rate in Germany over days.](https://api.corona-zahlen.org/germany/states/history/hospitalization/:days) \(Non-Incremental\) + +- [Provides covid cases and other information in Germany.](https://api.corona-zahlen.org/state) \(Non-Incremental\ Entry-Stream) +- [Provides covid cases and other information in Germany, group by age.](https://api.corona-zahlen.org/states/age-groupss) \(Non-Incremental\) +- [Provides cases in Germany based on days.](https://api.corona-zahlen.org/germany/states/history/cases/:days) \(Non-Incremental\) +- [Provides incidence rate of covid in Germany based on days.](https://api.corona-zahlen.org/germany/states/history/incidence/:days) \(Non-Incremental\) +- [Provides death rate in Germany over days](https://api.corona-zahlen.org/germany/states/history/deaths/:days) \(Non-Incremental\) +- [Provides recovery rate in Germany over days.](https://api.corona-zahlen.org/germany/states/history/recovered/:days) \(Non-Incremental\) +- [Provides frozen incidence in Germany over days.](https://api.corona-zahlen.org/germany/states/history/frozen-incidence/:days) \(Non-Incremental\) +- [Provides hospitalization rate in Germany over days.](https://api.corona-zahlen.org/germany/states/history/hospitalization/:days) \(Non-Incremental\) Incremental streams have required parameter start-date. Without passing start-date as parameter full-refresh occurs. -As cursor field this connector uses "date". \ No newline at end of file +As cursor field this connector uses "date". diff --git a/airbyte-integrations/connectors/source-rki-covid/source_rki_covid/schemas/TODO.md b/airbyte-integrations/connectors/source-rki-covid/source_rki_covid/schemas/TODO.md index cf1efadb3c9..0037aeb60d8 100644 --- a/airbyte-integrations/connectors/source-rki-covid/source_rki_covid/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-rki-covid/source_rki_covid/schemas/TODO.md @@ -1,20 +1,25 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). -The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. - +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. + The schema of a stream is the return value of `Stream.get_json_schema`. - + ## Static schemas + By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need. Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files. - + ## Dynamic schemas + If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org). -## Dynamically modifying static schemas -Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: +## Dynamically modifying static schemas + +Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: + ``` def get_json_schema(self): schema = super().get_json_schema() @@ -22,4 +27,4 @@ def get_json_schema(self): return schema ``` -Delete this file once you're done. Or don't. Up to you :) +Delete this file once you're done. Or don't. Up to you :) diff --git a/airbyte-integrations/connectors/source-rocket-chat/README.md b/airbyte-integrations/connectors/source-rocket-chat/README.md index ed7f76f3a78..afae6b8a5af 100644 --- a/airbyte-integrations/connectors/source-rocket-chat/README.md +++ b/airbyte-integrations/connectors/source-rocket-chat/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/rocket-chat) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_rocket_chat/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-rocket-chat build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-rocket-chat build An image will be built with the tag `airbyte/source-rocket-chat:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-rocket-chat:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-rocket-chat:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rocket-chat:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-rocket-chat test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-rocket-chat test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-rocket-chat/rocket-chat.md b/airbyte-integrations/connectors/source-rocket-chat/rocket-chat.md index 220997e6756..a41cc7883b2 100644 --- a/airbyte-integrations/connectors/source-rocket-chat/rocket-chat.md +++ b/airbyte-integrations/connectors/source-rocket-chat/rocket-chat.md @@ -6,19 +6,19 @@ This source can sync data from the [Rocket.chat API](https://developer.rocket.ch ## This Source Supports the Following Streams -* teams -* rooms -* channels -* roles -* subscriptions -* users +- teams +- rooms +- channels +- roles +- subscriptions +- users ### Features | Feature | Supported?\(Yes/No\) | Notes | -| :--* | :--* | :--* | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| :--_ | :--_ | :--\* | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -36,6 +36,6 @@ You need to setup a personal access token within the Rocket.chat workspace, see ## Changelog -| Version | Date | Pull Request | Subject | -| :-----* | :--------* | :-------------------------------------------------------* | :----------------------------------------* | -| 0.1.0 | 2022-10-29 | [#18635](https://github.com/airbytehq/airbyte/pull/18635) | 🎉 New Source: Rocket.chat API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :-----_ | :--------_ | :-------------------------------------------------------_ | :----------------------------------------_ | +| 0.1.0 | 2022-10-29 | [#18635](https://github.com/airbytehq/airbyte/pull/18635) | 🎉 New Source: Rocket.chat API [low-code CDK] | diff --git a/airbyte-integrations/connectors/source-rss/README.md b/airbyte-integrations/connectors/source-rss/README.md index 9590d4472ad..a572dee6a67 100644 --- a/airbyte-integrations/connectors/source-rss/README.md +++ b/airbyte-integrations/connectors/source-rss/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/rss) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `integration_tests/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-rss build ``` An image will be available on your host with the tag `airbyte/source-rss:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-rss:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rss:dev check --config /secrets/config.json @@ -69,6 +67,7 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ### Running our CI test suite You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-rss test ``` @@ -80,8 +79,9 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -91,10 +91,11 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-rss test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/rss.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). diff --git a/airbyte-integrations/connectors/source-s3/README.md b/airbyte-integrations/connectors/source-s3/README.md index 71cb2aa21b8..6b9ff77961a 100644 --- a/airbyte-integrations/connectors/source-s3/README.md +++ b/airbyte-integrations/connectors/source-s3/README.md @@ -1,31 +1,32 @@ # S3 source connector - This is the repository for the S3 source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/s3). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/s3) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_s3/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-s3 spec poetry run source-s3 check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-s3 read --config secrets/config.json --catalog sample_files/co ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-s3 build ``` An image will be available on your host with the tag `airbyte/source-s3:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-s3:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-s3:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-s3 test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-s3 test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/s3.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-salesforce/BOOTSTRAP.md b/airbyte-integrations/connectors/source-salesforce/BOOTSTRAP.md index 943fb5c4e4f..abbc232b88c 100644 --- a/airbyte-integrations/connectors/source-salesforce/BOOTSTRAP.md +++ b/airbyte-integrations/connectors/source-salesforce/BOOTSTRAP.md @@ -1,32 +1,36 @@ -The Salesforce API can be used to pull any objects that live in the user’s SF instance. -There are two types of objects: +The Salesforce API can be used to pull any objects that live in the user’s SF instance. +There are two types of objects: - * **Standard**: Those are the same across all SF instances and have a static schema - * **Custom**: These are specific to each user’s instance. A user creates a custom object type by creating it in the UI. - Think of each custom object like a SQL table with a pre-defined schema. The schema of the object can be discovered through the - [Describe](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_sobject_describe.htm) endpoint on the API. - Then when pulling those objects via API one expect them to conform to the schema declared by the endpoint. +- **Standard**: Those are the same across all SF instances and have a static schema +- **Custom**: These are specific to each user’s instance. A user creates a custom object type by creating it in the UI. + Think of each custom object like a SQL table with a pre-defined schema. The schema of the object can be discovered through the + [Describe](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_sobject_describe.htm) endpoint on the API. + Then when pulling those objects via API one expect them to conform to the schema declared by the endpoint. -To query an object, one must use [SOQL](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_query.htm), Salesforce’s proprietary SQL language. +To query an object, one must use [SOQL](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_query.htm), Salesforce’s proprietary SQL language. An example might be `SELECT * FROM WHERE SystemModstamp > 2122-01-18T21:18:20.000Z`. -Because the `Salesforce` connector pulls all objects from `Salesforce` dynamically, then all streams are dynamically generated accordingly. -And at the stage of creating a schema for each stream, we understand whether the stream is dynamic or not (if the stream has one of the -following fields: `SystemModstamp`, `LastModifiedDate`, `CreatedDate`, `LoginTime`, then it is dynamic). -Based on this data, for streams that have information about record updates - we filter by `updated at`, and for streams that have information +Because the `Salesforce` connector pulls all objects from `Salesforce` dynamically, then all streams are dynamically generated accordingly. +And at the stage of creating a schema for each stream, we understand whether the stream is dynamic or not (if the stream has one of the +following fields: `SystemModstamp`, `LastModifiedDate`, `CreatedDate`, `LoginTime`, then it is dynamic). +Based on this data, for streams that have information about record updates - we filter by `updated at`, and for streams that have information only about the date of creation of the record (as in the case of streams that have only the `CreatedDate` field) - we filter by `created at`. And we assign the Cursor as follows: + ``` @property def cursor_field(self) -> str: return self.replication_key ``` + `replication_key` is one of the following values: `SystemModstamp`, `LastModifiedDate`, `CreatedDate`, `LoginTime`. In addition there are two types of APIs exposed by Salesforce: - * **[REST API](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_queryall.htm)**: completely synchronous - * **[BULK API](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/queries.htm)**: has larger rate limit allowance (150k objects per day on the standard plan) but is asynchronous and therefore follows a request-poll-wait pattern. - + +- **[REST API](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_queryall.htm)**: completely synchronous +- **[BULK API](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/queries.htm)**: has larger rate limit allowance (150k objects per day on the standard plan) but is asynchronous and therefore follows a request-poll-wait pattern. + See the links below for information about specific streams and some nuances about the connector: + - [information about streams](https://docs.google.com/spreadsheets/d/1s-MAwI5d3eBlBOD8II_sZM7pw5FmZtAJsx1KJjVRFNU/edit#gid=1796337932) (`Salesforce` tab) - [nuances about the connector](https://docs.airbyte.io/integrations/sources/salesforce) diff --git a/airbyte-integrations/connectors/source-salesforce/README.md b/airbyte-integrations/connectors/source-salesforce/README.md index 3c68cf4b526..cfe401b8d55 100644 --- a/airbyte-integrations/connectors/source-salesforce/README.md +++ b/airbyte-integrations/connectors/source-salesforce/README.md @@ -1,31 +1,32 @@ # Salesforce source connector - This is the repository for the Salesforce source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/salesforce). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/salesforce) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_salesforce/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-salesforce spec poetry run source-salesforce check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-salesforce read --config secrets/config.json --catalog sample_ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-salesforce build ``` An image will be available on your host with the tag `airbyte/source-salesforce:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-salesforce:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesforce:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-salesforce test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-salesforce test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/salesforce.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-salesloft/README.md b/airbyte-integrations/connectors/source-salesloft/README.md index 844a841d9c0..9d1e73ab7ca 100644 --- a/airbyte-integrations/connectors/source-salesloft/README.md +++ b/airbyte-integrations/connectors/source-salesloft/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/salesloft) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_salesloft/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-salesloft build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-salesloft build An image will be built with the tag `airbyte/source-salesloft:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-salesloft:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-salesloft:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesloft:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-salesloft test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-salesloft test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-sap-fieldglass/README.md b/airbyte-integrations/connectors/source-sap-fieldglass/README.md index c503551e7f9..cfc00439aa3 100644 --- a/airbyte-integrations/connectors/source-sap-fieldglass/README.md +++ b/airbyte-integrations/connectors/source-sap-fieldglass/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/sap-fieldglass) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sap_fieldglass/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-sap-fieldglass build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-sap-fieldglass build An image will be built with the tag `airbyte/source-sap-fieldglass:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-sap-fieldglass:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sap-fieldglass:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sap-fieldglass:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sap-fieldglass test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sap-fieldglass test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-scaffold-java-jdbc/README.md b/airbyte-integrations/connectors/source-scaffold-java-jdbc/README.md index 31ae071f64b..640d35d740d 100644 --- a/airbyte-integrations/connectors/source-scaffold-java-jdbc/README.md +++ b/airbyte-integrations/connectors/source-scaffold-java-jdbc/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-scaffold-java-jdbc:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,7 +23,9 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: + ``` ./gradlew :airbyte-integrations:connectors:source-scaffold-java-jdbc:buildConnectorImage ``` @@ -28,7 +33,9 @@ Build the connector image via Gradle: Once built, the docker image name and tag will be `airbyte/source-scaffold-java-jdbc:dev`. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-scaffold-java-jdbc:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-scaffold-java-jdbc:dev check --config /secrets/config.json @@ -37,23 +44,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/...` -Place integration tests in `src/test-integration/...` +Place integration tests in `src/test-integration/...` #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/sources/scaffold_java_jdbcSourceAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-scaffold-java-jdbc:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-scaffold-java-jdbc:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-scaffold-java-jdbc test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-scaffold-source-http/README.md b/airbyte-integrations/connectors/source-scaffold-source-http/README.md index 2a71d428732..3de5eda08c9 100644 --- a/airbyte-integrations/connectors/source-scaffold-source-http/README.md +++ b/airbyte-integrations/connectors/source-scaffold-source-http/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/scaffold-source-http) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-scaffold-source-http build ``` An image will be available on your host with the tag `airbyte/source-scaffold-source-http:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-scaffold-source-http:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-scaffold-source-http:dev check --config /secrets/config.json @@ -81,7 +79,7 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: ```bash @@ -93,13 +91,14 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-scaffold-source-http test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/scaffold-source-http.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-scaffold-source-http/src/source_scaffold_source_http/schemas/TODO.md b/airbyte-integrations/connectors/source-scaffold-source-http/src/source_scaffold_source_http/schemas/TODO.md index cf1efadb3c9..0037aeb60d8 100644 --- a/airbyte-integrations/connectors/source-scaffold-source-http/src/source_scaffold_source_http/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-scaffold-source-http/src/source_scaffold_source_http/schemas/TODO.md @@ -1,20 +1,25 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). -The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. - +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it. + The schema of a stream is the return value of `Stream.get_json_schema`. - + ## Static schemas + By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need. Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files. - + ## Dynamic schemas + If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org). -## Dynamically modifying static schemas -Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: +## Dynamically modifying static schemas + +Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value: + ``` def get_json_schema(self): schema = super().get_json_schema() @@ -22,4 +27,4 @@ def get_json_schema(self): return schema ``` -Delete this file once you're done. Or don't. Up to you :) +Delete this file once you're done. Or don't. Up to you :) diff --git a/airbyte-integrations/connectors/source-scaffold-source-python/README.md b/airbyte-integrations/connectors/source-scaffold-source-python/README.md index 1f11ec7cdaa..da910390e45 100644 --- a/airbyte-integrations/connectors/source-scaffold-source-python/README.md +++ b/airbyte-integrations/connectors/source-scaffold-source-python/README.md @@ -7,19 +7,17 @@ For information about how to use this connector within Airbyte, see [the documen ### Prerequisites -* Python (`^3.9`) -* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) - - +- Python (`^3.9`) +- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/scaffold-source-python) @@ -27,7 +25,6 @@ to generate the necessary credentials. Then create a file `secrets/config.json` Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector ``` @@ -49,16 +46,17 @@ poetry run pytest tests 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-scaffold-source-python build ``` An image will be available on your host with the tag `airbyte/source-scaffold-source-python:dev`. - ### Running as a docker container Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-scaffold-source-python:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-scaffold-source-python:dev check --config /secrets/config.json @@ -67,7 +65,9 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-scaffold-source-python test ``` @@ -79,8 +79,9 @@ If your connector requires to create or destroy resources for use during accepta ### Dependency Management -All of your dependencies should be managed via Poetry. +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -90,13 +91,14 @@ Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-scaffold-source-python test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/scaffold-source-python.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-secoda/README.md b/airbyte-integrations/connectors/source-secoda/README.md index 3c42e6b401a..a8a2b34984d 100644 --- a/airbyte-integrations/connectors/source-secoda/README.md +++ b/airbyte-integrations/connectors/source-secoda/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/secoda) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_secoda/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-secoda build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-secoda build An image will be built with the tag `airbyte/source-secoda:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-secoda:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-secoda:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-secoda:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-secoda test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-secoda test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-sendgrid/README.md b/airbyte-integrations/connectors/source-sendgrid/README.md index 5745cb704aa..9393901ea35 100644 --- a/airbyte-integrations/connectors/source-sendgrid/README.md +++ b/airbyte-integrations/connectors/source-sendgrid/README.md @@ -1,31 +1,32 @@ # Sendgrid source connector - This is the repository for the Sendgrid source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/sendgrid). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/sendgrid) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sendgrid/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-sendgrid spec poetry run source-sendgrid check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-sendgrid read --config secrets/config.json --catalog integrati ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-sendgrid build ``` An image will be available on your host with the tag `airbyte/source-sendgrid:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sendgrid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendgrid:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sendgrid test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sendgrid test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/sendgrid.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-sendinblue/README.md b/airbyte-integrations/connectors/source-sendinblue/README.md index 36a751299ab..b4c526bc461 100644 --- a/airbyte-integrations/connectors/source-sendinblue/README.md +++ b/airbyte-integrations/connectors/source-sendinblue/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/sendinblue) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sendinblue/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-sendinblue build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-sendinblue build An image will be built with the tag `airbyte/source-sendinblue:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-sendinblue:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sendinblue:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendinblue:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sendinblue test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sendinblue test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-senseforce/README.md b/airbyte-integrations/connectors/source-senseforce/README.md index e3ab68570d8..ae35c859450 100644 --- a/airbyte-integrations/connectors/source-senseforce/README.md +++ b/airbyte-integrations/connectors/source-senseforce/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/senseforce) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_senseforce/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-senseforce build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-senseforce build An image will be built with the tag `airbyte/source-senseforce:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-senseforce:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-senseforce:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-senseforce:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-senseforce test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-senseforce test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-sentry/README.md b/airbyte-integrations/connectors/source-sentry/README.md index a5651e8ee8e..478d5f6a316 100644 --- a/airbyte-integrations/connectors/source-sentry/README.md +++ b/airbyte-integrations/connectors/source-sentry/README.md @@ -1,31 +1,32 @@ # Sentry source connector - This is the repository for the Sentry source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/sentry). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/sentry) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sentry/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-sentry spec poetry run source-sentry check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-sentry read --config secrets/config.json --catalog integration ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-sentry build ``` An image will be available on your host with the tag `airbyte/source-sentry:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sentry:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sentry:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sentry test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sentry test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/sentry.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-sentry/bootstrap.md b/airbyte-integrations/connectors/source-sentry/bootstrap.md index 6b5bafe7b58..3115d8ac7a8 100644 --- a/airbyte-integrations/connectors/source-sentry/bootstrap.md +++ b/airbyte-integrations/connectors/source-sentry/bootstrap.md @@ -2,10 +2,10 @@ Sentry is a REST API. Connector has the following streams, and all of them support full refresh and incremental. -* [Events](https://docs.sentry.io/api/events/list-a-projects-events/) -* [Issues](https://docs.sentry.io/api/events/list-a-projects-issues/) -* [Projects](https://docs.sentry.io/api/projects/list-your-projects/) -* [Releases](https://docs.sentry.io/api/releases/list-an-organizations-releases/) +- [Events](https://docs.sentry.io/api/events/list-a-projects-events/) +- [Issues](https://docs.sentry.io/api/events/list-a-projects-issues/) +- [Projects](https://docs.sentry.io/api/projects/list-your-projects/) +- [Releases](https://docs.sentry.io/api/releases/list-an-organizations-releases/) And a [ProjectDetail](https://docs.sentry.io/api/projects/retrieve-a-project/) stream is also implemented just for connection checking. @@ -13,6 +13,6 @@ And a [ProjectDetail](https://docs.sentry.io/api/projects/retrieve-a-project/) s Sentry API offers three types of [authentication methods](https://docs.sentry.io/api/auth/). -* Auth Token - The most common authentication method in Sentry. Connector only supports this method. -* DSN Authentication - Only some API endpoints support this method. Not supported by this connector. -* API Keys - Keys are passed using HTTP Basic auth, and a legacy means of authenticating. They will still be supported but are disabled for new accounts. Not supported by this connector. \ No newline at end of file +- Auth Token - The most common authentication method in Sentry. Connector only supports this method. +- DSN Authentication - Only some API endpoints support this method. Not supported by this connector. +- API Keys - Keys are passed using HTTP Basic auth, and a legacy means of authenticating. They will still be supported but are disabled for new accounts. Not supported by this connector. diff --git a/airbyte-integrations/connectors/source-serpstat/README.md b/airbyte-integrations/connectors/source-serpstat/README.md index 74a160fffcf..18fc6071339 100644 --- a/airbyte-integrations/connectors/source-serpstat/README.md +++ b/airbyte-integrations/connectors/source-serpstat/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/serpstat) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_serpstat/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-serpstat build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-serpstat build An image will be built with the tag `airbyte/source-serpstat:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-serpstat:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-serpstat:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-serpstat:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-serpstat test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-serpstat test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-sftp-bulk/README.md b/airbyte-integrations/connectors/source-sftp-bulk/README.md index e5f94c665b7..d9490e2d3c3 100644 --- a/airbyte-integrations/connectors/source-sftp-bulk/README.md +++ b/airbyte-integrations/connectors/source-sftp-bulk/README.md @@ -1,31 +1,32 @@ # Sftp-Bulk source connector - This is the repository for the Sftp-Bulk source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/sftp-bulk). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/sftp-bulk) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sftp_bulk/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-sftp-bulk spec poetry run source-sftp-bulk check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-sftp-bulk read --config secrets/config.json --catalog sample_f ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-sftp-bulk build ``` An image will be available on your host with the tag `airbyte/source-sftp-bulk:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sftp-bulk:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sftp-bulk:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sftp-bulk test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sftp-bulk test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/sftp-bulk.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-sftp/README.md b/airbyte-integrations/connectors/source-sftp/README.md index 7991b543e3c..432dfa85d5c 100644 --- a/airbyte-integrations/connectors/source-sftp/README.md +++ b/airbyte-integrations/connectors/source-sftp/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-sftp:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-sftp:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-sftp:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sftp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sftp:dev check --config /secrets/config.json @@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/io/airbyte/integrations/source/sftp`. #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/source/sftpSourceAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-sftp:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-sftp:integrationTest ``` @@ -61,7 +75,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sftp test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-shopify/README.md b/airbyte-integrations/connectors/source-shopify/README.md index 2162414e361..cb8b43d03b3 100644 --- a/airbyte-integrations/connectors/source-shopify/README.md +++ b/airbyte-integrations/connectors/source-shopify/README.md @@ -1,31 +1,32 @@ # Shopify source connector - This is the repository for the Shopify source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/shopify). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/shopify) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_shopify/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-shopify spec poetry run source-shopify check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-shopify read --config secrets/config.json --catalog integratio ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-shopify build ``` An image will be available on your host with the tag `airbyte/source-shopify:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-shopify:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shopify:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-shopify test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shopify test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/shopify.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-shortio/README.md b/airbyte-integrations/connectors/source-shortio/README.md index ec115936fa3..be896900b4c 100644 --- a/airbyte-integrations/connectors/source-shortio/README.md +++ b/airbyte-integrations/connectors/source-shortio/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/shortio) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_shortio/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-shortio build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-shortio build An image will be built with the tag `airbyte/source-shortio:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-shortio:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-shortio:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shortio:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-shortio test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shortio test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-slack/README.md b/airbyte-integrations/connectors/source-slack/README.md index 9f21fa72ec4..306164e79a9 100644 --- a/airbyte-integrations/connectors/source-slack/README.md +++ b/airbyte-integrations/connectors/source-slack/README.md @@ -1,31 +1,32 @@ # Slack source connector - This is the repository for the Slack source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/slack). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/slack) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_slack/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-slack spec poetry run source-slack check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-slack read --config secrets/config.json --catalog sample_files ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-slack build ``` An image will be available on your host with the tag `airbyte/source-slack:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-slack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-slack:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-slack test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-slack test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/slack.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-smaily/README.md b/airbyte-integrations/connectors/source-smaily/README.md index 8e0e61140e4..231a636ea8b 100644 --- a/airbyte-integrations/connectors/source-smaily/README.md +++ b/airbyte-integrations/connectors/source-smaily/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/smaily) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_smaily/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-smaily build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-smaily build An image will be built with the tag `airbyte/source-smaily:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-smaily:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-smaily:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smaily:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-smaily test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-smaily test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-smartengage/README.md b/airbyte-integrations/connectors/source-smartengage/README.md index e1990a71e04..087489c517b 100644 --- a/airbyte-integrations/connectors/source-smartengage/README.md +++ b/airbyte-integrations/connectors/source-smartengage/README.md @@ -1,31 +1,32 @@ # Smartengage source connector - This is the repository for the Smartengage source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/smartengage). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/smartengage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_smartengage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-smartengage spec poetry run source-smartengage check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-smartengage read --config secrets/config.json --catalog sample ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-smartengage build ``` An image will be available on your host with the tag `airbyte/source-smartengage:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-smartengage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smartengage:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-smartengage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-smartengage test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/smartengage.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-smartsheets/README.md b/airbyte-integrations/connectors/source-smartsheets/README.md index 3938470a843..2ccc60c608b 100644 --- a/airbyte-integrations/connectors/source-smartsheets/README.md +++ b/airbyte-integrations/connectors/source-smartsheets/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/customer-io) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_customer_io/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name source-customer-io build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name source-customer-io build An image will be built with the tag `airbyte/source-customer-io:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-customer-io:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-customer-io:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-smartsheets test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-smartsheets test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-snapchat-marketing/README.md b/airbyte-integrations/connectors/source-snapchat-marketing/README.md index ac8a4af9ff1..399e6f2744d 100644 --- a/airbyte-integrations/connectors/source-snapchat-marketing/README.md +++ b/airbyte-integrations/connectors/source-snapchat-marketing/README.md @@ -1,31 +1,32 @@ # Snapchat-Marketing source connector - This is the repository for the Snapchat-Marketing source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/snapchat-marketing). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/snapchat-marketing) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_snapchat_marketing/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-snapchat-marketing spec poetry run source-snapchat-marketing check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-snapchat-marketing read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-snapchat-marketing build ``` An image will be available on your host with the tag `airbyte/source-snapchat-marketing:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-snapchat-marketing:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-snapchat-marketing:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-snapchat-marketing test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-snapchat-marketing test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/snapchat-marketing.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-snowflake/CHANGELOG.md b/airbyte-integrations/connectors/source-snowflake/CHANGELOG.md index 2482683b688..d833db383b4 100644 --- a/airbyte-integrations/connectors/source-snowflake/CHANGELOG.md +++ b/airbyte-integrations/connectors/source-snowflake/CHANGELOG.md @@ -1,4 +1,5 @@ # Changelog ## 0.1.0 + Initial Release. diff --git a/airbyte-integrations/connectors/source-snowflake/README.md b/airbyte-integrations/connectors/source-snowflake/README.md index 91769504a0e..8e9c7b26ea4 100644 --- a/airbyte-integrations/connectors/source-snowflake/README.md +++ b/airbyte-integrations/connectors/source-snowflake/README.md @@ -1,11 +1,14 @@ # Snowflake Source ## Documentation -* [User Documentation](https://docs.airbyte.io/integrations/sources/snowflake) + +- [User Documentation](https://docs.airbyte.io/integrations/sources/snowflake) ## Community Contributor + 1. Look at the integration documentation to see how to create a warehouse/database/schema/user/role for Airbyte to sync into. 1. Create a file at `secrets/config.json` with the following format: + ``` { "host": "ACCOUNT.REGION.PROVIDER.snowflakecomputing.com", @@ -20,7 +23,9 @@ } } ``` + 3. Create a file at `secrets/config_auth.json` with the following format: + ``` { "host": "ACCOUNT.REGION.PROVIDER.snowflakecomputing.com", @@ -36,7 +41,10 @@ } } ``` + ## For Airbyte employees + To be able to run integration tests locally: + 1. Put the contents of the `Source snowflake test creds (secrets/config.json)` secret on Lastpass into `secrets/config.json`. 1. Put the contents of the `SECRET_SOURCE-SNOWFLAKE_OAUTH__CREDS (secrets/config_auth.json)` secret on Lastpass into `secrets/config_auth.json`. diff --git a/airbyte-integrations/connectors/source-snowflake/integration_tests/README.md b/airbyte-integrations/connectors/source-snowflake/integration_tests/README.md index 96aa5492669..9bf604a7f6c 100644 --- a/airbyte-integrations/connectors/source-snowflake/integration_tests/README.md +++ b/airbyte-integrations/connectors/source-snowflake/integration_tests/README.md @@ -1,3 +1,4 @@ # Seeding the dataset + You can find the SQL scripts in this folder if you need to create or fix the SAT dataset. For more instructions and information about valid scripts, please check this [doc](https://docs.google.com/document/d/1k5TvxaNhKdr44aJIHWWtLk14Tzd2gbNX-J8YNoTj8u0/edit#heading=h.ls9oiedt9wyy). diff --git a/airbyte-integrations/connectors/source-sonar-cloud/README.md b/airbyte-integrations/connectors/source-sonar-cloud/README.md index e1a81748f74..d0dbb8ad890 100644 --- a/airbyte-integrations/connectors/source-sonar-cloud/README.md +++ b/airbyte-integrations/connectors/source-sonar-cloud/README.md @@ -1,31 +1,32 @@ # Sonar-Cloud source connector - This is the repository for the Sonar-Cloud source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/sonar-cloud). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/sonar-cloud) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_sonar_cloud/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-sonar-cloud spec poetry run source-sonar-cloud check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-sonar-cloud read --config secrets/config.json --catalog sample ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-sonar-cloud build ``` An image will be available on your host with the tag `airbyte/source-sonar-cloud:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-sonar-cloud:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sonar-cloud:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-sonar-cloud test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-sonar-cloud test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/sonar-cloud.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-spacex-api/README.md b/airbyte-integrations/connectors/source-spacex-api/README.md index 913ec4c891c..5923a9a934f 100644 --- a/airbyte-integrations/connectors/source-spacex-api/README.md +++ b/airbyte-integrations/connectors/source-spacex-api/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/spacex-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_spacex_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,9 +46,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-spacex-api build ``` @@ -50,12 +57,15 @@ airbyte-ci connectors --name=source-spacex-api build An image will be built with the tag `airbyte/source-spacex-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-spacex-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-spacex-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-spacex-api:dev check --config /secrets/config.json @@ -64,23 +74,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-spacex-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-spacex-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -88,4 +105,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-spacex-api/bootstrap.md b/airbyte-integrations/connectors/source-spacex-api/bootstrap.md index a0fd4f888d1..5ef36317832 100644 --- a/airbyte-integrations/connectors/source-spacex-api/bootstrap.md +++ b/airbyte-integrations/connectors/source-spacex-api/bootstrap.md @@ -1,7 +1,7 @@ # SpaceX-API The connector uses the v4 API documented here: https://github.com/r-spacex/SpaceX-API . It is -straightforward HTTP REST API with no authentication. +straightforward HTTP REST API with no authentication. ## Dummy API key @@ -28,8 +28,7 @@ Just pass any dummy api key for establishing the connection. Example:123 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter your `id` if needed. (Optional) -6. Click **Set up source**. - - * We use only GET methods, all endpoints are straightforward. We emit what we receive as HTTP response. +4. Enter your `id` if needed. (Optional) +5. Click **Set up source**. +- We use only GET methods, all endpoints are straightforward. We emit what we receive as HTTP response. diff --git a/airbyte-integrations/connectors/source-square/README.md b/airbyte-integrations/connectors/source-square/README.md index 4ac61d9ac6d..a54fd27b9f7 100644 --- a/airbyte-integrations/connectors/source-square/README.md +++ b/airbyte-integrations/connectors/source-square/README.md @@ -1,31 +1,32 @@ # Square source connector - This is the repository for the Square source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/square). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/square) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_square/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-square spec poetry run source-square check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-square read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-square build ``` An image will be available on your host with the tag `airbyte/source-square:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-square:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-square:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-square test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-square test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/square.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-square/source_square/schemas/TODO.md b/airbyte-integrations/connectors/source-square/source_square/schemas/TODO.md index 327ddcb2644..3bd4a64deb4 100644 --- a/airbyte-integrations/connectors/source-square/source_square/schemas/TODO.md +++ b/airbyte-integrations/connectors/source-square/source_square/schemas/TODO.md @@ -1,16 +1,19 @@ # TODO: Define your stream schemas -Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). + +Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org). You can describe the schema of your streams using one `.json` file per stream. - + ## Static schemas + From the `square.yaml` configuration file, you read the `.json` files in the `schemas/` directory. You can refer to a schema in your configuration file using the `schema_loader` component's `file_path` field. For example: + ``` schema_loader: type: JsonSchema file_path: "./source_square/schemas/customers.json" ``` + Every stream specified in the configuration file should have a corresponding `.json` schema file. Delete this file once you're done. Or don't. Up to you :) - diff --git a/airbyte-integrations/connectors/source-statuspage/README.md b/airbyte-integrations/connectors/source-statuspage/README.md index 7a25aad4341..2223dc56824 100644 --- a/airbyte-integrations/connectors/source-statuspage/README.md +++ b/airbyte-integrations/connectors/source-statuspage/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/statuspage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_statuspage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-statuspage build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-statuspage build An image will be built with the tag `airbyte/source-statuspage:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-statuspage:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-statuspage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-statuspage:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-statuspage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-statuspage test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-strava/README.md b/airbyte-integrations/connectors/source-strava/README.md index 01ace8e2f00..29d206ac0ae 100644 --- a/airbyte-integrations/connectors/source-strava/README.md +++ b/airbyte-integrations/connectors/source-strava/README.md @@ -1,31 +1,32 @@ # Strava source connector - This is the repository for the Strava source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/strava). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/strava) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_strava/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-strava spec poetry run source-strava check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-strava read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-strava build ``` An image will be available on your host with the tag `airbyte/source-strava:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-strava:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-strava:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-strava test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-strava test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/strava.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-strava/bootstrap.md b/airbyte-integrations/connectors/source-strava/bootstrap.md index d269a396dc7..5eb6f1ce2e0 100644 --- a/airbyte-integrations/connectors/source-strava/bootstrap.md +++ b/airbyte-integrations/connectors/source-strava/bootstrap.md @@ -3,21 +3,23 @@ Strava is a REST based API. Connector is implemented with [Airbyte CDK](https://docs.airbyte.io/connector-development/cdk-python). Connector supports the following two streams: -* [Athlete Stats](https://developers.strava.com/docs/reference/#api-Athletes-getStats) - * Returns a set of stats specific to the specified `athlete_id` config input -* [Activities](https://developers.strava.com/docs/reference/#api-Activities-getLoggedInAthleteActivities) \(Incremental\) - * Returns activities of the athlete whose refresh token it belongs to - * Stream will start with activities that happen after the `started_at` config input - * Stream will keep on attempting to read the next page of query until the API returns an empty list + +- [Athlete Stats](https://developers.strava.com/docs/reference/#api-Athletes-getStats) + - Returns a set of stats specific to the specified `athlete_id` config input +- [Activities](https://developers.strava.com/docs/reference/#api-Activities-getLoggedInAthleteActivities) \(Incremental\) + - Returns activities of the athlete whose refresh token it belongs to + - Stream will start with activities that happen after the `started_at` config input + - Stream will keep on attempting to read the next page of query until the API returns an empty list Rate Limiting: -* Strava API has limitations to 100 requests every 15 minutes, 1000 daily + +- Strava API has limitations to 100 requests every 15 minutes, 1000 daily Authentication and Permissions: -* Streams utilize [Oauth](https://developers.strava.com/docs/authentication/#oauthoverview) for authorization -* The [Activities](https://developers.strava.com/docs/reference/#api-Activities-getLoggedInAthleteActivities) stream relies on the refresh token containing the `activity:read_all` scope -* List of scopes can be found [here](https://developers.strava.com/docs/authentication/#detailsaboutrequestingaccess) - * Scope of `activity:read` should work as well, but will not include private activities or privacy zone data +- Streams utilize [Oauth](https://developers.strava.com/docs/authentication/#oauthoverview) for authorization +- The [Activities](https://developers.strava.com/docs/reference/#api-Activities-getLoggedInAthleteActivities) stream relies on the refresh token containing the `activity:read_all` scope +- List of scopes can be found [here](https://developers.strava.com/docs/authentication/#detailsaboutrequestingaccess) + - Scope of `activity:read` should work as well, but will not include private activities or privacy zone data See [this](https://docs.airbyte.io/integrations/sources/strava) link for the nuances about the connector. diff --git a/airbyte-integrations/connectors/source-stripe/README.md b/airbyte-integrations/connectors/source-stripe/README.md index 8b8e5526ae2..3cb15f778da 100644 --- a/airbyte-integrations/connectors/source-stripe/README.md +++ b/airbyte-integrations/connectors/source-stripe/README.md @@ -1,31 +1,32 @@ # Stripe source connector - This is the repository for the Stripe source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/stripe). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/stripe) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_stripe/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-stripe spec poetry run source-stripe check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-stripe read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-stripe build ``` An image will be available on your host with the tag `airbyte/source-stripe:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-stripe:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-stripe:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-stripe test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-stripe test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/stripe.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-survey-sparrow/README.md b/airbyte-integrations/connectors/source-survey-sparrow/README.md index 0fe7eb35e34..f9b0f5c329d 100644 --- a/airbyte-integrations/connectors/source-survey-sparrow/README.md +++ b/airbyte-integrations/connectors/source-survey-sparrow/README.md @@ -1,31 +1,32 @@ # Survey-Sparrow source connector - This is the repository for the Survey-Sparrow source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/survey-sparrow). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/survey-sparrow) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_survey_sparrow/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-survey-sparrow spec poetry run source-survey-sparrow check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-survey-sparrow read --config secrets/config.json --catalog sam ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-survey-sparrow build ``` An image will be available on your host with the tag `airbyte/source-survey-sparrow:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-survey-sparrow:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-survey-sparrow:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-survey-sparrow test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-survey-sparrow test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/survey-sparrow.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-surveycto/README.md b/airbyte-integrations/connectors/source-surveycto/README.md index e5879b99a77..12009790e1e 100644 --- a/airbyte-integrations/connectors/source-surveycto/README.md +++ b/airbyte-integrations/connectors/source-surveycto/README.md @@ -4,12 +4,16 @@ This is the repository for the Surveycto source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/surveycto). ## Documentation + 1. The generator boilderplate is generated by this command + ``` -cd airbyte-integrations/connector-templates/generator +cd airbyte-integrations/connector-templates/generator ./generate.sh ``` + 2. Create a dev environment + ``` cd ../../connectors/source-surveycto python3 -m venv .venv # Create a virtual environment in the .venv directory @@ -20,23 +24,28 @@ pip install -r requirements.txt ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -45,6 +54,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/surveycto) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_surveycto/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -54,6 +64,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -63,9 +74,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-surveycto build ``` @@ -73,12 +85,15 @@ airbyte-ci connectors --name=source-surveycto build An image will be built with the tag `airbyte/source-surveycto:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-surveycto:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-surveycto:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveycto:dev check --config /secrets/config.json @@ -87,23 +102,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-surveycto test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-surveycto test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -111,4 +133,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-surveymonkey/README.md b/airbyte-integrations/connectors/source-surveymonkey/README.md index f7b91bd3d6d..27cc66f1ed5 100644 --- a/airbyte-integrations/connectors/source-surveymonkey/README.md +++ b/airbyte-integrations/connectors/source-surveymonkey/README.md @@ -1,31 +1,32 @@ # Surveymonkey source connector - This is the repository for the Surveymonkey source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/surveymonkey). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/surveymonkey) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_surveymonkey/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-surveymonkey spec poetry run source-surveymonkey check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-surveymonkey read --config secrets/config.json --catalog integ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-surveymonkey build ``` An image will be available on your host with the tag `airbyte/source-surveymonkey:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-surveymonkey:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveymonkey:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-surveymonkey test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-surveymonkey test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/surveymonkey.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-tempo/README.md b/airbyte-integrations/connectors/source-tempo/README.md index 7ae7456e2be..3f963fd2702 100644 --- a/airbyte-integrations/connectors/source-tempo/README.md +++ b/airbyte-integrations/connectors/source-tempo/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/fullstory) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_fullstory/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name source-fullstory build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name source-fullstory build An image will be built with the tag `airbyte/source-fullstory:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-fullstory:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-fullstory:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tempo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tempo test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-teradata/README.md b/airbyte-integrations/connectors/source-teradata/README.md index c80424c6b39..f8e5e4edd10 100644 --- a/airbyte-integrations/connectors/source-teradata/README.md +++ b/airbyte-integrations/connectors/source-teradata/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-teradata:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-teradata:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-teradata:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-teradata:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-teradata:dev check --config /secrets/config.json @@ -38,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/...` -Place integration tests in `src/test-integration/...` +Place integration tests in `src/test-integration/...` #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/sources/TeradataSourceAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-teradata:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-teradata:integrationTest ``` @@ -62,7 +76,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-teradata test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -70,4 +86,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-the-guardian-api/README.md b/airbyte-integrations/connectors/source-the-guardian-api/README.md index 0d36dec3168..efe08e0c177 100644 --- a/airbyte-integrations/connectors/source-the-guardian-api/README.md +++ b/airbyte-integrations/connectors/source-the-guardian-api/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/the-guardian-api) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_the_guardian_api/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-the-guardian-api build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-the-guardian-api build An image will be built with the tag `airbyte/source-the-guardian-api:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-the-guardian-api:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-the-guardian-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-the-guardian-api:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-the-guardian-api test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-the-guardian-api test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tidb/README.md b/airbyte-integrations/connectors/source-tidb/README.md index c8d82b58d91..98a460031e7 100755 --- a/airbyte-integrations/connectors/source-tidb/README.md +++ b/airbyte-integrations/connectors/source-tidb/README.md @@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do ## Local development #### Building via Gradle + From the Airbyte repository root, run: + ``` ./gradlew :airbyte-integrations:connectors:source-tidb:build ``` #### Create credentials + **If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`. Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information. @@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang ### Locally running the connector docker image #### Build + Build the connector image via Gradle: ``` ./gradlew :airbyte-integrations:connectors:source-tidb:buildConnectorImage ``` + Once built, the docker image name and tag on your host will be `airbyte/source-tidb:dev`. the Dockerfile. #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tidb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tidb:dev check --config /secrets/config.json @@ -38,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + We use `JUnit` for Java tests. ### Unit and Integration Tests + Place unit tests under `src/test/...` -Place integration tests in `src/test-integration/...` +Place integration tests in `src/test-integration/...` #### Acceptance Tests + Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in `src/test-integration/java/io/airbyte/integrations/sources/TiDBSourceAcceptanceTest.java`. ### Using gradle to run tests + All commands should be run from airbyte project root. To run unit tests: + ``` ./gradlew :airbyte-integrations:connectors:source-tidb:unitTest ``` + To run acceptance and custom integration tests: + ``` ./gradlew :airbyte-integrations:connectors:source-tidb:integrationTest ``` @@ -62,7 +76,9 @@ To run acceptance and custom integration tests: ## Dependency Management ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tidb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -70,4 +86,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tiktok-marketing/README.md b/airbyte-integrations/connectors/source-tiktok-marketing/README.md index fadf0bc2de0..9aaf80daced 100644 --- a/airbyte-integrations/connectors/source-tiktok-marketing/README.md +++ b/airbyte-integrations/connectors/source-tiktok-marketing/README.md @@ -1,31 +1,32 @@ # Tiktok-Marketing source connector - This is the repository for the Tiktok-Marketing source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/tiktok-marketing). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/tiktok-marketing) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_tiktok_marketing/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-tiktok-marketing spec poetry run source-tiktok-marketing check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-tiktok-marketing read --config secrets/config.json --catalog s ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-tiktok-marketing build ``` An image will be available on your host with the tag `airbyte/source-tiktok-marketing:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tiktok-marketing:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tiktok-marketing:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tiktok-marketing test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tiktok-marketing test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/tiktok-marketing.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md b/airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md index 6b54ddf6909..a1415514b69 100644 --- a/airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md +++ b/airbyte-integrations/connectors/source-tiktok-marketing/bootstrap.md @@ -2,67 +2,69 @@ The Business Marketing API is [a REST based API](https://business-api.tiktok.com This service also provides a [sandbox](https://business-api.tiktok.com/marketing_api/docs?rid=88iodtuzdt7&id=1701890920013825) environment for testing with some limitations. ## Core Advertiser stream + The basic entity is 'advertiser'. All other streams use this required parameter for data loading. This works slightly differently between sandbox and production environments. For production, every developer application can have multiple advertisers. [This endpoint](https://business-api.tiktok.com/marketing_api/docs?id=1708503202263042) gets a list of advertiser accounts that authorized an app, providing us functionality to obtain the associated advertisers. However, this endpoint is inaccessible for sandbox because a sandbox can have only one advertiser object and its ID is known in advance. ## Other streams -* [Campaigns](https://business-api.tiktok.com/marketing_api/docs?id=1708582970809346) \(Incremental\) -* [Ad Groups](https://business-api.tiktok.com/marketing_api/docs?id=1708503489590273)\(Incremental\) -* [Ads](https://business-api.tiktok.com/marketing_api/docs?id=1708572923161602)\(Incremental\) + +- [Campaigns](https://business-api.tiktok.com/marketing_api/docs?id=1708582970809346) \(Incremental\) +- [Ad Groups](https://business-api.tiktok.com/marketing_api/docs?id=1708503489590273)\(Incremental\) +- [Ads](https://business-api.tiktok.com/marketing_api/docs?id=1708572923161602)\(Incremental\) Dependent streams have required parameter advertiser_id. As cursor field this connector uses "modify_time" values. But endpoints don't provide any mechanism for correct data filtering and sorting thus for incremental sync this connector tries to load all data and to validate a cursor field value on own side. - - `stream` method has granularity condition depend on that report streams supports by different connector version: + - For all version: - basic streams list: - * ad_groups - * ads - * campaigns - * advertisers + basic streams list: + - ad_groups + - ads + - campaigns + - advertisers - for < 0.1.13 - expose report streams initialized with 'report_granularity' argument, like: - Example: + Example: + - AdsReports(report_granularity='DAILY') - AdsReports(report_granularity='LIFETIME') - streams list: - * advertisers_reports - * advertisers_audience_reports - * campaigns_audience_reports_by_country - * ad_group_audience_reports - * ads_audience_reports - * ad_groups_reports - * ads_reports - * campaigns_reports + streams list: + - advertisers_reports + - advertisers_audience_reports + - campaigns_audience_reports_by_country + - ad_group_audience_reports + - ads_audience_reports + - ad_groups_reports + - ads_reports + - campaigns_reports -- for >= 0.1.13 - expose report streams in format: _, like: - Example: +- for >= 0.1.13 - expose report streams in format: *, like: + Example: - AdsReportsDaily(Daily, AdsReports) - AdsReportsLifetime(Lifetime, AdsReports) - streams: - * campaigns_audience_reports_daily - * campaigns_audience_reports_by_country_daily - * campaigns_audience_reports_by_platform_daily - * campaigns_reports_daily - * advertisers_audience_reports_daily - * advertisers_audience_reports_by_country_daily - * advertisers_audience_reports_by_platform_daily - * advertisers_reports_daily - * ad_group_audience_reports_daily - * ad_group_audience_reports_by_country_daily - * ad_group_audience_reports_by_platform_daily - * ads_reports_lifetime - * advertiser_ids - * campaigns_reports_lifetime - * advertisers_audience_reports_lifetime - * ad_groups_reports_lifetime - * ad_groups_reports_daily - * advertisers_reports_lifetime - * ads_reports_daily - * ads_audience_reports_daily - * ads_audience_reports_by_country_daily - * ads_audience_reports_by_platform_daily - * ads_reports_hourly - * ad_groups_reports_hourly - * advertisers_reports_hourly - * campaigns_reports_hourly + streams: + - campaigns_audience_reports_daily + - campaigns_audience_reports_by_country_daily + - campaigns_audience_reports_by_platform_daily + - campaigns_reports_daily + - advertisers_audience_reports_daily + - advertisers_audience_reports_by_country_daily + - advertisers_audience_reports_by_platform_daily + - advertisers_reports_daily + - ad_group_audience_reports_daily + - ad_group_audience_reports_by_country_daily + - ad_group_audience_reports_by_platform_daily + - ads_reports_lifetime + - advertiser_ids + - campaigns_reports_lifetime + - advertisers_audience_reports_lifetime + - ad_groups_reports_lifetime + - ad_groups_reports_daily + - advertisers_reports_lifetime + - ads_reports_daily + - ads_audience_reports_daily + - ads_audience_reports_by_country_daily + - ads_audience_reports_by_platform_daily + - ads_reports_hourly + - ad_groups_reports_hourly + - advertisers_reports_hourly + - campaigns_reports_hourly diff --git a/airbyte-integrations/connectors/source-timely/README.md b/airbyte-integrations/connectors/source-timely/README.md index 6082bbbaf14..fad12573f29 100644 --- a/airbyte-integrations/connectors/source-timely/README.md +++ b/airbyte-integrations/connectors/source-timely/README.md @@ -1,31 +1,32 @@ # Timely source connector - This is the repository for the Timely source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/timely). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/timely) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_timely/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-timely spec poetry run source-timely check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-timely read --config secrets/config.json --catalog sample_file ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-timely build ``` An image will be available on your host with the tag `airbyte/source-timely:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-timely:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-timely:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-timely test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-timely test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/timely.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-tmdb/README.md b/airbyte-integrations/connectors/source-tmdb/README.md index a4ed1e3439f..3a1f0838866 100644 --- a/airbyte-integrations/connectors/source-tmdb/README.md +++ b/airbyte-integrations/connectors/source-tmdb/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/tmdb) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_tmdb/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -41,9 +47,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-tmdb build ``` @@ -51,12 +58,15 @@ airbyte-ci connectors --name=source-tmdb build An image will be built with the tag `airbyte/source-tmdb:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-tmdb:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tmdb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tmdb:dev check --config /secrets/config.json @@ -65,23 +75,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tmdb test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tmdb test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -89,4 +106,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tmdb/bootstrap.md b/airbyte-integrations/connectors/source-tmdb/bootstrap.md index c88370a8643..1ba840c3fb6 100644 --- a/airbyte-integrations/connectors/source-tmdb/bootstrap.md +++ b/airbyte-integrations/connectors/source-tmdb/bootstrap.md @@ -1,7 +1,7 @@ # TMDb The connector uses the v3 API documented here: https://developers.themoviedb.org/3/getting-started/introduction. It is -straightforward HTTP REST API with API Authentication. +straightforward HTTP REST API with API Authentication. ## API key @@ -14,7 +14,7 @@ Api key is mandate for this connector to work. It could be generated using a fre ### Step 1: Set up TMDb connection - Have an API key by generating personal API key (Example: 12345) -- A movie ID, or query could be configured in config.json (Not Mandate, Default movie _id would be 550 and query would be marvel) +- A movie ID, or query could be configured in config.json (Not Mandate, Default movie \_id would be 550 and query would be marvel) - See sample_config.json for more details ## Step 2: Generate schema for the endpoint @@ -28,8 +28,7 @@ Api key is mandate for this connector to work. It could be generated using a fre 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter params `movie_id, query, language` (if needed). -6. Click **Set up source**. - - * We use only GET methods, all streams are straightforward. +4. Enter params `movie_id, query, language` (if needed). +5. Click **Set up source**. +- We use only GET methods, all streams are straightforward. diff --git a/airbyte-integrations/connectors/source-todoist/README.md b/airbyte-integrations/connectors/source-todoist/README.md index 3d752ac2ae4..40a7f76174b 100644 --- a/airbyte-integrations/connectors/source-todoist/README.md +++ b/airbyte-integrations/connectors/source-todoist/README.md @@ -1,31 +1,32 @@ # Todoist source connector - This is the repository for the Todoist source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/todoist). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/todoist) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_todoist/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-todoist spec poetry run source-todoist check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-todoist read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-todoist build ``` An image will be available on your host with the tag `airbyte/source-todoist:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-todoist:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-todoist:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-todoist test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-todoist test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/todoist.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-toggl/README.md b/airbyte-integrations/connectors/source-toggl/README.md index d196b009f6d..29e00f21822 100644 --- a/airbyte-integrations/connectors/source-toggl/README.md +++ b/airbyte-integrations/connectors/source-toggl/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/toggl) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_toggl/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-toggl build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-toggl build An image will be built with the tag `airbyte/source-toggl:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-toggl:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-toggl:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-toggl:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-toggl test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-toggl test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tplcentral/README.md b/airbyte-integrations/connectors/source-tplcentral/README.md index ba7a0aa252b..6e6db8194ce 100644 --- a/airbyte-integrations/connectors/source-tplcentral/README.md +++ b/airbyte-integrations/connectors/source-tplcentral/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/tplcentral) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_tplcentral/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-tplcentral build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-tplcentral build An image will be built with the tag `airbyte/source-tplcentral:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-tplcentral:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tplcentral:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tplcentral:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tplcentral test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tplcentral test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-trello/README.md b/airbyte-integrations/connectors/source-trello/README.md index debe2e6038f..c15a7b3eae0 100644 --- a/airbyte-integrations/connectors/source-trello/README.md +++ b/airbyte-integrations/connectors/source-trello/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/trello) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_trello/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-trello build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-trello build An image will be built with the tag `airbyte/source-trello:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-trello:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-trello:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trello:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-trello test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-trello test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-trustpilot/README.md b/airbyte-integrations/connectors/source-trustpilot/README.md index 7745d8a7f18..18f92ab7a91 100644 --- a/airbyte-integrations/connectors/source-trustpilot/README.md +++ b/airbyte-integrations/connectors/source-trustpilot/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/trustpilot) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_trustpilot/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-trustpilot build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-trustpilot build An image will be built with the tag `airbyte/source-trustpilot:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-trustpilot:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-trustpilot:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trustpilot:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-trustpilot test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-trustpilot test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tvmaze-schedule/README.md b/airbyte-integrations/connectors/source-tvmaze-schedule/README.md index fdf98ecdf7d..bd2bf03a18b 100644 --- a/airbyte-integrations/connectors/source-tvmaze-schedule/README.md +++ b/airbyte-integrations/connectors/source-tvmaze-schedule/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/tvmaze-schedule) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_tvmaze_schedule/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-tvmaze-schedule build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-tvmaze-schedule build An image will be built with the tag `airbyte/source-tvmaze-schedule:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-tvmaze-schedule:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tvmaze-schedule:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tvmaze-schedule:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tvmaze-schedule test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tvmaze-schedule test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-twilio-taskrouter/README.md b/airbyte-integrations/connectors/source-twilio-taskrouter/README.md index ad6620b3693..0633ed803ec 100644 --- a/airbyte-integrations/connectors/source-twilio-taskrouter/README.md +++ b/airbyte-integrations/connectors/source-twilio-taskrouter/README.md @@ -1,31 +1,32 @@ # Twilio-Taskrouter source connector - This is the repository for the Twilio-Taskrouter source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/twilio-taskrouter). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/twilio-taskrouter) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_twilio_taskrouter/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-twilio-taskrouter spec poetry run source-twilio-taskrouter check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-twilio-taskrouter read --config secrets/config.json --catalog ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-twilio-taskrouter build ``` An image will be available on your host with the tag `airbyte/source-twilio-taskrouter:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-twilio-taskrouter:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio-taskrouter:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-twilio-taskrouter test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-twilio-taskrouter test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/twilio-taskrouter.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-twilio/README.md b/airbyte-integrations/connectors/source-twilio/README.md index b4d9f466f18..938ae7498bc 100644 --- a/airbyte-integrations/connectors/source-twilio/README.md +++ b/airbyte-integrations/connectors/source-twilio/README.md @@ -1,31 +1,32 @@ # Twilio source connector - This is the repository for the Twilio source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/twilio). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/twilio) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_twilio/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-twilio spec poetry run source-twilio check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-twilio read --config secrets/config.json --catalog integration ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-twilio build ``` An image will be available on your host with the tag `airbyte/source-twilio:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-twilio:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-twilio test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-twilio test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/twilio.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-twitter/README.md b/airbyte-integrations/connectors/source-twitter/README.md index 4dffdf65820..bd7b6830738 100644 --- a/airbyte-integrations/connectors/source-twitter/README.md +++ b/airbyte-integrations/connectors/source-twitter/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/twitter) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_twitter/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-twitter build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-twitter build An image will be built with the tag `airbyte/source-twitter:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-twitter:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-twitter:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twitter:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-twitter test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-twitter test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-tyntec-sms/README.md b/airbyte-integrations/connectors/source-tyntec-sms/README.md index fe952c25d8c..ebf126e06ab 100644 --- a/airbyte-integrations/connectors/source-tyntec-sms/README.md +++ b/airbyte-integrations/connectors/source-tyntec-sms/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/tyntec-sms) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_tyntec_sms/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-tyntec-sms build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-tyntec-sms build An image will be built with the tag `airbyte/source-tyntec-sms:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-tyntec-sms:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-tyntec-sms:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tyntec-sms:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-tyntec-sms test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-tyntec-sms test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-typeform/README.md b/airbyte-integrations/connectors/source-typeform/README.md index 0157f45bbbd..d5441b1467d 100644 --- a/airbyte-integrations/connectors/source-typeform/README.md +++ b/airbyte-integrations/connectors/source-typeform/README.md @@ -1,31 +1,32 @@ # Typeform source connector - This is the repository for the Typeform source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/typeform). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/typeform) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_typeform/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-typeform spec poetry run source-typeform check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-typeform read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-typeform build ``` An image will be available on your host with the tag `airbyte/source-typeform:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-typeform:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-typeform:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-typeform test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-typeform test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/typeform.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-unleash/README.md b/airbyte-integrations/connectors/source-unleash/README.md index aefd0e9eac5..447e3e6aa1f 100644 --- a/airbyte-integrations/connectors/source-unleash/README.md +++ b/airbyte-integrations/connectors/source-unleash/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/unleash) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_unleash/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-unleash build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-unleash build An image will be built with the tag `airbyte/source-unleash:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-unleash:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-unleash:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-unleash:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-unleash test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-unleash test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-us-census/README.md b/airbyte-integrations/connectors/source-us-census/README.md index ad9c105dbf3..94630a500cb 100644 --- a/airbyte-integrations/connectors/source-us-census/README.md +++ b/airbyte-integrations/connectors/source-us-census/README.md @@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/us-census) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_us_census/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-us-census build ``` @@ -58,12 +66,15 @@ airbyte-ci connectors --name=source-us-census build An image will be built with the tag `airbyte/source-us-census:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-us-census:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-us-census:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-us-census:dev check --config /secrets/config.json @@ -72,23 +83,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-us-census test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-us-census test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-vantage/README.md b/airbyte-integrations/connectors/source-vantage/README.md index fc037e05b32..6b873f83653 100644 --- a/airbyte-integrations/connectors/source-vantage/README.md +++ b/airbyte-integrations/connectors/source-vantage/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/vantage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_vantage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-vantage build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-vantage build An image will be built with the tag `airbyte/source-vantage:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-vantage:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-vantage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vantage:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-vantage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-vantage test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-visma-economic/README.md b/airbyte-integrations/connectors/source-visma-economic/README.md index a5a5601ec26..4889869b2ea 100644 --- a/airbyte-integrations/connectors/source-visma-economic/README.md +++ b/airbyte-integrations/connectors/source-visma-economic/README.md @@ -1,31 +1,32 @@ # Visma-Economic source connector - This is the repository for the Visma-Economic source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/visma-economic). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/visma-economic) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_visma_economic/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-visma-economic spec poetry run source-visma-economic check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-visma-economic read --config secrets/config.json --catalog sam ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-visma-economic build ``` An image will be available on your host with the tag `airbyte/source-visma-economic:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-visma-economic:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-visma-economic:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-visma-economic test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-visma-economic test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/visma-economic.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-vitally/README.md b/airbyte-integrations/connectors/source-vitally/README.md index db65f39f5b6..9439701ea88 100644 --- a/airbyte-integrations/connectors/source-vitally/README.md +++ b/airbyte-integrations/connectors/source-vitally/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/vitally) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_vitally/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-vitally build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-vitally build An image will be built with the tag `airbyte/source-vitally:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-vitally:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-vitally:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vitally:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-vitally test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-vitally test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-waiteraid/README.md b/airbyte-integrations/connectors/source-waiteraid/README.md index c7d29eb67e1..7a5c36a98d3 100644 --- a/airbyte-integrations/connectors/source-waiteraid/README.md +++ b/airbyte-integrations/connectors/source-waiteraid/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/waiteraid) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_waiteraid/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -38,7 +44,9 @@ See `integration_tests/sample_config.json` for a sample config file. **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source waiteraid test creds` and place them into `secrets/config.json`. + ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -48,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-waiteraid build ``` @@ -58,12 +67,15 @@ airbyte-ci connectors --name=source-waiteraid build An image will be built with the tag `airbyte/source-waiteraid:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-waiteraid:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-waiteraid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-waiteraid:dev check --config /secrets/config.json @@ -72,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-waiteraid test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-waiteraid test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -96,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-waiteraid/bootstrap.md b/airbyte-integrations/connectors/source-waiteraid/bootstrap.md index a92233214a4..55d4feae60b 100644 --- a/airbyte-integrations/connectors/source-waiteraid/bootstrap.md +++ b/airbyte-integrations/connectors/source-waiteraid/bootstrap.md @@ -2,10 +2,11 @@ Waiteraid is a REST API. Connector has the following streams, and all of them support full refresh only. -* [Bookings](https://app.waiteraid.com/api-docs/index.html#api_get_bookings) +- [Bookings](https://app.waiteraid.com/api-docs/index.html#api_get_bookings) ## Authentication + Waiteraid API offers two types of [authentication methods](https://app.waiteraid.com/api-docs/index.html#auth_call). -* API Keys - Keys are passed using HTTP Basic auth. -* Username and Password - Not supported by this connector. +- API Keys - Keys are passed using HTTP Basic auth. +- Username and Password - Not supported by this connector. diff --git a/airbyte-integrations/connectors/source-weatherstack/README.md b/airbyte-integrations/connectors/source-weatherstack/README.md index 531fb57a99e..726cb0639a6 100644 --- a/airbyte-integrations/connectors/source-weatherstack/README.md +++ b/airbyte-integrations/connectors/source-weatherstack/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/weatherstack) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_weatherstack/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-weatherstack build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-weatherstack build An image will be built with the tag `airbyte/source-weatherstack:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-weatherstack:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-weatherstack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-weatherstack:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-weatherstack test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-weatherstack test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-webflow/README.md b/airbyte-integrations/connectors/source-webflow/README.md index 807cec095b4..97d414383db 100644 --- a/airbyte-integrations/connectors/source-webflow/README.md +++ b/airbyte-integrations/connectors/source-webflow/README.md @@ -10,23 +10,28 @@ A detailed tutorial has been written about this implementation. See: [Build a co ## Local development ### Prerequisites + - Webflow v1 API Key #### Minimum Python version required `= 3.9.11` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -35,6 +40,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/webflow) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_webflow/spec.yaml` file. Note that any directory named `secrets` is git-ignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -46,6 +52,7 @@ For more information about creating Webflow credentials, see [the documentation] and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -55,9 +62,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-webflow build ``` @@ -65,12 +73,15 @@ airbyte-ci connectors --name=source-webflow build An image will be built with the tag `airbyte/source-webflow:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-webflow:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-webflow:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-webflow:dev check --config /secrets/config.json @@ -79,23 +90,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-webflow test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-webflow test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -103,4 +121,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-whisky-hunter/README.md b/airbyte-integrations/connectors/source-whisky-hunter/README.md index 48a699198d2..6a96e5d8d9f 100644 --- a/airbyte-integrations/connectors/source-whisky-hunter/README.md +++ b/airbyte-integrations/connectors/source-whisky-hunter/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/whisky-hunter) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_whisky_hunter/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-whisky-hunter build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-whisky-hunter build An image will be built with the tag `airbyte/source-whisky-hunter:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-whisky-hunter:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-whisky-hunter:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-whisky-hunter:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-whisky-hunter test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-whisky-hunter test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-whisky-hunter/bootstrap.md b/airbyte-integrations/connectors/source-whisky-hunter/bootstrap.md index 472c1c2709e..f5c3758564b 100644 --- a/airbyte-integrations/connectors/source-whisky-hunter/bootstrap.md +++ b/airbyte-integrations/connectors/source-whisky-hunter/bootstrap.md @@ -3,18 +3,20 @@ [Whisky Hunter](https://whiskyhunter.net/api/) is an API. Connector is implemented with the [Airbyte Low-Code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview). Connector supports the following three streams: -* `auctions_data` - * Provides stats about specific auctions. -* `auctions_info` - * Provides information and metadata about recurring and one-off auctions. -* `distilleries_info` - * Provides information about distilleries. + +- `auctions_data` + - Provides stats about specific auctions. +- `auctions_info` + - Provides information and metadata about recurring and one-off auctions. +- `distilleries_info` + - Provides information about distilleries. Rate Limiting: -* No published rate limit. + +- No published rate limit. Authentication and Permissions: -* No authentication. +- No authentication. See [this](https://docs.airbyte.io/integrations/sources/whisky-hunter) link for the connector docs. diff --git a/airbyte-integrations/connectors/source-wikipedia-pageviews/README.md b/airbyte-integrations/connectors/source-wikipedia-pageviews/README.md index 6ffa64ddd6e..eae97dbc1ed 100755 --- a/airbyte-integrations/connectors/source-wikipedia-pageviews/README.md +++ b/airbyte-integrations/connectors/source-wikipedia-pageviews/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/wikipedia-pageviews) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_wikipedia_pageviews/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-wikipedia-pageviews build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-wikipedia-pageviews build An image will be built with the tag `airbyte/source-wikipedia-pageviews:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-wikipedia-pageviews:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-wikipedia-pageviews:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wikipedia-pageviews:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-wikipedia-pageviews test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-wikipedia-pageviews test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-woocommerce/README.md b/airbyte-integrations/connectors/source-woocommerce/README.md index 153cbfd90c5..b69eb610d45 100644 --- a/airbyte-integrations/connectors/source-woocommerce/README.md +++ b/airbyte-integrations/connectors/source-woocommerce/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/woocommerce) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_woocommerce/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-woocommerce build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-woocommerce build An image will be built with the tag `airbyte/source-woocommerce:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-woocommerce:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-woocommerce:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-woocommerce:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-woocommerce test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-woocommerce test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-workable/README.md b/airbyte-integrations/connectors/source-workable/README.md index 183ef638217..b7d2af7eabd 100644 --- a/airbyte-integrations/connectors/source-workable/README.md +++ b/airbyte-integrations/connectors/source-workable/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/workable) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_workable/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-workable build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-workable build An image will be built with the tag `airbyte/source-workable:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-workable:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-workable:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workable:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-workable test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-workable test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-workramp/README.md b/airbyte-integrations/connectors/source-workramp/README.md index 3382c28a060..77e2224222f 100644 --- a/airbyte-integrations/connectors/source-workramp/README.md +++ b/airbyte-integrations/connectors/source-workramp/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/workramp) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_workramp/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-workramp build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-workramp build An image will be built with the tag `airbyte/source-workramp:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-workramp:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-workramp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workramp:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-workramp test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-workramp test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-wrike/README.md b/airbyte-integrations/connectors/source-wrike/README.md index 25b2da5fd3c..de7963c05cc 100644 --- a/airbyte-integrations/connectors/source-wrike/README.md +++ b/airbyte-integrations/connectors/source-wrike/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/wrike) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_wrike/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-wrike build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-wrike build An image will be built with the tag `airbyte/source-wrike:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-wrike:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-wrike:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wrike:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-wrike test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-wrike test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-xero/README.md b/airbyte-integrations/connectors/source-xero/README.md index 5fb499eaa44..24824bd5544 100644 --- a/airbyte-integrations/connectors/source-xero/README.md +++ b/airbyte-integrations/connectors/source-xero/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/xero) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_xero/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-xero build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-xero build An image will be built with the tag `airbyte/source-xero:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-xero:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-xero:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xero:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-xero test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-xero test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-xkcd/README.md b/airbyte-integrations/connectors/source-xkcd/README.md index 19c7ae82139..cadc1108c1d 100644 --- a/airbyte-integrations/connectors/source-xkcd/README.md +++ b/airbyte-integrations/connectors/source-xkcd/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.9.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python3 -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/xkcd) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_xkcd/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-xkcd build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-xkcd build An image will be built with the tag `airbyte/source-xkcd:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-xkcd:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-xkcd:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xkcd:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-xkcd test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-xkcd test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-xkcd/bootstrap.md b/airbyte-integrations/connectors/source-xkcd/bootstrap.md index 89e30b2b46e..3bf6166b053 100644 --- a/airbyte-integrations/connectors/source-xkcd/bootstrap.md +++ b/airbyte-integrations/connectors/source-xkcd/bootstrap.md @@ -11,7 +11,7 @@ xkcd API has only one endpoint that responds with the comic metadata. ## Quick Notes - This is an open API, which means no credentials are necessary to access this data. -- This API doesn't accept query strings or POST params. The only way to iterate over the comics is through different paths, passing the comic number (https://xkcd.com/{comic_num}/json.html). +- This API doesn't accept query strings or POST params. The only way to iterate over the comics is through different paths, passing the comic number (https://xkcd.com/{comic_num}/json.html). ## API Reference diff --git a/airbyte-integrations/connectors/source-yahoo-finance-price/README.md b/airbyte-integrations/connectors/source-yahoo-finance-price/README.md index a194b44c200..027b60ce472 100644 --- a/airbyte-integrations/connectors/source-yahoo-finance-price/README.md +++ b/airbyte-integrations/connectors/source-yahoo-finance-price/README.md @@ -1,31 +1,32 @@ # Yahoo-Finance-Price source connector - This is the repository for the Yahoo-Finance-Price source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/yahoo-finance-price). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/yahoo-finance-price) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_yahoo_finance_price/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-yahoo-finance-price spec poetry run source-yahoo-finance-price check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-yahoo-finance-price read --config secrets/config.json --catalo ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-yahoo-finance-price build ``` An image will be available on your host with the tag `airbyte/source-yahoo-finance-price:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-yahoo-finance-price:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yahoo-finance-price:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-yahoo-finance-price test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-yahoo-finance-price test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/yahoo-finance-price.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-yandex-metrica/README.md b/airbyte-integrations/connectors/source-yandex-metrica/README.md index e9c882e3590..eac547ac4e1 100644 --- a/airbyte-integrations/connectors/source-yandex-metrica/README.md +++ b/airbyte-integrations/connectors/source-yandex-metrica/README.md @@ -1,31 +1,32 @@ # Yandex-Metrica source connector - This is the repository for the Yandex-Metrica source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/yandex-metrica). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/yandex-metrica) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_yandex_metrica/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-yandex-metrica spec poetry run source-yandex-metrica check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-yandex-metrica read --config secrets/config.json --catalog sam ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-yandex-metrica build ``` An image will be available on your host with the tag `airbyte/source-yandex-metrica:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-yandex-metrica:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yandex-metrica:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-yandex-metrica test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-yandex-metrica test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/yandex-metrica.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-yotpo/README.md b/airbyte-integrations/connectors/source-yotpo/README.md index a0454fb315c..1ad2649859a 100644 --- a/airbyte-integrations/connectors/source-yotpo/README.md +++ b/airbyte-integrations/connectors/source-yotpo/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/yotpo) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_yotpo/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-yotpo build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-yotpo build An image will be built with the tag `airbyte/source-yotpo:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-yotpo:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-yotpo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yotpo:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-yotpo test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-yotpo test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-younium/README.md b/airbyte-integrations/connectors/source-younium/README.md index 4d8e3d90d62..ceafc620c9d 100644 --- a/airbyte-integrations/connectors/source-younium/README.md +++ b/airbyte-integrations/connectors/source-younium/README.md @@ -1,31 +1,32 @@ # Younium source connector - This is the repository for the Younium source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/younium). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/younium) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_younium/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-younium spec poetry run source-younium check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-younium read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-younium build ``` An image will be available on your host with the tag `airbyte/source-younium:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-younium:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-younium:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-younium test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-younium test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/younium.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-youtube-analytics/README.md b/airbyte-integrations/connectors/source-youtube-analytics/README.md index 57c9e04b195..af196d27086 100644 --- a/airbyte-integrations/connectors/source-youtube-analytics/README.md +++ b/airbyte-integrations/connectors/source-youtube-analytics/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/youtube-analytics) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_youtube_analytics/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-youtube-analytics build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-youtube-analytics build An image will be built with the tag `airbyte/source-youtube-analytics:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-youtube-analytics:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-youtube-analytics:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-youtube-analytics:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-youtube-analytics test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-youtube-analytics test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-zapier-supported-storage/README.md b/airbyte-integrations/connectors/source-zapier-supported-storage/README.md index 64ee66535c5..b640ce86ce6 100644 --- a/airbyte-integrations/connectors/source-zapier-supported-storage/README.md +++ b/airbyte-integrations/connectors/source-zapier-supported-storage/README.md @@ -1,31 +1,32 @@ # Zapier-Supported-Storage source connector - This is the repository for the Zapier-Supported-Storage source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zapier-supported-storage). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zapier-supported-storage) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zapier_supported_storage/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zapier-supported-storage spec poetry run source-zapier-supported-storage check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zapier-supported-storage read --config secrets/config.json --c ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zapier-supported-storage build ``` An image will be available on your host with the tag `airbyte/source-zapier-supported-storage:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zapier-supported-storage:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zapier-supported-storage:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zapier-supported-storage test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zapier-supported-storage test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zapier-supported-storage.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zendesk-chat/README.md b/airbyte-integrations/connectors/source-zendesk-chat/README.md index 411735aa8b1..e3f4a639e95 100644 --- a/airbyte-integrations/connectors/source-zendesk-chat/README.md +++ b/airbyte-integrations/connectors/source-zendesk-chat/README.md @@ -1,31 +1,32 @@ # Zendesk-Chat source connector - This is the repository for the Zendesk-Chat source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zendesk-chat). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zendesk-chat) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_chat/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zendesk-chat spec poetry run source-zendesk-chat check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zendesk-chat read --config secrets/config.json --catalog integ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zendesk-chat build ``` An image will be available on your host with the tag `airbyte/source-zendesk-chat:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zendesk-chat:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-chat:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zendesk-chat test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zendesk-chat test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zendesk-chat.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zendesk-sell/README.md b/airbyte-integrations/connectors/source-zendesk-sell/README.md index 5b6fabc3b1a..a4d97c24988 100644 --- a/airbyte-integrations/connectors/source-zendesk-sell/README.md +++ b/airbyte-integrations/connectors/source-zendesk-sell/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zendesk-sell) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_sell/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,9 +17,10 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-zendesk-sell build ``` @@ -26,12 +28,15 @@ airbyte-ci connectors --name=source-zendesk-sell build An image will be built with the tag `airbyte/source-zendesk-sell:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-zendesk-sell:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zendesk-sell:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sell:dev check --config /secrets/config.json @@ -40,23 +45,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zendesk-sell test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zendesk-sell test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -64,4 +76,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-zendesk-sunshine/README.md b/airbyte-integrations/connectors/source-zendesk-sunshine/README.md index 2ec28fffe52..3317f4ae0a6 100644 --- a/airbyte-integrations/connectors/source-zendesk-sunshine/README.md +++ b/airbyte-integrations/connectors/source-zendesk-sunshine/README.md @@ -1,31 +1,32 @@ # Zendesk-Sunshine source connector - This is the repository for the Zendesk-Sunshine source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zendesk-sunshine). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zendesk-sunshine) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_sunshine/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zendesk-sunshine spec poetry run source-zendesk-sunshine check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zendesk-sunshine read --config secrets/config.json --catalog s ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zendesk-sunshine build ``` An image will be available on your host with the tag `airbyte/source-zendesk-sunshine:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zendesk-sunshine:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sunshine:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zendesk-sunshine test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zendesk-sunshine test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zendesk-sunshine.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zendesk-support/README.md b/airbyte-integrations/connectors/source-zendesk-support/README.md index 79e724300a3..06e5627b069 100644 --- a/airbyte-integrations/connectors/source-zendesk-support/README.md +++ b/airbyte-integrations/connectors/source-zendesk-support/README.md @@ -1,31 +1,32 @@ # Zendesk-Support source connector - This is the repository for the Zendesk-Support source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zendesk-support). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zendesk-support) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_support/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zendesk-support spec poetry run source-zendesk-support check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zendesk-support read --config secrets/config.json --catalog in ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zendesk-support build ``` An image will be available on your host with the tag `airbyte/source-zendesk-support:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zendesk-support:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-support:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zendesk-support test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zendesk-support test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zendesk-support.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zendesk-talk/README.md b/airbyte-integrations/connectors/source-zendesk-talk/README.md index 3b9ae361dd5..df4b0ebdf75 100644 --- a/airbyte-integrations/connectors/source-zendesk-talk/README.md +++ b/airbyte-integrations/connectors/source-zendesk-talk/README.md @@ -1,31 +1,32 @@ # Zendesk-Talk source connector - This is the repository for the Zendesk-Talk source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zendesk-talk). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zendesk-talk) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zendesk_talk/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zendesk-talk spec poetry run source-zendesk-talk check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zendesk-talk read --config secrets/config.json --catalog integ ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zendesk-talk build ``` An image will be available on your host with the tag `airbyte/source-zendesk-talk:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zendesk-talk:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-talk:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zendesk-talk test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zendesk-talk test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zendesk-talk.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zenefits/README.md b/airbyte-integrations/connectors/source-zenefits/README.md index eb4c41b5b5c..9dd2ae44af4 100644 --- a/airbyte-integrations/connectors/source-zenefits/README.md +++ b/airbyte-integrations/connectors/source-zenefits/README.md @@ -1,31 +1,32 @@ # Zenefits source connector - This is the repository for the Zenefits source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zenefits). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zenefits) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zenefits/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zenefits spec poetry run source-zenefits check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zenefits read --config secrets/config.json --catalog sample_fi ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zenefits build ``` An image will be available on your host with the tag `airbyte/source-zenefits:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zenefits:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenefits:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zenefits test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zenefits test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zenefits.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zenloop/README.md b/airbyte-integrations/connectors/source-zenloop/README.md index 13a27f64b3d..ddfa9ff559a 100644 --- a/airbyte-integrations/connectors/source-zenloop/README.md +++ b/airbyte-integrations/connectors/source-zenloop/README.md @@ -1,31 +1,32 @@ # Zenloop source connector - This is the repository for the Zenloop source connector, written in Python. For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/zenloop). ## Local development ### Prerequisites -* Python (~=3.9) -* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) +- Python (~=3.9) +- Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) ### Installing the connector + From this connector directory, run: + ```bash poetry install --with dev ``` - ### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/zenloop) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zenloop/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. See `sample_files/sample_config.json` for a sample config file. - ### Locally running the connector + ``` poetry run source-zenloop spec poetry run source-zenloop check --config secrets/config.json @@ -34,23 +35,28 @@ poetry run source-zenloop read --config secrets/config.json --catalog sample_fil ``` ### Running unit tests + To run unit tests locally, from the connector directory run: + ``` poetry run pytest unit_tests ``` ### Building the docker image + 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) 2. Run the following command to build the docker image: + ```bash airbyte-ci connectors --name=source-zenloop build ``` An image will be available on your host with the tag `airbyte/source-zenloop:dev`. - ### Running as a docker container + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zenloop:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenloop:dev check --config /secrets/config.json @@ -59,18 +65,23 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ### Running our CI test suite + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zenloop test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ### Dependency Management -All of your dependencies should be managed via Poetry. + +All of your dependencies should be managed via Poetry. To add a new dependency, run: + ```bash poetry add ``` @@ -78,14 +89,16 @@ poetry add Please commit the changes to `pyproject.toml` and `poetry.lock` files. ## Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zenloop test` -2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): - - bump the `dockerImageTag` value in in `metadata.yaml` - - bump the `version` value in `pyproject.toml` +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): + - bump the `dockerImageTag` value in in `metadata.yaml` + - bump the `version` value in `pyproject.toml` 3. Make sure the `metadata.yaml` content is up to date. 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/zenloop.md`). 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. -8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. \ No newline at end of file +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. diff --git a/airbyte-integrations/connectors/source-zoho-crm/README.md b/airbyte-integrations/connectors/source-zoho-crm/README.md index 2a63cefbbc4..0d17705a432 100644 --- a/airbyte-integrations/connectors/source-zoho-crm/README.md +++ b/airbyte-integrations/connectors/source-zoho-crm/README.md @@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development ### Prerequisites + **To iterate on this connector, make sure to complete this prerequisites section.** #### Minimum Python version required `= 3.7.0` #### Build & Activate Virtual Environment and install dependencies + From this connector directory, create a virtual environment: + ``` python -m venv .venv ``` This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run: + ``` source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]' ``` + If you are in an IDE, follow your IDE's instructions to activate the virtualenv. Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is @@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu should work as you expect. #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/zoho-crm) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_zoho_crm/spec.json` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file. and place them into `secrets/config.json`. ### Locally running the connector + ``` python main.py spec python main.py check --config secrets/config.json @@ -49,9 +56,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con ### Locally running the connector docker image - #### Build + **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** + ```bash airbyte-ci connectors --name=source-zoho-crm build ``` @@ -59,12 +67,15 @@ airbyte-ci connectors --name=source-zoho-crm build An image will be built with the tag `airbyte/source-zoho-crm:dev`. **Via `docker build`:** + ```bash docker build -t airbyte/source-zoho-crm:dev . ``` #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zoho-crm:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zoho-crm:dev check --config /secrets/config.json @@ -73,23 +84,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zoho-crm test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zoho-crm test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. @@ -97,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). 6. Pat yourself on the back for being an awesome contributor. 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. - diff --git a/airbyte-integrations/connectors/source-zoom/README.md b/airbyte-integrations/connectors/source-zoom/README.md index 3b188883c65..4b79a68fd3d 100644 --- a/airbyte-integrations/connectors/source-zoom/README.md +++ b/airbyte-integrations/connectors/source-zoom/README.md @@ -6,6 +6,7 @@ For information about how to use this connector within Airbyte, see [the documen ## Local development #### Create credentials + **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/zoom) to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_survey_sparrow/spec.yaml` file. Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. @@ -16,10 +17,8 @@ and place them into `secrets/config.json`. ### Locally running the connector docker image - - - #### Use `airbyte-ci` to build your connector + The Airbyte way of building this connector is to use our `airbyte-ci` tool. You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). Then running the following command will build your connector: @@ -27,15 +26,18 @@ Then running the following command will build your connector: ```bash airbyte-ci connectors --name source-zoom build ``` + Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-zoom:dev`. ##### Customizing our build process + When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding a `build_customization.py` module to your connector. This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -55,6 +57,7 @@ async def post_connector_install(connector_container: Container) -> Container: ``` #### Build your own connector image + This connector is built using our dynamic built process in `airbyte-ci`. The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). @@ -63,6 +66,7 @@ It does not rely on a Dockerfile. If you would like to patch our connector and build your own a simple approach would be to: 1. Create your own Dockerfile based on the latest version of the connector image. + ```Dockerfile FROM airbyte/source-zoom:latest @@ -73,16 +77,21 @@ RUN pip install ./airbyte/integration_code # ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" # ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] ``` + Please use this as an example. This is not optimized. 2. Build your image: + ```bash docker build -t airbyte/source-zoom:dev . # Running the spec command against your patched connector docker run airbyte/source-zoom:dev spec ``` + #### Run + Then run any of the connector commands as follows: + ``` docker run --rm airbyte/source-zoom:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zoom:dev check --config /secrets/config.json @@ -91,23 +100,30 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat ``` ## Testing + You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md): + ```bash airbyte-ci connectors --name=source-zoom test ``` ### Customizing acceptance Tests + Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. ## Dependency Management + All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are: -* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. -* required for the testing need to go to `TEST_REQUIREMENTS` list + +- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +- required for the testing need to go to `TEST_REQUIREMENTS` list ### Publishing a new version of the connector + You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? + 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-zoom test` 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). 3. Make sure the `metadata.yaml` content is up to date. diff --git a/docs/access-management/sso-providers/azure-entra-id.md b/docs/access-management/sso-providers/azure-entra-id.md index dda68496985..5b06c4e29d8 100644 --- a/docs/access-management/sso-providers/azure-entra-id.md +++ b/docs/access-management/sso-providers/azure-entra-id.md @@ -42,7 +42,7 @@ Hit **Register** to create the application. To create Client credentials for Airbyte to talk to your application head to **Certificates & Secrets** on the detail screen of your application and select the **Client secrets** tab. -Click **New client secret**, specify any Description you want and any expiry date you want. +Click **New client secret**, specify any Description you want and any expiry date you want. :::tip We recommend to chose an expiry date of at least 12 months. You'll need to pass in the new client secret every time the old one expires to continue being able to log in via Entra ID. @@ -54,9 +54,9 @@ Copy the **Value** (the Client Secret itself) immediately after creation. You wo You'll need to pass your Airbyte contact the following information of the created application. -* **Client Secret**: as copied above -* **Application (client) ID**: You'll find this in the **Essentials** section on the **Overview** page of the application you created -* **OpenID Connect metadata document**: You'll find this in the **Endpoints** panel, that you can open from the top bar on the **Overview** page +- **Client Secret**: as copied above +- **Application (client) ID**: You'll find this in the **Essentials** section on the **Overview** page of the application you created +- **OpenID Connect metadata document**: You'll find this in the **Endpoints** panel, that you can open from the top bar on the **Overview** page Once we've received this information from you, We'll setup SSO for you and let you know once it's ready to be used. @@ -84,6 +84,7 @@ Hit **Register** to create the application. ### Create client credentials To create client credentials for Airbyte to interface with your application, head to **Certificates & Secrets** on the detail screen of your application and select the **Client secrets** tab. Then: + 1. Click **New client secret**, and enter the expiry date of your choosing. You'll need to pass in the new client secret every time the old one expires to continue being able to log in via Entra ID. 2. Copy the **Value** (the client secret itself) immediately after creation. You won't be able to view this later on. @@ -93,7 +94,6 @@ Depending on the default "Admin consent require' value for your organization you Admin Consent Option - ### Setup information needed Once your Microsoft Entra ID app is set up, you're ready to deploy Airbyte Self-Managed Enterprise with SSO. Take note of the following configuration values, as you will need them to configure Airbyte to use your new Okta SSO app integration: @@ -107,5 +107,3 @@ Use this information to configure the auth details of your `airbyte.yml` for you - - diff --git a/docs/access-management/sso-providers/okta.md b/docs/access-management/sso-providers/okta.md index 241c385cdf0..9d16edfe2a9 100644 --- a/docs/access-management/sso-providers/okta.md +++ b/docs/access-management/sso-providers/okta.md @@ -63,6 +63,7 @@ On the following screen you'll need to configure all parameters for your Okta ap * Your **Okta domain** (it's not specific to this application, see [Find your Okta domain](https://developer.okta.com/docs/guides/find-your-domain/main/)) * **Client ID** * **Client Secret** + Create the application with the following parameters: @@ -104,5 +105,6 @@ On the following screen you'll need to configure all parameters for your Okta ap * Client Secret Visit the [implementation guide](/enterprise-setup/implementation-guide.md) for instructions on how to deploy Airbyte Enterprise using `kubernetes`, `kubectl` and `helm`. + diff --git a/docs/access-management/sso.md b/docs/access-management/sso.md index 065c7ed74e5..b9b8574ea29 100644 --- a/docs/access-management/sso.md +++ b/docs/access-management/sso.md @@ -35,4 +35,3 @@ import DocCardList from '@theme/DocCardList'; Accessing your self hosted Airbyte will automatically forward you to your IdP's login page (e.g. Okta login page). Log into your work account and you’ll be forwarded back to your Airbyte and be logged in. - diff --git a/docs/api-documentation.md b/docs/api-documentation.md index 53cf2c7845a..9b1fa1c8264 100644 --- a/docs/api-documentation.md +++ b/docs/api-documentation.md @@ -6,10 +6,10 @@ products: all Airbyte has two sets of APIs which are intended for different uses. The table below outlines their descriptions, use cases, availability and status. -| | **Airbyte API** | **Configuration API** | -|------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| **Description** | Airbyte API is a reliable, easy-to-use interface for programmatically controlling the Airbyte platform. With full support from the Airbyte team. | The Config API is an internal Airbyte API that is designed for communications between different Airbyte components. | -| **Use Cases** | Enables users to control Airbyte programmatically and use with Orchestration tools (ex: Airflow)

    Exists for Airbyte users to write applications against.

    Enables [Powered by Airbyte](https://airbyte.com/embed-airbyte-connectors-with-api) | Enables Airbyte Engineering team to configure Airbyte | -| **Intended users** | Airbyte OSS, Cloud & Self-Hosted Enterprise | Airbyte Engineering Team | -| **Status** | Available to all Airbyte users (OSS, Cloud, Self-Hosted Enterprise). Learn more on our [blog](https://airbyte.com/blog/airbyte-api).

    Full support from the Airbyte team. | Airbyte does NOT have active commitments to support this API long-term. Users utilize the Config API, at their own risk.

    This API is utilized internally by the Airbyte Engineering team and may be modified in the future if the need arises.

    Modifications by the Airbyte Engineering team could create breaking changes and OSS users would need to update their code to catch up to any backwards incompatible changes in the API. | -| **Documentation** | [Available here](https://api.airbyte.com) | [Available here](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html) +| | **Airbyte API** | **Configuration API** | +| ------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| **Description** | Airbyte API is a reliable, easy-to-use interface for programmatically controlling the Airbyte platform. With full support from the Airbyte team. | The Config API is an internal Airbyte API that is designed for communications between different Airbyte components. | +| **Use Cases** | Enables users to control Airbyte programmatically and use with Orchestration tools (ex: Airflow)

    Exists for Airbyte users to write applications against.

    Enables [Powered by Airbyte](https://airbyte.com/embed-airbyte-connectors-with-api) | Enables Airbyte Engineering team to configure Airbyte | +| **Intended users** | Airbyte OSS, Cloud & Self-Hosted Enterprise | Airbyte Engineering Team | +| **Status** | Available to all Airbyte users (OSS, Cloud, Self-Hosted Enterprise). Learn more on our [blog](https://airbyte.com/blog/airbyte-api).

    Full support from the Airbyte team. | Airbyte does NOT have active commitments to support this API long-term. Users utilize the Config API, at their own risk.

    This API is utilized internally by the Airbyte Engineering team and may be modified in the future if the need arises.

    Modifications by the Airbyte Engineering team could create breaking changes and OSS users would need to update their code to catch up to any backwards incompatible changes in the API. | +| **Documentation** | [Available here](https://api.airbyte.com) | [Available here](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html) | diff --git a/docs/cloud/managing-airbyte-cloud/configuring-connections.md b/docs/cloud/managing-airbyte-cloud/configuring-connections.md index 94b213e439f..7dfb664503d 100644 --- a/docs/cloud/managing-airbyte-cloud/configuring-connections.md +++ b/docs/cloud/managing-airbyte-cloud/configuring-connections.md @@ -8,11 +8,11 @@ A connection links a source to a destination and defines how your data will sync ## Configure Connection Settings -Configuring the connection settings allows you to manage various aspects of the sync, such as how often data syncs and where data is written. +Configuring the connection settings allows you to manage various aspects of the sync, such as how often data syncs and where data is written. To configure these settings: -1. In the Airbyte UI, click **Connections** and then click the connection you want to change. +1. In the Airbyte UI, click **Connections** and then click the connection you want to change. 2. Click the **Settings** tab. @@ -26,14 +26,14 @@ These settings apply to all streams in the connection. You can configure the following settings: -| Setting | Description | -|--------------------------------------|-------------------------------------------------------------------------------------| -| Connection Name | A custom name for your connection | -| [Schedule Type](/using-airbyte/core-concepts/sync-schedules.md) | How often data syncs (can be scheduled, cron, API-triggered or manual) | -| [Destination Namespace](/using-airbyte/core-concepts/namespaces.md) | Where the replicated data is written to in the destination | -| Destination Stream Prefix | A prefix added to each table name in the destination | -| [Detect and propagate schema changes](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How Airbyte handles schema changes in the source | -| [Connection Data Residency](/cloud/managing-airbyte-cloud/manage-data-residency.md) | Where data will be processed (Cloud only) | +| Setting | Description | +| --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------- | +| Connection Name | A custom name for your connection | +| [Schedule Type](/using-airbyte/core-concepts/sync-schedules.md) | How often data syncs (can be scheduled, cron, API-triggered or manual) | +| [Destination Namespace](/using-airbyte/core-concepts/namespaces.md) | Where the replicated data is written to in the destination | +| Destination Stream Prefix | A prefix added to each table name in the destination | +| [Detect and propagate schema changes](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How Airbyte handles schema changes in the source | +| [Connection Data Residency](/cloud/managing-airbyte-cloud/manage-data-residency.md) | Where data will be processed (Cloud only) | ## Modify Streams @@ -61,9 +61,9 @@ Source-defined cursors and primary keys are selected automatically and cannot be :::info -* You can only deselect top-level fields. You cannot deselect nested fields. -* The Airbyte platform may read all data from the source (depending on the source), but it will only write data to the destination from fields you selected. Deselecting fields will not prevent the Airbyte platform from reading them. -* When you refresh the schema, newly added fields will be selected by default, even if you have previously deselected fields in that stream. +- You can only deselect top-level fields. You cannot deselect nested fields. +- The Airbyte platform may read all data from the source (depending on the source), but it will only write data to the destination from fields you selected. Deselecting fields will not prevent the Airbyte platform from reading them. +- When you refresh the schema, newly added fields will be selected by default, even if you have previously deselected fields in that stream. ::: diff --git a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md index f5822a2d28e..4293c346337 100644 --- a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md +++ b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md @@ -2,9 +2,9 @@ products: cloud --- -# Use the dbt Cloud integration +# Use the dbt Cloud integration -By using the dbt Cloud integration, you can create and run dbt transformations during syncs in Airbyte Cloud. This allows you to transform raw data into a format that is suitable for analysis and reporting, including cleaning and enriching the data. +By using the dbt Cloud integration, you can create and run dbt transformations during syncs in Airbyte Cloud. This allows you to transform raw data into a format that is suitable for analysis and reporting, including cleaning and enriching the data. :::note @@ -14,13 +14,13 @@ Normalizing data may cause an increase in your destination's compute cost. This ## Step 1: Generate a service token -Generate a [service token](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens#generating-service-account-tokens) for your dbt Cloud transformation. +Generate a [service token](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens#generating-service-account-tokens) for your dbt Cloud transformation. :::note -* To use the dbt Cloud integration, you must use a paid version of dbt Cloud. -* The service token must have Member, Job Admin, or Account Admin permissions. - +- To use the dbt Cloud integration, you must use a paid version of dbt Cloud. +- The service token must have Member, Job Admin, or Account Admin permissions. + ::: ## Step 2: Set up the dbt Cloud integration in Airbyte Cloud @@ -37,12 +37,12 @@ To set up the dbt Cloud integration in Airbyte Cloud: 5. Go to the **Transformation** tab and click **+ Add transformation**. -6. Select the transformation from the dropdown and click **Save changes**. The transformation will run during the subsequent syncs until you remove it. +6. Select the transformation from the dropdown and click **Save changes**. The transformation will run during the subsequent syncs until you remove it. :::note You can have multiple transformations per connection. - + ::: -8. To remove a transformation, click **X** on the transformation and click **Save changes**. +8. To remove a transformation, click **X** on the transformation and click **Save changes**. diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md index 09d5dd4389b..4ad1a9b8bfe 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md @@ -4,43 +4,43 @@ products: all # Manage notifications -This page provides guidance on how to manage notifications for Airbyte, allowing you to stay up-to-date on the activities in your workspace. +This page provides guidance on how to manage notifications for Airbyte, allowing you to stay up-to-date on the activities in your workspace. ## Notification Event Types -| Type of Notification | Description | -|------------------------|---------------------------------------------------------------------------------------------------------------------| -| **Failed Syncs** | A sync from any of your connections fails. Note that if sync runs frequently or if there are many syncs in the workspace these types of events can be noisy | -| **Successful Syncs** | A sync from any of your connections succeeds. Note that if sync runs frequently or if there are many syncs in the workspace these types of events can be noisy -| **Automated Connection Updates** | A connection is updated automatically (ex. a source schema is automatically updated) | -| **Connection Updates Requiring Action** | A connection update requires you to take action (ex. a breaking schema change is detected) | -| **Warning - Repeated Failures** | A connection will be disabled soon due to repeated failures. It has failed 50 times consecutively or there were only failed jobs in the past 7 days | -| **Sync Disabled - Repeated Failures** | A connection was automatically disabled due to repeated failures. It will be disabled when it has failed 100 times consecutively or has been failing for 14 days in a row | -| **Warning - Upgrade Required** (Cloud only) | A new connector version is available and requires manual upgrade | -| **Sync Disabled - Upgrade Required** (Cloud only) | One or more connections were automatically disabled due to a connector upgrade deadline passing +| Type of Notification | Description | +| ------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| **Failed Syncs** | A sync from any of your connections fails. Note that if sync runs frequently or if there are many syncs in the workspace these types of events can be noisy | +| **Successful Syncs** | A sync from any of your connections succeeds. Note that if sync runs frequently or if there are many syncs in the workspace these types of events can be noisy | +| **Automated Connection Updates** | A connection is updated automatically (ex. a source schema is automatically updated) | +| **Connection Updates Requiring Action** | A connection update requires you to take action (ex. a breaking schema change is detected) | +| **Warning - Repeated Failures** | A connection will be disabled soon due to repeated failures. It has failed 50 times consecutively or there were only failed jobs in the past 7 days | +| **Sync Disabled - Repeated Failures** | A connection was automatically disabled due to repeated failures. It will be disabled when it has failed 100 times consecutively or has been failing for 14 days in a row | +| **Warning - Upgrade Required** (Cloud only) | A new connector version is available and requires manual upgrade | +| **Sync Disabled - Upgrade Required** (Cloud only) | One or more connections were automatically disabled due to a connector upgrade deadline passing | ### Enabling schema update notifications -To be notified of any source schema changes, make sure you have enabled `Automatic Connection Updates` and `Connection Updates Requiring Action` notifications. If these are off, even if you turned on schema update notifications in a connection's settings, Airbyte will *NOT* send out any notifications related to these types of events. +To be notified of any source schema changes, make sure you have enabled `Automatic Connection Updates` and `Connection Updates Requiring Action` notifications. If these are off, even if you turned on schema update notifications in a connection's settings, Airbyte will _NOT_ send out any notifications related to these types of events. To edit this setting, click **Connections** and select the connection you want to receive notifications for. Click the **Settings** tab on the Connection page. In the **Advanced Settings**, toggle **Schema update notifications**. - ## Configure Email Notification Settings To set up email notifications, click **Settings** and navigate to **Workspace** > **Notifications**. -Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. +Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. -![](./assets/notifications-email.png) +![](./assets/notifications-email.png) :::note -All email notifications except for Successful Syncs are enabled by default. +All email notifications except for Successful Syncs are enabled by default. ::: ### Modify the email recipient + To change the recipient, edit and save the **notification email recipient**. If you would like to send email notifications to more than one recipient, you can enter an email distribution list (ie Google Group) as the recipient. ## Configure Slack Notification settings @@ -49,35 +49,34 @@ If you're more of a visual learner, head over to [this video](https://www.youtub ### Create a Slack app -1. To set up Slack notifications, navigate to https://api.slack.com/apps/. Select `Create an App`. +1. To set up Slack notifications, navigate to https://api.slack.com/apps/. Select `Create an App`. -![](./assets/notification-slack-create-app.png) +![](./assets/notification-slack-create-app.png) -2. Select `From Scratch`. Enter your App Name (e.g. Airbyte Sync Notifications) and pick your desired Slack workspace. +2. Select `From Scratch`. Enter your App Name (e.g. Airbyte Sync Notifications) and pick your desired Slack workspace. -3. **Enable Incoming Webhooks**: in the left sidebar, click on `Incoming Webhooks`. Click the slider button in the top right to turn the feature on. Then click `Add New Webhook to Workspace`. +3. **Enable Incoming Webhooks**: in the left sidebar, click on `Incoming Webhooks`. Click the slider button in the top right to turn the feature on. Then click `Add New Webhook to Workspace`. -![](./assets/notification-slack-add-webhook.png) +![](./assets/notification-slack-add-webhook.png) 4. Select the channel that you want to receive Airbyte notifications in (ideally a dedicated one), and click `Allow` to give it permissions to access the channel. You should see the bot show up in the selected channel now. You will see an active webhook right above the `Add New Webhook to Workspace` button. -![](./assets/notification-slack-webhook-url-success.png) +![](./assets/notification-slack-webhook-url-success.png) 5. Click `Copy.` to copy the link to your clipboard, which you will need to enter into Airbyte. Your Webhook URL should look similar to this: - ``` - https://hooks.slack.com/services/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PM1 - ``` +``` +https://hooks.slack.com/services/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PM1 +``` ### Enable the Slack notification in Airbyte -1. Click **Settings** and navigate to **Notifications**. On this page, you can toggle each slider decide whether you want notifications on each notification type. Paste the copied webhook URL to `Webhook URL`. +1. Click **Settings** and navigate to **Notifications**. On this page, you can toggle each slider decide whether you want notifications on each notification type. Paste the copied webhook URL to `Webhook URL`. -3. **Test it out**: you can click `Test` to send a test message to the channel. Or, just run a sync now and try it out! For a successful sync, you should receive a notification that looks like this: +2. **Test it out**: you can click `Test` to send a test message to the channel. Or, just run a sync now and try it out! For a successful sync, you should receive a notification that looks like this: ![](./assets/notification-slack-success.png) - -4. Click **Save changes** to ensure you continue to receive alerts about your Airbyte syncs. \ No newline at end of file +4. Click **Save changes** to ensure you continue to receive alerts about your Airbyte syncs. diff --git a/docs/cloud/managing-airbyte-cloud/manage-connection-state.md b/docs/cloud/managing-airbyte-cloud/manage-connection-state.md index a745288fbe8..bd08c010724 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-connection-state.md +++ b/docs/cloud/managing-airbyte-cloud/manage-connection-state.md @@ -4,16 +4,17 @@ products: all # Modifying connection state -The connection state provides additional information about incremental syncs. It includes the most recent values for the global or stream-level cursors, which can aid in debugging or determining which data will be included in the next sync. +The connection state provides additional information about incremental syncs. It includes the most recent values for the global or stream-level cursors, which can aid in debugging or determining which data will be included in the next sync. To review the connection state: + 1. In the Airbyte UI, click **Connections** and then click the connection you want to display. 2. Click the **Settings** tab on the Connection page. -3. Click the **Advanced** dropdown arrow. +3. Click the **Advanced** dropdown arrow. - **Connection State** displays. + **Connection State** displays. Editing the connection state allows the sync to start from any date in the past. If the state is edited, Airbyte will start syncing incrementally from the new date. This is helpful if you do not want to fully resync your data. To edit the connection state: @@ -25,4 +26,4 @@ Updates to connection state should be handled with extreme care. Updates may bre 2. Confirm changes by clicking "Update state". Discard any changes by clikcing "Revert changes". -3. Confirm the changes to the connection state update. \ No newline at end of file +3. Confirm the changes to the connection state update. diff --git a/docs/cloud/managing-airbyte-cloud/manage-credits.md b/docs/cloud/managing-airbyte-cloud/manage-credits.md index 67518ead415..df05bba2218 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-credits.md +++ b/docs/cloud/managing-airbyte-cloud/manage-credits.md @@ -4,39 +4,40 @@ products: cloud # Manage credits -Airbyte [credits](https://airbyte.com/pricing) are used to pay for Airbyte resources when you run a sync. You can purchase credits on Airbyte Cloud to keep your data flowing without interruption. +Airbyte [credits](https://airbyte.com/pricing) are used to pay for Airbyte resources when you run a sync. You can purchase credits on Airbyte Cloud to keep your data flowing without interruption. ## Buy credits -1. To purchase credits directly through the UI, click **Billing** in the left-hand sidebar. The billing page displays the available credits, total credit usage, and the credit usage per connection. +1. To purchase credits directly through the UI, click **Billing** in the left-hand sidebar. The billing page displays the available credits, total credit usage, and the credit usage per connection. - :::tip + :::tip - If you are unsure of how many credits you need, use our [Cost Estimator](https://www.airbyte.com/pricing) or click **Talk to Sales** to find the right amount for your team. + If you are unsure of how many credits you need, use our [Cost Estimator](https://www.airbyte.com/pricing) or click **Talk to Sales** to find the right amount for your team. - ::: + ::: 2. Click **Buy credits**. Enter the quantity of credits you intend to purchase and adjust the **credit quantity** accordingly. When you're ready, click **Checkout**. - :::note + :::note - Purchase limits: - * Minimum: 20 credits - * Maximum: 6,000 credits + Purchase limits: - ::: + - Minimum: 20 credits + - Maximum: 6,000 credits - To buy more credits or discuss a custom plan, reach out to [Sales](https://airbyte.com/talk-to-sales). + ::: -5. You'll be renavigated to a Stripe payment page. If this is your first time purchasing, you'll be asked for payment details. After you enter your billing address, sales tax (if applicable) is calculated and added to the total. + To buy more credits or discuss a custom plan, reach out to [Sales](https://airbyte.com/talk-to-sales). -6. Click **Pay** to process your payment. A receipt for your purchase is automatically sent to your email. +3. You'll be renavigated to a Stripe payment page. If this is your first time purchasing, you'll be asked for payment details. After you enter your billing address, sales tax (if applicable) is calculated and added to the total. - :::note +4. Click **Pay** to process your payment. A receipt for your purchase is automatically sent to your email. - Credits expire after one year if they are not used. + :::note - ::: + Credits expire after one year if they are not used. + + ::: ## Automatic reload of credits @@ -51,11 +52,12 @@ To enroll, [email us](mailto:billing@airbyte.io) with: As an example, if the recharge threshold is 10 credits and recharge balance is 30 credits, anytime your credit balance dips below 10 credits, Airbyte will automatically add enough credits to bring the balance back to 30 credits by charging the difference between your credit balance and 30 credits. To take a real example, if: + 1. The credit balance reached 3 credits. 2. 27 credits are automatically charged to the card on file and added to the balance. 3. The ending credit balance is 30 credits. -Note that the difference between the recharge credit amount and recharge threshold must be at least 20 as our minimum purchase is 20 credits. +Note that the difference between the recharge credit amount and recharge threshold must be at least 20 as our minimum purchase is 20 credits. If you are enrolled and want to change your limits or cancel your enrollment, [email us](mailto:billing@airbyte.io). diff --git a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md index ec76c2cb334..bbec07165ed 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md +++ b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md @@ -8,31 +8,32 @@ In Airbyte Cloud, you can set the default data residency for your workspace and ## Choose your workspace default data residency -Setting a default data residency allows you to choose where your data is processed. Set the default data residency **before** creating a new source or connection so that subsequent workflows that rely on the default data residency, such as fetching the schema or testing the source or destination, can process data in the correct region. +Setting a default data residency allows you to choose where your data is processed. Set the default data residency **before** creating a new source or connection so that subsequent workflows that rely on the default data residency, such as fetching the schema or testing the source or destination, can process data in the correct region. -:::note +:::note While the data is processed in a data plane of the chosen residency, the cursor and primary key data is stored in the US control plane. If you have data that cannot be stored in the US, do not use it as a cursor or primary key. ::: -When you set the default data residency, it applies your preference to new connections only. If you do not adjust the default data residency, the [Airbyte Default](configuring-connections.md) region is used (United States). If you want to change the data residency for an individual connection, you can do so in its [connection settings](configuring-connections.md). +When you set the default data residency, it applies your preference to new connections only. If you do not adjust the default data residency, the [Airbyte Default](configuring-connections.md) region is used (United States). If you want to change the data residency for an individual connection, you can do so in its [connection settings](configuring-connections.md). To choose your default data residency, click **Settings** in the Airbyte UI. Navigate to **Workspace** > **Data Residency**. Use the dropdown to choose the location for your default data residency and save your changes. -:::info +:::info -Depending on your network configuration, you may need to add [IP addresses](/operating-airbyte/security.md#network-security-1) to your allowlist. +Depending on your network configuration, you may need to add [IP addresses](/operating-airbyte/security.md#network-security-1) to your allowlist. ::: ## Choose the data residency for a connection + You can additionally choose the data residency for your connection in the connection settings. You can choose the data residency when creating a new connection, or you can set the default data residency for your workspace so that it applies for any new connections moving forward. To choose a custom data residency for your connection, click **Connections** in the Airbyte UI and then select the connection that you want to configure. Navigate to the **Settings** tab, open the **Advanced Settings**, and select the **Data residency** for the connection. -:::note +:::note -Changes to data residency will not affect any sync in progress. +Changes to data residency will not affect any sync in progress. ::: diff --git a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md index f5714c0a77c..6bc5b188ed0 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md +++ b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md @@ -7,42 +7,46 @@ products: all You can specify for each connection how Airbyte should handle any change of schema in the source. This process helps ensure accurate and efficient data syncs, minimizing errors and saving you time and effort in managing your data pipelines. ## Types of Schema Changes -When propagation is enabled, your data in the destination will automatically shift to bring in the new changes. -| Type of Schema Change | Propagation Behavior | -|---------------------|---------------------------------------------------------------------------------------------------------------------| -| New Column | The new colummn will be created in the destination. Values for the column will be filled in for the updated rows. If you are missing values for rows not updated, a backfill can be done by completing a full resync or through the `Backfill new or renamed columns` option (see below) -| Removal of column | The old column will be removed from the destination. -| New stream | The first sync will create the new stream in the destination and fill all data in as if it is an initial sync. | -| Removal of stream | The stream will stop updating, and any existing data in the destination will remain. | -| Column data type changes | The data in the destination will remain the same. For those syncing on a Destinations V2 destination, any new or updated rows with incompatible data types will result in a row error in the destination tables and show an error in the `airbyte_meta` field. You will need to refresh the schema and do a full resync to ensure the data types are consistent. +When propagation is enabled, your data in the destination will automatically shift to bring in the new changes. + +| Type of Schema Change | Propagation Behavior | +| ------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| New Column | The new colummn will be created in the destination. Values for the column will be filled in for the updated rows. If you are missing values for rows not updated, a backfill can be done by completing a full resync or through the `Backfill new or renamed columns` option (see below) | +| Removal of column | The old column will be removed from the destination. | +| New stream | The first sync will create the new stream in the destination and fill all data in as if it is an initial sync. | +| Removal of stream | The stream will stop updating, and any existing data in the destination will remain. | +| Column data type changes | The data in the destination will remain the same. For those syncing on a Destinations V2 destination, any new or updated rows with incompatible data types will result in a row error in the destination tables and show an error in the `airbyte_meta` field. You will need to refresh the schema and do a full resync to ensure the data types are consistent. | ## Detect and Propagate Schema Changes -Based on your configured settings for **Detect and propagate schema changes**, Airbyte will automatically sync those changes or ignore them: -| Setting | Description | -|---------------------|---------------------------------------------------------------------------------------------------------------------| -| Propagate all changes (streams and fields) | All new streams and column changes from the source will automatically be propagated and reflected in the destination. This includes stream changes (additions or deletions), column changes (additions or deletions) and data type changes -| Propagate column changes only | Only column changes will be propagated. New or removed streams are ignored. -| Detect changes and manually approve | Schema changes will be detected, but not propagated. Syncs will continue running with the schema you've set up. To propagate the detected schema changes, you will need to approve the changes manually | -| Detect changes and pause connection | Connections will be automatically disabled as soon as any schema changes are detected | +Based on your configured settings for **Detect and propagate schema changes**, Airbyte will automatically sync those changes or ignore them: -Airbyte currently checks for any changes in your source schema immediately before syncing, at most once every 24 hours. This means that your schema may not always be propagated before your sync. +| Setting | Description | +| ------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| Propagate all changes (streams and fields) | All new streams and column changes from the source will automatically be propagated and reflected in the destination. This includes stream changes (additions or deletions), column changes (additions or deletions) and data type changes | +| Propagate column changes only | Only column changes will be propagated. New or removed streams are ignored. | +| Detect changes and manually approve | Schema changes will be detected, but not propagated. Syncs will continue running with the schema you've set up. To propagate the detected schema changes, you will need to approve the changes manually | +| Detect changes and pause connection | Connections will be automatically disabled as soon as any schema changes are detected | + +Airbyte currently checks for any changes in your source schema immediately before syncing, at most once every 24 hours. This means that your schema may not always be propagated before your sync. :::tip Ensure you receive schema notifications for your connection by enabling notifications in the connection's settings. ::: In all cases, if a breaking schema change is detected, the connection will be paused immediately for manual review to prevent future syncs from failing. Breaking schema changes occur when: -* An existing primary key is removed from the source -* An existing cursor is removed from the source + +- An existing primary key is removed from the source +- An existing cursor is removed from the source To re-enable the streams, ensure the correct **Primary Key** and **Cursor** are selected for each stream and save the connection. You will be prompted to clear the affected streams so that Airbyte can ensure future syncs are successful. ### Backfill new or renamed columns -To further automate the propagation of schema changes, Airbyte also offers the option to backfill new or renamed columns as a part of the sync. This means that anytime a new column is detected through the auto-propagation of schema changes, Airbyte will sync the entire stream again so that all values in the new columns will be completely filled, even if the row was not updated. If this option is not enabled, only rows that are updated as a part of the regular sync will be populated with a value. -This feature will only perform the backfill when `Detect and propagate schema changes` is set to `Propagate all changes` or `Propagate columns changes only` and Airbyte detects the schema change as a part of a sync. Refreshing the schema manually and applying schema changes will not allow the backfill to occur. +To further automate the propagation of schema changes, Airbyte also offers the option to backfill new or renamed columns as a part of the sync. This means that anytime a new column is detected through the auto-propagation of schema changes, Airbyte will sync the entire stream again so that all values in the new columns will be completely filled, even if the row was not updated. If this option is not enabled, only rows that are updated as a part of the regular sync will be populated with a value. + +This feature will only perform the backfill when `Detect and propagate schema changes` is set to `Propagate all changes` or `Propagate columns changes only` and Airbyte detects the schema change as a part of a sync. Refreshing the schema manually and applying schema changes will not allow the backfill to occur. :::tip Enabling automatic backfills may incur increased destination costs from refreshing the entire stream. @@ -54,11 +58,11 @@ For Cloud users, any stream that contains a new or renamed column will not be bi If the connection is set to **Detect any changes and manually approve** schema changes, Airbyte continues syncing according to your last saved schema. You need to manually approve any detected schema changes for the schema in the destination to change. -1. In the Airbyte UI, click **Connections**. Select a connection and navigate to the **Schema** tab. If schema changes are detected, you'll see a blue "i" icon next to the Replication ab. +1. In the Airbyte UI, click **Connections**. Select a connection and navigate to the **Schema** tab. If schema changes are detected, you'll see a blue "i" icon next to the Replication ab. 2. Click **Review changes**. -3. The **Refreshed source schema** dialog displays the changes detected. +3. The **Refreshed source schema** dialog displays the changes detected. 4. Review the changes and click **OK** to close the dialog. @@ -74,18 +78,19 @@ Breaking changes can also occur when a new major version of the connector is rel A major version upgrade will include a breaking change if any of these apply: -| Type of Change | Description | -|------------------|---------------------------------------------------------------------------------------------------------------------| -| Connector Spec Change | The configuration has been changed and syncs will fail until users reconfigure or re-authenticate. | -| Schema Change | The type of property previously present within a record has changed and a refresh of the source schema is required. -| Stream or Property Removal | Data that was previously being synced is no longer going to be synced | -| Destination Format / Normalization Change | The way the destination writes the final data or how Airbyte cleans that data is changing in a way that requires a full refresh | -| State Changes | The format of the source’s state has changed, and the full dataset will need to be re-synced | +| Type of Change | Description | +| ----------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------- | +| Connector Spec Change | The configuration has been changed and syncs will fail until users reconfigure or re-authenticate. | +| Schema Change | The type of property previously present within a record has changed and a refresh of the source schema is required. | +| Stream or Property Removal | Data that was previously being synced is no longer going to be synced | +| Destination Format / Normalization Change | The way the destination writes the final data or how Airbyte cleans that data is changing in a way that requires a full refresh | +| State Changes | The format of the source’s state has changed, and the full dataset will need to be re-synced | To review and fix breaking schema changes: + 1. In the Airbyte UI, click **Connections** and select the connection with breaking changes. -2. Review the description of what has changed in the new version. The breaking change will require you to upgrade your source or destination to a new version by a specific cutoff date. +2. Review the description of what has changed in the new version. The breaking change will require you to upgrade your source or destination to a new version by a specific cutoff date. 3. Update the source or destination to the new version to continue syncing. Follow the connector-specific migration guide to ensure your connections continue syncing successfully. @@ -93,8 +98,8 @@ To review and fix breaking schema changes: In addition to Airbyte's automatic schema change detection, you can manually refresh the source schema to stay up to date with changes in your schema. To manually refresh the source schema: - 1. In the Airbyte UI, click **Connections** and then click the connection you want to refresh. Click the **Schema** tab. +1. In the Airbyte UI, click **Connections** and then click the connection you want to refresh. Click the **Schema** tab. - 2. In the **Select streams** table, click **Refresh source schema** to fetch the schema of your data source. +2. In the **Select streams** table, click **Refresh source schema** to fetch the schema of your data source. - 3. If there are changes to the schema, you can review them in the **Refreshed source schema** dialog. \ No newline at end of file +3. If there are changes to the schema, you can review them in the **Refreshed source schema** dialog. diff --git a/docs/cloud/managing-airbyte-cloud/review-connection-status.md b/docs/cloud/managing-airbyte-cloud/review-connection-status.md index c93a94d3bb1..3128ceb5295 100644 --- a/docs/cloud/managing-airbyte-cloud/review-connection-status.md +++ b/docs/cloud/managing-airbyte-cloud/review-connection-status.md @@ -3,48 +3,50 @@ products: all --- # Review the connection status + The connection status displays information about the connection and of each stream being synced. Reviewing this summary allows you to assess the connection's current status and understand when the next sync will be run. - + ![Connection Status](./assets/connection-status-page.png) To review the connection status: -1. In the Airbyte UI, click **Connections**. -2. Click a connection in the list to view its status. +1. In the Airbyte UI, click **Connections**. -| Status | Description | -|------------------|---------------------------------------------------------------------------------------------------------------------| -| On time | The connection is operating within the expected timeframe expectations set by the replication frequency | -| On track | The connection is slightly delayed but is expected to catch up before the next sync. | -| Delayed | The connection has not loaded data within the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 1 hour | -| Error | The connection has not loaded data in more than two times the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 2 hours | -| Action Required | A breaking change related to the source or destination requires attention to resolve | -| In Progress | The connection is currently extracting or loading data | -| Disabled | The connection has been disabled and is not scheduled to run | -| Pending | The connection has not been run yet, so no status exists | - -If the most recent sync failed, you'll see the error message that will help diagnose if the failure is due to a source or destination configuration error. [Reach out](/community/getting-support.md) to us if you need any help to ensure you data continues syncing. +2. Click a connection in the list to view its status. + +| Status | Description | +| --------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| On time | The connection is operating within the expected timeframe expectations set by the replication frequency | +| On track | The connection is slightly delayed but is expected to catch up before the next sync. | +| Delayed | The connection has not loaded data within the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 1 hour | +| Error | The connection has not loaded data in more than two times the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 2 hours | +| Action Required | A breaking change related to the source or destination requires attention to resolve | +| In Progress | The connection is currently extracting or loading data | +| Disabled | The connection has been disabled and is not scheduled to run | +| Pending | The connection has not been run yet, so no status exists | + +If the most recent sync failed, you'll see the error message that will help diagnose if the failure is due to a source or destination configuration error. [Reach out](/community/getting-support.md) to us if you need any help to ensure you data continues syncing. :::info If a sync starts to fail, it will automatically be disabled after 100 consecutive failures or 14 consecutive days of failure. ::: -If a new major version of the connector has been released, you will also see a banner on this page indicating the cutoff date for the version. Airbyte recommends upgrading before the cutoff date to ensure your data continues syncing. If you do not upgrade before the cutoff date, Airbyte will automatically disable your connection. +If a new major version of the connector has been released, you will also see a banner on this page indicating the cutoff date for the version. Airbyte recommends upgrading before the cutoff date to ensure your data continues syncing. If you do not upgrade before the cutoff date, Airbyte will automatically disable your connection. Learn more about version upgrades in our [resolving breaking change documentation](/cloud/managing-airbyte-cloud/manage-schema-changes#resolving-breaking-changes). ## Review the stream status + The stream status allows you to monitor each stream's latest status. The stream will be highlighted with a grey pending bar to indicate the sync is actively extracting or loading data. -| Status | Description | -|------------------|---------------------------------------------------------------------------------------------------------------------| -| On time | The stream is operating within the expected timeframe expectations set by the replication frequency | -| Error | The most recent sync for this stream failed -| Pending | The stream has not been synced yet, so not status exists | +| Status | Description | +| ------- | --------------------------------------------------------------------------------------------------- | +| On time | The stream is operating within the expected timeframe expectations set by the replication frequency | +| Error | The most recent sync for this stream failed | +| Pending | The stream has not been synced yet, so not status exists | Each stream shows the last record loaded to the destination. Toggle the header to display the exact datetime the last record was loaded. -You can [reset](/operator-guides/reset.md) an individual stream without resetting all streams in a connection by clicking the three grey dots next to any stream. +You can [reset](/operator-guides/reset.md) an individual stream without resetting all streams in a connection by clicking the three grey dots next to any stream. You can also navigate directly to the stream's configuration by click the three grey dots next to any stream and selecting "Open details" to be redirected to the stream configuration. - diff --git a/docs/cloud/managing-airbyte-cloud/review-sync-history.md b/docs/cloud/managing-airbyte-cloud/review-sync-history.md index dae49ab3c7a..655dd668d81 100644 --- a/docs/cloud/managing-airbyte-cloud/review-sync-history.md +++ b/docs/cloud/managing-airbyte-cloud/review-sync-history.md @@ -4,36 +4,35 @@ products: all # Review the sync history -The job history displays information about synced data, such as the amount of data moved, the number of records read and committed, and the total sync time. Reviewing this summary can help you monitor the sync performance and identify any potential issues. +The job history displays information about synced data, such as the amount of data moved, the number of records read and committed, and the total sync time. Reviewing this summary can help you monitor the sync performance and identify any potential issues. ![Job History](./assets/connection-job-history.png) -To review the sync history, click a connection in the list to view its sync history. Sync History displays the sync status or [reset](/operator-guides/reset.md) status. The sync status is defined as: +To review the sync history, click a connection in the list to view its sync history. Sync History displays the sync status or [reset](/operator-guides/reset.md) status. The sync status is defined as: + +| Status | Description | +| ------------------- | ----------------------------------------------------------------- | +| Succeeded | 100% of the data has been extracted and loaded to the destination | +| Partially Succeeded | A subset of the data has been loaded to the destination | +| Failed | None of the data has been loaded to the destination | +| Cancelled | The sync was cancelled manually before finishing | +| Running | The sync is currently running | -| Status | Description | -|---------------------|---------------------------------------------------------------------------------------------------------------------| -| Succeeded | 100% of the data has been extracted and loaded to the destination | -| Partially Succeeded | A subset of the data has been loaded to the destination -| Failed | None of the data has been loaded to the destination | -| Cancelled | The sync was cancelled manually before finishing | -| Running | The sync is currently running | - ## Sync summary -Each sync shows the time the sync was initiated and additional metadata. This information can help in understanding sync performance over time. +Each sync shows the time the sync was initiated and additional metadata. This information can help in understanding sync performance over time. -| Data | Description | -|------------------------------------------|--------------------------------------------------------------------------------------| -| x GB (also measured in KB, MB) | Amount of data moved during the sync | -| x extracted records | Number of records read from the source during the sync | -| x loaded records | Number of records the destination confirmed it received. | -| xh xm xs | Total time (hours, minutes, seconds) for the sync to complete | +| Data | Description | +| ------------------------------ | ------------------------------------------------------------- | +| x GB (also measured in KB, MB) | Amount of data moved during the sync | +| x extracted records | Number of records read from the source during the sync | +| x loaded records | Number of records the destination confirmed it received. | +| xh xm xs | Total time (hours, minutes, seconds) for the sync to complete | - -:::note +:::note In the event of a failure, Airbyte will make several attempts to sync your data before waiting for the next sync to retry. The latest rules can be read about [here](../../understanding-airbyte/jobs.md#retry-rules). ::: -On this page, you can also view the complete logs and find any relevant errors, find a link to the job to share with Support, or download a copy of the logs locally. \ No newline at end of file +On this page, you can also view the complete logs and find any relevant errors, find a link to the job to share with Support, or download a copy of the logs locally. diff --git a/docs/community/code-of-conduct.md b/docs/community/code-of-conduct.md index 4cb81d4468f..cf90bed1519 100644 --- a/docs/community/code-of-conduct.md +++ b/docs/community/code-of-conduct.md @@ -12,19 +12,19 @@ In the interest of fostering an open and welcoming environment, we as contributo Examples of behavior that contributes to creating a positive environment include: -* Using welcoming and inclusive language -* Being respectful of differing viewpoints and experiences -* Gracefully accepting constructive criticism -* Focusing on what is best for the community -* Showing empathy towards other community members +- Using welcoming and inclusive language +- Being respectful of differing viewpoints and experiences +- Gracefully accepting constructive criticism +- Focusing on what is best for the community +- Showing empathy towards other community members Examples of unacceptable behavior by participants include: -* The use of sexualized language or imagery and unwelcome sexual attention or advances -* Trolling, insulting/derogatory comments, and personal or political attacks -* Public or private harassment -* Publishing others’ private information, such as a physical or electronic address, without explicit permission -* Other conduct which could reasonably be considered inappropriate in a professional setting +- The use of sexualized language or imagery and unwelcome sexual attention or advances +- Trolling, insulting/derogatory comments, and personal or political attacks +- Public or private harassment +- Publishing others’ private information, such as a physical or electronic address, without explicit permission +- Other conduct which could reasonably be considered inappropriate in a professional setting ## Our Responsibilities @@ -53,7 +53,7 @@ Airbyte's Slack community is growing incredibly fast. We're home to over 1500 da ### Rule 1: Be respectful. Our desire is for everyone to have a positive, fulfilling experience in Airbyte Slack, and we sincerely appreciate your help in making this happen. -All of the guidelines we provide below are important, but there’s a reason respect is the first rule. We take it seriously, and while the occasional breach of etiquette around Slack is forgivable, we cannot condone disrespectful behavior. +All of the guidelines we provide below are important, but there’s a reason respect is the first rule. We take it seriously, and while the occasional breach of etiquette around Slack is forgivable, we cannot condone disrespectful behavior. ### Rule 2: Use the most relevant channels. @@ -61,7 +61,7 @@ We deliberately use topic-specific Slack channels so members of the community ca ### Rule 3: Don’t double-post. -Please be considerate of our community members’ time. We know your question is important, but please keep in mind that Airbyte Slack is not a customer service platform but a community of volunteers who will help you as they are able around their own work schedule. You have access to all the history, so it’s easy to check if your question has already been asked. +Please be considerate of our community members’ time. We know your question is important, but please keep in mind that Airbyte Slack is not a customer service platform but a community of volunteers who will help you as they are able around their own work schedule. You have access to all the history, so it’s easy to check if your question has already been asked. ### Rule 4: Check question for clarity and thoughtfulness. @@ -69,23 +69,22 @@ Airbyte Slack is a community of volunteers. Our members enjoy helping others; th ### Rule 5: Keep it public. -This is a public forum; please do not contact individual members of this community without their express permission, regardless of whether you are trying to recruit someone, sell a product, or solicit help. +This is a public forum; please do not contact individual members of this community without their express permission, regardless of whether you are trying to recruit someone, sell a product, or solicit help. ### Rule 6: No soliciting! The purpose of the Airbyte Slack community is to provide a forum for data practitioners to discuss their work and share their ideas and learnings. It is not intended as a place to generate leads for vendors or recruiters, and may not be used as such. -If you’re a vendor, you may advertise your product in #shameless-plugs. Advertising your product anywhere else is strictly against the rules. +If you’re a vendor, you may advertise your product in #shameless-plugs. Advertising your product anywhere else is strictly against the rules. ### Rule 7: Don't spam tags, or use @here or @channel. -Using the @here and @channel keywords in a post will not help, as they are disabled in Slack for everyone excluding admins. Nonetheless, if you use them we will remind you with a link to this rule, to help you better understand the way Airbyte Slack operates. +Using the @here and @channel keywords in a post will not help, as they are disabled in Slack for everyone excluding admins. Nonetheless, if you use them we will remind you with a link to this rule, to help you better understand the way Airbyte Slack operates. -Do not tag specific individuals for help on your questions. If someone chooses to respond to your question, they will do so. You will find that our community of volunteers is generally very responsive and amazingly helpful! +Do not tag specific individuals for help on your questions. If someone chooses to respond to your question, they will do so. You will find that our community of volunteers is generally very responsive and amazingly helpful! ### Rule 8: Use threads for discussion. -The simplest way to keep conversations on track in Slack is to use threads. The Airbyte Slack community relies heavily on threads, and if you break from this convention, rest assured one of our community members will respectfully inform you quickly! +The simplest way to keep conversations on track in Slack is to use threads. The Airbyte Slack community relies heavily on threads, and if you break from this convention, rest assured one of our community members will respectfully inform you quickly! _If you see a message or receive a direct message that violates any of these rules, please contact an Airbyte team member and we will take the appropriate moderation action immediately. We have zero tolerance for intentional rule-breaking and hate speech._ - diff --git a/docs/community/getting-support.md b/docs/community/getting-support.md index 339bd08399c..0a38bed89f5 100644 --- a/docs/community/getting-support.md +++ b/docs/community/getting-support.md @@ -22,12 +22,11 @@ If you require personalized support, reach out to our sales team to inquire abou We are driving our community support from our [forum](https://github.com/airbytehq/airbyte/discussions) on GitHub. - ## Airbyte Cloud Support If you have questions about connector setup, error resolution, or want to report a bug, Airbyte Support is available to assist you. We recommend checking [our documentation](https://docs.airbyte.com/) and searching our [Help Center](https://support.airbyte.com/hc/en-us) before opening a support ticket. -If you couldn't find the information you need in our docs or Help Center, open a ticket within the Airbyte Cloud platform by selecting the "Support" icon in the lower left navigation bar. Alternatively, you can submit a ticket through our [Help Center](https://support.airbyte.com/hc/en-us) by completing an Airbyte Cloud Support Request. Our team is online and availible to assist from 7AM - 7PM Eastern. +If you couldn't find the information you need in our docs or Help Center, open a ticket within the Airbyte Cloud platform by selecting the "Support" icon in the lower left navigation bar. Alternatively, you can submit a ticket through our [Help Center](https://support.airbyte.com/hc/en-us) by completing an Airbyte Cloud Support Request. Our team is online and availible to assist from 7AM - 7PM Eastern. **If you're unsure about the supported connectors, refer to our [Connector Support Levels](https://docs.airbyte.com/project-overview/product-support-levels/) & [Connector Catalog](https://docs.airbyte.com/integrations/).** @@ -37,7 +36,7 @@ If you don't see a connector you need, you can submit a [connector request](http To stay updated on Airbyte's future plans, take a look at [our roadmap](https://github.com/orgs/airbytehq/projects/37/views/1). -Please be sure to sign up for Airbyte with your company email address, as we do not support personal accounts. +Please be sure to sign up for Airbyte with your company email address, as we do not support personal accounts. ## Airbyte Enterprise (self-hosted) Support @@ -45,27 +44,28 @@ If you're running Airbyte Open Source with Airbyte Enterprise or have an OSS sup Before opening a support ticket, we recommend consulting [our documentation](https://docs.airbyte.com/) and searching our [Help Center](https://support.airbyte.com/hc/en-us). If your question remains unanswered, please submit a ticket through our Help Center. We suggest creating an [Airbyte Help Center account](https://airbyte1416.zendesk.com/auth/v2/login/signin?return_to=https%3A%2F%2Fsupport.airbyte.com%2Fhc%2Fen-us&theme=hc&locale=en-us&brand_id=15365055240347&auth_origin=15365055240347%2Ctrue%2Ctrue) to access your organization's support requests. Our team is online and availible to assist from 7AM - 7PM Eastern. -**Connector support is based on certification status of the connector.** Please see our [Connector Support Levels](https://docs.airbyte.com/project-overview/product-support-levels) if you have any questions on support provided for one of your connectors. +**Connector support is based on certification status of the connector.** Please see our [Connector Support Levels](https://docs.airbyte.com/project-overview/product-support-levels) if you have any questions on support provided for one of your connectors. Submitting a Pull Request for review? -* Be sure to follow our [contribution guidelines](https://docs.airbyte.com/contributing-to-airbyte/) laid out here on our doc. Highlights include: - * PRs should be limited to a single change-set -* Submit the PR as a PR Request through the Help Center Open Source Enterprise Support Request form -* If you are submitting a Platform PR we accept Platform PRs in the areas below: - * Helm - * Environment variable configurations - * Bug Fixes - * Security version bumps - * **If outside these areas, please open up an issue to help the team understand the need and if we are able to consider a PR** +- Be sure to follow our [contribution guidelines](https://docs.airbyte.com/contributing-to-airbyte/) laid out here on our doc. Highlights include: + - PRs should be limited to a single change-set +- Submit the PR as a PR Request through the Help Center Open Source Enterprise Support Request form +- If you are submitting a Platform PR we accept Platform PRs in the areas below: + - Helm + - Environment variable configurations + - Bug Fixes + - Security version bumps + - **If outside these areas, please open up an issue to help the team understand the need and if we are able to consider a PR** Submitting a PR does not guarantee its merge. The Airbyte support team will conduct an initial review, and if the PR aligns with Airbyte's roadmap, it will be prioritized based on team capacities and priorities. Although we strive to offer our utmost assistance, there are certain requests that we are unable to support. Currently, we do not provide assistance for these particular items: -* Question/troubleshooting assistance with forked versions of Airbyte -* Configuring using Octavia CLI -* Creating and configuring custom transformation using dbt -* Curating unique documentation and training materials -* Configuring Airbyte to meet security requirements -If you think you will need assistance when upgrading, we recommend upgrading during our support hours, Monday-Friday 7AM - 7PM ET so we can assist if support is needed. If you upgrade outside of support hours, please submit a ticket and we will assist when we are back online. +- Question/troubleshooting assistance with forked versions of Airbyte +- Configuring using Octavia CLI +- Creating and configuring custom transformation using dbt +- Curating unique documentation and training materials +- Configuring Airbyte to meet security requirements + +If you think you will need assistance when upgrading, we recommend upgrading during our support hours, Monday-Friday 7AM - 7PM ET so we can assist if support is needed. If you upgrade outside of support hours, please submit a ticket and we will assist when we are back online. diff --git a/docs/connector-development/README.md b/docs/connector-development/README.md index 6b0dc8b5c37..71e4a60beb6 100644 --- a/docs/connector-development/README.md +++ b/docs/connector-development/README.md @@ -46,6 +46,7 @@ The Airbyte community also maintains some CDKs: Before building a new connector, review [Airbyte's data protocol specification](../understanding-airbyte/airbyte-protocol.md). ::: + ## Adding a new connector The easiest way to make and start using a connector in your workspace is by using the diff --git a/docs/connector-development/best-practices.md b/docs/connector-development/best-practices.md index b6b3be72a18..3356daea412 100644 --- a/docs/connector-development/best-practices.md +++ b/docs/connector-development/best-practices.md @@ -5,16 +5,16 @@ In order to guarantee the highest quality for connectors, we've compiled the fol ## Principles of developing connectors 1. **Reliability + usability > more features.** It is better to support 1 feature that works reliably and has a great UX than 2 that are unreliable or hard to use. One solid connector is better than 2 finicky ones. -2. **Fail fast.** A user should not be able to configure something that will not work. +2. **Fail fast.** A user should not be able to configure something that will not work. 3. **Fail actionably.** If a failure is actionable by the user, clearly let them know what they can do. Otherwise, make it very easy for them to give us necessary debugging information \(logs etc.\) From these principles we extrapolate the following goals for connectors, in descending priority order: -1. **Correct user input should result in a successful sync.** If there is an issue, it should be extremely easy for the user to see and report. -2. **Issues arising from bad user input should print an actionable error message.** "Invalid credentials" is not an actionable message. "Please verify your username/password is correct" is better. -3. **Wherever possible, a connector should support incremental sync.** This prevents excessive load on the underlying data source. _\*\*_ -4. **When running a sync, a connector should communicate its status frequently to provide clear feedback that it is working.** Output a log message at least every 5 minutes. -5. **A connector should allow reading or writing as many entities as is feasible.** Supporting syncing all entities from an API is preferred to only supporting a small subset which would satisfy narrow use cases. Similarly, a database should support as many data types as is feasible. +1. **Correct user input should result in a successful sync.** If there is an issue, it should be extremely easy for the user to see and report. +2. **Issues arising from bad user input should print an actionable error message.** "Invalid credentials" is not an actionable message. "Please verify your username/password is correct" is better. +3. **Wherever possible, a connector should support incremental sync.** This prevents excessive load on the underlying data source. _\*\*_ +4. **When running a sync, a connector should communicate its status frequently to provide clear feedback that it is working.** Output a log message at least every 5 minutes. +5. **A connector should allow reading or writing as many entities as is feasible.** Supporting syncing all entities from an API is preferred to only supporting a small subset which would satisfy narrow use cases. Similarly, a database should support as many data types as is feasible. Note that in the above list, the _least_ important is the number of features it has \(e.g: whether an API connector supports all entities in the API\). The most important thing is that for its declared features, it is reliable and usable. The only exception are “minimum viability” features e.g: for some sources, it’s not feasible to pull data without incremental due to rate limiting issues. In this case, those are considered usability issues. @@ -26,24 +26,24 @@ When reviewing connectors, we'll use the following "checklist" to verify whether **As much as possible, prove functionality via testing**. This means slightly different things depending on the type of connector: -* **All connectors** must test all the sync modes they support during integration tests -* **Database connectors** should test that they can replicate **all** supported data types in both `read` and `discover` operations -* **API connectors** should validate records that every stream outputs data - * If this causes rate limiting problems, there should be a periodic CI build which tests this on a less frequent cadence to avoid rate limiting +- **All connectors** must test all the sync modes they support during integration tests +- **Database connectors** should test that they can replicate **all** supported data types in both `read` and `discover` operations +- **API connectors** should validate records that every stream outputs data + - If this causes rate limiting problems, there should be a periodic CI build which tests this on a less frequent cadence to avoid rate limiting **Thoroughly test edge cases.** While Airbyte provides a [Standard Test Suite](testing-connectors/connector-acceptance-tests-reference.md) that all connectors must pass, it's not possible for the standard test suite to cover all edge cases. When in doubt about whether the standard tests provide sufficient evidence of functionality, write a custom test case for your connector. ### Check Connection -* **Verify permissions upfront**. The "check connection" operation should verify any necessary permissions upfront e.g: the provided API token has read access to the API entities. - * In some cases it's not possible to verify permissions without knowing which streams the user wants to replicate. For example, a provided API token only needs read access to the "Employees" entity if the user wants to replicate the "Employees" stream. In this case, the CheckConnection operation should verify the minimum needed requirements \(e.g: the API token exists\), and the "read" or "write" operation should verify all needed permissions based on the provided catalog, failing if a required permission is not granted. -* **Provide actionable feedback for incorrect input.** - * Examples of non actionable error messages - * "Can't connect". The only recourse this gives the user is to guess whether they need to dig through logs or guess which field of their input configuration is incorrect. - * Examples of actionable error messages - * "Your username/password combination is incorrect" - * "Unable to reach Database host: please verify that there are no firewall rules preventing Airbyte from connecting to the database" - * etc... +- **Verify permissions upfront**. The "check connection" operation should verify any necessary permissions upfront e.g: the provided API token has read access to the API entities. + - In some cases it's not possible to verify permissions without knowing which streams the user wants to replicate. For example, a provided API token only needs read access to the "Employees" entity if the user wants to replicate the "Employees" stream. In this case, the CheckConnection operation should verify the minimum needed requirements \(e.g: the API token exists\), and the "read" or "write" operation should verify all needed permissions based on the provided catalog, failing if a required permission is not granted. +- **Provide actionable feedback for incorrect input.** + - Examples of non actionable error messages + - "Can't connect". The only recourse this gives the user is to guess whether they need to dig through logs or guess which field of their input configuration is incorrect. + - Examples of actionable error messages + - "Your username/password combination is incorrect" + - "Unable to reach Database host: please verify that there are no firewall rules preventing Airbyte from connecting to the database" + - etc... ### Rate Limiting diff --git a/docs/connector-development/cdk-python/README.md b/docs/connector-development/cdk-python/README.md index 3830da20a13..4f3d2e9385d 100644 --- a/docs/connector-development/cdk-python/README.md +++ b/docs/connector-development/cdk-python/README.md @@ -33,7 +33,7 @@ offers helpers specific for creating Airbyte source connectors for: This document is a general introduction to the CDK. Readers should have basic familiarity with the [Airbyte Specification](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/) before proceeding. -If you have any issues with troubleshooting or want to learn more about the CDK from the Airbyte team, head to +If you have any issues with troubleshooting or want to learn more about the CDK from the Airbyte team, head to [the Connector Development section of our Airbyte Forum](https://github.com/airbytehq/airbyte/discussions) to inquire further! diff --git a/docs/connector-development/cdk-python/basic-concepts.md b/docs/connector-development/cdk-python/basic-concepts.md index e446ccb42c2..a1da6316e21 100644 --- a/docs/connector-development/cdk-python/basic-concepts.md +++ b/docs/connector-development/cdk-python/basic-concepts.md @@ -52,8 +52,8 @@ As the code examples show, the `AbstractSource` delegates to the set of `Stream` A summary of what we've covered so far on how to use the Airbyte CDK: -* A concrete implementation of the `AbstractSource` object is required. -* This involves, +- A concrete implementation of the `AbstractSource` object is required. +- This involves, 1. implementing the `check_connection`function. 2. Creating the appropriate `Stream` classes and returning them in the `streams` function. 3. placing the above mentioned `spec.yaml` file in the right place. @@ -61,4 +61,3 @@ A summary of what we've covered so far on how to use the Airbyte CDK: ## HTTP Streams We've covered how the `AbstractSource` works with the `Stream` interface in order to fulfill the Airbyte Specification. Although developers are welcome to implement their own object, the CDK saves developers the hassle of doing so in the case of HTTP APIs with the [`HTTPStream`](http-streams.md) object. - diff --git a/docs/connector-development/cdk-python/full-refresh-stream.md b/docs/connector-development/cdk-python/full-refresh-stream.md index c5bf971e990..ded56425d61 100644 --- a/docs/connector-development/cdk-python/full-refresh-stream.md +++ b/docs/connector-development/cdk-python/full-refresh-stream.md @@ -45,4 +45,4 @@ We highly recommend implementing Incremental when feasible. See the [incremental Another alternative to Incremental and Full Refresh streams is [resumable full refresh](resumable-full-refresh-stream.md). This is a stream that uses API endpoints that cannot reliably retrieve data in an incremental fashion. However, it can offer improved resilience -against errors by checkpointing the stream's page number or cursor. \ No newline at end of file +against errors by checkpointing the stream's page number or cursor. diff --git a/docs/connector-development/cdk-python/http-streams.md b/docs/connector-development/cdk-python/http-streams.md index ac4af4efe63..fc7d8283117 100644 --- a/docs/connector-development/cdk-python/http-streams.md +++ b/docs/connector-development/cdk-python/http-streams.md @@ -2,10 +2,10 @@ The CDK offers base classes that greatly simplify writing HTTP API-based connectors. Some of the most useful features include helper functionality for: -* Authentication \(basic auth, Oauth2, or any custom auth method\) -* Pagination -* Handling rate limiting with static or dynamic backoff timing -* Caching +- Authentication \(basic auth, Oauth2, or any custom auth method\) +- Pagination +- Handling rate limiting with static or dynamic backoff timing +- Caching All these features have sane off-the-shelf defaults but are completely customizable depending on your use case. They can also be combined with other stream features described in the [full refresh streams](full-refresh-stream.md) and [incremental streams](incremental-stream.md) sections. @@ -35,7 +35,7 @@ Using either authenticator is as simple as passing the created authenticator int ## Pagination -Most APIs, when facing a large call, tend to return the results in pages. The CDK accommodates paging via the `next_page_token` function. This function is meant to extract the next page "token" from the latest response. The contents of a "token" are completely up to the developer: it can be an ID, a page number, a partial URL etc.. The CDK will continue making requests as long as the `next_page_token` continues returning non-`None` results. This can then be used in the `request_params` and other methods in `HttpStream` to page through API responses. Here is an [example](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-stripe/source_stripe/streams.py#L34) from the Stripe API. +Most APIs, when facing a large call, tend to return the results in pages. The CDK accommodates paging via the `next_page_token` function. This function is meant to extract the next page "token" from the latest response. The contents of a "token" are completely up to the developer: it can be an ID, a page number, a partial URL etc.. The CDK will continue making requests as long as the `next_page_token` continues returning non-`None` results. This can then be used in the `request_params` and other methods in `HttpStream` to page through API responses. Here is an [example](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-stripe/source_stripe/streams.py#L34) from the Stripe API. ## Rate Limiting @@ -50,7 +50,8 @@ Note that Airbyte will always attempt to make as many requests as possible and o When implementing [stream slicing](incremental-stream.md#streamstream_slices) in an `HTTPStream` each Slice is equivalent to a HTTP request; the stream will make one request per element returned by the `stream_slices` function. The current slice being read is passed into every other method in `HttpStream` e.g: `request_params`, `request_headers`, `path`, etc.. to be injected into a request. This allows you to dynamically determine the output of the `request_params`, `path`, and other functions to read the input slice and return the appropriate value. ## Nested Streams & Caching -It's possible to cache data from a stream onto a temporary file on disk. + +It's possible to cache data from a stream onto a temporary file on disk. This is especially useful when dealing with streams that depend on the results of another stream e.g: `/employees/{id}/details`. In this case, we can use caching to write the data of the parent stream to a file to use this data when the child stream synchronizes, rather than performing a full HTTP request again. @@ -61,10 +62,12 @@ Caching can be enabled by overriding the `use_cache` property of the `HttpStream The caching mechanism is related to parent streams. For child streams, there is an `HttpSubStream` class inheriting from `HttpStream` and overriding the `stream_slices` method that returns a generator of all parent entries. To use caching in the parent/child relationship, perform the following steps: + 1. Turn on parent stream caching by overriding the `use_cache` property. 2. Inherit child stream class from `HttpSubStream` class. #### Example + ```python class Employees(HttpStream): ... diff --git a/docs/connector-development/cdk-python/incremental-stream.md b/docs/connector-development/cdk-python/incremental-stream.md index f24c33c7e68..e482f3aa411 100644 --- a/docs/connector-development/cdk-python/incremental-stream.md +++ b/docs/connector-development/cdk-python/incremental-stream.md @@ -4,10 +4,10 @@ An incremental Stream is a stream which reads data incrementally. That is, it on Several new pieces are essential to understand how incrementality works with the CDK: -* `AirbyteStateMessage` -* cursor fields -* `IncrementalMixin` -* `Stream.get_updated_state` (deprecated) +- `AirbyteStateMessage` +- cursor fields +- `IncrementalMixin` +- `Stream.get_updated_state` (deprecated) as well as a few other optional concepts. @@ -28,7 +28,7 @@ In the context of the CDK, setting the `Stream.cursor_field` property to any tru This class mixin adds property `state` with abstract setter and getter. The `state` attribute helps the CDK figure out the current state of sync at any moment (in contrast to deprecated `Stream.get_updated_state` method). The setter typically deserialize state saved by CDK and initialize internal state of the stream. -The getter should serialize internal state of the stream. +The getter should serialize internal state of the stream. ```python @property @@ -42,6 +42,7 @@ def state(self, value: Mapping[str, Any]): The actual logic of updating state during reading is implemented somewhere else, usually as part of `read_records` method, right after the latest record returned that matches the new state. Therefore, the state represents the latest checkpoint successfully achieved, and all next records should match the next state after that one. + ```python def read_records(self, ...): ... @@ -56,6 +57,7 @@ def read_records(self, ...): ``` ### `Stream.get_updated_state` + (deprecated since 1.48.0, see `IncrementalMixin`) This function helps the stream keep track of the latest state by inspecting every record output by the stream \(as returned by the `Stream.read_records` method\) and comparing it against the most recent state object. This allows sync to resume from where the previous sync last stopped, regardless of success or failure. This function typically compares the state object's and the latest record's cursor field, picking the latest one. @@ -76,7 +78,7 @@ While this is very simple, **it requires that records are output in ascending or Interval based checkpointing can be implemented by setting the `Stream.state_checkpoint_interval` property e.g: ```text -class MyAmazingStream(Stream): +class MyAmazingStream(Stream): # Save the state every 100 records state_checkpoint_interval = 100 ``` @@ -97,7 +99,6 @@ For a more in-depth description of stream slicing, see the [Stream Slices guide] In summary, an incremental stream requires: -* the `cursor_field` property -* to be inherited from `IncrementalMixin` and state methods implemented -* Optionally, the `stream_slices` function - +- the `cursor_field` property +- to be inherited from `IncrementalMixin` and state methods implemented +- Optionally, the `stream_slices` function diff --git a/docs/connector-development/cdk-python/python-concepts.md b/docs/connector-development/cdk-python/python-concepts.md index 0b97f2ae3c4..29f280615db 100644 --- a/docs/connector-development/cdk-python/python-concepts.md +++ b/docs/connector-development/cdk-python/python-concepts.md @@ -56,4 +56,3 @@ class Pilot(Employee): Generators are basically iterators over arbitrary source data. They are handy because their syntax is extremely concise and feel just like any other list or collection when working with them in code. If you see `yield` anywhere in the code -- that's a generator at work. - diff --git a/docs/connector-development/cdk-python/resumable-full-refresh-stream.md b/docs/connector-development/cdk-python/resumable-full-refresh-stream.md index aa2d20c5cd6..3a0de22b25f 100644 --- a/docs/connector-development/cdk-python/resumable-full-refresh-stream.md +++ b/docs/connector-development/cdk-python/resumable-full-refresh-stream.md @@ -19,7 +19,7 @@ values used to checkpoint state in between resumable full refresh sync attempts ## Criteria for Resumable Full Refresh -:::warning +:::warning Resumable full refresh in the Python CDK does not currently support substreams. This work is currently in progress. ::: @@ -42,7 +42,7 @@ is retried. This class mixin adds property `state` with abstract setter and getter. The `state` attribute helps the CDK figure out the current state of sync at any moment. The setter typically deserializes state saved by CDK and initialize internal state of the stream. -The getter should serialize internal state of the stream. +The getter should serialize internal state of the stream. ```python @property @@ -88,5 +88,5 @@ in between sync attempts, but deleted at the beginning of new sync jobs. In summary, a resumable full refresh stream requires: -* to be inherited from `StateMixin` and state methods implemented -* implementing `Stream.read_records()` to get the Stream's current state, request a single page of records, and update the Stream's state with the next page to fetch or `{}`. +- to be inherited from `StateMixin` and state methods implemented +- implementing `Stream.read_records()` to get the Stream's current state, request a single page of records, and update the Stream's state with the next page to fetch or `{}`. diff --git a/docs/connector-development/cdk-python/schemas.md b/docs/connector-development/cdk-python/schemas.md index 5be7ac6f262..3056944f7fe 100644 --- a/docs/connector-development/cdk-python/schemas.md +++ b/docs/connector-development/cdk-python/schemas.md @@ -16,7 +16,7 @@ Important note: any objects referenced via `$ref` should be placed in the `share If you are implementing a connector to pull data from an API which publishes an [OpenAPI/Swagger spec](https://swagger.io/specification/), you can use a tool we've provided for generating JSON schemas from the OpenAPI definition file. Detailed information can be found [here](https://github.com/airbytehq/airbyte/tree/master/tools/openapi2jsonschema/). -### Generating schemas using the output of your connector's read command +### Generating schemas using the output of your connector's read command We also provide a tool for generating schemas using a connector's `read` command output. Detailed information can be found [here](https://github.com/airbytehq/airbyte/tree/master/tools/schema_generator/). @@ -43,7 +43,7 @@ def get_json_schema(self): It is important to ensure output data conforms to the declared json schema. This is because the destination receiving this data to load into tables may strictly enforce schema \(e.g. when data is stored in a SQL database, you can't put CHAR type into INTEGER column\). In the case of changes to API output \(which is almost guaranteed to happen over time\) or a minor mistake in jsonschema definition, data syncs could thus break because of mismatched datatype schemas. -To remain robust in operation, the CDK provides a transformation ability to perform automatic object mutation to align with desired schema before outputting to the destination. All streams inherited from airbyte_cdk.sources.streams.core.Stream class have this transform configuration available. It is \_disabled_ by default and can be configured per stream within a source connector. +To remain robust in operation, the CDK provides a transformation ability to perform automatic object mutation to align with desired schema before outputting to the destination. All streams inherited from airbyte*cdk.sources.streams.core.Stream class have this transform configuration available. It is \_disabled* by default and can be configured per stream within a source connector. ### Default type transformation @@ -81,7 +81,7 @@ And objects inside array of referenced by $ref attribute. If the value cannot be cast \(e.g. string "asdf" cannot be casted to integer\), the field would retain its original value. Schema type transformation support any jsonschema types, nested objects/arrays and reference types. Types described as array of more than one type \(except "null"\), types under oneOf/anyOf keyword wont be transformed. -_Note:_ This transformation is done by the source, not the stream itself. I.e. if you have overriden "read\_records" method in your stream it wont affect object transformation. All transformation are done in-place by modifing output object before passing it to "get\_updated\_state" method, so "get\_updated\_state" would receive the transformed object. +_Note:_ This transformation is done by the source, not the stream itself. I.e. if you have overriden "read_records" method in your stream it wont affect object transformation. All transformation are done in-place by modifing output object before passing it to "get_updated_state" method, so "get_updated_state" would receive the transformed object. ### Custom schema type transformation @@ -99,13 +99,13 @@ class MyStream(Stream): return transformed_value ``` -Where original\_value is initial field value and field\_schema is part of jsonschema describing field type. For schema +Where original_value is initial field value and field_schema is part of jsonschema describing field type. For schema ```javascript {"type": "object", "properties": {"value": {"type": "string", "format": "date-time"}}} ``` -field\_schema variable would be equal to +field_schema variable would be equal to ```javascript {"type": "string", "format": "date-time"} @@ -145,7 +145,7 @@ class MyStream(Stream): Transforming each object on the fly would add some time for each object processing. This time is depends on object/schema complexity and hardware configuration. -There are some performance benchmarks we've done with ads\_insights facebook schema \(it is complex schema with objects nested inside arrays ob object and a lot of references\) and example object. Here is the average transform time per single object, seconds: +There are some performance benchmarks we've done with ads_insights facebook schema \(it is complex schema with objects nested inside arrays ob object and a lot of references\) and example object. Here is the average transform time per single object, seconds: ```text regular transform: @@ -162,4 +162,3 @@ just traverse/validate through json schema and object fields: ``` On my PC \(AMD Ryzen 7 5800X\) it took 0.8 milliseconds per object. As you can see most time \(~ 75%\) is taken by jsonschema traverse/validation routine and very little \(less than 10 %\) by actual converting. Processing time can be reduced by skipping jsonschema type checking but it would be no warnings about possible object jsonschema inconsistency. - diff --git a/docs/connector-development/cdk-python/stream-slices.md b/docs/connector-development/cdk-python/stream-slices.md index 70b511923c3..0c111b04398 100644 --- a/docs/connector-development/cdk-python/stream-slices.md +++ b/docs/connector-development/cdk-python/stream-slices.md @@ -25,4 +25,3 @@ Slack is a chat platform for businesses. Collectively, a company can easily post This is a great usecase for stream slicing. The `messages` stream, which outputs one record per chat message, can slice records by time e.g: hourly. It implements this by specifying the beginning and end timestamp of each hour that it wants to pull data from. Then after all the records in a given hour \(i.e: slice\) have been read, the connector outputs a STATE message to indicate that state should be saved. This way, if the connector ever fails during a sync \(for example if the API goes down\) then at most, it will reread only one hour's worth of messages. See the implementation of the Slack connector [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-slack/source_slack/source.py). - diff --git a/docs/connector-development/config-based/advanced-topics.md b/docs/connector-development/config-based/advanced-topics.md index cd9b70f4549..86a9b18cf92 100644 --- a/docs/connector-development/config-based/advanced-topics.md +++ b/docs/connector-development/config-based/advanced-topics.md @@ -57,9 +57,9 @@ This can be used to avoid repetitions. Schema: ```yaml - "$parameters": - type: object - additionalProperties: true +"$parameters": + type: object + additionalProperties: true ``` Example: @@ -308,9 +308,9 @@ When you receive this error, you can address this by defining the missing field 1. Given the connection config and an optional stream state, the `PartitionRouter` computes the partitions that should be routed to read data. 2. Iterate over all the partitions defined by the stream's partition router. 3. For each partition, - 1. Submit a request to the partner API as defined by the requester - 2. Select the records from the response - 3. Repeat for as long as the paginator points to a next page + 1. Submit a request to the partner API as defined by the requester + 2. Select the records from the response + 3. Repeat for as long as the paginator points to a next page [connector-flow](./assets/connector-flow.png) diff --git a/docs/connector-development/config-based/low-code-cdk-overview.md b/docs/connector-development/config-based/low-code-cdk-overview.md index c22efdc16cb..f93bbc675f4 100644 --- a/docs/connector-development/config-based/low-code-cdk-overview.md +++ b/docs/connector-development/config-based/low-code-cdk-overview.md @@ -156,4 +156,5 @@ For examples of production-ready config-based connectors, refer to: - [Sentry](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-sentry/source_sentry/manifest.yaml) ## Reference + The full schema definition for the YAML file can be found [here](https://raw.githubusercontent.com/airbytehq/airbyte/master/airbyte-cdk/python/airbyte_cdk/sources/declarative/declarative_component_schema.yaml). diff --git a/docs/connector-development/config-based/tutorial/0-getting-started.md b/docs/connector-development/config-based/tutorial/0-getting-started.md index 5a264a66c4a..7e037c9979f 100644 --- a/docs/connector-development/config-based/tutorial/0-getting-started.md +++ b/docs/connector-development/config-based/tutorial/0-getting-started.md @@ -48,4 +48,4 @@ This can be done by signing up for the Free tier plan on [Exchange Rates Data AP ## Next Steps -Next, we'll [create a Source using the connector generator.](1-create-source.md) \ No newline at end of file +Next, we'll [create a Source using the connector generator.](1-create-source.md) diff --git a/docs/connector-development/config-based/tutorial/1-create-source.md b/docs/connector-development/config-based/tutorial/1-create-source.md index 568c5bcc1fa..905aa3a8791 100644 --- a/docs/connector-development/config-based/tutorial/1-create-source.md +++ b/docs/connector-development/config-based/tutorial/1-create-source.md @@ -1,4 +1,4 @@ -# Step 1: Generate the source connector project locally +# Step 1: Generate the source connector project locally Let's start by cloning the Airbyte repository: @@ -30,4 +30,4 @@ Next, [we'll install dependencies required to run the connector](2-install-depen ## More readings -- [Connector generator](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connector-templates/generator/README.md) \ No newline at end of file +- [Connector generator](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connector-templates/generator/README.md) diff --git a/docs/connector-development/config-based/tutorial/2-install-dependencies.md b/docs/connector-development/config-based/tutorial/2-install-dependencies.md index 55520557fc3..23afd8b8c58 100644 --- a/docs/connector-development/config-based/tutorial/2-install-dependencies.md +++ b/docs/connector-development/config-based/tutorial/2-install-dependencies.md @@ -1,6 +1,5 @@ # Step 2: Install dependencies - ```bash cd ../../connectors/source-exchange-rates-tutorial poetry install @@ -35,4 +34,4 @@ Next, we'll [connect to the API source](3-connecting-to-the-API-source.md) - [Basic Concepts](https://docs.airbyte.com/connector-development/cdk-python/basic-concepts) - [Defining Stream Schemas](https://docs.airbyte.com/connector-development/cdk-python/schemas) -- The module's generated `README.md` contains more details on the supported commands. \ No newline at end of file +- The module's generated `README.md` contains more details on the supported commands. diff --git a/docs/connector-development/config-based/tutorial/3-connecting-to-the-API-source.md b/docs/connector-development/config-based/tutorial/3-connecting-to-the-API-source.md index 752ccee58ef..adab88e68b9 100644 --- a/docs/connector-development/config-based/tutorial/3-connecting-to-the-API-source.md +++ b/docs/connector-development/config-based/tutorial/3-connecting-to-the-API-source.md @@ -21,7 +21,7 @@ Let's populate the specification (`spec`) and the configuration (`secrets/config 1. We'll add these properties to the `spec` block in the `source-exchange-rates-tutorial/source_exchange_rates_tutorial/manifest.yaml` ```yaml -spec: +spec: documentation_url: https://docs.airbyte.com/integrations/sources/exchangeratesapi connection_specification: $schema: http://json-schema.org/draft-07/schema# @@ -75,12 +75,12 @@ definitions: 2. Then, let's rename the stream from `customers` to `rates`, update the primary key to `date`, and set the path to "/exchangerates_data/latest" as per the API's documentation. This path is specific to the stream, so we'll set it within the `rates_stream` definition ```yaml - rates_stream: - $ref: "#/definitions/base_stream" - $parameters: - name: "rates" - primary_key: "date" - path: "/exchangerates_data/latest" +rates_stream: + $ref: "#/definitions/base_stream" + $parameters: + name: "rates" + primary_key: "date" + path: "/exchangerates_data/latest" ``` We'll also update the reference in the `streams` block @@ -136,7 +136,7 @@ version: "0.1.0" definitions: selector: extractor: - field_path: [ ] + field_path: [] requester: url_base: "https://api.apilayer.com" http_method: "GET" @@ -169,7 +169,7 @@ streams: check: stream_names: - "rates" -spec: +spec: documentation_url: https://docs.airbyte.com/integrations/sources/exchangeratesapi connection_specification: $schema: http://json-schema.org/draft-07/schema# diff --git a/docs/connector-development/config-based/tutorial/4-reading-data.md b/docs/connector-development/config-based/tutorial/4-reading-data.md index d1f69b71163..a7deaedbf3f 100644 --- a/docs/connector-development/config-based/tutorial/4-reading-data.md +++ b/docs/connector-development/config-based/tutorial/4-reading-data.md @@ -10,9 +10,7 @@ Let's first add the stream to the configured catalog in `source-exchange-rates-t "stream": { "name": "rates", "json_schema": {}, - "supported_sync_modes": [ - "full_refresh" - ] + "supported_sync_modes": ["full_refresh"] }, "sync_mode": "full_refresh", "destination_sync_mode": "overwrite" diff --git a/docs/connector-development/config-based/tutorial/5-incremental-reads.md b/docs/connector-development/config-based/tutorial/5-incremental-reads.md index 9cf2aac0c86..ec11512dc61 100644 --- a/docs/connector-development/config-based/tutorial/5-incremental-reads.md +++ b/docs/connector-development/config-based/tutorial/5-incremental-reads.md @@ -10,7 +10,7 @@ We'll now add a `start_date` property to the connector. First we'll update the spec block in `source_exchange_rates_tutorial/manifest.yaml` ```yaml -spec: +spec: documentation_url: https://docs.airbyte.com/integrations/sources/exchangeratesapi connection_specification: $schema: http://json-schema.org/draft-07/schema# @@ -81,6 +81,7 @@ poetry run source-exchange-rates-tutorial read --config secrets/config.json --ca By reading the output record, you should see that we read historical data instead of the latest exchange rate. For example: + > "historical": true, "base": "USD", "date": "2022-07-18" The connector will now always read data for the start date, which is not exactly what we want. @@ -156,7 +157,7 @@ version: "0.1.0" definitions: selector: extractor: - field_path: [ ] + field_path: [] requester: url_base: "https://api.apilayer.com" http_method: "GET" @@ -202,7 +203,7 @@ streams: check: stream_names: - "rates" -spec: +spec: documentation_url: https://docs.airbyte.com/integrations/sources/exchangeratesapi connection_specification: $schema: http://json-schema.org/draft-07/schema# @@ -261,10 +262,7 @@ This can be achieved by updating the catalog to run in incremental mode (`integr "stream": { "name": "rates", "json_schema": {}, - "supported_sync_modes": [ - "full_refresh", - "incremental" - ] + "supported_sync_modes": ["full_refresh", "incremental"] }, "sync_mode": "incremental", "destination_sync_mode": "overwrite" diff --git a/docs/connector-development/config-based/tutorial/6-testing.md b/docs/connector-development/config-based/tutorial/6-testing.md index 7effad89c30..6eee821827f 100644 --- a/docs/connector-development/config-based/tutorial/6-testing.md +++ b/docs/connector-development/config-based/tutorial/6-testing.md @@ -45,4 +45,4 @@ Next, we'll add the connector to the [Airbyte platform](https://docs.airbyte.com - [Contribution guide](../../../contributing-to-airbyte/README.md) - [Greenhouse source](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-greenhouse) - [Sendgrid source](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-sendgrid) -- [Sentry source](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-sentry) \ No newline at end of file +- [Sentry source](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-sentry) diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/authentication.md b/docs/connector-development/config-based/understanding-the-yaml-file/authentication.md index b0b0f9f3b45..b5cc4b13b1a 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/authentication.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/authentication.md @@ -5,14 +5,14 @@ The `Authenticator` defines how to configure outgoing HTTP requests to authentic Schema: ```yaml - Authenticator: - type: object - description: "Authenticator type" - anyOf: - - "$ref": "#/definitions/OAuth" - - "$ref": "#/definitions/ApiKeyAuthenticator" - - "$ref": "#/definitions/BearerAuthenticator" - - "$ref": "#/definitions/BasicHttpAuthenticator" +Authenticator: + type: object + description: "Authenticator type" + anyOf: + - "$ref": "#/definitions/OAuth" + - "$ref": "#/definitions/ApiKeyAuthenticator" + - "$ref": "#/definitions/BearerAuthenticator" + - "$ref": "#/definitions/BasicHttpAuthenticator" ``` ## Authenticators @@ -25,19 +25,19 @@ The following definition will set the header "Authorization" with a value "Beare Schema: ```yaml - ApiKeyAuthenticator: - type: object - additionalProperties: true - required: - - header - - api_token - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - header: - type: string - api_token: - type: string +ApiKeyAuthenticator: + type: object + additionalProperties: true + required: + - header + - api_token + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + header: + type: string + api_token: + type: string ``` Example: @@ -57,16 +57,16 @@ The following definition will set the header "Authorization" with a value "Beare Schema: ```yaml - BearerAuthenticator: - type: object - additionalProperties: true - required: - - api_token - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - api_token: - type: string +BearerAuthenticator: + type: object + additionalProperties: true + required: + - api_token + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + api_token: + type: string ``` Example: @@ -87,18 +87,18 @@ The following definition will set the header "Authorization" with a value `Basic Schema: ```yaml - BasicHttpAuthenticator: - type: object - additionalProperties: true - required: - - username - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - username: - type: string - password: - type: string +BasicHttpAuthenticator: + type: object + additionalProperties: true + required: + - username + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + username: + type: string + password: + type: string ``` Example: @@ -138,45 +138,45 @@ OAuth authentication is supported through the `OAuthAuthenticator`, which requir Schema: ```yaml - OAuth: - type: object - additionalProperties: true - required: - - token_refresh_endpoint - - client_id - - client_secret - - refresh_token - - access_token_name - - expires_in_name - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - token_refresh_endpoint: +OAuth: + type: object + additionalProperties: true + required: + - token_refresh_endpoint + - client_id + - client_secret + - refresh_token + - access_token_name + - expires_in_name + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + token_refresh_endpoint: + type: string + client_id: + type: string + client_secret: + type: string + refresh_token: + type: string + scopes: + type: array + items: type: string - client_id: - type: string - client_secret: - type: string - refresh_token: - type: string - scopes: - type: array - items: - type: string - default: [ ] - token_expiry_date: - type: string - access_token_name: - type: string - default: "access_token" - expires_in_name: - type: string - default: "expires_in" - refresh_request_body: - type: object - grant_type: - type: string - default: "refresh_token" + default: [] + token_expiry_date: + type: string + access_token_name: + type: string + default: "access_token" + expires_in_name: + type: string + default: "expires_in" + refresh_request_body: + type: object + grant_type: + type: string + default: "refresh_token" ``` Example: @@ -195,6 +195,7 @@ authenticator: JSON Web Token (JWT) authentication is supported through the `JwtAuthenticator`. Schema + ```yaml JwtAuthenticator: title: JWT Authenticator @@ -323,23 +324,23 @@ Example: ```yaml authenticator: - type: JwtAuthenticator - secret_key: "{{ config['secret_key'] }}" - base64_encode_secret_key: True - algorithm: RS256 - token_duration: 3600 - header_prefix: Bearer - jwt_headers: - kid: "{{ config['kid'] }}" - cty: "JWT" - additional_jwt_headers: - test: "{{ config['test']}}" - jwt_payload: - iss: "{{ config['iss'] }}" - sub: "sub value" - aud: "aud value" - additional_jwt_payload: - test: "test custom payload" + type: JwtAuthenticator + secret_key: "{{ config['secret_key'] }}" + base64_encode_secret_key: True + algorithm: RS256 + token_duration: 3600 + header_prefix: Bearer + jwt_headers: + kid: "{{ config['kid'] }}" + cty: "JWT" + additional_jwt_headers: + test: "{{ config['test']}}" + jwt_payload: + iss: "{{ config['iss'] }}" + sub: "sub value" + aud: "aud value" + additional_jwt_payload: + test: "test custom payload" ``` ## More readings diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/error-handling.md b/docs/connector-development/config-based/understanding-the-yaml-file/error-handling.md index 4b3ef60b01a..746f6773514 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/error-handling.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/error-handling.md @@ -8,12 +8,12 @@ Other behaviors can be configured through the `Requester`'s `error_handler` fiel Schema: ```yaml - ErrorHandler: - type: object - description: "Error handler" - anyOf: - - "$ref": "#/definitions/DefaultErrorHandler" - - "$ref": "#/definitions/CompositeErrorHandler" +ErrorHandler: + type: object + description: "Error handler" + anyOf: + - "$ref": "#/definitions/DefaultErrorHandler" + - "$ref": "#/definitions/CompositeErrorHandler" ``` ## Default error handler @@ -21,26 +21,26 @@ Schema: Schema: ```yaml - DefaultErrorHandler: - type: object - required: - - max_retries - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - response_filters: - type: array - items: - "$ref": "#/definitions/HttpResponseFilter" - max_retries: - type: integer - default: 5 - backoff_strategies: - type: array - items: - "$ref": "#/definitions/BackoffStrategy" - default: [ ] +DefaultErrorHandler: + type: object + required: + - max_retries + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + response_filters: + type: array + items: + "$ref": "#/definitions/HttpResponseFilter" + max_retries: + type: integer + default: 5 + backoff_strategies: + type: array + items: + "$ref": "#/definitions/BackoffStrategy" + default: [] ``` ## Defining errors @@ -53,32 +53,32 @@ For instance, this example will configure the handler to also retry responses wi Schema: ```yaml - HttpResponseFilter: - type: object - required: - - action - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - action: - "$ref": "#/definitions/ResponseAction" - http_codes: - type: array - items: - type: integer - default: [ ] - error_message_contains: - type: string - predicate: - type: string - ResponseAction: - type: string - enum: - - SUCCESS - - FAIL - - IGNORE - - RETRY +HttpResponseFilter: + type: object + required: + - action + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + action: + "$ref": "#/definitions/ResponseAction" + http_codes: + type: array + items: + type: integer + default: [] + error_message_contains: + type: string + predicate: + type: string +ResponseAction: + type: string + enum: + - SUCCESS + - FAIL + - IGNORE + - RETRY ``` Example: @@ -154,13 +154,13 @@ The error handler supports a few backoff strategies, which are described in the Schema: ```yaml - BackoffStrategy: - type: object - anyOf: - - "$ref": "#/definitions/ExponentialBackoffStrategy" - - "$ref": "#/definitions/ConstantBackoffStrategy" - - "$ref": "#/definitions/WaitTimeFromHeader" - - "$ref": "#/definitions/WaitUntilTimeFromHeader" +BackoffStrategy: + type: object + anyOf: + - "$ref": "#/definitions/ExponentialBackoffStrategy" + - "$ref": "#/definitions/ConstantBackoffStrategy" + - "$ref": "#/definitions/WaitTimeFromHeader" + - "$ref": "#/definitions/WaitUntilTimeFromHeader" ``` ### Exponential backoff @@ -170,15 +170,15 @@ This is the default backoff strategy. The requester will backoff with an exponen Schema: ```yaml - ExponentialBackoffStrategy: - type: object - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - factor: - type: integer - default: 5 +ExponentialBackoffStrategy: + type: object + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + factor: + type: integer + default: 5 ``` ### Constant Backoff @@ -188,16 +188,16 @@ When using the `ConstantBackoffStrategy` strategy, the requester will backoff wi Schema: ```yaml - ConstantBackoffStrategy: - type: object - additionalProperties: true - required: - - backoff_time_in_seconds - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - backoff_time_in_seconds: - type: number +ConstantBackoffStrategy: + type: object + additionalProperties: true + required: + - backoff_time_in_seconds + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + backoff_time_in_seconds: + type: number ``` ### Wait time defined in header @@ -208,18 +208,18 @@ In this example, the requester will backoff by the response's "wait_time" header Schema: ```yaml - WaitTimeFromHeader: - type: object - additionalProperties: true - required: - - header - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - header: - type: string - regex: - type: string +WaitTimeFromHeader: + type: object + additionalProperties: true + required: + - header + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + header: + type: string + regex: + type: string ``` Example: @@ -257,20 +257,20 @@ In this example, the requester will wait until the time specified in the "wait_u Schema: ```yaml - WaitUntilTimeFromHeader: - type: object - additionalProperties: true - required: - - header - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - header: - type: string - regex: - type: string - min_wait: - type: number +WaitUntilTimeFromHeader: + type: object + additionalProperties: true + required: + - header + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + header: + type: string + regex: + type: string + min_wait: + type: number ``` Example: @@ -315,17 +315,17 @@ In this example, a constant backoff of 5 seconds, will be applied if the respons Schema: ```yaml - CompositeErrorHandler: - type: object - required: - - error_handlers - additionalProperties: - "$parameters": - "$ref": "#/definitions/$parameters" - error_handlers: - type: array - items: - "$ref": "#/definitions/ErrorHandler" +CompositeErrorHandler: + type: object + required: + - error_handlers + additionalProperties: + "$parameters": + "$ref": "#/definitions/$parameters" + error_handlers: + type: array + items: + "$ref": "#/definitions/ErrorHandler" ``` Example: diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/incremental-syncs.md b/docs/connector-development/config-based/understanding-the-yaml-file/incremental-syncs.md index 16f0439d8b7..7616f3822fc 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/incremental-syncs.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/incremental-syncs.md @@ -10,10 +10,10 @@ When a stream is read incrementally, a state message will be output by the conne ## DatetimeBasedCursor -The `DatetimeBasedCursor` is used to read records from the underlying data source (e.g: an API) according to a specified datetime range. This time range is partitioned into time windows according to the `step`. For example, if you have `start_time=2022-01-01T00:00:00`, `end_time=2022-01-05T00:00:00` and `step=P1D`, the following partitions will be created: +The `DatetimeBasedCursor` is used to read records from the underlying data source (e.g: an API) according to a specified datetime range. This time range is partitioned into time windows according to the `step`. For example, if you have `start_time=2022-01-01T00:00:00`, `end_time=2022-01-05T00:00:00` and `step=P1D`, the following partitions will be created: | Start | End | -|---------------------|---------------------| +| ------------------- | ------------------- | | 2022-01-01T00:00:00 | 2022-01-01T23:59:59 | | 2022-01-02T00:00:00 | 2022-01-02T23:59:59 | | 2022-01-03T00:00:00 | 2022-01-03T23:59:59 | @@ -27,83 +27,83 @@ Upon a successful sync, the final stream state will be the datetime of the last Schema: ```yaml - DatetimeBasedCursor: - description: Cursor to provide incremental capabilities over datetime - type: object - required: - - type - - cursor_field - - end_datetime - - datetime_format - - cursor_granularity - - start_datetime - - step - properties: - type: - type: string - enum: [DatetimeBasedCursor] - cursor_field: - description: The location of the value on a record that will be used as a bookmark during sync - type: string - datetime_format: - description: The format of the datetime - type: string - cursor_granularity: - description: Smallest increment the datetime_format has (ISO 8601 duration) that is used to ensure the start of a slice does not overlap with the end of the previous one - type: string - end_datetime: - description: The datetime that determines the last record that should be synced - anyOf: - - type: string - - "$ref": "#/definitions/MinMaxDatetime" - start_datetime: - description: The datetime that determines the earliest record that should be synced - anyOf: - - type: string - - "$ref": "#/definitions/MinMaxDatetime" - step: - description: The size of the time window (ISO8601 duration) - type: string - end_time_option: - description: Request option for end time - "$ref": "#/definitions/RequestOption" - lookback_window: - description: How many days before start_datetime to read data for (ISO8601 duration) - type: string - start_time_option: - description: Request option for start time - "$ref": "#/definitions/RequestOption" - partition_field_end: - description: Partition start time field - type: string - partition_field_start: - description: Partition end time field - type: string - $parameters: - type: object - additionalProperties: true - MinMaxDatetime: - description: Compares the provided date against optional minimum or maximum times. The max_datetime serves as the ceiling and will be returned when datetime exceeds it. The min_datetime serves as the floor - type: object - required: - - type - - datetime - properties: - type: - type: string - enum: [MinMaxDatetime] - datetime: - type: string - datetime_format: - type: string - default: "" - max_datetime: - type: string - min_datetime: - type: string - $parameters: - type: object - additionalProperties: true +DatetimeBasedCursor: + description: Cursor to provide incremental capabilities over datetime + type: object + required: + - type + - cursor_field + - end_datetime + - datetime_format + - cursor_granularity + - start_datetime + - step + properties: + type: + type: string + enum: [DatetimeBasedCursor] + cursor_field: + description: The location of the value on a record that will be used as a bookmark during sync + type: string + datetime_format: + description: The format of the datetime + type: string + cursor_granularity: + description: Smallest increment the datetime_format has (ISO 8601 duration) that is used to ensure the start of a slice does not overlap with the end of the previous one + type: string + end_datetime: + description: The datetime that determines the last record that should be synced + anyOf: + - type: string + - "$ref": "#/definitions/MinMaxDatetime" + start_datetime: + description: The datetime that determines the earliest record that should be synced + anyOf: + - type: string + - "$ref": "#/definitions/MinMaxDatetime" + step: + description: The size of the time window (ISO8601 duration) + type: string + end_time_option: + description: Request option for end time + "$ref": "#/definitions/RequestOption" + lookback_window: + description: How many days before start_datetime to read data for (ISO8601 duration) + type: string + start_time_option: + description: Request option for start time + "$ref": "#/definitions/RequestOption" + partition_field_end: + description: Partition start time field + type: string + partition_field_start: + description: Partition end time field + type: string + $parameters: + type: object + additionalProperties: true +MinMaxDatetime: + description: Compares the provided date against optional minimum or maximum times. The max_datetime serves as the ceiling and will be returned when datetime exceeds it. The min_datetime serves as the floor + type: object + required: + - type + - datetime + properties: + type: + type: string + enum: [MinMaxDatetime] + datetime: + type: string + datetime_format: + type: string + default: "" + max_datetime: + type: string + min_datetime: + type: string + $parameters: + type: object + additionalProperties: true ``` Example: diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/pagination.md b/docs/connector-development/config-based/understanding-the-yaml-file/pagination.md index b47e57416b1..620494a071d 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/pagination.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/pagination.md @@ -9,14 +9,14 @@ Conversely, pages don't have semantic value. More pages simply means that more r Schema: ```yaml - Paginator: - type: object - anyOf: - - "$ref": "#/definitions/DefaultPaginator" - - "$ref": "#/definitions/NoPagination" - NoPagination: - type: object - additionalProperties: true +Paginator: + type: object + anyOf: + - "$ref": "#/definitions/DefaultPaginator" + - "$ref": "#/definitions/NoPagination" +NoPagination: + type: object + additionalProperties: true ``` ## Default paginator @@ -30,25 +30,25 @@ The default paginator is defined by Schema: ```yaml - DefaultPaginator: - type: object - additionalProperties: true - required: - - page_token_option - - pagination_strategy - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - page_size: - type: integer - page_size_option: - "$ref": "#/definitions/RequestOption" - page_token_option: - anyOf: - - "$ref": "#/definitions/RequestOption" - - "$ref": "#/definitions/RequestPath" - pagination_strategy: - "$ref": "#/definitions/PaginationStrategy" +DefaultPaginator: + type: object + additionalProperties: true + required: + - page_token_option + - pagination_strategy + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + page_size: + type: integer + page_size_option: + "$ref": "#/definitions/RequestOption" + page_token_option: + anyOf: + - "$ref": "#/definitions/RequestOption" + - "$ref": "#/definitions/RequestPath" + pagination_strategy: + "$ref": "#/definitions/PaginationStrategy" ``` 3 pagination strategies are supported @@ -62,12 +62,12 @@ Schema: Schema: ```yaml - PaginationStrategy: - type: object - anyOf: - - "$ref": "#/definitions/CursorPagination" - - "$ref": "#/definitions/OffsetIncrement" - - "$ref": "#/definitions/PageIncrement" +PaginationStrategy: + type: object + anyOf: + - "$ref": "#/definitions/CursorPagination" + - "$ref": "#/definitions/OffsetIncrement" + - "$ref": "#/definitions/PageIncrement" ``` ### Page increment @@ -77,16 +77,16 @@ When using the `PageIncrement` strategy, the page number will be set as part of Schema: ```yaml - PageIncrement: - type: object - additionalProperties: true - required: - - page_size - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - page_size: - type: integer +PageIncrement: + type: object + additionalProperties: true + required: + - page_size + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + page_size: + type: integer ``` The following paginator example will fetch 5 records per page, and specify the page number as a request_parameter: @@ -123,16 +123,16 @@ When using the `OffsetIncrement` strategy, the number of records read will be se Schema: ```yaml - OffsetIncrement: - type: object - additionalProperties: true - required: - - page_size - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - page_size: - type: integer +OffsetIncrement: + type: object + additionalProperties: true + required: + - page_size + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + page_size: + type: integer ``` The following paginator example will fetch 5 records per page, and specify the offset as a request_parameter: @@ -172,20 +172,20 @@ This cursor value can be used to request the next page of record. Schema: ```yaml - CursorPagination: - type: object - additionalProperties: true - required: - - cursor_value - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - cursor_value: - type: string - stop_condition: - type: string - page_size: - type: integer +CursorPagination: + type: object + additionalProperties: true + required: + - cursor_value + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + cursor_value: + type: string + stop_condition: + type: string + page_size: + type: integer ``` #### Cursor paginator in request parameters diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/partition-router.md b/docs/connector-development/config-based/understanding-the-yaml-file/partition-router.md index dd29a2cb28e..062dd1a0e54 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/partition-router.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/partition-router.md @@ -9,18 +9,18 @@ The most common use case for the `PartitionRouter` component is the retrieval of Schema: ```yaml - partition_router: - default: [] - anyOf: - - "$ref": "#/definitions/CustomPartitionRouter" - - "$ref": "#/definitions/ListPartitionRouter" - - "$ref": "#/definitions/SubstreamPartitionRouter" - - type: array - items: - anyOf: - - "$ref": "#/definitions/CustomPartitionRouter" - - "$ref": "#/definitions/ListPartitionRouter" - - "$ref": "#/definitions/SubstreamPartitionRouter" +partition_router: + default: [] + anyOf: + - "$ref": "#/definitions/CustomPartitionRouter" + - "$ref": "#/definitions/ListPartitionRouter" + - "$ref": "#/definitions/SubstreamPartitionRouter" + - type: array + items: + anyOf: + - "$ref": "#/definitions/CustomPartitionRouter" + - "$ref": "#/definitions/ListPartitionRouter" + - "$ref": "#/definitions/SubstreamPartitionRouter" ``` Notice that you can specify one or more `PartitionRouter`s on a Retriever. When multiple are defined, the result will be Cartesian product of all partitions and a request cycle will be performed for each permutation. @@ -36,30 +36,30 @@ Notice that you can specify one or more `PartitionRouter`s on a Retriever. When Schema: ```yaml - ListPartitionRouter: - description: Partition router that is used to retrieve records that have been partitioned according to a list of values - type: object - required: - - type - - cursor_field - - slice_values - properties: - type: - type: string - enum: [ListPartitionRouter] - cursor_field: - type: string - partition_values: - anyOf: - - type: string - - type: array - items: - type: string - request_option: - "$ref": "#/definitions/RequestOption" - $parameters: - type: object - additionalProperties: true +ListPartitionRouter: + description: Partition router that is used to retrieve records that have been partitioned according to a list of values + type: object + required: + - type + - cursor_field + - slice_values + properties: + type: + type: string + enum: [ListPartitionRouter] + cursor_field: + type: string + partition_values: + anyOf: + - type: string + - type: array + items: + type: string + request_option: + "$ref": "#/definitions/RequestOption" + $parameters: + type: object + additionalProperties: true ``` As an example, this partition router will iterate over the 2 repositories ("airbyte" and "airbyte-secret") and will set a request_parameter on outgoing HTTP requests. @@ -95,23 +95,23 @@ Substreams are implemented by defining their partition router as a `SubstreamPar Schema: ```yaml - SubstreamPartitionRouter: - description: Partition router that is used to retrieve records that have been partitioned according to records from the specified parent streams - type: object - required: - - type - - parent_stream_configs - properties: - type: - type: string - enum: [SubstreamPartitionRouter] - parent_stream_configs: - type: array - items: - "$ref": "#/definitions/ParentStreamConfig" - $parameters: - type: object - additionalProperties: true +SubstreamPartitionRouter: + description: Partition router that is used to retrieve records that have been partitioned according to records from the specified parent streams + type: object + required: + - type + - parent_stream_configs + properties: + type: + type: string + enum: [SubstreamPartitionRouter] + parent_stream_configs: + type: array + items: + "$ref": "#/definitions/ParentStreamConfig" + $parameters: + type: object + additionalProperties: true ``` Example: diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/record-selector.md b/docs/connector-development/config-based/understanding-the-yaml-file/record-selector.md index 3fe76778631..c44bdccc09d 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/record-selector.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/record-selector.md @@ -4,21 +4,21 @@ The record selector is responsible for translating an HTTP response into a list Schema: ```yaml - HttpSelector: - type: object - anyOf: - - "$ref": "#/definitions/RecordSelector" - RecordSelector: - type: object - required: - - extractor - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - extractor: - "$ref": "#/definitions/RecordExtractor" - record_filter: - "$ref": "#/definitions/RecordFilter" +HttpSelector: + type: object + anyOf: + - "$ref": "#/definitions/RecordSelector" +RecordSelector: + type: object + required: + - extractor + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + extractor: + "$ref": "#/definitions/RecordExtractor" + record_filter: + "$ref": "#/definitions/RecordFilter" ``` The current record extraction implementation uses [dpath](https://pypi.org/project/dpath/) to select records from the json-decoded HTTP response. @@ -26,18 +26,18 @@ For nested structures `*` can be used to iterate over array elements. Schema: ```yaml - DpathExtractor: - type: object - additionalProperties: true - required: - - field_path - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - field_path: - type: array - items: - type: string +DpathExtractor: + type: object + additionalProperties: true + required: + - field_path + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + field_path: + type: array + items: + type: string ``` ## Common recipes: @@ -51,7 +51,7 @@ If the root of the response is an array containing the records, the records can ```yaml selector: extractor: - field_path: [ ] + field_path: [] ``` If the root of the response is a json object representing a single record, the record can be extracted and wrapped in an array. @@ -68,7 +68,7 @@ and a selector ```yaml selector: extractor: - field_path: [ ] + field_path: [] ``` The selected records will be @@ -97,7 +97,7 @@ and a selector ```yaml selector: extractor: - field_path: [ "data" ] + field_path: ["data"] ``` The selected records will be @@ -137,7 +137,7 @@ and a selector ```yaml selector: extractor: - field_path: [ "data", "records" ] + field_path: ["data", "records"] ``` The selected records will be @@ -158,7 +158,6 @@ The selected records will be Given a response body of the form ```json - { "data": [ { @@ -173,7 +172,6 @@ Given a response body of the form } ] } - ``` and a selector @@ -181,7 +179,7 @@ and a selector ```yaml selector: extractor: - field_path: [ "data", "*", "record" ] + field_path: ["data", "*", "record"] ``` The selected records will be @@ -207,7 +205,7 @@ In this example, all records with a `created_at` field greater than the stream s ```yaml selector: extractor: - field_path: [ ] + field_path: [] record_filter: condition: "{{ record['created_at'] < stream_slice['start_time'] }}" ``` @@ -219,11 +217,11 @@ Fields can be added or removed from records by adding `Transformation`s to a str Schema: ```yaml - RecordTransformation: - type: object - anyOf: - - "$ref": "#/definitions/AddFields" - - "$ref": "#/definitions/RemoveFields" +RecordTransformation: + type: object + anyOf: + - "$ref": "#/definitions/AddFields" + - "$ref": "#/definitions/RemoveFields" ``` ### Adding fields @@ -234,35 +232,35 @@ This example adds a top-level field "field1" with a value "static_value" Schema: ```yaml - AddFields: - type: object - required: - - fields - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - fields: - type: array - items: - "$ref": "#/definitions/AddedFieldDefinition" - AddedFieldDefinition: - type: object - required: - - path - - value - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - path: - "$ref": "#/definitions/FieldPointer" - value: - type: string - FieldPointer: - type: array - items: +AddFields: + type: object + required: + - fields + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + fields: + type: array + items: + "$ref": "#/definitions/AddedFieldDefinition" +AddedFieldDefinition: + type: object + required: + - path + - value + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + path: + "$ref": "#/definitions/FieldPointer" + value: type: string +FieldPointer: + type: array + items: + type: string ``` Example: @@ -335,26 +333,25 @@ Fields can be removed from records with the `RemoveFields` transformation. Schema: ```yaml - RemoveFields: - type: object - required: - - field_pointers - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - field_pointers: - type: array - items: - "$ref": "#/definitions/FieldPointer" - +RemoveFields: + type: object + required: + - field_pointers + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + field_pointers: + type: array + items: + "$ref": "#/definitions/FieldPointer" ``` Given a record of the following shape: ``` { - "path": + "path": { "to": { @@ -383,7 +380,7 @@ resulting in the following record: ``` { - "path": + "path": { "to": { diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/reference.md b/docs/connector-development/config-based/understanding-the-yaml-file/reference.md index 85c60bc9b8d..d33e322444d 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/reference.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/reference.md @@ -2,7 +2,6 @@ import ManifestYamlDefinitions from '@site/src/components/ManifestYamlDefinition import schema from "../../../../airbyte-cdk/python/airbyte_cdk/sources/declarative/declarative_component_schema.yaml"; - # YAML Reference This page lists all components, interpolation variables and interpolation macros that can be used when defining a low code YAML file. @@ -12,49 +11,49 @@ For the technical JSON schema definition that low code manifests are validated a export const toc = [ - { - "value": "Components:", - "id": "components", - "level": 2 - }, - { - value: "DeclarativeSource", - id: "/definitions/DeclarativeSource", - level: 3 - }, - ...Object.keys(schema.definitions).map((id) => ({ - value: id, - id: `/definitions/${id}`, - level: 3 - })), - { - "value": "Interpolation variables:", - "id": "variables", - "level": 2 - }, - ...schema.interpolation.variables.map((def) => ({ - value: def.title, - id: `/variables/${def.title}`, - level: 3 - })), - { - "value": "Interpolation macros:", - "id": "macros", - "level": 2 - }, - ...schema.interpolation.macros.map((def) => ({ - value: def.title, - id: `/macros/${def.title}`, - level: 3 - })), - { - "value": "Interpolation filters:", - "id": "filters", - "level": 2 - }, - ...schema.interpolation.filters.map((def) => ({ - value: def.title, - id: `/filters/${def.title}`, - level: 3 - })) -]; \ No newline at end of file +{ +"value": "Components:", +"id": "components", +"level": 2 +}, +{ +value: "DeclarativeSource", +id: "/definitions/DeclarativeSource", +level: 3 +}, +...Object.keys(schema.definitions).map((id) => ({ +value: id, +id: `/definitions/${id}`, +level: 3 +})), +{ +"value": "Interpolation variables:", +"id": "variables", +"level": 2 +}, +...schema.interpolation.variables.map((def) => ({ +value: def.title, +id: `/variables/${def.title}`, +level: 3 +})), +{ +"value": "Interpolation macros:", +"id": "macros", +"level": 2 +}, +...schema.interpolation.macros.map((def) => ({ +value: def.title, +id: `/macros/${def.title}`, +level: 3 +})), +{ +"value": "Interpolation filters:", +"id": "filters", +"level": 2 +}, +...schema.interpolation.filters.map((def) => ({ +value: def.title, +id: `/filters/${def.title}`, +level: 3 +})) +]; diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/request-options.md b/docs/connector-development/config-based/understanding-the-yaml-file/request-options.md index 14af7cda2a4..2af009db949 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/request-options.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/request-options.md @@ -13,24 +13,24 @@ The options can be configured as key value pairs: Schema: ```yaml - RequestOptionsProvider: - type: object - anyOf: - - "$ref": "#/definitions/InterpolatedRequestOptionsProvider" - InterpolatedRequestOptionsProvider: - type: object - additionalProperties: true - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - request_parameters: - "$ref": "#/definitions/RequestInput" - request_headers: - "$ref": "#/definitions/RequestInput" - request_body_data: - "$ref": "#/definitions/RequestInput" - request_body_json: - "$ref": "#/definitions/RequestInput" +RequestOptionsProvider: + type: object + anyOf: + - "$ref": "#/definitions/InterpolatedRequestOptionsProvider" +InterpolatedRequestOptionsProvider: + type: object + additionalProperties: true + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + request_parameters: + "$ref": "#/definitions/RequestInput" + request_headers: + "$ref": "#/definitions/RequestInput" + request_body_data: + "$ref": "#/definitions/RequestInput" + request_body_json: + "$ref": "#/definitions/RequestInput" ``` Example: @@ -68,25 +68,25 @@ Some components can add request options to the requests sent to the API endpoint Schema: ```yaml - RequestOption: - description: A component that specifies the key field and where in the request a component's value should be inserted into. - type: object - required: - - type - - field_name - - inject_into - properties: - type: - type: string - enum: [RequestOption] - field_name: - type: string - inject_into: - enum: - - request_parameter - - header - - body_data - - body_json +RequestOption: + description: A component that specifies the key field and where in the request a component's value should be inserted into. + type: object + required: + - type + - field_name + - inject_into + properties: + type: + type: string + enum: [RequestOption] + field_name: + type: string + inject_into: + enum: + - request_parameter + - header + - body_data + - body_json ``` ### Request Path @@ -97,15 +97,15 @@ modify the HTTP path of the API endpoint being accessed. Schema: ```yaml - RequestPath: - description: A component that specifies where in the request path a component's value should be inserted into. - type: object - required: - - type - properties: - type: - type: string - enum: [RequestPath] +RequestPath: + description: A component that specifies where in the request path a component's value should be inserted into. + type: object + required: + - type + properties: + type: + type: string + enum: [RequestPath] ``` ## Authenticators @@ -169,4 +169,4 @@ More details on incremental syncs can be found in the [incremental syncs section - [Requester](./requester.md) - [Pagination](./pagination.md) -- [Incremental Syncs](./incremental-syncs.md) \ No newline at end of file +- [Incremental Syncs](./incremental-syncs.md) diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/requester.md b/docs/connector-development/config-based/understanding-the-yaml-file/requester.md index 789c04877f2..3f0319b69e9 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/requester.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/requester.md @@ -13,39 +13,39 @@ There is currently only one implementation, the `HttpRequester`, which is define The schema of a requester object is: ```yaml - Requester: - type: object - anyOf: - - "$ref": "#/definitions/HttpRequester" - HttpRequester: - type: object - additionalProperties: true - required: - - url_base - - path - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - url_base: - type: string - description: "base url" - path: - type: string - description: "path" - http_method: - "$ref": "#/definitions/HttpMethod" - default: "GET" - request_options_provider: - "$ref": "#/definitions/RequestOptionsProvider" - authenticator: - "$ref": "#/definitions/Authenticator" - error_handler: - "$ref": "#/definitions/ErrorHandler" - HttpMethod: - type: string - enum: - - GET - - POST +Requester: + type: object + anyOf: + - "$ref": "#/definitions/HttpRequester" +HttpRequester: + type: object + additionalProperties: true + required: + - url_base + - path + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + url_base: + type: string + description: "base url" + path: + type: string + description: "path" + http_method: + "$ref": "#/definitions/HttpMethod" + default: "GET" + request_options_provider: + "$ref": "#/definitions/RequestOptionsProvider" + authenticator: + "$ref": "#/definitions/Authenticator" + error_handler: + "$ref": "#/definitions/ErrorHandler" +HttpMethod: + type: string + enum: + - GET + - POST ``` ## Configuring request parameters and headers @@ -57,4 +57,4 @@ Additionally, some stateful components use a `RequestOption` to configure the op ## More readings -- [Request options](./request-options.md) \ No newline at end of file +- [Request options](./request-options.md) diff --git a/docs/connector-development/config-based/understanding-the-yaml-file/yaml-overview.md b/docs/connector-development/config-based/understanding-the-yaml-file/yaml-overview.md index 47d25dd8f7d..643249a7ba1 100644 --- a/docs/connector-development/config-based/understanding-the-yaml-file/yaml-overview.md +++ b/docs/connector-development/config-based/understanding-the-yaml-file/yaml-overview.md @@ -7,7 +7,7 @@ The low-code framework involves editing a boilerplate [YAML file](../low-code-cd Streams define the schema of the data to sync, as well as how to read it from the underlying API source. A stream generally corresponds to a resource within the API. They are analogous to tables for a relational database source. -By default, the schema of a stream's data is defined as a [JSONSchema](https://json-schema.org/) file in `/schemas/.json`. +By default, the schema of a stream's data is defined as a [JSONSchema](https://json-schema.org/) file in `/schemas/.json`. Alternately, the stream's data schema can be stored in YAML format inline in the YAML file, by including the optional `schema_loader` key. If the data schema is provided inline, any schema on disk for that stream will be ignored. @@ -16,42 +16,42 @@ More information on how to define a stream's schema can be found [here](../../.. The stream object is represented in the YAML file as: ```yaml - DeclarativeStream: - description: A stream whose behavior is described by a set of declarative low code components - type: object - additionalProperties: true - required: - - type - - retriever - properties: - type: - type: string - enum: [DeclarativeStream] - retriever: - "$ref": "#/definitions/Retriever" - schema_loader: - definition: The schema loader used to retrieve the schema for the current stream +DeclarativeStream: + description: A stream whose behavior is described by a set of declarative low code components + type: object + additionalProperties: true + required: + - type + - retriever + properties: + type: + type: string + enum: [DeclarativeStream] + retriever: + "$ref": "#/definitions/Retriever" + schema_loader: + definition: The schema loader used to retrieve the schema for the current stream + anyOf: + - "$ref": "#/definitions/InlineSchemaLoader" + - "$ref": "#/definitions/JsonFileSchemaLoader" + stream_cursor_field: + definition: The field of the records being read that will be used during checkpointing + anyOf: + - type: string + - type: array + items: + - type: string + transformations: + definition: A list of transformations to be applied to each output record in the + type: array + items: anyOf: - - "$ref": "#/definitions/InlineSchemaLoader" - - "$ref": "#/definitions/JsonFileSchemaLoader" - stream_cursor_field: - definition: The field of the records being read that will be used during checkpointing - anyOf: - - type: string - - type: array - items: - - type: string - transformations: - definition: A list of transformations to be applied to each output record in the - type: array - items: - anyOf: - - "$ref": "#/definitions/AddFields" - - "$ref": "#/definitions/CustomTransformation" - - "$ref": "#/definitions/RemoveFields" - $parameters: - type: object - additional_properties: true + - "$ref": "#/definitions/AddFields" + - "$ref": "#/definitions/CustomTransformation" + - "$ref": "#/definitions/RemoveFields" + $parameters: + type: object + additional_properties: true ``` More details on streams and sources can be found in the [basic concepts section](../../cdk-python/basic-concepts.md). @@ -73,7 +73,7 @@ It is described by: 1. [Requester](./requester.md): Describes how to submit requests to the API source 2. [Paginator](./pagination.md): Describes how to navigate through the API's pages 3. [Record selector](./record-selector.md): Describes how to extract records from a HTTP response -4. [Partition router](./partition-router.md): Describes how to retrieve data across multiple resource locations +4. [Partition router](./partition-router.md): Describes how to retrieve data across multiple resource locations Each of those components (and their subcomponents) are defined by an explicit interface and one or many implementations. The developer can choose and configure the implementation they need depending on specifications of the integration they are building against. @@ -83,26 +83,26 @@ Since the `Retriever` is defined as part of the Stream configuration, different The schema of a retriever object is: ```yaml - retriever: - description: Retrieves records by synchronously sending requests to fetch records. The retriever acts as an orchestrator between the requester, the record selector, the paginator, and the partition router. - type: object - required: - - requester - - record_selector - - requester - properties: - "$parameters": - "$ref": "#/definitions/$parameters" - requester: - "$ref": "#/definitions/Requester" - record_selector: - "$ref": "#/definitions/HttpSelector" - paginator: - "$ref": "#/definitions/Paginator" - stream_slicer: - "$ref": "#/definitions/StreamSlicer" - PrimaryKey: - type: string +retriever: + description: Retrieves records by synchronously sending requests to fetch records. The retriever acts as an orchestrator between the requester, the record selector, the paginator, and the partition router. + type: object + required: + - requester + - record_selector + - requester + properties: + "$parameters": + "$ref": "#/definitions/$parameters" + requester: + "$ref": "#/definitions/Requester" + record_selector: + "$ref": "#/definitions/HttpSelector" + paginator: + "$ref": "#/definitions/Paginator" + stream_slicer: + "$ref": "#/definitions/StreamSlicer" +PrimaryKey: + type: string ``` ### Routing to Data that is Partitioned in Multiple Locations @@ -120,7 +120,7 @@ During a sync where both are configured, the Cartesian product of these paramete For example, if we had a `DatetimeBasedCursor` requesting data over a 3-day range partitioned by day and a `ListPartitionRouter` with the following locations `A`, `B`, and `C`. This would result in the following combinations that will be used to request data. | Partition | Date Range | -|-----------|-------------------------------------------| +| --------- | ----------------------------------------- | | A | 2022-01-01T00:00:00 - 2022-01-01T23:59:59 | | B | 2022-01-01T00:00:00 - 2022-01-01T23:59:59 | | C | 2022-01-01T00:00:00 - 2022-01-01T23:59:59 | diff --git a/docs/connector-development/connector-builder-ui/authentication.md b/docs/connector-development/connector-builder-ui/authentication.md index b57aa440cc8..d4ac8796122 100644 --- a/docs/connector-development/connector-builder-ui/authentication.md +++ b/docs/connector-development/connector-builder-ui/authentication.md @@ -13,17 +13,19 @@ If your API doesn't need authentication, leave it set at "No auth". This means t ## Authentication methods Check the documentation of the API you want to integrate for the used authentication method. The following ones are supported in the connector builder: -* [Basic HTTP](#basic-http) -* [Bearer Token](#bearer-token) -* [API Key](#api-key) -* [OAuth](#oauth) -* [Session Token](#session-token) + +- [Basic HTTP](#basic-http) +- [Bearer Token](#bearer-token) +- [API Key](#api-key) +- [OAuth](#oauth) +- [Session Token](#session-token) Select the matching authentication method for your API and check the sections below for more information about individual methods. ### Basic HTTP If requests are authenticated using the Basic HTTP authentication method, the documentation page will likely contain one of the following keywords: + - "Basic Auth" - "Basic HTTP" - "Authorization: Basic" @@ -39,6 +41,7 @@ Sometimes, only a username and no password is required, like for the [Chargebee In the basic authentication scheme, the supplied username and password are concatenated with a colon `:` and encoded using the base64 algorithm. For username `user` and password `passwd`, the base64-encoding of `user:passwd` is `dXNlcjpwYXNzd2Q=`. When fetching records, this string is sent as part of the `Authorization` header: + ``` curl -X GET \ -H "Authorization: Basic dXNlcjpwYXNzd2Q=" \ @@ -56,6 +59,7 @@ Like the Basic HTTP authentication it does not require further configuration. Th The [Sendgrid API](https://docs.sendgrid.com/api-reference/how-to-use-the-sendgrid-v3-api/authentication) and the [Square API](https://developer.squareup.com/docs/build-basics/access-tokens) are supporting Bearer authentication. When fetching records, the token is sent along as the `Authorization` header: + ``` curl -X GET \ -H "Authorization: Bearer " \ @@ -68,18 +72,19 @@ The API key authentication method is similar to the Bearer authentication but al The following table helps with which mechanism to use for which API: -| Description | Injection mechanism | -|----------|----------| -| (HTTP) header | `header` | -| Query parameter / query string / request parameter / URL parameter | `request_parameter` | -| Form encoded request body / form data | `body_data` | -| JSON encoded request body | `body_json` | +| Description | Injection mechanism | +| ------------------------------------------------------------------ | ------------------- | +| (HTTP) header | `header` | +| Query parameter / query string / request parameter / URL parameter | `request_parameter` | +| Form encoded request body / form data | `body_data` | +| JSON encoded request body | `body_json` | #### Example The [CoinAPI.io API](https://docs.coinapi.io/market-data/rest-api#authorization) is using API key authentication via the `X-CoinAPI-Key` header. When fetching records, the api token is included in the request using the configured header: + ``` curl -X GET \ -H "X-CoinAPI-Key: " \ @@ -97,11 +102,12 @@ In this scheme, the OAuth endpoint of an API is called with client id and client The connector needs to be configured with the endpoint to call to obtain access tokens with the client id/secret and/or the refresh token. OAuth client id/secret and the refresh token are provided via "Testing values" in the connector builder as well as when configuring this connector as a Source. Depending on how the refresh endpoint is implemented exactly, additional configuration might be necessary to specify how to request an access token with the right permissions (configuring OAuth scopes and grant type) and how to extract the access token and the expiry date out of the response (configuring expiry date format and property name as well as the access key property name): -* **Scopes** - the [OAuth scopes](https://oauth.net/2/scope/) the access token will have access to. if not specified, no scopes are sent along with the refresh token request -* **Grant type** - the used OAuth grant type (either refresh token or client credentials). In case of refresh_token, a refresh token has to be provided by the end user when configuring the connector as a Source. -* **Token expiry property name** - the name of the property in the response that contains token expiry information. If not specified, it's set to `expires_in` -* **Token expire property date format** - if not specified, the expiry property is interpreted as the number of seconds the access token will be valid -* **Access token property name** - the name of the property in the response that contains the access token to do requests. If not specified, it's set to `access_token` + +- **Scopes** - the [OAuth scopes](https://oauth.net/2/scope/) the access token will have access to. if not specified, no scopes are sent along with the refresh token request +- **Grant type** - the used OAuth grant type (either refresh token or client credentials). In case of refresh_token, a refresh token has to be provided by the end user when configuring the connector as a Source. +- **Token expiry property name** - the name of the property in the response that contains token expiry information. If not specified, it's set to `expires_in` +- **Token expire property date format** - if not specified, the expiry property is interpreted as the number of seconds the access token will be valid +- **Access token property name** - the name of the property in the response that contains the access token to do requests. If not specified, it's set to `access_token` If the API uses other grant types like PKCE are required, it's not possible to use the connector builder with OAuth authentication - check out the [compatibility guide](/connector-development/connector-builder-ui/connector-builder-compatibility#oauth) for more information. @@ -112,10 +118,12 @@ Keep in mind that the OAuth authentication method does not implement a single-cl The [Square API](https://developer.squareup.com/docs/build-basics/access-tokens#get-an-oauth-access-token) supports OAuth. In this case, the authentication method has to be configured like this: -* "Token refresh endpoint" is `https://connect.squareup.com/oauth2/token` -* "Token expiry property name" is `expires_at` + +- "Token refresh endpoint" is `https://connect.squareup.com/oauth2/token` +- "Token expiry property name" is `expires_at` When running a sync, the connector is first sending client id, client secret and refresh token to the token refresh endpoint: + ``` curl -X POST \ @@ -125,6 +133,7 @@ curl -X POST \ ``` The response is a JSON object containing an `access_token` property and an `expires_at` property: + ``` {"access_token":"", "expires_at": "2023-12-12T00:00:00"} ``` @@ -132,6 +141,7 @@ The response is a JSON object containing an `access_token` property and an `expi The `expires_at` date tells the connector how long the access token can be used - if this point in time is passed, a new access token is requested automatically. When fetching records, the access token is sent along as part of the `Authorization` header: + ``` curl -X GET \ -H "Authorization: Bearer " \ @@ -145,9 +155,11 @@ In a lot of cases, OAuth refresh tokens are long-lived and can be used to create This can be done using the "Overwrite config with refresh token response" setting. If enabled, the authenticator expects a new refresh token to be returned from the token refresh endpoint. By default, the property `refresh_token` is used to extract the new refresh token, but this can be configured using the "Refresh token property name" setting. The connector then updates its own configuration with the new refresh token and uses it the next time an access token needs to be generated. If this option is used, it's necessary to specify an initial access token along with its expiry date in the "Testing values" menu. ### Session Token + Some APIs require callers to first fetch a unique token from one endpoint, then make the rest of their calls to all other endpoints using that token to authenticate themselves. These tokens usually have an expiration time, after which a new token needs to be re-fetched to continue making requests. This flow can be achieved through using the Session Token Authenticator. If requests are authenticated using the Session Token authentication method, the API documentation page will likely contain one of the following keywords: + - "Session Token" - "Session ID" - "Auth Token" @@ -155,16 +167,18 @@ If requests are authenticated using the Session Token authentication method, the - "Temporary Token" #### Configuration + The configuration of a Session Token authenticator is a bit more involved than other authenticators, as you need to configure both how to make requests to the session token retrieval endpoint (which requires its own authentication method), as well as how the token is extracted from that response and used for the data requests. We will walk through each part of the configuration below. Throughout this, we will refer to the [Metabase API](https://www.metabase.com/learn/administration/metabase-api#authenticate-your-requests-with-a-session-token) as an example of an API that uses session token authentication. + - `Session Token Retrieval` - this is a group of fields which configures how the session token is fetched from the session token endpoint in your API. Once the session token is retrieved, your connector will reuse that token until it expires, at which point it will retrieve a new session token using this configuration. - `URL` - the full URL of the session token endpoint - For Metabase, this would be `https://.metabaseapp.com/api/session`. - `HTTP Method` - the HTTP method that should be used when retrieving the session token endpoint, either `GET` or `POST` - Metabase requires `POST` for its `/api/session` requests. - `Authentication Method` - configures the method of authentication to use **for the session token retrieval request only** - - Note that this is separate from the parent Session Token Authenticator. It contains the same options as the parent Authenticator Method dropdown, except for OAuth (which is unlikely to be used for obtaining session tokens) and Session Token (as it does not make sense to nest). + - Note that this is separate from the parent Session Token Authenticator. It contains the same options as the parent Authenticator Method dropdown, except for OAuth (which is unlikely to be used for obtaining session tokens) and Session Token (as it does not make sense to nest). - For Metabase, the `/api/session` endpoint takes in a `username` and `password` in the request body. Since this is a non-standard authentication method, we must set this inner `Authentication Method` to `No Auth`, and instead configure the `Request Body` to pass these credentials (discussed below). - `Query Parameters` - used to attach query parameters to the session token retrieval request - Metabase does not require any query parameters in the `/api/session` request, so this is left unset. diff --git a/docs/connector-development/connector-builder-ui/connector-builder-compatibility.md b/docs/connector-development/connector-builder-ui/connector-builder-compatibility.md index 73df9137d71..3701d38724b 100644 --- a/docs/connector-development/connector-builder-ui/connector-builder-compatibility.md +++ b/docs/connector-development/connector-builder-ui/connector-builder-compatibility.md @@ -1,16 +1,18 @@ # Compatibility Guide + Answer the following questions to determine whether the Connector Builder is the right tool to build the connector you need: + - [ ] [Is it an HTTP API returning a collection of records synchronously?](#is-the-integration-an-http-api-returning-a-collection-of-records-synchronously) - [ ] [Are data endpoints fixed?](#are-data-endpoints-fixed) - [ ] [Is the API using one of the following authentication mechanism?](#what-type-of-authentication-is-required) - - [Basic HTTP](#basic-http) - - [API key injected in request header or query parameter](#api-key) - - [OAuth2.0 with long-lived refresh token](#is-the-oauth-refresh-token-long-lived) + - [Basic HTTP](#basic-http) + - [API key injected in request header or query parameter](#api-key) + - [OAuth2.0 with long-lived refresh token](#is-the-oauth-refresh-token-long-lived) - [ ] [Is the data returned as JSON?](#is-the-data-returned-as-json) - [ ] [If records are paginated, are they using one of the following mechanism?](#how-are-records-paginated) - - [Limit-offset](#limit-offset--offsetincrement-) - - [Page count](#page-count) - - [Link to the next page](#link-to-next-page--cursorpagination-) + - [Limit-offset](#limit-offset--offsetincrement-) + - [Page count](#page-count) + - [Link to the next page](#link-to-next-page--cursorpagination-) - [ ] [Are the required parameters of the integration key-value pairs?](#are-the-required-parameters-of-the-integration-key-value-pairs) You can use the Connector Builder if the integration checks all the items. @@ -30,6 +32,7 @@ Taking the [Congress API](https://api.congress.gov/#/bill) as an example, Indicates the records can be retrieved by submitting a GET request to the `/bill` path. The sample response shows that the response returns a collection of records, so the Congress API is a REST API returning a collection of records. Sample response: + ``` { "bills":[ @@ -75,6 +78,7 @@ These endpoints are also valid synchronous HTTP endpoints. This differs from the [Amazon Ads reports endpoint](https://advertising.amazon.com/API/docs/en-us/info/api-overview), which returns a report ID, which will be generated asynchronously by the source. This is not a synchronous HTTP API because the reports need to be downloaded separately. Examples: + - Yes: [Congress API](https://api.congress.gov/#/) - No: [Amazon Ads](https://advertising.amazon.com/API/docs/en-us/info/api-overview) @@ -89,10 +93,13 @@ For example, the [Congress API](https://api.congress.gov/#/) specifies the data If an integration has a dynamic list of data endpoints representing separate streams, use the Python CDK. ## What type of authentication is required? + Look up the authentication mechanism in the API documentation, and identify which type it is. ### Basic HTTP + Are requests authenticated using the Basic HTTP authentication method? You can search the documentation page for one of the following keywords + - "Basic Auth" - "Basic HTTP" - "Authorization: Basic" @@ -102,6 +109,7 @@ Example: [Greenhouse](https://developers.greenhouse.io/harvest.html#introduction If the authentication mechanism is Basic HTTP, it is compatible with the Connector Builder. ### API Key + Are requests authenticated using an API key injected either as a query parameter or as a request header? Examples: [Congress API](https://api.congress.gov/), [Sendgrid](https://docs.sendgrid.com/for-developers/sending-email/authentication) @@ -109,6 +117,7 @@ Examples: [Congress API](https://api.congress.gov/), [Sendgrid](https://docs.sen If the authentication mechanism is an API key injected as a query parameter or as a request header, it is compatible with the Connector Builder. ### OAuth + Are requests authenticated using an OAuth2.0 flow with a refresh token grant type? Examples: [Square](https://developer.squareup.com/docs/oauth-api/overview), [Woocommerce](https://woocommerce.github.io/woocommerce-rest-api-docs/#introduction) @@ -118,6 +127,7 @@ If the refresh request requires a [grant type](https://oauth.net/2/grant-types/) If the authentication mechanism is OAuth flow 2.0 with refresh token or client credentials and does not require custom query params, it is compatible with the Connector Builder. ### Session Token + Are data requests authenticated using a temporary session token that is obtained through a separate request? Examples: [Metabase](https://www.metabase.com/learn/administration/metabase-api#authenticate-your-requests-with-a-session-token), [Splunk](https://dev.splunk.com/observability/reference/api/sessiontokens/latest) @@ -125,6 +135,7 @@ Examples: [Metabase](https://www.metabase.com/learn/administration/metabase-api# If the authentication mechanism is a session token obtained through calling a separate endpoint, and which expires after some amount of time and needs to be re-obtained, it is compatible with the Connector Builder. ### Other + AWS endpoints are examples of APIs requiring a non-standard authentication mechanism. You can tell from [the documentation](https://docs.aws.amazon.com/pdfs/awscloudtrail/latest/APIReference/awscloudtrail-api.pdf#Welcome) that requests need to be signed with a hash. Example: [AWS Cloudtrail](https://docs.aws.amazon.com/pdfs/awscloudtrail/latest/APIReference/awscloudtrail-api.pdf#Welcome) @@ -132,21 +143,26 @@ Example: [AWS Cloudtrail](https://docs.aws.amazon.com/pdfs/awscloudtrail/latest/ If the integration requires a non-standard authentication mechanism, use Python CDK or low-code with custom components. ## Is the data returned as JSON? + Is the data returned by the API formatted as JSON, or is it formatted as another format such as XML, CSV, gRPC, or PDF? Examples: + - Yes: [Congress API](https://api.congress.gov/) - No: [Federal Railroad Administration (FRA) Safety Data APIs](https://safetydata.fra.dot.gov/MasterWebService/FRASafetyDataAPIs.aspx) If the data is not formatted as JSON, use the Python CDK. ## How are records paginated? + Look up the pagination mechanism in the API documentation, and identify which type it is. Here are the standard pagination mechanisms the connector builder supports: ### Page count + Endpoints using page count pagination accept two pagination parameters + 1. The number of records to be returned (typically called “page_size”) 2. The page to request (typically called “page” or “page number“) @@ -155,7 +171,9 @@ Example: [newsapi.ai](https://newsapi.ai/documentation) ![Page-count-example](./assets/connector_builder_compatibility/page_count_example.png) ### Limit-Offset (OffsetIncrement) + Endpoints using limit-offset pagination accept two pagination parameters + 1. The number of records to be returned (typically called “limit”) 2. The index of the first record to return (typically called “offset”) @@ -164,6 +182,7 @@ Endpoints using limit-offset pagination accept two pagination parameters Example: [Congress API](https://api.congress.gov/) ### Link to next page (CursorPagination) + Endpoints paginated with a link to the next page of records typically include either a “Link” field in the response header, or in the response body. You can search the documentation and the sample response for the “next” keyword. @@ -171,6 +190,7 @@ You can search the documentation and the sample response for the “next” keyw Example: [Greenhouse](https://developers.greenhouse.io/harvest.html#pagination) ### Are the required parameters of the integration key-value pairs? + The Connector Builder currently only supports key-value query params and request body parameters. This means endpoints requiring [GraphQL](https://graphql.org/) are not well supported at the moment. @@ -183,7 +203,8 @@ The endpoint requires a list of filters and metrics. This endpoint is not supported by the connector builder because the “filters” and “metrics” fields are lists. Examples: + - Yes: [Shopify GraphQL Admin API](https://shopify.dev/docs/api/admin-graphql#endpoints), [SproutSocial](https://api.sproutsocial.com/docs/#analytics-endpoints) - No: [Congress API](https://api.congress.gov/) -If the integration requires query params or body parameters that are not key-value pairs, use the Python CDK. \ No newline at end of file +If the integration requires query params or body parameters that are not key-value pairs, use the Python CDK. diff --git a/docs/connector-development/connector-builder-ui/error-handling.md b/docs/connector-development/connector-builder-ui/error-handling.md index b4ac595de32..d466c2b0c9a 100644 --- a/docs/connector-development/connector-builder-ui/error-handling.md +++ b/docs/connector-development/connector-builder-ui/error-handling.md @@ -6,7 +6,7 @@ When using the "Test" button to run a test sync of the connector, the Connector Error handlers allow for the connector to decide how to continue fetching data according to the contents of the response from the partner API. Depending on attributes of the response such as status code, text body, or headers, the connector can continue making requests, retry unsuccessful attempts, or fail the sync. -An error handler is made of two parts, "Backoff strategy" and "Response filter". When the conditions of the response filter are met, the connector will proceed with the sync according to behavior specified. See the [Response filter](#response-filter) section for a detailed breakdown of possible response filter actions. In the event of a failed request that needs to be retried, the backoff strategy determines how long the connector should wait before attempting the request again. +An error handler is made of two parts, "Backoff strategy" and "Response filter". When the conditions of the response filter are met, the connector will proceed with the sync according to behavior specified. See the [Response filter](#response-filter) section for a detailed breakdown of possible response filter actions. In the event of a failed request that needs to be retried, the backoff strategy determines how long the connector should wait before attempting the request again. When an error handler is not configured for a stream, the connector will default to retrying requests that received a 429 and 5XX status code in the response 5 times using a 5-second exponential backoff. This default retry behavior is recommended if the API documentation does not specify error handling or retry behavior. @@ -15,10 +15,11 @@ Refer to the documentation of the API you are building a connector for to determ ## Backoff strategies The API documentation will usually cover when to reattempt a failed request that is retryable. This is often through a `429 Too Many Requests` response status code, but it can vary for different APIs. The following backoff strategies are supported in the connector builder: -* [Constant](#constant) -* [Exponential](#exponential) -* [Wait time from header](#wait-time-from-header) -* [Wait until time from header](#wait-until-time-from-header) + +- [Constant](#constant) +- [Exponential](#exponential) +- [Wait time from header](#wait-time-from-header) +- [Wait until time from header](#wait-until-time-from-header) ### Constant @@ -38,7 +39,7 @@ Note: When no backoff strategy is defined, the connector defaults to using an ex #### Example -The [Delighted API](https://app.delighted.com/docs/api#rate-limits) is an API that recommends using an exponential backoff. In this case, the API documentation recommends retrying requests after 2 seconds, 4 seconds, then 8 seconds and so on. +The [Delighted API](https://app.delighted.com/docs/api#rate-limits) is an API that recommends using an exponential backoff. In this case, the API documentation recommends retrying requests after 2 seconds, 4 seconds, then 8 seconds and so on. Although a lot of API documentation does not call out using an exponential backoff, some APIs like the [Posthog API](https://posthog.com/docs/api) mention rate limits that are advantageous to use an exponential backoff. In this case, the rate limit of 240 requests/min should work for most syncs. However, if there is a spike in traffic, then the exponential backoff allows the connector to avoid sending more requests than the endpoint can support. @@ -62,7 +63,7 @@ The "Wait until time from header" backoff strategy allows the connector to wait #### Example -The [Recurly API](https://recurly.com/developers/api/v2021-02-25/index.html#section/Getting-Started/Limits) is an API that defines a header `X-RateLimit-Reset` which specifies when the request rate limit will be reset. +The [Recurly API](https://recurly.com/developers/api/v2021-02-25/index.html#section/Getting-Started/Limits) is an API that defines a header `X-RateLimit-Reset` which specifies when the request rate limit will be reset. Take for example a connector that makes a request at 25/04/2023 01:00:00 GMT and receives a response with a 429 status code and the header `X-RateLimit-Reset` set to 1682413200. This epoch time is equivalent to 25/04/2023 02:00:00 GMT. Using the `X-RateLimit-Reset` header value, the connector will pause the sync for one hour before attempting subsequent requests to the Recurly API. @@ -75,9 +76,10 @@ A response filter should be used when a connector needs to interpret an API resp ### Response conditions The following conditions can be specified on the "Response filter" and are used to determine if attributes of the response match the filter. When more than one of condition is specified, the filter will take action if the response satisfies any of the conditions: -* [If error message matches](#if-error-message-matches) -* [and predicate is fulfilled](#and-predicate-is-fulfilled) -* [and HTTP codes match](#and-http-codes-match) + +- [If error message matches](#if-error-message-matches) +- [and predicate is fulfilled](#and-predicate-is-fulfilled) +- [and HTTP codes match](#and-http-codes-match) #### If error message matches @@ -106,6 +108,7 @@ The Pocket API emits API responses for rate limiting errors using a 403 error st ### Then execute action If a response from the API matches the predicates of the response filter the connector will continue the sync according to the "Then execute action" definition. This is a list of the actions that a connector can take: + - SUCCESS: The response was successful and the connector will extract records from the response and emit them to a destination. The connector will continue fetching the next set of records from the API. - RETRY: The response was unsuccessful, but the error is transient and may be successful on subsequent attempts. The request will be retried according to the backoff policy defined on the error handler. - IGNORE: The response was unsuccessful, but the error should be ignored. The connector will not emit any records for the current response. The connector will continue fetching the next set of records from the API. diff --git a/docs/connector-development/connector-builder-ui/overview.md b/docs/connector-development/connector-builder-ui/overview.md index 2acccfb717c..178004105a4 100644 --- a/docs/connector-development/connector-builder-ui/overview.md +++ b/docs/connector-development/connector-builder-ui/overview.md @@ -8,14 +8,14 @@ The connector builder UI is in beta, which means it’s still in active developm Developer updates will be announced via our #help-connector-development Slack channel. If you are using the CDK, please join to stay up to date on changes and issues. ::: - ## When should I use the connector builder? The connector builder is the right tool if the following points are met: -* You want to integrate with a JSON-based HTTP API as a source of records -* The API you want to integrate with doesn't exist yet as a connector in the [connector catalog](/integrations/sources/). -* The API is suitable for the connector builder as per the -[compatibility guide](./connector-builder-compatibility.md). + +- You want to integrate with a JSON-based HTTP API as a source of records +- The API you want to integrate with doesn't exist yet as a connector in the [connector catalog](/integrations/sources/). +- The API is suitable for the connector builder as per the + [compatibility guide](./connector-builder-compatibility.md). ## Getting started @@ -73,17 +73,19 @@ A lot of [Airbyte-managed connectors](https://github.com/airbytehq/airbyte/tree/ These `manifest.yaml` files can easily be imported and explored in the builder. To do so, follow these steps: -* Navigate to a `manifest.yaml` file of a connector on Github -* Download the raw file -* Go to the connector builder -* Create a new connector with the button in the top right -* Pick the "Import a YAML manifest" option -* Select the downloaded file -* Change and test the connector + +- Navigate to a `manifest.yaml` file of a connector on Github +- Download the raw file +- Go to the connector builder +- Create a new connector with the button in the top right +- Pick the "Import a YAML manifest" option +- Select the downloaded file +- Change and test the connector The following connectors are good showcases for real-world use cases: -* The [Pendo.io API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-pendo/source_pendo/manifest.yaml) is a simple connector implementing multiple streams and API-key based authentication -* The [News API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-news-api/source_news_api/manifest.yaml) implements pagination and user-configurable request parameters -* The [CoinGecko API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-coingecko-coins/source_coingecko_coins/manifest.yaml) implements incremental syncs + +- The [Pendo.io API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-pendo/source_pendo/manifest.yaml) is a simple connector implementing multiple streams and API-key based authentication +- The [News API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-news-api/source_news_api/manifest.yaml) implements pagination and user-configurable request parameters +- The [CoinGecko API](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-coingecko-coins/source_coingecko_coins/manifest.yaml) implements incremental syncs Note: Not all `manifest.yaml` files can be edited and tested in the connector builder because some are using [custom python classes](https://docs.airbyte.com/connector-development/config-based/advanced-topics#custom-components) which isn't supported yet. diff --git a/docs/connector-development/connector-builder-ui/pagination.md b/docs/connector-development/connector-builder-ui/pagination.md index f10328be681..45314ede802 100644 --- a/docs/connector-development/connector-builder-ui/pagination.md +++ b/docs/connector-development/connector-builder-ui/pagination.md @@ -271,12 +271,13 @@ The following APIs implement cursor pagination in various ways: ## Custom parameter injection Using the "Inject page size / limit / offset into outgoing HTTP request" option in the pagination form works for most cases, but sometimes the API has special requirements that can't be handled this way: -* The API requires to add a prefix or a suffix to the actual value -* Multiple values need to be put together in a single parameter -* The value needs to be injected into the URL path -* Some conditional logic needs to be applied + +- The API requires to add a prefix or a suffix to the actual value +- Multiple values need to be put together in a single parameter +- The value needs to be injected into the URL path +- Some conditional logic needs to be applied To handle these cases, disable injection in the pagination form and use the generic parameter section at the bottom of the stream configuration form to freely configure query parameters, headers and properties of the JSON body, by using jinja expressions and [available variables](/connector-development/config-based/understanding-the-yaml-file/reference/#/variables). You can also use these variables as part of the URL path. For example the [Prestashop API](https://devdocs.prestashop-project.org/8/webservice/cheat-sheet/#list-options) requires to set offset and limit separated by a comma into a single query parameter (`?limit=,`) -For this case, you can use the `next_page_token` variable to configure a query parameter with key `limit` and value `{{ next_page_token['next_page_token'] or '0' }},50` to inject the offset from the pagination strategy and a hardcoded limit of 50 into the same parameter. \ No newline at end of file +For this case, you can use the `next_page_token` variable to configure a query parameter with key `limit` and value `{{ next_page_token['next_page_token'] or '0' }},50` to inject the offset from the pagination strategy and a hardcoded limit of 50 into the same parameter. diff --git a/docs/connector-development/connector-builder-ui/partitioning.md b/docs/connector-development/connector-builder-ui/partitioning.md index 57dc0d5d930..14060a10025 100644 --- a/docs/connector-development/connector-builder-ui/partitioning.md +++ b/docs/connector-development/connector-builder-ui/partitioning.md @@ -2,15 +2,17 @@ Partitioning is required if the records of a stream are grouped into buckets based on an attribute or parent resources that need to be queried separately to extract the records. -Sometimes records belonging to a single stream are partitioned into subsets that need to be fetched separately. In most cases, these partitions are a parent resource type of the resource type targeted by the connector. The partitioning feature can be used to configure your connector to iterate through all partitions. In API documentation, this concept can show up as mandatory parameters that need to be set on the path, query parameters or request body of the request. +Sometimes records belonging to a single stream are partitioned into subsets that need to be fetched separately. In most cases, these partitions are a parent resource type of the resource type targeted by the connector. The partitioning feature can be used to configure your connector to iterate through all partitions. In API documentation, this concept can show up as mandatory parameters that need to be set on the path, query parameters or request body of the request. Common API structures look like this: -* The [SurveySparrow API](https://developers.surveysparrow.com/rest-apis/response#getV3Responses) allows to fetch a list of responses to surveys. For the `/responses` endpoint, the id of the survey to fetch responses for needs to be specified via the query parameter `survey_id`. The API does not allow to fetch responses for all available surveys in a single request, there needs to be a separate request per survey. The surveys represent the partitions of the responses stream. -* The [Woocommerce API](https://woocommerce.github.io/woocommerce-rest-api-docs/#order-notes) includes an endpoint to fetch notes of webshop orders via the `/orders//notes` endpoint. The `` placeholder needs to be set to the id of the order to fetch the notes for. The orders represent the partitions of the notes stream. + +- The [SurveySparrow API](https://developers.surveysparrow.com/rest-apis/response#getV3Responses) allows to fetch a list of responses to surveys. For the `/responses` endpoint, the id of the survey to fetch responses for needs to be specified via the query parameter `survey_id`. The API does not allow to fetch responses for all available surveys in a single request, there needs to be a separate request per survey. The surveys represent the partitions of the responses stream. +- The [Woocommerce API](https://woocommerce.github.io/woocommerce-rest-api-docs/#order-notes) includes an endpoint to fetch notes of webshop orders via the `/orders//notes` endpoint. The `` placeholder needs to be set to the id of the order to fetch the notes for. The orders represent the partitions of the notes stream. There are some cases that require multiple requests to fetch all records as well, but partitioning is not the right tool to configure these in the connector builder: -* If your records are spread out across multiple pages that need to be requested individually if there are too many records, use the Pagination feature. -* If your records are spread out over time and multiple requests are necessary to fetch all data (for example one request per day), use the Incremental sync feature. + +- If your records are spread out across multiple pages that need to be requested individually if there are too many records, use the Pagination feature. +- If your records are spread out over time and multiple requests are necessary to fetch all data (for example one request per day), use the Incremental sync feature. ## Dynamic and static partitioning @@ -23,18 +25,21 @@ The API providing the partitions via one or multiple separate requests is a "dyn ### Parameterized Requests To configure static partitioning, enable the `Parameterized Requests` component. The following fields have to be configured: -* The "Parameter Values" can either be set to a list of strings, making the partitions part of the connector itself, or delegated to a user input so the end user configuring a Source based on the connector can control which partitions to fetch. When using "user input" mode for the parameter values, create a user input of type array and reference it as the value using the [placeholder](/connector-development/config-based/understanding-the-yaml-file/reference#variables) value using `{{ config[''] }}` -* The "Current Parameter Value Identifier" can be freely choosen and is the identifier of the variable holding the current parameter value. It can for example be used in the path of the stream using the `{{ stream_partition. }}` syntax. -* The "Inject Parameter Value into outgoing HTTP Request" option allows you to configure how to add the current parameter value to the requests + +- The "Parameter Values" can either be set to a list of strings, making the partitions part of the connector itself, or delegated to a user input so the end user configuring a Source based on the connector can control which partitions to fetch. When using "user input" mode for the parameter values, create a user input of type array and reference it as the value using the [placeholder](/connector-development/config-based/understanding-the-yaml-file/reference#variables) value using `{{ config[''] }}` +- The "Current Parameter Value Identifier" can be freely choosen and is the identifier of the variable holding the current parameter value. It can for example be used in the path of the stream using the `{{ stream_partition. }}` syntax. +- The "Inject Parameter Value into outgoing HTTP Request" option allows you to configure how to add the current parameter value to the requests #### Example To enable static partitioning defined as part of the connector for the [SurveySparrow API](https://developers.surveysparrow.com/rest-apis/response#getV3Responses) responses, the Parameterized Requests component needs to be configured as following: -* "Parameter Values" are set to the list of survey ids to fetch -* "Current Parameter Value Identifier" is set to `survey` (this is not used for this example) -* "Inject Parameter Value into outgoing HTTP Request" is set to `request_parameter` for the field name `survey_id` + +- "Parameter Values" are set to the list of survey ids to fetch +- "Current Parameter Value Identifier" is set to `survey` (this is not used for this example) +- "Inject Parameter Value into outgoing HTTP Request" is set to `request_parameter` for the field name `survey_id` When parameter values were set to `123`, `456` and `789`, the following requests will be executed: + ``` curl -X GET https://api.surveysparrow.com/v3/responses?survey_id=123 curl -X GET https://api.surveysparrow.com/v3/responses?survey_id=456 @@ -42,16 +47,18 @@ curl -X GET https://api.surveysparrow.com/v3/responses?survey_id=789 ``` To enable user-configurable static partitions for the [Woocommerce API](https://woocommerce.github.io/woocommerce-rest-api-docs/#order-notes) order notes, the configuration would look like this: -* Set "Parameter Values" to "User input" -* In the "Value" input, click the user icon and create a new user input -* Name it `Order IDs`, set type to `array` and click create -* Set "Current Parameter Value Identifier" to `order` -* "Inject Parameter Value into outgoing HTTP Request" is disabled, because the order id needs to be injected into the path -* In the general section of the stream configuration, the "URL Path" is set to `/orders/{{ stream_partition.order }}/notes` + +- Set "Parameter Values" to "User input" +- In the "Value" input, click the user icon and create a new user input +- Name it `Order IDs`, set type to `array` and click create +- Set "Current Parameter Value Identifier" to `order` +- "Inject Parameter Value into outgoing HTTP Request" is disabled, because the order id needs to be injected into the path +- In the general section of the stream configuration, the "URL Path" is set to `/orders/{{ stream_partition.order }}/notes` When Order IDs were set to `123`, `456` and `789` in the testing values, the following requests will be executed: + ``` curl -X GET https://example.com/wp-json/wc/v3/orders/123/notes curl -X GET https://example.com/wp-json/wc/v3/orders/456/notes @@ -63,20 +70,23 @@ curl -X GET https://example.com/wp-json/wc/v3/orders/789/notes To fetch the list of partitions (in this example surveys or orders) from the API itself, the "Parent Stream" component has to be used. It allows you to select another stream of the same connector to serve as the source for partitions to fetch. Each record of the parent stream is used as a partition for the current stream. The following fields have to be configured to use the Parent Stream component: -* The "Parent Stream" defines the records of which stream should be used as partitions -* The "Parent Key" is the property on the parent stream record that should become the partition value (in most cases this is some form of id) -* The "Current Parent Key Value Identifier" can be freely choosen and is the identifier of the variable holding the current partition value. It can for example be used in the path of the stream using the `{{ stream_partition. }}` [interpolation placeholder](/connector-development/config-based/understanding-the-yaml-file/reference#variables). + +- The "Parent Stream" defines the records of which stream should be used as partitions +- The "Parent Key" is the property on the parent stream record that should become the partition value (in most cases this is some form of id) +- The "Current Parent Key Value Identifier" can be freely choosen and is the identifier of the variable holding the current partition value. It can for example be used in the path of the stream using the `{{ stream_partition. }}` [interpolation placeholder](/connector-development/config-based/understanding-the-yaml-file/reference#variables). #### Example To enable dynamic partitioning for the [Woocommerce API](https://woocommerce.github.io/woocommerce-rest-api-docs/#order-notes) order notes, first an orders stream needs to be configured for the `/orders` endpoint to fetch a list of orders. Once this is done, the Parent Stream component for the responses stream has be configured like this: -* "Parent Key" is set to `id` -* "Current Parent Key Value Identifier" is set to `order` -* In the general section of the stream configuration, the "URL Path" is set to `/orders/{{ stream_partition.order }}/notes` + +- "Parent Key" is set to `id` +- "Current Parent Key Value Identifier" is set to `order` +- In the general section of the stream configuration, the "URL Path" is set to `/orders/{{ stream_partition.order }}/notes` When triggering a sync, the connector will first fetch all records of the orders stream. The records will look like this: + ``` { "id": 123, "currency": "EUR", "shipping_total": "12.23", ... } { "id": 456, "currency": "EUR", "shipping_total": "45.56", ... } @@ -84,6 +94,7 @@ When triggering a sync, the connector will first fetch all records of the orders ``` To turn a record into a partition value, the "parent key" is extracted, resulting in the partition values `123`, `456` and `789`. In turn, this results in the following requests to fetch the records of the notes stream: + ``` curl -X GET https://example.com/wp-json/wc/v3/orders/123/notes curl -X GET https://example.com/wp-json/wc/v3/orders/456/notes @@ -97,6 +108,7 @@ It is possible to configure multiple partitioning mechanisms on a single stream For example, the [Google Pagespeed API](https://developers.google.com/speed/docs/insights/v5/reference/pagespeedapi/runpagespeed) allows to specify the URL and the "strategy" to run an analysis for. To allow a user to trigger an analysis for multiple URLs and strategies at the same time, two Parameterized Request lists can be used (one injecting the parameter value into the `url` parameter, one injecting it into the `strategy` parameter). If a user configures the URLs `example.com` and `example.org` and the strategies `desktop` and `mobile`, then the following requests will be triggered + ``` curl -X GET https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=example.com&strategy=desktop curl -X GET https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=example.com&strategy=mobile @@ -109,15 +121,18 @@ curl -X GET https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=examp Sometimes it's helpful to attach the partition a record belongs to to the record itself so it can be used during analysis in the destination. This can be done using a transformation to add a field and the `{{ stream_partition. }}` interpolation placeholder. For example when fetching the order notes via the [Woocommerce API](https://woocommerce.github.io/woocommerce-rest-api-docs/#order-notes), the order id itself is not included in the note record, which means it won't be possible to associate which note belongs to which order: + ``` { "id": 999, "author": "Jon Doe", "note": "Great product!" } ``` However the order id can be added by taking the following steps: -* Making sure the "Current Parameter Value Identifier" is set to `order` -* Add an "Add field" transformation with "Path" `order_id` and "Value" `{{ stream_partition.order }}` + +- Making sure the "Current Parameter Value Identifier" is set to `order` +- Add an "Add field" transformation with "Path" `order_id` and "Value" `{{ stream_partition.order }}` Using this configuration, the notes record looks like this: + ``` { "id": 999, "author": "Jon Doe", "note": "Great product!", "order_id": 123 } ``` @@ -125,9 +140,10 @@ Using this configuration, the notes record looks like this: ## Custom parameter injection Using the "Inject Parameter / Parent Key Value into outgoing HTTP Request" option in the Parameterized Requests and Parent Stream components works for most cases, but sometimes the API has special requirements that can't be handled this way: -* The API requires to add a prefix or a suffix to the actual value -* Multiple values need to be put together in a single parameter -* The value needs to be injected into the URL path -* Some conditional logic needs to be applied + +- The API requires to add a prefix or a suffix to the actual value +- Multiple values need to be put together in a single parameter +- The value needs to be injected into the URL path +- Some conditional logic needs to be applied To handle these cases, disable injection in the component and use the generic parameter section at the bottom of the stream configuration form to freely configure query parameters, headers and properties of the JSON body, by using jinja expressions and [available variables](/connector-development/config-based/understanding-the-yaml-file/reference/#/variables). You can also use these variables (like `stream_partition`) as part of the URL path as shown in the Woocommerce example above. diff --git a/docs/connector-development/connector-metadata-file.md b/docs/connector-development/connector-metadata-file.md index bdee0bd9fba..38518cecfc8 100644 --- a/docs/connector-development/connector-metadata-file.md +++ b/docs/connector-development/connector-metadata-file.md @@ -49,12 +49,12 @@ This section contains two subsections: `cloud` and `oss` (Open Source Software). Here's how the `registries` section is structured in our previous `metadata.yaml` example: ```yaml - registries: - cloud: - dockerRepository: airbyte/source-postgres-strict-encrypt - enabled: true - oss: - enabled: true +registries: + cloud: + dockerRepository: airbyte/source-postgres-strict-encrypt + enabled: true + oss: + enabled: true ``` In this example, both `cloud` and `oss` registries are enabled, and the Docker repository for the `cloud` registry is overrode to `airbyte/source-postgres-strict-encrypt`. @@ -79,6 +79,7 @@ tags: - "keyword:database" - "keyword:SQL" ``` + In the example above, the connector has three tags. Tags are used for two primary purposes in Airbyte: 1. **Denoting the Programming Language(s)**: Tags that begin with language: are used to specify the programming languages that are utilized by the connector. This information is auto-generated by a script that scans the connector's files for recognized programming languages. In the example above, language:java means that the connector uses Java. @@ -88,6 +89,7 @@ In the example above, the connector has three tags. Tags are used for two primar These are just examples of how tags can be used. As a free-form field, the tags list can be customized as required for each connector. This flexibility allows tags to be a powerful tool for managing and discovering connectors. ## The `icon` Field + _⚠️ This property is in the process of being refactored to be a file in the connector folder_ You may notice a `icon.svg` file in the connectors folder. @@ -97,24 +99,28 @@ This is because we are transitioning away from icons being stored in the `airbyt This transition is currently in progress. Once it is complete, the `icon` field in the `metadata.yaml` file will be removed, and the `icon.svg` file will be used instead. ## The `releases` Section + The `releases` section contains extra information about certain types of releases. The current types of releases are: -* `breakingChanges` + +- `breakingChanges` ### `breakingChanges` The `breakingChanges` section of `releases` contains a dictionary of version numbers (usually major versions, i.e. `1.0.0`) and information about their associated breaking changes. Each entry must contain the following parameters: -* `message`: A description of the breaking change, written in a user-friendly format. This message should briefly describe - * What the breaking change is, and which users it effects (e.g. all users of the source, or only those using a certain stream) - * Why the change is better for the user (fixed a bug, something got faster, etc) - * What the user should do to fix the issue (e.g. a full reset, run a SQL query in the destinaton, etc) -* `upgradeDeadline`: (`YYYY-MM-DD`) The date by which the user should upgrade to the new version. -When considering what the `upgradeDeadline` should be, target the amount of time which would be reasonable for the user to make the required changes described in the `message` and upgrade giude. If the required changes are _simple_ (e.g. "do a full reset"), 2 weeks is recommended. Note that you do *not* want to link the duration of `upgradeDeadline` to an upstream API's deprecation date. While it is true that the older version of a connector will continue to work for that period of time, it means that users who are pinned to the older version of the connector will not benefit from future updates and fixes. +- `message`: A description of the breaking change, written in a user-friendly format. This message should briefly describe + - What the breaking change is, and which users it effects (e.g. all users of the source, or only those using a certain stream) + - Why the change is better for the user (fixed a bug, something got faster, etc) + - What the user should do to fix the issue (e.g. a full reset, run a SQL query in the destinaton, etc) +- `upgradeDeadline`: (`YYYY-MM-DD`) The date by which the user should upgrade to the new version. -Without all 3 of these points, the breaking change message is not helpful to users. +When considering what the `upgradeDeadline` should be, target the amount of time which would be reasonable for the user to make the required changes described in the `message` and upgrade giude. If the required changes are _simple_ (e.g. "do a full reset"), 2 weeks is recommended. Note that you do _not_ want to link the duration of `upgradeDeadline` to an upstream API's deprecation date. While it is true that the older version of a connector will continue to work for that period of time, it means that users who are pinned to the older version of the connector will not benefit from future updates and fixes. + +Without all 3 of these points, the breaking change message is not helpful to users. Here is an example: + ```yaml releases: breakingChanges: @@ -124,6 +130,7 @@ releases: ``` #### `scopedImpact` + The optional `scopedImpact` property allows you to provide a list of scopes for which the change is breaking. This allows you to reduce the scope of the change; it's assumed that any scopes not listed are unaffected by the breaking change. @@ -145,11 +152,12 @@ if they are not syncing the `users` stream. The supported scope types are listed below. -| Scope Type | Value Type | Value Description | -|------------|------------|------------------| -| stream | `list[str]` | List of stream names | +| Scope Type | Value Type | Value Description | +| ---------- | ----------- | -------------------- | +| stream | `list[str]` | List of stream names | #### `remoteRegistries` + The optional `remoteRegistries` property allows you to configure how a connector should be published to registries like Pypi. **Important note**: Currently no automated publishing will occur. diff --git a/docs/connector-development/connector-specification-reference.md b/docs/connector-development/connector-specification-reference.md index 76d95e76b76..4a69f903fcc 100644 --- a/docs/connector-development/connector-specification-reference.md +++ b/docs/connector-development/connector-specification-reference.md @@ -5,6 +5,7 @@ The [connector specification](../understanding-airbyte/airbyte-protocol.md#spec) ## Demoing your specification While iterating on your specification, you can preview what it will look like in the UI in realtime by following the instructions below. + 1. Open the `ConnectorForm` preview component in our deployed Storybook at: https://components.airbyte.dev/?path=/story/connector-connectorform--preview 2. Press `raw` on the `connectionSpecification` property, so you will be able to paste a JSON structured string 3. Set the string you want to preview the UI for @@ -61,7 +62,7 @@ Additionally, `order` values cannot be duplicated within the same object or grou By default, all optional fields will be collapsed into an `Optional fields` section which can be expanded or collapsed by the user. This helps streamline the UI for setting up a connector by initially focusing attention on the required fields only. For existing connectors, if their configuration contains a non-empty and non-default value for a collapsed optional field, then that section will be automatically opened when the connector is opened in the UI. -These `Optional fields` sections are placed at the bottom of a field group, meaning that all required fields in the same group will be placed above it. To interleave optional fields with required fields, set `always_show: true` on the optional field along with an `order`, which will cause the field to no longer be collapsed in an `Optional fields` section and be ordered as normal. +These `Optional fields` sections are placed at the bottom of a field group, meaning that all required fields in the same group will be placed above it. To interleave optional fields with required fields, set `always_show: true` on the optional field along with an `order`, which will cause the field to no longer be collapsed in an `Optional fields` section and be ordered as normal. **Note:** `always_show` also causes fields that are normally hidden by an OAuth button to still be shwon. @@ -332,7 +333,9 @@ In each item in the `oneOf` array, the `option_title` string field exists with t ``` #### oneOf display type + You can also configure the way that oneOf fields are displayed in the Airbyte UI through the `display_type` property. Valid values for this property are: + - `dropdown` - Renders a dropdown menu containing the title of each option for the user to select - This is a compact look that works well in most cases @@ -342,6 +345,7 @@ You can also configure the way that oneOf fields are displayed in the Airbyte UI - This choice draws more attention to the field and shows the descriptions of each option at all times, which can be useful for important or complicated fields Here is an example of setting the `display_type` of a oneOf field to `dropdown`, along with how it looks in the Airbyte UI: + ``` "update_method": { "type": "object", @@ -381,6 +385,7 @@ Here is an example of setting the `display_type` of a oneOf field to `dropdown`, ] } ``` + ![dropdown oneOf](../assets/docs/oneOf-dropdown.png) And here is how it looks if the `display_type` property is set to `radio` instead: diff --git a/docs/connector-development/debugging-docker.md b/docs/connector-development/debugging-docker.md index 3f707fc0d3d..0710842a354 100644 --- a/docs/connector-development/debugging-docker.md +++ b/docs/connector-development/debugging-docker.md @@ -1,14 +1,17 @@ # Debugging Docker Containers + This guide will cover debugging **JVM docker containers** either started via Docker Compose or started by the worker container, such as a Destination container. This guide will assume use of [IntelliJ Community edition](https://www.jetbrains.com/idea/), however the steps could be applied to another IDE or debugger. ## Prerequisites + You should have the airbyte repo downloaded and should be able to [run the platform locally](https://docs.airbyte.com/deploying-airbyte/local-deployment). Also, if you're on macOS you will need to follow the installation steps for [Docker Mac Connect](https://github.com/chipmk/docker-mac-net-connect). ## Connecting your debugger -This solution utilizes the environment variable `JAVA_TOOL_OPTIONS` which when set to a specific value allows us to connect our debugger. + +This solution utilizes the environment variable `JAVA_TOOL_OPTIONS` which when set to a specific value allows us to connect our debugger. We will also be setting up a **Remote JVM Debug** run configuration in IntelliJ which uses the IP address or hostname to connect. > **Note** @@ -16,18 +19,21 @@ We will also be setting up a **Remote JVM Debug** run configuration in IntelliJ > by IP address. ### Docker Compose Extension -By default, the `docker compose` command will look for a `docker-compose.yaml` file in your directory and execute its instructions. However, you can + +By default, the `docker compose` command will look for a `docker-compose.yaml` file in your directory and execute its instructions. However, you can provide multiple files to the `docker compose` command with the `-f` option. You can read more about how Docker compose combines or overrides values when you provide multiple files [on Docker's Website](https://docs.docker.com/compose/extends/). In the Airbyte repo, there is already another file `docker-compose.debug.yaml` which extends the `docker-compose.yaml` file. Our goal is to set the `JAVA_TOOL_OPTIONS` environment variable in the environment of the container we wish to debug. If you look at the `server` configuration under `services` in the `docker-compose.debug.yaml` file, it should look like this: + ```yaml - server: - environment: - - JAVA_TOOL_OPTIONS=${DEBUG_SERVER_JAVA_OPTIONS} +server: + environment: + - JAVA_TOOL_OPTIONS=${DEBUG_SERVER_JAVA_OPTIONS} ``` + What this is saying is: For the Service `server` add an environment variable `JAVA_TOOL_OPTIONS` with the value of the variable `DEBUG_SERVER_JAVA_OPTIONS`. `DEBUG_SERVER_JAVA_OPTIONS` has no default value, so if we don't provide one, `JAVA_TOOL_OPTIONS` will be blank or empty. When running the `docker compose` command, Docker will look to your local environment variables, to see if you have set a value for `DEBUG_SERVER_JAVA_OPTIONS` and copy that value. To set this value @@ -42,8 +48,9 @@ DEBUG_SERVER_JAVA_OPTIONS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=y > This command also passes in the `VERSION=dev` environment variable, which is recommended from the comments in the `docker-compose.debug.yaml` ### Connecting the Debugger + Now we need to connect our debugger. In IntelliJ, open `Edit Configurations...` from the run menu (Or search for `Edit Configurations` in the command palette). -Create a new *Remote JVM Debug* Run configuration. The `host` option defaults to `localhost` which if you're on Linux you can leave this unchanged. +Create a new _Remote JVM Debug_ Run configuration. The `host` option defaults to `localhost` which if you're on Linux you can leave this unchanged. On a Mac however, you need to find the IP address of your container. **Make sure you've installed and started the [Docker Mac Connect](https://github.com/chipmk/docker-mac-net-connect) service prior to running the `docker compose` command**. With your containers running, run the following command to easily fetch the IP addresses: @@ -60,51 +67,58 @@ $ docker inspect $(docker ps -q ) --format='{{ printf "%-50s" .Name}} {{printf " /airbyte-db airbyte/db:dev 172.18.0.4172.19.0.3 /airbyte-temporal-ui temporalio/web:1.13.0 172.18.0.3172.19.0.2 ``` + You should see an entry for `/airbyte-server` which is the container we've been targeting so copy its IP address (`172.18.0.9` in the example output above) and replace `localhost` in your IntelliJ Run configuration with the IP address. -Save your Remote JVM Debug run configuration and run it with the debug option. You should now be able to place breakpoints in any code that is being executed by the +Save your Remote JVM Debug run configuration and run it with the debug option. You should now be able to place breakpoints in any code that is being executed by the `server` container. If you need to debug another container from the original `docker-compose.yaml` file, you could modify the `docker-compose.debug.yaml` file with a similar option. ### Debugging Containers Launched by the Worker container + The Airbyte platform launches some containers as needed at runtime, which are not defined in the `docker-compose.yaml` file. These containers are the source or destination tasks, among other things. But if we can't pass environment variables to them through the `docker-compose.debug.yaml` file, then how can we set the -`JAVA_TOOL_OPTIONS` environment variable? Well, the answer is that we can *pass it through* the container which launches the other containers - the `worker` container. +`JAVA_TOOL_OPTIONS` environment variable? Well, the answer is that we can _pass it through_ the container which launches the other containers - the `worker` container. For this example, lets say that we want to debug something that happens in the `destination-postgres` connector container. To follow along with this example, you will need to have set up a connection which uses postgres as a destination, however if you want to use a different connector like `source-postgres`, `destination-bigquery`, etc. that's fine. In the `docker-compose.debug.yaml` file you should see an entry for the `worker` service which looks like this + ```yaml - worker: - environment: - - DEBUG_CONTAINER_IMAGE=${DEBUG_CONTAINER_IMAGE} - - DEBUG_CONTAINER_JAVA_OPTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=*:5005 +worker: + environment: + - DEBUG_CONTAINER_IMAGE=${DEBUG_CONTAINER_IMAGE} + - DEBUG_CONTAINER_JAVA_OPTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=*:5005 ``` -Similar to the previous debugging example, we want to pass an environment variable to the `docker compose` command. This time we're setting the + +Similar to the previous debugging example, we want to pass an environment variable to the `docker compose` command. This time we're setting the `DEBUG_CONTAINER_IMAGE` environment variable to the name of the container we're targeting. For our example that is `destination-postgres` so run the command: + ```bash DEBUG_CONTAINER_IMAGE="destination-postgres:5005" VERSION="dev" docker compose -f docker-compose.yaml -f docker-compose.debug.yaml up ``` + The `worker` container now has an environment variable `DEBUG_CONTAINER_IMAGE` with a value of `destination-postgres` which when it compares when it is spawning containers. If the container name matches the environment variable, it will set the `JAVA_TOOL_OPTIONS` environment variable in the container to the value of its `DEBUG_CONTAINER_JAVA_OPTS` environment variable, which is the same value we used in the `server` example. #### Connecting the Debugger to a Worker Spawned Container -To connect your debugger, **the container must be running**. This `destination-postgres` container will only run when we're running one of its tasks, + +To connect your debugger, **the container must be running**. This `destination-postgres` container will only run when we're running one of its tasks, such as when a replication is running. Navigate to a connection in your local Airbyte instance at http://localhost:8000 which uses postgres as a destination. If you ran through the [Postgres to Postgres replication tutorial](https://airbyte.com/tutorials/postgres-replication), you can use this connection. -On the connection page, trigger a manual sync with the "Sync now" button. Because we set the `suspend` option to `y` in our `JAVA_TOOL_OPTIONS` the +On the connection page, trigger a manual sync with the "Sync now" button. Because we set the `suspend` option to `y` in our `JAVA_TOOL_OPTIONS` the container will pause all execution until the debugger is connected. This can be very useful for methods which run very quickly, such as the Check method. -However, this could be very detrimental if it were pushed into a production environment. For now, it gives us time to set a new Remote JVM Debug Configuraiton. +However, this could be very detrimental if it were pushed into a production environment. For now, it gives us time to set a new Remote JVM Debug Configuraiton. -This container will have a different IP than the `server` Remote JVM Debug Run configuration we set up earlier. So lets set up a new one with the IP of +This container will have a different IP than the `server` Remote JVM Debug Run configuration we set up earlier. So lets set up a new one with the IP of the `destination-postgres` container: ```bash $ docker inspect $(docker ps -q ) --format='{{ printf "%-50s" .Name}} {{printf "%-50s" .Config.Image}} {{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' -/destination-postgres-write-52-0-grbsw airbyte/destination-postgres:0.3.26 +/destination-postgres-write-52-0-grbsw airbyte/destination-postgres:0.3.26 /airbyte-proxy airbyte/proxy:dev 172.18.0.10172.19.0.4 /airbyte-worker airbyte/worker:dev 172.18.0.8 /airbyte-server airbyte/server:dev 172.18.0.9 @@ -125,25 +139,29 @@ You can now add breakpoints and debug any code which would be executed in the `d Happy Debugging! #### Connecting the Debugger to an Integration Test Spawned Container -You can also debug code contained in containers spawned in an integration test! This can be used to debug integration tests as well as testing code changes. -The steps involved are: + +You can also debug code contained in containers spawned in an integration test! This can be used to debug integration tests as well as testing code changes. +The steps involved are: + 1. Follow all the steps outlined above to set up the **Remote JVM Debug** run configuration. 2. Edit the run configurations associated with the given integration test with the following environment variables:`DEBUG_CONTAINER_IMAGE=source-postgres;DEBUG_CONTAINER_JAVA_OPTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=*:5005` -Note that you will have to keep repeating this step for every new integration test run configuration you create. -3. Run the integration test in debug mode. In the debug tab, open up the Remote JVM Debugger run configuration you just created. + Note that you will have to keep repeating this step for every new integration test run configuration you create. +3. Run the integration test in debug mode. In the debug tab, open up the Remote JVM Debugger run configuration you just created. 4. Keep trying to attach the Remote JVM Debugger. It will likely fail a couple of times and eventually connect to the test container. If you want a more -deterministic way to connect the debugger, you can set a break point in the `DockerProcessFactor.localDebuggingOptions()` method. Resume running the integration test run and -then attempt to attach the Remote JVM Debugger (you still might need a couple of tries). - + deterministic way to connect the debugger, you can set a break point in the `DockerProcessFactor.localDebuggingOptions()` method. Resume running the integration test run and + then attempt to attach the Remote JVM Debugger (you still might need a couple of tries). ## Gotchas + So now that your debugger is set up, what else is there to know? ### Code changes + When you're debugging, you might want to make a code change. Anytime you make a code change, your code will become out of sync with the container which is run by the platform. Essentially this means that after you've made a change you will need to rebuild the docker container you're debugging. Additionally, for the connector containers, you may have to navigate to "Settings" in your local Airbyte Platform's web UI and change the version of the container to `dev`. See you connector's `README` for details on how to rebuild the container image. ### Ports -In this tutorial we've been using port `5005` for all debugging. It's the default, so we haven't changed it. If you need to debug *multiple* containers however, they will clash on this port. + +In this tutorial we've been using port `5005` for all debugging. It's the default, so we haven't changed it. If you need to debug _multiple_ containers however, they will clash on this port. If you need to do this, you will have to modify your setup to use another port that is not in use. diff --git a/docs/connector-development/migration-to-base-image.md b/docs/connector-development/migration-to-base-image.md index d6bc3bac2d8..03c6f6c9f50 100644 --- a/docs/connector-development/migration-to-base-image.md +++ b/docs/connector-development/migration-to-base-image.md @@ -6,19 +6,21 @@ This guide will help connector developers to migrate their connector to use our N.B: This guide currently only applies to python connectors. ## Prerequisite + [Install the airbyte-ci tool](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1) - ## Definition of a successful migration + 1. The connector `Dockerfile` is removed from the connector folder 2. The connector `metadata.yaml` is referencing the latest base image in the `data.connectorBuildOptions.baseImage` key 3. The connector version is bumped by a patch increment -4. A changelog entry is added to the connector documentation file +4. A changelog entry is added to the connector documentation file 5. The connector is successfully built and tested by our CI 6. If you add `build_customization.py` to your connector, the Connector Operations team has reviewed and approved your changes. ## Semi automated migration -- Run `airbyte-ci connectors --name= migrate_to_base_image ` + +- Run `airbyte-ci connectors --name= migrate_to_base_image ` - Commit and push the changes on your PR ## Manual migration @@ -28,17 +30,19 @@ In order for a connector to use our base image it has to declare it in its `meta Example: ```yaml - connectorBuildOptions: - baseImage: docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c +connectorBuildOptions: + baseImage: docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c ``` ### Why are we using long addresses instead of tags? + **For build reproducibility!**. -Using full image address allows us to have a more deterministic build process. +Using full image address allows us to have a more deterministic build process. If we used tags our connector could get built with a different base image if the tag was overwritten. In other word, using the image digest (sha256), we have the guarantee that a build, on the same commit, will always use the same base image. ### What if my connector needs specific system dependencies? + Declaring the base image in the metadata.yaml file makes the Dockerfile obselete and the connector will be built using our internal build process declared [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/python_connectors.py#L55). If your connector has specific system dependencies, or has to set environment variables, we have a pre/post build hook framework for that. @@ -47,6 +51,7 @@ This module should contain a `pre_connector_install` and `post_connector_install It will be imported at runtime by our build process and the functions will be called if they exist. Here is an example of a `build_customization.py` module: + ```python from __future__ import annotations @@ -67,12 +72,14 @@ async def post_connector_install(connector_container: Container) -> Container: ### Listing migrated / non migrated connectors: -To list all migrated certified connectors you can ran: +To list all migrated certified connectors you can ran: + ```bash airbyte-ci connectors --support-level=certified --metadata-query="data.connectorBuildOptions.baseImage is not None" list ``` -To list all non migrated certified connectors you can ran: +To list all non migrated certified connectors you can ran: + ```bash airbyte-ci connectors --metadata-query="data.supportLevel == 'certified' and 'connectorBuildOptions' not in data.keys()" list ``` diff --git a/docs/connector-development/schema-reference.md b/docs/connector-development/schema-reference.md index c7650cca2e2..e243d76d6fa 100644 --- a/docs/connector-development/schema-reference.md +++ b/docs/connector-development/schema-reference.md @@ -1,10 +1,10 @@ # Schema Reference -This document provides instructions on how to create a static schema for your Airbyte stream, which is necessary for integrating data from various sources. +This document provides instructions on how to create a static schema for your Airbyte stream, which is necessary for integrating data from various sources. You can check out all the supported data types and examples at [this link](../understanding-airbyte/supported-data-types.md). - For instance, the example record response for the schema is shown below: + ```json { "id": "hashidstring", @@ -50,23 +50,21 @@ The schema is then translated into the following JSON format. Please note that i "type": ["null", "object"], "additionalProperties": true, "properties": { - "steps": { - "type": ["null", "string"] + "steps": { + "type": ["null", "string"] }, "count_steps": { - "type": ["null", "integer"] + "type": ["null", "integer"] } } }, "example_string_array": { - "items": { - "type": ["null", "string"] - } + "items": { + "type": ["null", "string"] + } } } } ``` We hope this guide helps you create a successful static schema for your Airbyte stream. Please don't hesitate to reach out if you have any further questions or concerns. - - diff --git a/docs/connector-development/testing-connectors/README.md b/docs/connector-development/testing-connectors/README.md index 06377781fd2..446905a1d6f 100644 --- a/docs/connector-development/testing-connectors/README.md +++ b/docs/connector-development/testing-connectors/README.md @@ -3,18 +3,23 @@ Multiple tests suites compose the Airbyte connector testing pyramid ## Common to all connectors -* [Connectors QA checks](https://docs.airbyte.com/contributing-to-airbyte/resources/qa-checks) -* [Connector Acceptance tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/) + +- [Connectors QA checks](https://docs.airbyte.com/contributing-to-airbyte/resources/qa-checks) +- [Connector Acceptance tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/) ## Connector specific tests + ### 🐍 Python connectors + We use `pytest` to run unit and integration tests: + ```bash # From connector directory poetry run pytest ``` ### ☕ Java connectors + We run Java connector tests with gradle. ```bash @@ -27,6 +32,7 @@ We run Java connector tests with gradle. Please note that according to the test implementation you might have to provide connector configurations as a `config.json` file in a `.secrets` folder in the connector code directory. ## 🤖 CI + If you want to run the global test suite, exactly like what is run in CI, you should install [`airbyte-ci` CLI](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) and use the following command: ```bash @@ -36,6 +42,5 @@ airbyte-ci connectors --name= test This will run all the tests for the connector, including the QA checks and the Connector Acceptance tests. Connector Acceptance tests require connector configuration to be provided as a `config.json` file in a `.secrets` folder in the connector code directory. - Our CI infrastructure runs the connector tests with [`airbyte-ci` CLI](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md). Connectors tests are automatically and remotely triggered on your branch according to the changes made in your branch. **Passing tests are required to merge a connector pull request.** diff --git a/docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md b/docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md index 375146b7d83..998126731c5 100644 --- a/docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md +++ b/docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md @@ -145,7 +145,7 @@ These backward compatibility tests can be bypassed by changing the value of the One more test validates the specification against containing exposed secrets. This means fields that potentially could hold a secret value should be explicitly marked with `"airbyte_secret": true`. If an input field like `api_key` / `password` / `client_secret` / etc. is exposed, the test will fail. | Input | Type | Default | Note | -|:-----------------------------------------------------------------|:--------|:--------------------|:----------------------------------------------------------------------------------------------------------------------| +| :--------------------------------------------------------------- | :------ | :------------------ | :-------------------------------------------------------------------------------------------------------------------- | | `spec_path` | string | `secrets/spec.json` | Path to a YAML or JSON file representing the spec expected to be output by this connector | | `backward_compatibility_tests_config.previous_connector_version` | string | `latest` | Previous connector version to use for backward compatibility tests (expects a version following semantic versioning). | | `backward_compatibility_tests_config.disable_for_version` | string | None | Disable the backward compatibility test for a specific version (expects a version following semantic versioning). | @@ -183,30 +183,30 @@ These backward compatibility tests can be bypassed by changing the value of the Configuring all streams in the input catalog to full refresh mode verifies that a read operation produces some RECORD messages. Each stream should have some data, if you can't guarantee this for particular streams - add them to the `empty_streams` list. Set `validate_data_points=True` if possible. This validation is going to be enabled by default and won't be configurable in future releases. -| Input | Type | Default | Note | -|:------------------------------------------------|:-----------------|:--------------------------------------------|:--------------------------------------------------------------------------------------------------------------| -| `config_path` | string | `secrets/config.json` | Path to a JSON object representing a valid connector configuration | -| `configured_catalog_path` | string | `integration_tests/configured_catalog.json` | Path to configured catalog | -| `empty_streams` | array of objects | \[\] | List of streams that might be empty with a `bypass_reason` | -| `empty_streams[0].name` | string | | Name of the empty stream | -| `empty_streams[0].bypass_reason` | string | None | Reason why this stream is empty | -| `ignored_fields[stream][0].name` | string | | Name of the ignored field | -| `ignored_fields[stream][0].bypass_reason` | string | None | Reason why this field is ignored | -| `validate_schema` | boolean | True | Verify that structure and types of records matches the schema from discovery command | -| `fail_on_extra_columns` | boolean | True | Fail schema validation if undeclared columns are found in records. Only relevant when `validate_schema=True` | -| `validate_data_points` | boolean | False | Validate that all fields in all streams contained at least one data point | -| `timeout_seconds` | int | 5\*60 | Test execution timeout in seconds | -| `expect_trace_message_on_failure` | boolean | True | Ensure that a trace message is emitted when the connector crashes | -| `expect_records` | object | None | Compare produced records with expected records, see details below | -| `expect_records.path` | string | | File with expected records | -| `expect_records.bypass_reason` | string | | Explain why this test is bypassed | -| `expect_records.exact_order` | boolean | False | Ensure that records produced in exact same order | -| `file_types` | object | None | Configure file-based connectors specific tests | -| `file_types.skip_test` | boolean | False | Skip file-based connectors specific tests for the current config with a `bypass_reason` | -| `file_types.bypass_reason` | string | None | Reason why file-based connectors specific tests are skipped | -| `file_types.unsupported_types` | array of objects | None | Configure file types which are not supported by a source | -| `file_types.unsupported_types[0].extension` | string | | File type in `.csv` format which cannot be added to a test account | -| `file_types.unsupported_types[0].bypass_reason` | string | None | Reason why this file type cannot be added to a test account | +| Input | Type | Default | Note | +| :---------------------------------------------- | :--------------- | :------------------------------------------ | :----------------------------------------------------------------------------------------------------------- | +| `config_path` | string | `secrets/config.json` | Path to a JSON object representing a valid connector configuration | +| `configured_catalog_path` | string | `integration_tests/configured_catalog.json` | Path to configured catalog | +| `empty_streams` | array of objects | \[\] | List of streams that might be empty with a `bypass_reason` | +| `empty_streams[0].name` | string | | Name of the empty stream | +| `empty_streams[0].bypass_reason` | string | None | Reason why this stream is empty | +| `ignored_fields[stream][0].name` | string | | Name of the ignored field | +| `ignored_fields[stream][0].bypass_reason` | string | None | Reason why this field is ignored | +| `validate_schema` | boolean | True | Verify that structure and types of records matches the schema from discovery command | +| `fail_on_extra_columns` | boolean | True | Fail schema validation if undeclared columns are found in records. Only relevant when `validate_schema=True` | +| `validate_data_points` | boolean | False | Validate that all fields in all streams contained at least one data point | +| `timeout_seconds` | int | 5\*60 | Test execution timeout in seconds | +| `expect_trace_message_on_failure` | boolean | True | Ensure that a trace message is emitted when the connector crashes | +| `expect_records` | object | None | Compare produced records with expected records, see details below | +| `expect_records.path` | string | | File with expected records | +| `expect_records.bypass_reason` | string | | Explain why this test is bypassed | +| `expect_records.exact_order` | boolean | False | Ensure that records produced in exact same order | +| `file_types` | object | None | Configure file-based connectors specific tests | +| `file_types.skip_test` | boolean | False | Skip file-based connectors specific tests for the current config with a `bypass_reason` | +| `file_types.bypass_reason` | string | None | Reason why file-based connectors specific tests are skipped | +| `file_types.unsupported_types` | array of objects | None | Configure file types which are not supported by a source | +| `file_types.unsupported_types[0].extension` | string | | File type in `.csv` format which cannot be added to a test account | +| `file_types.unsupported_types[0].bypass_reason` | string | None | Reason why this file type cannot be added to a test account | `expect_records` is a nested configuration, if omitted - the part of the test responsible for record matching will be skipped. @@ -284,22 +284,22 @@ This test verifies that sync produces no records when run with the STATE with ab Verifies that certain properties of the connector and its streams guarantee a higher level of usability standards for certified connectors. Some examples of the types of tests covered are verification that streams define primary keys, correct OAuth spec configuration, or a connector emits the correct stream status during a read. -| Input | Type | Default | Note | -|:------------------------------------------|:-----------------|:----------------------|:-----------------------------------------------------------------------| -| `config_path` | string | `secrets/config.json` | Path to a JSON object representing a valid connector configuration | -| `streams_without_primary_key` | array of objects | None | List of streams that do not support a primary key like reports streams | -| `streams_without_primary_key.name` | string | None | Name of the stream missing the PK | -| `streams_without_primary_key.bypass_reason` | string | None | The reason the stream doesn't have the PK | -| `allowed_hosts.bypass_reason` | object with `bypass_reason` | None | Defines the `bypass_reason` description about why the `allowedHosts` check for the certified connector should be skipped | -| `suggested_streams.bypass_reason` | object with `bypass_reason` | None | Defines the `bypass_reason` description about why the `suggestedStreams` check for the certified connector should be skipped | +| Input | Type | Default | Note | +| :------------------------------------------ | :-------------------------- | :-------------------- | :--------------------------------------------------------------------------------------------------------------------------- | +| `config_path` | string | `secrets/config.json` | Path to a JSON object representing a valid connector configuration | +| `streams_without_primary_key` | array of objects | None | List of streams that do not support a primary key like reports streams | +| `streams_without_primary_key.name` | string | None | Name of the stream missing the PK | +| `streams_without_primary_key.bypass_reason` | string | None | The reason the stream doesn't have the PK | +| `allowed_hosts.bypass_reason` | object with `bypass_reason` | None | Defines the `bypass_reason` description about why the `allowedHosts` check for the certified connector should be skipped | +| `suggested_streams.bypass_reason` | object with `bypass_reason` | None | Defines the `bypass_reason` description about why the `suggestedStreams` check for the certified connector should be skipped | ## Test Connector Documentation -Verifies that connectors documentation follows our standard template, does have correct order of headings, -does not have missing headings and all required fields in Prerequisites section. +Verifies that connectors documentation follows our standard template, does have correct order of headings, +does not have missing headings and all required fields in Prerequisites section. | Input | Type | Default | Note | -|:------------------|:-------|:----------------------|:-------------------------------------------------------------------| +| :---------------- | :----- | :-------------------- | :----------------------------------------------------------------- | | `config_path` | string | `secrets/config.json` | Path to a JSON object representing a valid connector configuration | | `timeout_seconds` | int | 20\*60 | Test execution timeout in seconds | diff --git a/docs/connector-development/tutorials/custom-python-connector/2-reading-a-page.md b/docs/connector-development/tutorials/custom-python-connector/2-reading-a-page.md index c3bd7f223c3..11caf60c91a 100644 --- a/docs/connector-development/tutorials/custom-python-connector/2-reading-a-page.md +++ b/docs/connector-development/tutorials/custom-python-connector/2-reading-a-page.md @@ -333,7 +333,10 @@ poetry run source-survey-monkey-demo read --config secrets/config.json --catalog The connector should've successfully read records. ```json -{ "type": "LOG", "log": { "level": "INFO", "message": "Read 14 records from surveys stream" } } +{ + "type": "LOG", + "log": { "level": "INFO", "message": "Read 14 records from surveys stream" } +} ``` You can also pass in the `--debug` flag to see the real requests and responses sent and received. diff --git a/docs/connector-development/tutorials/the-hard-way/build-a-connector-the-hard-way.md b/docs/connector-development/tutorials/the-hard-way/build-a-connector-the-hard-way.md index 876898b3144..98ab769a442 100644 --- a/docs/connector-development/tutorials/the-hard-way/build-a-connector-the-hard-way.md +++ b/docs/connector-development/tutorials/the-hard-way/build-a-connector-the-hard-way.md @@ -125,7 +125,7 @@ To contact the stock ticker API, we need two things: 2. The API key to use when contacting the API \(you can obtain a free API token from [Polygon.io](https://polygon.io/dashboard/signup) free plan\) -:::info +:::info For reference, the API docs we'll be using [can be found here](https://polygon.io/docs/stocks/get_v2_aggs_ticker__stocksticker__range__multiplier___timespan___from___to). diff --git a/docs/connector-development/ux-handbook.md b/docs/connector-development/ux-handbook.md index 5253df5e9ed..bc31bf2df97 100644 --- a/docs/connector-development/ux-handbook.md +++ b/docs/connector-development/ux-handbook.md @@ -65,9 +65,9 @@ Data replicated by Airbyte must be correct and complete. If a user moves data wi Some tricky examples which can break data integrity if not handled correctly: -* Zipcodes for the US east coast should not lose their leading zeros because of being detected as integer -* Database timezones could affect the value of timestamps -* Esoteric text values (e.g: weird UTF characters) +- Zipcodes for the US east coast should not lose their leading zeros because of being detected as integer +- Database timezones could affect the value of timestamps +- Esoteric text values (e.g: weird UTF characters) **Reliability** @@ -97,10 +97,10 @@ There is also a tension between featureset and ease of use. The more features ar Without repeating too many details mentioned elsewhere, the important thing to know is Airbyte serves all the following personas: -| **Persona** | **Level of technical knowledge** | -| ------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| Data Analyst |

    Proficient with:

    Data manipulation tools like Excel or SQL
    Dashboard tools like Looker

    Not very familiar with reading API docs and doesn't know what a curl request is. But might be able to generate an API key if you tell them exactly how.

    | -| Analytics Engineer |

    Proficient with:

    SQL & DBT
    Git
    A scripting language like Python
    Shallow familiarity with infra tools like Docker

    Much more technical than a data analyst, but not as much as a data engineer

    | +| **Persona** | **Level of technical knowledge** | +| ------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Data Analyst |

    Proficient with:

    Data manipulation tools like Excel or SQL
    Dashboard tools like Looker

    Not very familiar with reading API docs and doesn't know what a curl request is. But might be able to generate an API key if you tell them exactly how.

    | +| Analytics Engineer |

    Proficient with:

    SQL & DBT
    Git
    A scripting language like Python
    Shallow familiarity with infra tools like Docker

    Much more technical than a data analyst, but not as much as a data engineer

    | | Data Engineer |

    Proficient with:

    SQL & DBT
    Git
    2 or more programming languages
    Infra tools like Docker or Kubernetes
    Cloud technologies like AWS or GCP
    Building or consuming APIs
    orhestartion tools like Airflow

    The most technical persona we serve. Think of them like an engineer on your team

    | Keep in mind that the distribution of served personas will differ per connector. Data analysts are highly unlikely to form the majority of users for a very technical connector like say, Kafka. @@ -181,13 +181,13 @@ All configurations must have an unmistakable explanation describing their purpos For example, in some Ads APIs like Facebook, the user’s data may continue to be updated up to 28 days after it is created. This happens because a user may take action because of an ad (like buying a product) many days after they see the ad. In this case, the user may want to configure a “lookback” window for attributing. -Adding a parameter “attribution\_lookback\_window” with no explanation might confuse the user more than it helps them. Instead, we should add a clear title and description which describes what this parameter is and how different values will impact the data output by the connector. +Adding a parameter “attribution_lookback_window” with no explanation might confuse the user more than it helps them. Instead, we should add a clear title and description which describes what this parameter is and how different values will impact the data output by the connector. **Document how users can obtain configuration parameters** If a user needs to obtain an API key or host name, tell them exactly where to find it. Ideally you would show them screenshots, though include a date and API version in those if possible, so it’s clear when they’ve aged out of date. -**Links should point to page anchors where applicable**. +**Links should point to page anchors where applicable**. Often, you are trying to redirect the user to a specific part of the page. For example, if you wanted to point someone to the "Input Configuration" section of this doc, it is better to point them to `https://docs.airbyte.com/connector-development/ux-handbook#input-configuration` instead of `https://docs.airbyte.com/connector-development/ux-handbook`. @@ -247,8 +247,8 @@ Assuming we follow ELT over ETL, and automate generation of output schemas, this If for any reason we need to change the output schema declared by a connector in a backwards breaking way, consider it a necessary evil that should be avoided if possible. Basically, the only reasons for a backwards breaking change should be: -* a connector previously had an incorrect schema, or -* It was not following ELT principles and is now being changed to follow them +- a connector previously had an incorrect schema, or +- It was not following ELT principles and is now being changed to follow them Other breaking changes should probably be escalated for approval. diff --git a/docs/contributing-to-airbyte/README.md b/docs/contributing-to-airbyte/README.md index aa36724dd24..6cadca6289a 100644 --- a/docs/contributing-to-airbyte/README.md +++ b/docs/contributing-to-airbyte/README.md @@ -1,5 +1,5 @@ --- -description: 'We love contributions to Airbyte, big or small.' +description: "We love contributions to Airbyte, big or small." --- # Contributing to Airbyte @@ -19,12 +19,13 @@ A great place to start looking will be our GitHub projects for: Due to project priorities, we may not be able to accept all contributions at this time. We are prioritizing the following contributions: -* Bug fixes, features, and enhancements to existing API source connectors. -* Migrating Python CDK to Low-code or No-Code Framework. -* New connector sources built with the Low-Code CDK or Connector Builder, as these connectors are easier to maintain. -* Bug fixes, features, and enhancements to the following database sources: Postgres, MySQL, MSSQL. -* Bug fixes to the following destinations: BigQuery, Snowflake, Redshift, S3, and Postgres. -* Helm Charts features, bug fixes, and other platform bug fixes. + +- Bug fixes, features, and enhancements to existing API source connectors. +- Migrating Python CDK to Low-code or No-Code Framework. +- New connector sources built with the Low-Code CDK or Connector Builder, as these connectors are easier to maintain. +- Bug fixes, features, and enhancements to the following database sources: Postgres, MySQL, MSSQL. +- Bug fixes to the following destinations: BigQuery, Snowflake, Redshift, S3, and Postgres. +- Helm Charts features, bug fixes, and other platform bug fixes. :::warning Airbyte is undergoing a major revamp of the shared core Java destinations codebase, with plans to release a new CDK in 2024. @@ -37,6 +38,7 @@ Contributions outside of these will be evaluated on a case-by-case basis by our ::: The usual workflow of code contribution is: + 1. Fork the Airbyte repository. 2. Clone the repository locally. 3. Create a branch for your feature/bug fix with the format `{YOUR_USERNAME}/{FEATURE/BUG}` (e.g. `jdoe/source-stock-api-stream-fix`) @@ -58,6 +60,7 @@ Pull Requests without updates will be closed due inactivity. ::: Guidelines to common code contributions: + - [Submit code change to existing Source Connector](change-cdk-connector.md) - [Submit a New Connector](submit-new-connector.md) diff --git a/docs/contributing-to-airbyte/change-cdk-connector.md b/docs/contributing-to-airbyte/change-cdk-connector.md index 89466149ac4..07f18ab35d5 100644 --- a/docs/contributing-to-airbyte/change-cdk-connector.md +++ b/docs/contributing-to-airbyte/change-cdk-connector.md @@ -3,11 +3,14 @@ ## Contribution Process ### Open an issue, or find a similar one. + Before jumping into the code please first: -1. Check if the improvement you want to make or bug you want to fix is already captured in an [existing issue](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fconnectors+-label%3Aneeds-triage+label%3Acommunity) + +1. Check if the improvement you want to make or bug you want to fix is already captured in an [existing issue](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fconnectors+-label%3Aneeds-triage+label%3Acommunity) 2. If you don't find an existing issue, either - - [Report a Connector Bug](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fbug%2Carea%2Fconnectors%2Cneeds-triage&projects=&template=1-issue-connector.yaml), or - - [Request a New Connector Feature](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fenhancement%2Cneeds-triage&projects=&template=6-feature-request.yaml) + +- [Report a Connector Bug](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fbug%2Carea%2Fconnectors%2Cneeds-triage&projects=&template=1-issue-connector.yaml), or +- [Request a New Connector Feature](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fenhancement%2Cneeds-triage&projects=&template=6-feature-request.yaml) This will enable our team to make sure your contribution does not overlap with existing works and will comply with the design orientation we are currently heading the product toward. If you do not receive an update on the issue from our team, please ping us on [Slack](https://slack.airbyte.io)! @@ -16,7 +19,8 @@ Make sure you're working on an issue had been already triaged to not have your c ::: ### Code your contribution -1. To contribute to a connector, fork the [Connector repository](https://github.com/airbytehq/airbyte). + +1. To contribute to a connector, fork the [Connector repository](https://github.com/airbytehq/airbyte). 2. Open a branch for your work 3. Code the change 4. Write a unit test for each custom function you added or changed @@ -25,7 +29,6 @@ Make sure you're working on an issue had been already triaged to not have your c 7. Update the changelog entry in documentation in `docs/integrations/.md` 8. Make sure your contribution passes our [QA checks](./resources/qa-checks.md) - :::info There is a README file inside each connector folder containing instructions to run that connector's tests locally. ::: @@ -34,8 +37,8 @@ There is a README file inside each connector folder containing instructions to r Pay attention to breaking changes to connectors. You can read more [here](#breaking-changes-to-connectors). ::: - ### Open a pull request + 1. Rebase master with your branch before submitting a pull request. 2. Open the pull request. 3. Follow the [title convention](./resources/pull-requests-handbook.md#pull-request-title-convention) for Pull Requests @@ -44,34 +47,37 @@ Pay attention to breaking changes to connectors. You can read more [here](#break 6. Wait for a review from a community maintainer or our team. ### Review process -When we review, we look at: -* ‌Does the PR solve the issue? -* Is the proposed solution reasonable? -* Is it tested? \(unit tests or integration tests\) -* Is it introducing security risks? -* Is it introducing a breaking change? -‌Once your PR passes, we will merge it 🎉. +When we review, we look at: + +- ‌Does the PR solve the issue? +- Is the proposed solution reasonable? +- Is it tested? \(unit tests or integration tests\) +- Is it introducing security risks? +- Is it introducing a breaking change? + ‌Once your PR passes, we will merge it 🎉. ## Breaking Changes to Connectors Often times, changes to connectors can be made without impacting the user experience.  However, there are some changes that will require users to take action before they can continue to sync data.  These changes are considered **Breaking Changes** and require: -1. A **Major Version** increase  +1. A **Major Version** increase 2. A [`breakingChanges` entry](https://docs.airbyte.com/connector-development/connector-metadata-file/) in the `releases` section of the `metadata.yaml` file 3. A migration guide which details steps that users should take to resolve the change 4. An Airbyte Engineer to follow the  [Connector Breaking Change Release Playbook](https://docs.google.com/document/u/0/d/1VYQggHbL_PN0dDDu7rCyzBLGRtX-R3cpwXaY8QxEgzw/edit) before merging. ### Types of Breaking Changes + A breaking change is any change that will require users to take action before they can continue to sync data. The following are examples of breaking changes: -- **Spec Change** - The configuration required by users of this connector have been changed and syncs will fail until users reconfigure or re-authenticate.  This change is not possible via a Config Migration  +- **Spec Change** - The configuration required by users of this connector have been changed and syncs will fail until users reconfigure or re-authenticate.  This change is not possible via a Config Migration - **Schema Change** - The type of property previously present within a record has changed - **Stream or Property Removal** - Data that was previously being synced is no longer going to be synced. - **Destination Format / Normalization Change** - The way the destination writes the final data or how normalization cleans that data is changing in a way that requires a full-refresh. - **State Changes** - The format of the source’s state has changed, and the full dataset will need to be re-synced ### Limiting the Impact of Breaking Changes + Some of the changes listed above may not impact all users of the connector. For example, a change to the schema of a specific stream only impacts users who are syncing that stream. The breaking change metadata allows you to specify narrowed scopes that are specifically affected by a breaking change. See the [`breakingChanges` entry](https://docs.airbyte.com/connector-development/connector-metadata-file/) documentation for supported scopes. diff --git a/docs/contributing-to-airbyte/issues-and-requests.md b/docs/contributing-to-airbyte/issues-and-requests.md index 3ad347bdb68..183705c2c8b 100644 --- a/docs/contributing-to-airbyte/issues-and-requests.md +++ b/docs/contributing-to-airbyte/issues-and-requests.md @@ -1,11 +1,11 @@ # Issues and Requests ## Report a Bug + Bug reports help us make Airbyte better for everyone. We provide a preconfigured template for bugs to make it very clear what information we need. ‌Please search within our [already reported bugs](https://github.com/airbytehq/airbyte/issues?q=is%3Aissue+is%3Aopen+label%3Atype%2Fbug) before raising a new one to make sure you're not raising a duplicate. - ## Request new Features or Connector Requesting new features or connectors is an essential way to contribute. Your input helps us understand your needs and priorities, enabling us to enhance the functionality and versatility of Airbyte. diff --git a/docs/contributing-to-airbyte/resources/code-formatting.md b/docs/contributing-to-airbyte/resources/code-formatting.md index f2e4ab359fa..65bb10e5aaa 100644 --- a/docs/contributing-to-airbyte/resources/code-formatting.md +++ b/docs/contributing-to-airbyte/resources/code-formatting.md @@ -3,36 +3,45 @@ ## Tools ### 🐍 Python + We format our Python code using: -* [Black](https://github.com/psf/black) for code formatting -* [isort](https://pycqa.github.io/isort/) for import sorting + +- [Black](https://github.com/psf/black) for code formatting +- [isort](https://pycqa.github.io/isort/) for import sorting Our configuration for both tools is in the [pyproject.toml](https://github.com/airbytehq/airbyte/blob/master/pyproject.toml) file. ### ☕ Java + We format our Java code using [Spotless](https://github.com/diffplug/spotless). Our configuration for Spotless is in the [spotless-maven-pom.xml](https://github.com/airbytehq/airbyte/blob/master/spotless-maven-pom.xml) file. ### Json and Yaml + We format our Json and Yaml files using [prettier](https://prettier.io/). ## Pre-push hooks and CI + We wrapped all our code formatting tools in [airbyte-ci](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md). ### Local formatting + You can run `airbyte-ci format fix all` to format all the code in the repository. We wrapped this command in a pre-push hook so that you can't push code that is not formatted. -To install the pre-push hook, run: +To install the pre-push hook, run: + ```bash make tools.pre-commit.setup ``` + This will install `airbyte-ci` and the pre-push hook. The pre-push hook runs formatting on all the repo files. If the hook attempts to format a file that is not part of your contribution, it means that formatting is also broken in the master branch. Please open a separate PR to fix the formatting in the master branch. ### CI checks + In the CI we run the `airbyte-ci format check all` command to check that all the code is formatted. If it is not, the CI will fail and you will have to run `airbyte-ci format fix all` locally to fix the formatting issues. Failure on the CI is not expected if you installed the pre-push hook. diff --git a/docs/contributing-to-airbyte/resources/developing-locally.md b/docs/contributing-to-airbyte/resources/developing-locally.md index 7bffa0174eb..370fe84d30a 100644 --- a/docs/contributing-to-airbyte/resources/developing-locally.md +++ b/docs/contributing-to-airbyte/resources/developing-locally.md @@ -16,23 +16,24 @@ Manually switching between different language versions can get hairy. We recomme To start contributing: -1. [Fork](https://docs.github.com/en/github/getting-started-with-github/fork-a-repo) the [`airbyte`](https://github.com/airbytehq/airbyte) repository to develop connectors or the [ `airbyte-platform`](https://github.com/airbytehq/airbyte-platform) repository to develop the Airbyte platform. +1. [Fork](https://docs.github.com/en/github/getting-started-with-github/fork-a-repo) the [`airbyte`](https://github.com/airbytehq/airbyte) repository to develop connectors or the [ `airbyte-platform`](https://github.com/airbytehq/airbyte-platform) repository to develop the Airbyte platform. 2. Clone the fork on your workstation: If developing connectors, you can work on connectors locally but additionally start the platform independently locally using : - ```bash - git clone git@github.com:{YOUR_USERNAME}/airbyte.git - cd airbyte - ./run-ab-platform.sh - ``` +```bash +git clone git@github.com:{YOUR_USERNAME}/airbyte.git +cd airbyte +./run-ab-platform.sh +``` + If developing platform: - ```bash - git clone git@github.com:{YOUR_USERNAME}/airbyte-platform.git - cd airbyte-platform - docker compose up - ``` +```bash +git clone git@github.com:{YOUR_USERNAME}/airbyte-platform.git +cd airbyte-platform +docker compose up +``` ## Build with `gradle` @@ -107,12 +108,10 @@ In your local `airbyte` repository, run the following command: ``` - Then, build the connector image: - - Install our [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) tool to build your connector. + - Install our [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) tool to build your connector. - Running `airbyte-ci connectors --name source- build` will build your connector image. - Once the command is done, you will find your connector image in your local docker host: `airbyte/source-:dev`. - - :::info The above connector image is tagged with `dev`. You can change this to use another tag if you'd like. @@ -121,7 +120,7 @@ The above connector image is tagged with `dev`. You can change this to use anoth - In your browser, visit [http://localhost:8000/](http://localhost:8000/) - Log in with the default user `airbyte` and default password `password` -- Go to `Settings` (gear icon in lower left corner) +- Go to `Settings` (gear icon in lower left corner) - Go to `Sources` or `Destinations` (depending on which connector you are testing) - Update the version number to use your docker image tag (default is `dev`) - Click `Change` to save the changes @@ -132,7 +131,6 @@ Now when you run a sync with that connector, it will use your local docker image In your local `airbyte-platform` repository, run the following commands to run acceptance \(end-to-end\) tests for the platform: - ```bash SUB_BUILD=PLATFORM ./gradlew clean build SUB_BUILD=PLATFORM ./gradlew :airbyte-tests:acceptanceTests @@ -196,6 +194,7 @@ pnpm start When working on the connector builder UI and doing changes to the CDK and the webapp at the same time, you can start the dev server with `CDK_MANIFEST_PATH` or `CDK_VERSION` environment variables set to have the correct Typescript types built. If `CDK_VERSION` is set, it's loading the specified version of the CDK from pypi instead of the default one, if `CDK_MANIFEST_PATH` is set, it's copying the schema file locally. For example: + ``` CDK_MANIFEST_PATH=../../airbyte/airbyte-cdk/python/airbyte_cdk/sources/declarative/declarative_component_schema.yaml pnpm start ``` diff --git a/docs/contributing-to-airbyte/resources/developing-on-docker.md b/docs/contributing-to-airbyte/resources/developing-on-docker.md index c7d141c2560..ed8c7751581 100644 --- a/docs/contributing-to-airbyte/resources/developing-on-docker.md +++ b/docs/contributing-to-airbyte/resources/developing-on-docker.md @@ -1,22 +1,25 @@ # Developing on Docker -## Incrementality +## Incrementality -The docker build is fully incremental for the platform build, which means that it will only build an image if it is needed. We need to keep it that +The docker build is fully incremental for the platform build, which means that it will only build an image if it is needed. We need to keep it that way. The top level `build.gradle` file defines several convenient tasks for building a docker image. -1) The `copyGeneratedTar` task copies a generated TAR file from a default location into the default location used by the [docker plugin](https://github.com/bmuschko/gradle-docker-plugin). -2) The `buildDockerImage` task is a convenience class for configuring the above linked docker plugin that centralizes configuration logic commonly found in our dockerfiles. -3) Makes the `buildDockerImage` task depend on the Gradle `assemble` task. + +1. The `copyGeneratedTar` task copies a generated TAR file from a default location into the default location used by the [docker plugin](https://github.com/bmuschko/gradle-docker-plugin). +2. The `buildDockerImage` task is a convenience class for configuring the above linked docker plugin that centralizes configuration logic commonly found in our dockerfiles. +3. Makes the `buildDockerImage` task depend on the Gradle `assemble` task. These tasks are created in a subproject if the subproject has a `gradle.properties` file with the `dockerImageName` property. This property sets the built docker image's name. ## Adding a new docker build Once you have a `Dockerfile`, generating the docker image is done in the following way: + 1. Create a `gradle.properties` file in the subproject with the `dockerImageName` property set to the docker image name. For example: + ```groovy // In the gradle.properties file. dockerImageName=cron @@ -26,6 +29,7 @@ dockerImageName=cron depend on the copy TAR task in the subproject's build.gradle. For example: + ```groovy tasks.named("buildDockerImage") { dependsOn copyGeneratedTar @@ -34,6 +38,7 @@ tasks.named("buildDockerImage") { 3. If this is a subproject with a more custom copy strategy, define your own task to copy the necessary files and configure the build docker task to depend on this custom copy task in the subproject's build.gradle. + ```groovy task copyScripts(type: Copy) { dependsOn copyDocker @@ -56,9 +61,10 @@ The docker images that are running using a jar need to the latest published OSS ### Existing modules -The version should already be present. If a new version is published while a PR is open, it should generate a conflict, that will prevent you from -merging the review. There are scenarios where it is going to generate and error (The Dockerfile is moved for example), the way to avoid any issue +The version should already be present. If a new version is published while a PR is open, it should generate a conflict, that will prevent you from +merging the review. There are scenarios where it is going to generate and error (The Dockerfile is moved for example), the way to avoid any issue is to: + - Check the `.env` file to make sure that the latest version align with the version in the PR - Merge the `master` branch in the PR and make sure that the build is working right before merging. @@ -69,11 +75,13 @@ The version will be automatically replace with new version when releasing the OS ### New module This is trickier than handling the version of an existing module. -First your docker file generating an image need to be added to the `.bumpversion.cfg`. For each and every version you want to build with, the -docker image will need to be manually tag and push until the PR is merge. The reason is that the build has a check to know if all the potential +First your docker file generating an image need to be added to the `.bumpversion.cfg`. For each and every version you want to build with, the +docker image will need to be manually tag and push until the PR is merge. The reason is that the build has a check to know if all the potential docker images are present in the docker repository. It is done the following way: + ```shell docker tag 7d94ea2ad657 airbyte/temporal:0.30.35-alpha docker push airbyte/temporal:0.30.35-alpha ``` + The image ID can be retrieved using `docker images` or the docker desktop UI. diff --git a/docs/contributing-to-airbyte/resources/pull-requests-handbook.md b/docs/contributing-to-airbyte/resources/pull-requests-handbook.md index 2b1944c2d6f..5cc725f9320 100644 --- a/docs/contributing-to-airbyte/resources/pull-requests-handbook.md +++ b/docs/contributing-to-airbyte/resources/pull-requests-handbook.md @@ -6,13 +6,13 @@ When creating a pull request follow the naming conventions depending on the chan In general, the pull request title starts with an emoji with the connector you're doing the changes, eg (✨ Source E-Commerce: add new stream `Users`). Airbyte uses this pattern to automatically assign team reviews and build the product release notes. -| Pull Request Type | Emoji | Examples | -| ----------------- | ----- | ---------| -| New Connector (Source or Destination) | 🎉 | 🎉 New Destination: Database | -| Add a feature to an existing connector | ✨ | ✨ Source E-Commerce: add new stream `Users` | -| Fix a bug | 🐛 | 🐛 Source E-Commerce: fix start date parameter in spec | -| Documentation (updates or new entries) | 📝 | 📝 Fix Database connector changelog | -| It's a breaking change | 🚨 | 🚨🚨🐛 Source Kafka: fix a complex bug | +| Pull Request Type | Emoji | Examples | +| -------------------------------------- | ----- | ------------------------------------------------------ | +| New Connector (Source or Destination) | 🎉 | 🎉 New Destination: Database | +| Add a feature to an existing connector | ✨ | ✨ Source E-Commerce: add new stream `Users` | +| Fix a bug | 🐛 | 🐛 Source E-Commerce: fix start date parameter in spec | +| Documentation (updates or new entries) | 📝 | 📝 Fix Database connector changelog | +| It's a breaking change | 🚨 | 🚨🚨🐛 Source Kafka: fix a complex bug | For more information about [breaking changes](#breaking-changes-to-connectors). A maintainer will help and instruct about possible breaking changes. @@ -43,7 +43,7 @@ When creating or updating connectors, we spend a lot of time manually transcribi Changes to connector behavior should always be accompanied by a version bump and a changelog entry. We use [semantic versioning](https://semver.org/) to version changes to connectors. Since connectors are a bit different from APIs, we have our own take on semantic versioning, focusing on maintaining the best user experience of using a connector. - Major: a version in which a change is made which requires manual intervention (update to config or configured catalog) for an existing connection to continue to succeed, or one in which data that was previously being synced will no longer be synced - - Note that a category of "user intervention" is a schema change in the destination, as users will be required to update downstream reports and tools. A change that leads to a different final table in the destination is a breaking change + - Note that a category of "user intervention" is a schema change in the destination, as users will be required to update downstream reports and tools. A change that leads to a different final table in the destination is a breaking change - Minor: a version that introduces user-facing functionality in a backwards compatible manner - Patch: a version that introduces backwards compatible bug fixes or performance improvements @@ -52,7 +52,7 @@ Changes to connector behavior should always be accompanied by a version bump and Here are some examples of code changes and their respective version changes: | Change | Impact | Version Change | -|-----------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------|----------------| +| --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------- | -------------- | | Adding a required parameter to a connector's `spec` | Users will have to add the new parameter to their `config` | Major | | Changing a format of a parameter in a connector's `spec` from a single parameter to a `oneOf` | Users will have to edit their `config` to define their old parameter value in the `oneOf` format | Major | | Removing a stream from a connector's `catalog` | Data that was being synced will no longer be synced | Major | diff --git a/docs/contributing-to-airbyte/resources/qa-checks.md b/docs/contributing-to-airbyte/resources/qa-checks.md index 11bb496fd66..67ae33e7abd 100644 --- a/docs/contributing-to-airbyte/resources/qa-checks.md +++ b/docs/contributing-to-airbyte/resources/qa-checks.md @@ -6,122 +6,150 @@ Meeting these standards means that the connector will be able to be safely integ You can consider these checks as a set of guidelines to follow when developing a connector. They are by no mean replacing the need for a manual review of the connector codebase and the implementation of good test suites. - ## 📄 Documentation ### Breaking changes must be accompanied by a migration guide -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* -When a breaking change is introduced, we check that a migration guide is available. It should be stored under `./docs/integrations/s/-migrations.md`. +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ + +When a breaking change is introduced, we check that a migration guide is available. It should be stored under `./docs/integrations/s/-migrations.md`. This document should contain a section for each breaking change, in order of the version descending. It must explain users which action to take to migrate to the new version. + ### Connectors must have user facing documentation -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ The user facing connector documentation should be stored under `./docs/integrations/s/.md`. + ### Connectors must have a changelog entry for each version -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Each new version of a connector must have a changelog entry defined in the user facing documentation in `./docs/integrations/s/.md`. ## 📝 Metadata ### Connectors must have valid metadata.yaml file -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Connectors must have a `metadata.yaml` file at the root of their directory. This file is used to build our connector registry. Its structure must follow our metadata schema. Field values are also validated. This is to ensure that all connectors have the required metadata fields and that the metadata is valid. More details in this [documentation](https://docs.airbyte.com/connector-development/connector-metadata-file). + ### Connector must have a language tag in metadata -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Connectors must have a language tag in their metadata. It must be set in the `tags` field in metadata.yaml. The values can be `language:python` or `language:java`. This checks infers the correct language tag based on the presence of certain files in the connector directory. + ### Python connectors must have a CDK tag in metadata -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Python connectors must have a CDK tag in their metadata. It must be set in the `tags` field in metadata.yaml. The values can be `cdk:low-code`, `cdk:python`, or `cdk:file`. + ### Breaking change deadline should be a week in the future -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ If the connector version has a breaking change, the deadline field must be set to at least a week in the future. + ### Certified source connector must have a value filled out for maxSecondsBetweenMessages in metadata -*Applies to the following connector types: source* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with certified support level* + +_Applies to the following connector types: source_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with certified support level_ Certified source connectors must have a value filled out for `maxSecondsBetweenMessages` in metadata. This value represents the maximum number of seconds we could expect between messages for API connectors. And it's used by platform to tune connectors heartbeat timeout. The value must be set in the 'data' field in connector's `metadata.yaml` file. ## 📦 Packaging ### Connectors must use Poetry for dependency management -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Connectors must use [Poetry](https://python-poetry.org/) for dependency management. This is to ensure that all connectors use a dependency management tool which locks dependencies and ensures reproducible installs. + ### Connectors must be licensed under MIT or Elv2 -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Connectors must be licensed under the MIT or Elv2 license. This is to ensure that all connectors are licensed under a permissive license. More details in our [License FAQ](https://docs.airbyte.com/developer-guides/licenses/license-faq). + ### Connector license in metadata.yaml and pyproject.toml file must match -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Connectors license in metadata.yaml and pyproject.toml file must match. This is to ensure that all connectors are consistently licensed. + ### Connector version must follow Semantic Versioning -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Connector version must follow the Semantic Versioning scheme. This is to ensure that all connectors follow a consistent versioning scheme. Refer to our [Semantic Versioning for Connectors](https://docs.airbyte.com/contributing-to-airbyte/#semantic-versioning-for-connectors) for more details. + ### Connector version in metadata.yaml and pyproject.toml file must match -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Connector version in metadata.yaml and pyproject.toml file must match. This is to ensure that connector release is consistent. + ### Python connectors must have PyPi publishing enabled -*Applies to the following connector types: source* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Python connectors must have [PyPi](https://pypi.org/) publishing enabled in their `metadata.yaml` file. This is declared by setting `remoteRegistries.pypi.enabled` to `true` in metadata.yaml. This is to ensure that all connectors can be published to PyPi and can be used in `PyAirbyte`. ## 💼 Assets ### Connectors must have an icon -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Each connector must have an icon available in at the root of the connector code directory. It must be an SVG file named `icon.svg` and must be a square. ## 🔒 Security ### Connectors must use HTTPS only -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: java, low-code, python* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: java, low-code, python_ +_Applies to connector with any support level_ Connectors must use HTTPS only when making requests to external services. + ### Python connectors must not use a Dockerfile and must declare their base image in metadata.yaml file -*Applies to the following connector types: source, destination* -*Applies to the following connector languages: python, low-code* -*Applies to connector with any support level* + +_Applies to the following connector types: source, destination_ +_Applies to the following connector languages: python, low-code_ +_Applies to connector with any support level_ Connectors must use our Python connector base image (`docker.io/airbyte/python-connector-base`), declared through the `connectorBuildOptions.baseImage` in their `metadata.yaml`. This is to ensure that all connectors use a base image which is maintained and has security updates. diff --git a/docs/contributing-to-airbyte/submit-new-connector.md b/docs/contributing-to-airbyte/submit-new-connector.md index 664064d4cd1..7d60a1e1428 100644 --- a/docs/contributing-to-airbyte/submit-new-connector.md +++ b/docs/contributing-to-airbyte/submit-new-connector.md @@ -1,28 +1,30 @@ # Submit a New Connector :::info -Due to project priorities, we may not be able to accept all contributions at this time. +Due to project priorities, we may not be able to accept all contributions at this time. ::: #### Find an Issue or Create it! + Before jumping into the code please first: -1. Verify if there is an existing [Issue](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fconnectors+-label%3Aneeds-triage+label%3Acommunity) + +1. Verify if there is an existing [Issue](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fconnectors+-label%3Aneeds-triage+label%3Acommunity) 2. If you don't find an existing issue, [Request a New Connector](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fconnectors%2Cnew-connector&projects=&template=5-feature-new-connector.yaml) This will enable our team to make sure your contribution does not overlap with existing works and will comply with the design orientation we are currently heading the product toward. If you do not receive an update on the issue from our team, please ping us on [Slack](https://slack.airbyte.io)! - #### Code your contribution -1. To contribute to a connector, fork the [Connector repository](https://github.com/airbytehq/airbyte). + +1. To contribute to a connector, fork the [Connector repository](https://github.com/airbytehq/airbyte). 2. Open a branch for your work 3. Code the change 4. Ensure all tests pass. For connectors, this includes acceptance tests as well. -5. Update documentation in `docs/integrations/.md` +5. Update documentation in `docs/integrations/.md` 6. Make sure your contribution passes our [QA checks](./resources/qa-checks.md) - #### Open a pull request + 1. Rebase master with your branch before submitting a pull request. 2. Open the pull request. 3. Follow the [title convention](./resources/pull-requests-handbook.md#pull-request-title-convention) for Pull Requests @@ -31,9 +33,10 @@ This will enable our team to make sure your contribution does not overlap with e 6. Wait for a review from a community maintainer or our team. #### 4. Review process -When we review, we look at: -* ‌Does the PR add all existing streams, pagination and incremental syncs? -* Is the proposed solution reasonable? -* Is it tested? \(unit tests or integation tests\) -‌Once your PR passes, we will merge it 🎉. +When we review, we look at: + +- ‌Does the PR add all existing streams, pagination and incremental syncs? +- Is the proposed solution reasonable? +- Is it tested? \(unit tests or integation tests\) + ‌Once your PR passes, we will merge it 🎉. diff --git a/docs/contributing-to-airbyte/writing-docs.md b/docs/contributing-to-airbyte/writing-docs.md index 0343e3f8b86..6ebfe196de3 100644 --- a/docs/contributing-to-airbyte/writing-docs.md +++ b/docs/contributing-to-airbyte/writing-docs.md @@ -3,13 +3,13 @@ import TabItem from "@theme/TabItem"; # Updating Documentation -We welcome contributions to the Airbyte documentation! +We welcome contributions to the Airbyte documentation! -Our docs are written in [Markdown](https://guides.github.com/features/mastering-markdown/) following the [Google developer documentation style guide](https://developers.google.com/style/highlights) and the files are stored in our [Github repository](https://github.com/airbytehq/airbyte/tree/master/docs). The docs are published at [docs.airbyte.com](https://docs.airbyte.com/) using [Docusaurus](https://docusaurus.io/) and [GitHub Pages](https://pages.github.com/). +Our docs are written in [Markdown](https://guides.github.com/features/mastering-markdown/) following the [Google developer documentation style guide](https://developers.google.com/style/highlights) and the files are stored in our [Github repository](https://github.com/airbytehq/airbyte/tree/master/docs). The docs are published at [docs.airbyte.com](https://docs.airbyte.com/) using [Docusaurus](https://docusaurus.io/) and [GitHub Pages](https://pages.github.com/). ## Finding good first issues -The Docs team maintains a list of [#good-first-issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fdocumentation+label%3A%22good+first+issue%22) for new contributors. +The Docs team maintains a list of [#good-first-issues](https://github.com/airbytehq/airbyte/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%2Fdocumentation+label%3A%22good+first+issue%22) for new contributors. - If you're new to technical writing, start with the smaller issues (fixing typos, broken links, spelling and grammar, and so on). You can [edit the files directly on GitHub](#editing-directly-on-github). - If you're an experienced technical writer or a developer interested in technical writing, comment on an issue that interests you to discuss it with the Docs team. Once we decide on the approach and the tasks involved, [edit the files and open a Pull Request](#editing-on-your-local-machine) for the Docs team to review. @@ -28,7 +28,7 @@ You can contribute to Airbyte docs in two ways: To make minor changes (example: fixing typos) or edit a single file, you can edit the file directly on GitHub: -1. Click **Edit this page** at the bottom of any published document on [docs.airbyte.com](https://docs.airbyte.com/). You'll be taken to the GitHub editor. +1. Click **Edit this page** at the bottom of any published document on [docs.airbyte.com](https://docs.airbyte.com/). You'll be taken to the GitHub editor. 2. [Edit the file directly on GitHub and open a Pull Request](https://docs.github.com/en/repositories/working-with-files/managing-files/editing-files). ### Editing on your local machine @@ -85,7 +85,7 @@ To make complex changes or edit multiple files, edit the files on your local mac pnpm start ``` - Then navigate to [http://localhost:3005/](http://localhost:3005/). Whenever you make and save changes, you will see them reflected in the server. You can stop the running server in OSX/Linux by pressing `Ctrl-C` in the terminal. + Then navigate to [http://localhost:3005/](http://localhost:3005/). Whenever you make and save changes, you will see them reflected in the server. You can stop the running server in OSX/Linux by pressing `Ctrl-C` in the terminal. You can also build the docs locally and see the resulting changes. This is useful if you introduce changes that need to be run at build-time (e.g. adding a docs plug-in). To do so, run: @@ -93,28 +93,29 @@ To make complex changes or edit multiple files, edit the files on your local mac pnpm build pnpm serve ``` - - Then navigate to [http://localhost:3000/](http://localhost:3000/) to see your changes. You can stop the running server in OSX/Linux by pressing `Ctrl-C` in the terminal. + Then navigate to [http://localhost:3000/](http://localhost:3000/) to see your changes. You can stop the running server in OSX/Linux by pressing `Ctrl-C` in the terminal. 5. [Follow the GitHub workflow](https://docs.github.com/en/get-started/quickstart/contributing-to-projects/) to edit the files and create a pull request. - :::note - Before we accept any contributions, you'll need to sign the Contributor License Agreement (CLA). By signing a CLA, we can ensure that the community is free and confident in its ability to use your contributions. You will be prompted to sign the CLA while opening a pull request. - ::: + :::note + Before we accept any contributions, you'll need to sign the Contributor License Agreement (CLA). By signing a CLA, we can ensure that the community is free and confident in its ability to use your contributions. You will be prompted to sign the CLA while opening a pull request. + ::: -6. Assign `airbytehq/docs` as a Reviewer for your pull request. +6. Assign `airbytehq/docs` as a Reviewer for your pull request. ### Custom markdown extensions for connector docs -Airbyte's markdown documentation—particularly connector-specific documentation—needs to gracefully support multiple different contexts: key details may differ between open-source builds and Airbyte Cloud, and the more exhaustive explanations appropriate for https://docs.airbyte.com may bury key details when rendered as inline documentation within the Airbyte application. In order to support all these different contexts without resorting to multiple overlapping files that must be maintained in parallel, Airbyte's documentation tooling supports multiple nonstandard features. -Please familiarize yourself with all the tools available to you when writing documentation for a connector, so that you can provide appropriately tailored information to your readers in whichever context they see it. +Airbyte's markdown documentation—particularly connector-specific documentation—needs to gracefully support multiple different contexts: key details may differ between open-source builds and Airbyte Cloud, and the more exhaustive explanations appropriate for https://docs.airbyte.com may bury key details when rendered as inline documentation within the Airbyte application. In order to support all these different contexts without resorting to multiple overlapping files that must be maintained in parallel, Airbyte's documentation tooling supports multiple nonstandard features. + +Please familiarize yourself with all the tools available to you when writing documentation for a connector, so that you can provide appropriately tailored information to your readers in whichever context they see it. :::note As a general rule, features that introduce new behavior or prevent certain content from rendering will affect how the Airbyte UI displays markdown content, but have no impact on https://docs.airbyte.com. If you want to test out these in-app features in [a local Airbyte build](https://docs.airbyte.com/contributing-to-airbyte/resources/developing-locally/#develop-on-airbyte-webapp), ensure that you have the `airbyte` git repository checked out to the same parent directory as the airbyte platform repository: if so, development builds will by default fetch connector documentation from your local filesystem, allowing you to freely edit their content and view the rendered output. ::: #### Select between mutually-exclusive content options with `` + Tabs are a built-in feature of Docusaurus, the tool we use to build `https://docs.airbyte.com`; please refer to [their documentation](https://docusaurus.io/docs/markdown-features/tabs) for their options and behavior in this context. For better site-agnostic documentation, and because we like the feature, we maintain a separate `Tabs` implementation with limited, one-way API compatibility: all usage options we document should behave the same in-app and on `https://docs.airbyte.com`. If you find a discrepancy or breakage, we would appreciate if you [report it as a bug](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fenhancement%2Carea%2Fdocumentation+needs-triage&projects=&template=8-documentation.yaml)! The reverse is not necessarily true, however: Docusaurus supports many use cases besides ours, so supporting its every usage pattern is a deliberate non-goal. :::info @@ -171,9 +172,10 @@ When configuring this hypothetical connector using OAuth authentication, you sho - You should also avoid indenting `TabItem` tags and their content according to html conventions, since text indented by four spaces (common for html nested inside two levels of tags) can be interpreted as a code block; different markdown rendering tools can handle this inconsistently. #### Jump to the relevant documentation section when specific connector setup inputs are focused with `` + In the documentation, the relevant section needs to be wrapped in a `` component. When a user focuses the field identified by the `field` attribute in the connector setup UI, the documentation pane will automatically scroll to the associated section of the documentation, highlighting all content contained inside the `` tag. These are rendered as regular divs in the documentation site, so they have no effect in places other than the in-app documentation panel—however, note that there must be blank lines between a custom tag like `FieldAnchor` the content it wraps for the documentation site to render markdown syntax inside the custom tag to html. -The `field` attribute must be a valid json path to one of the properties nested under `connectionSpecification.properties` in that connector's `spec.json` or `spec.yaml` file. For example, if the connector spec contains a `connectionSpecification.properties.replication_method.replication_slot`, you would mark the start of the related documentation section with `` and its end with ``. It's also possible to highlight the same section for multiple fields by separating them with commas, like ``. To mark a section as highlighted after the user picks an option from a `oneOf`: use a `field` prop like `path.to.field[value-of-selection-key]`, where the `value-of-selection-key` is the value of a `const` field nested inside that `oneOf`. For example, if the specification of the `oneOf` field is: +The `field` attribute must be a valid json path to one of the properties nested under `connectionSpecification.properties` in that connector's `spec.json` or `spec.yaml` file. For example, if the connector spec contains a `connectionSpecification.properties.replication_method.replication_slot`, you would mark the start of the related documentation section with `` and its end with ``. It's also possible to highlight the same section for multiple fields by separating them with commas, like ``. To mark a section as highlighted after the user picks an option from a `oneOf`: use a `field` prop like `path.to.field[value-of-selection-key]`, where the `value-of-selection-key` is the value of a `const` field nested inside that `oneOf`. For example, if the specification of the `oneOf` field is: ```json "replication_method": { @@ -217,7 +219,9 @@ Because of their close connection with the connector setup form fields, `` -Certain content is important to document, but unhelpful in the context of the Airbyte UI's inline documentation views: + +Certain content is important to document, but unhelpful in the context of the Airbyte UI's inline documentation views: + - background information that helps users understand a connector but doesn't affect configuration - edge cases that are unusual but time-consuming to solve - context for readers on the documentation site about environment-specific content (see [below](#environment-specific-in-app-content-with-magic-html-comments)) @@ -225,7 +229,9 @@ Certain content is important to document, but unhelpful in the context of the Ai Wrapping such content in a pair of `...` tags will prevent it from being rendered within the Airbyte UI without affecting its presentation on https://docs.airbyte.com. This allows a single markdown file to be the source of truth for both a streamlined in-app reference and a more thorough treatment on the documentation website. #### Environment-specific in-app content with magic html comments + Sometimes, there are connector setup instructions which differ between open-source Airbyte builds and Airbyte Cloud. Document both cases, but wrap each in a pair of special HTML comments: + ```md @@ -235,6 +241,7 @@ Sometimes, there are connector setup instructions which differ between open-sour Only open-source builds of the Airbyte UI will render this content. + @@ -245,13 +252,16 @@ Only open-source builds of the Airbyte UI will render this content. Only cloud builds of the Airbyte UI will render this content. + Content outside of the magic-comment-delimited blocks will be rendered everywhere. ``` + Note that the documentation site will render _all_ environment-specific content, so please introduce environment-specific variants with some documentation-site-only context (like the hidden subheadings in the example above) to disambiguate. #### Contextually-styled callouts with admonition blocks + We have added support for [Docusaurus' admonition syntax](https://docusaurus.io/docs/markdown-features/admonitions) to Airbyte's in-app markdown renderer. To make an admonition, wrap text with lines of three colons, with the first colons immediately followed (no space) by a tag specifying the callout's semantic styling, which will be one of `tip`, `warning`, `caution`, `danger`, `note`, or `info`. The syntax parallells a code block's, but with colons instead of backticks. @@ -340,14 +350,15 @@ Some **dangerous** content with _Markdown_ `syntax`. Back to ordinary markdown content. ``` + Eagle-eyed readers may note that _all_ markdown should support this feature since it's part of the html spec. However, it's worth special mention since these dropdowns have been styled to be a graceful visual fit within our rendered documentation in all environments. #### Documenting PyAirbyte usage PyAirbyte is a Python library that allows to run syncs within a Python script for a subset of connectors. Documentation around PyAirbyte connectors is automatically generated from the connector's JSON schema spec. There are a few approaches to combine full control over the documentation with automatic generation for common cases: -* If a connector is PyAirbyte enabled (`remoteRegistries.pypi.enabled` set in the `metadata.yaml` file of the connector) and there is no second-level heading `Usage with PyAirbyte` in the documentation, the documentation will be automatically generated and placed above the `Changelog` section. -* By manually specifying a `Usage with PyAirbyte` section, this automatism is disabled. The following is a good starting point for this section: +- If a connector is PyAirbyte enabled (`remoteRegistries.pypi.enabled` set in the `metadata.yaml` file of the connector) and there is no second-level heading `Usage with PyAirbyte` in the documentation, the documentation will be automatically generated and placed above the `Changelog` section. +- By manually specifying a `Usage with PyAirbyte` section, this automatism is disabled. The following is a good starting point for this section: ```md @@ -368,19 +379,19 @@ The `PyAirbyteExample` component will generate a code example that can be run wi - If you're updating a connector doc, follow the [Connector documentation template](https://hackmd.io/Bz75cgATSbm7DjrAqgl4rw) - If you're adding a new file, update the [sidebars.js file](https://github.com/airbytehq/airbyte/blob/master/docusaurus/sidebars.js) - If you're adding a README to a code module, make sure the README has the following components: - - A brief description of the module - - Development pre-requisites (like which language or binaries are required for development) - - How to install dependencies - - How to build and run the code locally & via Docker - - Any other information needed for local iteration + - A brief description of the module + - Development pre-requisites (like which language or binaries are required for development) + - How to install dependencies + - How to build and run the code locally & via Docker + - Any other information needed for local iteration -## Advanced tasks +## Advanced tasks ### Adding a redirect To add a redirect, open the [`docusaurus/redirects.yml`](https://github.com/airbytehq/airbyte/blob/master/docusaurus/redirects.yml) file and add an entry from which old path to which new path a redirect should happen. -:::note +:::note Your path **needs** a leading slash `/` to work ::: @@ -392,21 +403,21 @@ Only the Airbyte team and maintainers have permissions to deploy the documentati #### Automated documentation site deployment -When `docs/` folder gets changed in `master` branch of the repository, [`Deploy docs.airbyte.com` Github workflow](https://github.com/airbytehq/airbyte/actions/workflows/deploy-docs-site.yml) steps in, builds and deploys the documentation site. This process is automatic, takes five to ten minutes, and needs no human intervention. +When `docs/` folder gets changed in `master` branch of the repository, [`Deploy docs.airbyte.com` Github workflow](https://github.com/airbytehq/airbyte/actions/workflows/deploy-docs-site.yml) steps in, builds and deploys the documentation site. This process is automatic, takes five to ten minutes, and needs no human intervention. #### Manual documentation site deployment :::note -Manual deployment is reserved for emergency cases. Please, bear in mind that automatic deployment is triggered by changes to `docs/` folder, so it needs to be disabled to avoid interference with manual deployment. +Manual deployment is reserved for emergency cases. Please, bear in mind that automatic deployment is triggered by changes to `docs/` folder, so it needs to be disabled to avoid interference with manual deployment. ::: -You'll need a GitHub SSH key to deploy the documentation site using the [deployment tool](https://github.com/airbytehq/airbyte/blob/master/tools/bin/deploy_docusaurus). +You'll need a GitHub SSH key to deploy the documentation site using the [deployment tool](https://github.com/airbytehq/airbyte/blob/master/tools/bin/deploy_docusaurus). To deploy the documentation site, run: ```bash cd airbyte -# or cd airbyte-cloud +# or cd airbyte-cloud git checkout master git pull ./tools/bin/deploy_docusaurus @@ -421,14 +432,15 @@ git checkout ``` ### Adding a diagram + We have the docusaurus [Mermaid](https://mermaid.js.org/) plugin which has a variety of diagram types and syntaxes available. :::danger - The connector specific docs do **not** currently support this, only use this for general docs. +The connector specific docs do **not** currently support this, only use this for general docs. ::: -Here is an example from the [Mermaid docs](https://mermaid.js.org/syntax/entityRelationshipDiagram.html) +Here is an example from the [Mermaid docs](https://mermaid.js.org/syntax/entityRelationshipDiagram.html) you would add the following to your markdown wrapped in a code block. ```md @@ -441,7 +453,7 @@ you would add the following to your markdown wrapped in a code block. CUSTOMER }|..|{ DELIVERY-ADDRESS : uses ``` -which produces the following diagram +which produces the following diagram ```mermaid --- @@ -453,5 +465,5 @@ erDiagram CUSTOMER }|..|{ DELIVERY-ADDRESS : uses ``` -check out the rest of the Mermaid documentation for its capabilities just be aware that not all +check out the rest of the Mermaid documentation for its capabilities just be aware that not all the features are available to the docusaurus plugin. diff --git a/docs/deploying-airbyte/docker-compose.md b/docs/deploying-airbyte/docker-compose.md index c1b37fae45d..2af199b0d3b 100644 --- a/docs/deploying-airbyte/docker-compose.md +++ b/docs/deploying-airbyte/docker-compose.md @@ -66,6 +66,7 @@ bash run-ab-platform.sh - Start moving some data! ## Troubleshooting + If you have any questions about the local setup and deployment process, head over to our [Getting Started FAQ](https://github.com/airbytehq/airbyte/discussions/categories/questions) on our Airbyte Forum that answers the following questions and more: - How long does it take to set up Airbyte? @@ -73,4 +74,4 @@ If you have any questions about the local setup and deployment process, head ove - Can I set a start time for my sync? If you encounter any issues, check out [Getting Support](/community/getting-support) documentation -for options how to get in touch with the community or us. \ No newline at end of file +for options how to get in touch with the community or us. diff --git a/docs/deploying-airbyte/local-deployment.md b/docs/deploying-airbyte/local-deployment.md index 101281932c4..7c49f0edc27 100644 --- a/docs/deploying-airbyte/local-deployment.md +++ b/docs/deploying-airbyte/local-deployment.md @@ -2,7 +2,7 @@ :::warning This tool is in active development. Airbyte strives to provide high quality, reliable software, however there may be -bugs or usability issues with this command. If you find an issue with the `abctl` command, please report it as a github +bugs or usability issues with this command. If you find an issue with the `abctl` command, please report it as a github issue [here](https://github.com/airbytehq/airbyte/issues) with the type of "🐛 [abctl] Report an issue with the abctl tool". ::: @@ -10,7 +10,7 @@ issue [here](https://github.com/airbytehq/airbyte/issues) with the type of "🐛 :::info These instructions have been tested on MacOS, Windows, Ubuntu and Fedora. -This tool is intended to get Airbyte running as quickly as possible with no additional configuration necessary. +This tool is intended to get Airbyte running as quickly as possible with no additional configuration necessary. Additional configuration options may be added in the future, however, if you need additional configuration options now, use the docker compose solution by following the instructions for the `run_ab_platform.sh` script [here](/deploying-airbyte/docker-compose). @@ -23,12 +23,12 @@ Mac users can use Brew to install the `abctl` command ```bash brew tap airbytehq/tap -brew install abctl +brew install abctl ``` ::: -- Install `Docker Desktop` \(see [instructions](https://docs.docker.com/desktop/install/mac-install/)\). +- Install `Docker Desktop` \(see [instructions](https://docs.docker.com/desktop/install/mac-install/)\). - After `Docker Desktop` is installed, you must enable `Kubernetes` \(see [instructions](https://docs.docker.com/desktop/kubernetes/)\). - If you did not use Brew to install `abctl` then download the latest version of `abctl` from the [releases page](https://github.com/airbytehq/abctl/releases) and run the following command: @@ -38,7 +38,6 @@ users should be able to run the command from the terminal. Airbyte suggests mac ::: - ```bash ./abctl local install ``` @@ -57,6 +56,7 @@ ABCTL_LOCAL_INSTALL_USERNAME=bar - Start moving some data! ## Troubleshooting + If you have any questions about the local setup and deployment process, head over to our [Getting Started FAQ](https://github.com/airbytehq/airbyte/discussions/categories/questions) on our Airbyte Forum that answers the following questions and more: - How long does it take to set up Airbyte? diff --git a/docs/deploying-airbyte/on-aws-ec2.md b/docs/deploying-airbyte/on-aws-ec2.md index 3b352ce5901..0bdaf7a6e95 100644 --- a/docs/deploying-airbyte/on-aws-ec2.md +++ b/docs/deploying-airbyte/on-aws-ec2.md @@ -41,6 +41,7 @@ sudo usermod -a -G docker $USER sudo yum install -y docker-compose-plugin docker compose version ``` + If you encounter an error on this part, you might prefer to follow the documentation to [install the docker compose plugin manually](https://docs.docker.com/compose/install/linux/#install-the-plugin-manually) (_make sure to do it for all users_). 4. To close the SSH connection, run the following command in your SSH session on the instance terminal: diff --git a/docs/deploying-airbyte/on-aws-ecs.md b/docs/deploying-airbyte/on-aws-ecs.md index 8f41dd6fa33..8695604006a 100644 --- a/docs/deploying-airbyte/on-aws-ecs.md +++ b/docs/deploying-airbyte/on-aws-ecs.md @@ -6,7 +6,7 @@ We do not currently support deployment on ECS. ::: -The current iteration is not compatible with ECS. -Airbyte currently relies on docker containers being able to create other docker containers. -ECS does not permit containers to do this. We will be revising this strategy soon, +The current iteration is not compatible with ECS. +Airbyte currently relies on docker containers being able to create other docker containers. +ECS does not permit containers to do this. We will be revising this strategy soon, so that we can be compatible with ECS and other container services. diff --git a/docs/deploying-airbyte/on-cloud.md b/docs/deploying-airbyte/on-cloud.md index f40de9389b0..5d900b69f58 100644 --- a/docs/deploying-airbyte/on-cloud.md +++ b/docs/deploying-airbyte/on-cloud.md @@ -19,4 +19,3 @@ You will be provided 1000 credits to get your first few syncs going! ![](../.gitbook/assets/cloud_connection_onboarding.png) **4. You're done!** - diff --git a/docs/deploying-airbyte/on-kubernetes-via-helm.md b/docs/deploying-airbyte/on-kubernetes-via-helm.md index 8cf05d9bacd..6334d8f8de7 100644 --- a/docs/deploying-airbyte/on-kubernetes-via-helm.md +++ b/docs/deploying-airbyte/on-kubernetes-via-helm.md @@ -316,7 +316,7 @@ GCS logging was tested on [Airbyte Helm Chart Version 0.54.69](https://artifacth 2. **Create Service Account**: Click "Create Service Account", enter a name, description, and then click "Create". 3. **Grant Permissions**: Assign the role of "Storage Object Admin" to the service account by selecting it from the role list. 4. **Create Key**: After creating the service account, click on it, go to the "Keys" tab, and then click "Add Key" > "Create new key". Choose JSON as the key type and click "Create". The key file will be downloaded automatically to your computer. -5. **Encode Key**: Encode GCP credentials file contents using Base64. This key will be referenced as `` +5. **Encode Key**: Encode GCP credentials file contents using Base64. This key will be referenced as `` #### Update the values.yaml with the GCS Logging Information below @@ -333,15 +333,15 @@ global: type: "GCS" gcs: bucket: "" - credentials: "/secrets/gcs-log-creds/gcp.json" + credentials: "/secrets/gcs-log-creds/gcp.json" credentialsJson: "" ``` - Update the following Environment Variables in the worker section: + ``` worker: - + extraEnv: - name: STATE_STORAGE_GCS_BUCKET_NAME value: diff --git a/docs/deploying-airbyte/on-oci-vm.md b/docs/deploying-airbyte/on-oci-vm.md index c3056da6ed6..c775ffb2447 100644 --- a/docs/deploying-airbyte/on-oci-vm.md +++ b/docs/deploying-airbyte/on-oci-vm.md @@ -65,7 +65,7 @@ Download the Airbyte repository and deploy it on the VM: 2. Run the following command to get Airbyte running on your OCI VM instance using the installation script: ```bash - ./run-ab-platform.sh -b + ./run-ab-platform.sh -b ``` 3. Open up a Browser and visit port 8000 - [http://localhost:8000/](http://localhost:8000/) diff --git a/docs/deploying-airbyte/on-plural.md b/docs/deploying-airbyte/on-plural.md index 88a61fe12fe..5662544ee72 100644 --- a/docs/deploying-airbyte/on-plural.md +++ b/docs/deploying-airbyte/on-plural.md @@ -6,16 +6,17 @@ If you'd prefer to follow along with a video, check out the Plural Airbyte deplo ## Getting started -1. Create an account on [https://app.plural.sh](https://app.plural.sh). +1. Create an account on [https://app.plural.sh](https://app.plural.sh). 2. Install the Plural CLI by following steps 1, 2, and 3 of the instructions [here](https://docs.plural.sh/getting-started). Through this, you will also configure your cloud provider and the domain name under which your application will be deployed to. We now need a Git repository to store your Plural configuration in. This will also contain the Helm and Terraform files that Plural will autogenerate for you. You have two options: + - Run `plural init` in any directory to let Plural initiate an OAuth workflow to create a Git repo for you. - Create a Git repo manually, clone it, and run `plural init` inside it. -Running `plural init` will configure your installation and cloud provider for the repo. +Running `plural init` will configure your installation and cloud provider for the repo. ## Installing Airbyte @@ -50,7 +51,7 @@ plural deploy --commit "deploying airbyte" ## Adding the Plural Console -To make management of your installation as simple as possible, we recommend installing the Plural Console. The console provides tools to manage resource scaling, receiving automated upgrades, dashboards tailored to your Airbyte installation, and log aggregation. Run: +To make management of your installation as simple as possible, we recommend installing the Plural Console. The console provides tools to manage resource scaling, receiving automated upgrades, dashboards tailored to your Airbyte installation, and log aggregation. Run: ```bash plural bundle install console console-aws diff --git a/docs/deploying-airbyte/on-restack.md b/docs/deploying-airbyte/on-restack.md index fbb3f11f26d..3bf4d0800e3 100644 --- a/docs/deploying-airbyte/on-restack.md +++ b/docs/deploying-airbyte/on-restack.md @@ -4,12 +4,12 @@ To deploy Airbyte with Restack: - - [Sign up for a Restack account](#sign-up-for-a-restack-account). - - [Add AWS credentials with AdministratorAccess](#add-aws-credentials-with-administratoraccess). - - [One-click cluster creation with Restack](#one-click-cluster-creation-with-restack). - - [Deploy Airbyte on Restack](#deploy-airbyte-on-restack). - - [Start using Airbyte](#start-using-airbyte). - - [Deploy multiple instances of Airbyte](#deploy-multiple-instances-of-airbyte). +- [Sign up for a Restack account](#sign-up-for-a-restack-account). +- [Add AWS credentials with AdministratorAccess](#add-aws-credentials-with-administratoraccess). +- [One-click cluster creation with Restack](#one-click-cluster-creation-with-restack). +- [Deploy Airbyte on Restack](#deploy-airbyte-on-restack). +- [Start using Airbyte](#start-using-airbyte). +- [Deploy multiple instances of Airbyte](#deploy-multiple-instances-of-airbyte). ## Sign up for a Restack account @@ -18,9 +18,9 @@ If you already have an account, login to Restack at [www.restack.io/login](https ## Add AWS credentials with AdministratorAccess -To deploy Airbyte in your own AWS infrastructure with Restack, you will need to add your credentials as the next step. +To deploy Airbyte in your own AWS infrastructure with Restack, you will need to add your credentials as the next step. -Make sure that this account has *AdministratorAccess*. This is how Restack can ensure an end-to-end cluster creation and cluster management process. +Make sure that this account has _AdministratorAccess_. This is how Restack can ensure an end-to-end cluster creation and cluster management process. 1. Navigate to **Clusters** in the left-hand navigation menu. 2. Select the **Credentials** tab. @@ -32,21 +32,22 @@ Make sure that this account has *AdministratorAccess*. This is how Restack can e ## One-click cluster creation with Restack :::tip -Running your application on a Kubernetes cluster lets you deploy, scale and monitor the application reliably. +Running your application on a Kubernetes cluster lets you deploy, scale and monitor the application reliably. ::: -Once you have added your credentials: +Once you have added your credentials: + 1. Navigate to the **Clusters** tab on the same page and click on **Create cluster**. 2. Give a suitable name to your cluster. 3. Select the region you want to deploy the cluster in. 4. Select the AWS credentials you added in the previous step. -The cluster creation process will start automatically. Once the cluster is ready, you will get an email on the email id connected with your account. +The cluster creation process will start automatically. Once the cluster is ready, you will get an email on the email id connected with your account. Creating a cluster is a one-time process. From here you can add other open source tools or multiple instances of Airbyte in the same cluster. -Any application you deploy in your cluster will be accessible via a free **restack domain**. -Contact the Restack team via chat to set a custom domain for your Airbyte instances. +Any application you deploy in your cluster will be accessible via a free **restack domain**. +Contact the Restack team via chat to set a custom domain for your Airbyte instances. ## Deploy Airbyte on Restack @@ -57,10 +58,10 @@ Contact the Restack team via chat to set a custom domain for your Airbyte instan ## Start using Airbyte -Airbyte will be deployed on your cluster and you can access it using the link under the *URL* tab. +Airbyte will be deployed on your cluster and you can access it using the link under the _URL_ tab. You can also check the workloads and volumes that are deployed within Airbyte. ## Deploy multiple instances of Airbyte -Restack makes it easier to deploy multiple instances of Airbyte on the same or multiple clusters. -
    So you can test the latest version before upgrading or have a dedicated instance for development and for production. \ No newline at end of file +Restack makes it easier to deploy multiple instances of Airbyte on the same or multiple clusters. +
    So you can test the latest version before upgrading or have a dedicated instance for development and for production. diff --git a/docs/developer-guides/licenses/README.md b/docs/developer-guides/licenses/README.md index 6601c9e166e..18a86671784 100644 --- a/docs/developer-guides/licenses/README.md +++ b/docs/developer-guides/licenses/README.md @@ -11,8 +11,6 @@ The license for a particular work is defined with following prioritized rules: If you have any question regarding licenses, just visit our [FAQ](license-faq.md) or [contact us](mailto:license@airbyte.io). -If you want to see a list of examples supported by ELv2, and not, to have a better understanding whether you should be concerned or not, check the [examples](examples.md). +If you want to see a list of examples supported by ELv2, and not, to have a better understanding whether you should be concerned or not, check the [examples](examples.md). **TL;DR:** Unless you want to host Airbyte yourself and sell it as an ELT/ETL tool, or to sell a product that directly exposes Airbyte’s UI or API, you should be good to go! - - diff --git a/docs/developer-guides/licenses/elv2-license.md b/docs/developer-guides/licenses/elv2-license.md index 2986bc13962..e8e87a48dca 100644 --- a/docs/developer-guides/licenses/elv2-license.md +++ b/docs/developer-guides/licenses/elv2-license.md @@ -35,4 +35,3 @@ _your licenses_ are all the licenses granted to you for the software under these _use_ means anything you do with the software requiring one of your licenses. _trademark_ means trademarks, service marks, and similar rights. - diff --git a/docs/developer-guides/licenses/examples.md b/docs/developer-guides/licenses/examples.md index 0a160a520db..ee3eae37940 100644 --- a/docs/developer-guides/licenses/examples.md +++ b/docs/developer-guides/licenses/examples.md @@ -1,7 +1,7 @@ # Examples -We chose ELv2 because it is very permissive with what you can do with the software. -We are still being asked whether one's project are concerned by the ELv2 license. So we decided to list some projects to make this very explicit. +We chose ELv2 because it is very permissive with what you can do with the software. +We are still being asked whether one's project are concerned by the ELv2 license. So we decided to list some projects to make this very explicit. Don't hesitate to ask us about this or to do a pull request to add your project here. If we merge it, it means you're good to go. @@ -9,11 +9,11 @@ Let's start with the list of projects that falls under ELv2 and for which you ca ## Examples of projects that can't leverage the technology under ELv2 without a contract -* Hosting Airbyte yourself and selling it as an ELT/ETL tool. That means selling a competitive alternative to Airbyte Cloud or the future Airbyte Enterprise. -* Selling a product that directly exposes Airbyte’s UI or API. +- Hosting Airbyte yourself and selling it as an ELT/ETL tool. That means selling a competitive alternative to Airbyte Cloud or the future Airbyte Enterprise. +- Selling a product that directly exposes Airbyte’s UI or API. ## Examples of projects for which you can leverage Airbyte fully -* Creating an analytics or attribution platform for which you want to use Airbyte to bring data in on behalf of your customers. -* Creating any kind of platform on which you offer Airbyte's connectors to your customers to bring their data in, unless you're selling some ELT / ETL solution. -* Building your internal data stack and configuring pipelines through Airbyte's UI or API. +- Creating an analytics or attribution platform for which you want to use Airbyte to bring data in on behalf of your customers. +- Creating any kind of platform on which you offer Airbyte's connectors to your customers to bring their data in, unless you're selling some ELT / ETL solution. +- Building your internal data stack and configuring pipelines through Airbyte's UI or API. diff --git a/docs/developer-guides/licenses/mit-license.md b/docs/developer-guides/licenses/mit-license.md index c9cef864ea5..9f40137a05a 100644 --- a/docs/developer-guides/licenses/mit-license.md +++ b/docs/developer-guides/licenses/mit-license.md @@ -9,4 +9,3 @@ Permission is hereby granted, free of charge, to any person obtaining a copy of The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. - diff --git a/docs/enterprise-setup/README.md b/docs/enterprise-setup/README.md index c46542f240b..7b29afe4fd5 100644 --- a/docs/enterprise-setup/README.md +++ b/docs/enterprise-setup/README.md @@ -6,16 +6,16 @@ products: oss-enterprise [Airbyte Self-Managed Enterprise](https://airbyte.com/product/airbyte-enterprise) is the best way to run Airbyte yourself. You get all 300+ pre-built connectors, data never leaves your environment, and Airbyte becomes self-serve in your organization with new tools to manage multiple users, and multiple teams using Airbyte all in one place. -A valid license key is required to get started with Airbyte Self-Managed Enterprise. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. +A valid license key is required to get started with Airbyte Self-Managed Enterprise. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. The following pages outline how to: + 1. [Deploy Airbyte Enterprise using Kubernetes](./implementation-guide.md) 2. [Configure Okta for Single Sign-On (SSO) with Airbyte Self-Managed Self-Managed Enterprise](/access-management/sso.md) -| Feature | Description | -|---------------------------|--------------------------------------------------------------------------------------------------------------| +| Feature | Description | +| ------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Premium Support | [Priority assistance](https://docs.airbyte.com/operator-guides/contact-support/#airbyte-enterprise-self-hosted-support) with deploying, managing and upgrading Airbyte or troubleshooting any connection issues. | -| User Management | [Okta SSO](/access-management/sso.md) to extend each Airbyte workspace to multiple users | -| Multiple Workspaces | Ability to create + manage multiple workspaces on one Airbyte instance | -| Role-Based Access Control | Isolate workspaces from one another with users roles scoped to individual workspaces | - +| User Management | [Okta SSO](/access-management/sso.md) to extend each Airbyte workspace to multiple users | +| Multiple Workspaces | Ability to create + manage multiple workspaces on one Airbyte instance | +| Role-Based Access Control | Isolate workspaces from one another with users roles scoped to individual workspaces | diff --git a/docs/enterprise-setup/api-access-config.md b/docs/enterprise-setup/api-access-config.md index 16f71fc20cc..e213de08706 100644 --- a/docs/enterprise-setup/api-access-config.md +++ b/docs/enterprise-setup/api-access-config.md @@ -25,10 +25,7 @@ POST /api/v1/applications/token Ensure the following JSON Body is attached to the request, populated with your `client_id` and `client_secret`: ```yaml -{ - "client_id" : "", - "client_secret": "" -} +{ "client_id": "", "client_secret": "" } ``` The API response should provide an `access_token` which is a Bearer Token valid for 60 minutes that can be used to make requests to the API. Once your `access_token` expires, you may make a new request to the `applications/token` endpoint to get a new token. @@ -45,20 +42,19 @@ Expect a response like the following: ```json { - "data": [ - { - "workspaceId": "b5367aab-9d68-4fea-800f-0000000000", - "name": "Finance Team", - "dataResidency": "auto" - }, - { - "workspaceId": "b5367aab-9d68-4fea-800f-0000000001", - "name": "Analytics Team", - "dataResidency": "auto" - }, - ] + "data": [ + { + "workspaceId": "b5367aab-9d68-4fea-800f-0000000000", + "name": "Finance Team", + "dataResidency": "auto" + }, + { + "workspaceId": "b5367aab-9d68-4fea-800f-0000000001", + "name": "Analytics Team", + "dataResidency": "auto" + } + ] } ``` To go further, you may use our [Python](https://github.com/airbytehq/airbyte-api-python-sdk) and [Java](https://github.com/airbytehq/airbyte-api-java-sdk) SDKs to make API requests directly in code, or our [Terraform Provider](https://registry.terraform.io/providers/airbytehq/airbyte/latest) (which uses the Airbyte API) to declare your Airbyte configuration as infrastructure. - diff --git a/docs/enterprise-setup/implementation-guide.md b/docs/enterprise-setup/implementation-guide.md index 377a282bcba..43207eeb0a3 100644 --- a/docs/enterprise-setup/implementation-guide.md +++ b/docs/enterprise-setup/implementation-guide.md @@ -21,16 +21,16 @@ For a production-ready deployment of Self-Managed Enterprise, various infrastruc Prior to deploying Self-Managed Enterprise, we recommend having each of the following infrastructure components ready to go. When possible, it's easiest to have all components running in the same [VPC](https://docs.aws.amazon.com/eks/latest/userguide/network_reqs.html). The provided recommendations are for customers deploying to AWS: -| Component | Recommendation | -|--------------------------|-----------------------------------------------------------------------------| +| Component | Recommendation | +| ------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Kubernetes Cluster | Amazon EKS cluster running in [2 or more availability zones](https://docs.aws.amazon.com/eks/latest/userguide/disaster-recovery-resiliency.html) on a minimum of 6 nodes. | -| Ingress | [Amazon ALB](#configuring-ingress) and a URL for users to access the Airbyte UI or make API requests. | -| Object Storage | [Amazon S3 bucket](#configuring-external-logging) with two directories for log and state storage. | -| Dedicated Database | [Amazon RDS Postgres](#configuring-the-airbyte-database) with at least one read replica. | -| External Secrets Manager | [Amazon Secrets Manager](/operator-guides/configuring-airbyte#secrets) for storing connector secrets. | - +| Ingress | [Amazon ALB](#configuring-ingress) and a URL for users to access the Airbyte UI or make API requests. | +| Object Storage | [Amazon S3 bucket](#configuring-external-logging) with two directories for log and state storage. | +| Dedicated Database | [Amazon RDS Postgres](#configuring-the-airbyte-database) with at least one read replica. | +| External Secrets Manager | [Amazon Secrets Manager](/operator-guides/configuring-airbyte#secrets) for storing connector secrets. | We require you to install and configure the following Kubernetes tooling: + 1. Install `helm` by following [these instructions](https://helm.sh/docs/intro/install/) 2. Install `kubectl` by following [these instructions](https://kubernetes.io/docs/tasks/tools/). 3. Configure `kubectl` to connect to your cluster by using `kubectl use-context my-cluster-name`: @@ -49,7 +49,7 @@ We require you to install and configure the following Kubernetes tooling: - + 1. Configure `gcloud` with `gcloud auth login`. 2. On the Google Cloud Console, the cluster page will have a "Connect" button, with a command to run locally: `gcloud container clusters get-credentials $CLUSTER_NAME --zone $ZONE_NAME --project $PROJECT_NAME`. @@ -90,7 +90,7 @@ metadata: name: airbyte-config-secrets type: Opaque stringData: -## Storage Secrets + ## Storage Secrets # S3 s3-access-key-id: ## e.g. AKIAIOSFODNN7EXAMPLE s3-secret-access-key: ## e.g. wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY @@ -111,35 +111,33 @@ Ensure your access key is tied to an IAM user with the [following policies](http ```yaml { - "Version":"2012-10-17", - "Statement":[ + "Version": "2012-10-17", + "Statement": + [ + { "Effect": "Allow", "Action": "s3:ListAllMyBuckets", "Resource": "*" }, { - "Effect":"Allow", - "Action": "s3:ListAllMyBuckets", - "Resource":"*" + "Effect": "Allow", + "Action": ["s3:ListBucket", "s3:GetBucketLocation"], + "Resource": "arn:aws:s3:::YOUR-S3-BUCKET-NAME", }, { - "Effect":"Allow", - "Action":["s3:ListBucket","s3:GetBucketLocation"], - "Resource":"arn:aws:s3:::YOUR-S3-BUCKET-NAME" - }, - { - "Effect":"Allow", - "Action":[ + "Effect": "Allow", + "Action": + [ "s3:PutObject", "s3:PutObjectAcl", "s3:GetObject", "s3:GetObjectAcl", - "s3:DeleteObject" - ], - "Resource":"arn:aws:s3:::YOUR-S3-BUCKET-NAME/*" - } - ] + "s3:DeleteObject", + ], + "Resource": "arn:aws:s3:::YOUR-S3-BUCKET-NAME/*", + }, + ], } ``` - + First, create a new file `gcp.json` containing the credentials JSON blob for the service account you are looking to assume. @@ -163,10 +161,9 @@ kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace
    - #### External Connector Secret Management -Airbyte's default behavior is to store encrypted connector secrets on your cluster as Kubernetes secrets. You may opt to instead store connector secrets in an external secret manager of your choosing (AWS Secrets Manager, Google Secrets Manager or Hashicorp Vault). +Airbyte's default behavior is to store encrypted connector secrets on your cluster as Kubernetes secrets. You may opt to instead store connector secrets in an external secret manager of your choosing (AWS Secrets Manager, Google Secrets Manager or Hashicorp Vault).
    Secrets for External Connector Secret Management @@ -197,12 +194,12 @@ kubectl create secret generic airbyte-config-secrets \
    - ## Installation Steps ### Step 1: Add Airbyte Helm Repository Follow these instructions to add the Airbyte helm repository: + 1. Run `helm repo add airbyte https://airbytehq.github.io/helm-charts`, where `airbyte` is the name of the repository that will be indexed locally. 2. Perform the repo indexing process, and ensure your helm repository is up-to-date by running `helm repo update`. 3. You can then browse all charts uploaded to your repository by running `helm search repo airbyte`. @@ -220,9 +217,9 @@ Follow these instructions to add the Airbyte helm repository: webapp-url: # example: http://localhost:8080 initial-user: - email: - first-name: - last-name: + email: + first-name: + last-name: username: # your existing Airbyte instance username password: # your existing Airbyte instance password @@ -235,7 +232,7 @@ license-key: # license key provided by Airbyte team 4. Add your Airbyte Self-Managed Enterprise license key to your `airbyte.yml` in the `license-key` field. -5. To enable SSO authentication, add [SSO auth details](/access-management/sso) to your `airbyte.yml` file. +5. To enable SSO authentication, add [SSO auth details](/access-management/sso) to your `airbyte.yml` file.
    Configuring auth in your airbyte.yml file @@ -245,13 +242,13 @@ license-key: # license key provided by Airbyte team To configure SSO with Okta, add the following at the end of your `airbyte.yml` file: ```yaml -auth: - identity-providers: - - type: okta - domain: $OKTA_DOMAIN - app-name: $OKTA_APP_INTEGRATION_NAME - client-id: $OKTA_CLIENT_ID - client-secret: $OKTA_CLIENT_SECRET +auth: + identity-providers: + - type: okta + domain: $OKTA_DOMAIN + app-name: $OKTA_APP_INTEGRATION_NAME + client-id: $OKTA_CLIENT_ID + client-secret: $OKTA_CLIENT_SECRET ``` See the [following guide](/access-management/sso-providers/okta) on how to collect this information for Okta. @@ -262,13 +259,13 @@ See the [following guide](/access-management/sso-providers/okta) on how to colle To configure SSO with any identity provider via [OpenID Connect (OIDC)](https://openid.net/developers/how-connect-works/), such as Azure Entra ID (formerly ActiveDirectory), add the following at the end of your `values.yml` file: ```yaml -auth: - identity-providers: - - type: oidc - domain: $DOMAIN - app-name: $APP_INTEGRATION_NAME - client-id: $CLIENT_ID - client-secret: $CLIENT_SECRET +auth: + identity-providers: + - type: oidc + domain: $DOMAIN + app-name: $APP_INTEGRATION_NAME + client-id: $CLIENT_ID + client-secret: $CLIENT_SECRET ``` See the [following guide](/access-management/sso-providers/azure-entra-id) on how to collect this information for Azure Entra ID (formerly ActiveDirectory). @@ -293,7 +290,7 @@ global: edition: enterprise ``` -3. The following subsections help you customize your deployment to use an external database, log storage, dedicated ingress, and more. To skip this and deploy a minimal, local version of Self-Managed Enterprise, [jump to Step 4](#step-4-deploy-self-managed-enterprise). +3. The following subsections help you customize your deployment to use an external database, log storage, dedicated ingress, and more. To skip this and deploy a minimal, local version of Self-Managed Enterprise, [jump to Step 4](#step-4-deploy-self-managed-enterprise). #### Configuring the Airbyte Database @@ -308,16 +305,16 @@ We assume in the following that you've already configured a Postgres instance: ```yaml postgresql: - enabled: false + enabled: false -externalDatabase: - host: ## Database host - user: ## Non-root username for the Airbyte database - database: db-airbyte ## Database name - port: 5432 ## Database port number +externalDatabase: + host: ## Database host + user: ## Non-root username for the Airbyte database + database: db-airbyte ## Database name + port: 5432 ## Database port number ``` -2. For the non-root user's password which has database access, you may use `password`, `existingSecret` or `jdbcUrl`. We recommend using `existingSecret`, or injecting sensitive fields from your own external secret store. Each of these parameters is mutually exclusive: +2. For the non-root user's password which has database access, you may use `password`, `existingSecret` or `jdbcUrl`. We recommend using `existingSecret`, or injecting sensitive fields from your own external secret store. Each of these parameters is mutually exclusive: ```yaml postgresql: @@ -407,33 +404,33 @@ metadata: spec: ingressClassName: nginx rules: - - host: # host, example: enterprise-demo.airbyte.com - http: - paths: - - backend: - service: - # format is ${RELEASE_NAME}-airbyte-webapp-svc - name: airbyte-enterprise-airbyte-webapp-svc - port: - number: 80 # service port, example: 8080 - path: / - pathType: Prefix - - backend: - service: - # format is ${RELEASE_NAME}-airbyte-keycloak-svc - name: airbyte-enterprise-airbyte-keycloak-svc - port: - number: 8180 - path: /auth - pathType: Prefix - - backend: - service: - # format is ${RELEASE_NAME}-airbyte--server-svc - name: airbyte-enterprise-airbyte-server-svc - port: - number: 8001 - path: /api/public - pathType: Prefix + - host: # host, example: enterprise-demo.airbyte.com + http: + paths: + - backend: + service: + # format is ${RELEASE_NAME}-airbyte-webapp-svc + name: airbyte-enterprise-airbyte-webapp-svc + port: + number: 80 # service port, example: 8080 + path: / + pathType: Prefix + - backend: + service: + # format is ${RELEASE_NAME}-airbyte-keycloak-svc + name: airbyte-enterprise-airbyte-keycloak-svc + port: + number: 8180 + path: /auth + pathType: Prefix + - backend: + service: + # format is ${RELEASE_NAME}-airbyte--server-svc + name: airbyte-enterprise-airbyte-server-svc + port: + number: 8001 + path: /api/public + pathType: Prefix ``` @@ -462,31 +459,31 @@ metadata: # alb.ingress.kubernetes.io/security-groups: spec: rules: - - host: # e.g. enterprise-demo.airbyte.com - http: - paths: - - backend: - service: - name: airbyte-enterprise-airbyte-webapp-svc - port: - number: 80 - path: / - pathType: Prefix - - backend: - service: - name: airbyte-enterprise-airbyte-keycloak-svc - port: - number: 8180 - path: /auth - pathType: Prefix - - backend: - service: - # format is ${RELEASE_NAME}-airbyte-server-svc - name: airbyte-enterprise-airbyte-server-svc - port: - number: 8001 - path: /api/public - pathType: Prefix + - host: # e.g. enterprise-demo.airbyte.com + http: + paths: + - backend: + service: + name: airbyte-enterprise-airbyte-webapp-svc + port: + number: 80 + path: / + pathType: Prefix + - backend: + service: + name: airbyte-enterprise-airbyte-keycloak-svc + port: + number: 8180 + path: /auth + pathType: Prefix + - backend: + service: + # format is ${RELEASE_NAME}-airbyte-server-svc + name: airbyte-enterprise-airbyte-server-svc + port: + number: 8001 + path: /api/public + pathType: Prefix ``` The ALB controller will use a `ServiceAccount` that requires the [following IAM policy](https://raw.githubusercontent.com/kubernetes-sigs/aws-load-balancer-controller/main/docs/install/iam_policy.json) to be attached. @@ -527,7 +524,7 @@ secretsManager: kms: ## Optional - ARN for KMS Decryption. ``` -Set `authenticationType` to `instanceProfile` if the compute infrastructure running Airbyte has pre-existing permissions (e.g. IAM role) to read and write from AWS Secrets Manager. +Set `authenticationType` to `instanceProfile` if the compute infrastructure running Airbyte has pre-existing permissions (e.g. IAM role) to read and write from AWS Secrets Manager. To decrypt secrets in the secret manager with AWS KMS, configure the `kms` field, and ensure your Kubernetes cluster has pre-existing permissions to read and decrypt secrets. @@ -588,7 +585,7 @@ In order to customize your deployment, you need to create an additional `values. After specifying your own configuration, run the following command: ```sh -helm upgrade \ +helm upgrade \ --namespace airbyte \ --values path/to/values.yaml --values ./values.yml \ diff --git a/docs/enterprise-setup/upgrading-from-community.md b/docs/enterprise-setup/upgrading-from-community.md index 15217913cc1..300834a74b3 100644 --- a/docs/enterprise-setup/upgrading-from-community.md +++ b/docs/enterprise-setup/upgrading-from-community.md @@ -4,14 +4,15 @@ products: oss-enterprise # Existing Instance Upgrades -This page supplements the [Self-Managed Enterprise implementation guide](./implementation-guide.md). It highlights the steps to take if you are currently using Airbyte Self-Managed Community, our free open source offering, and are ready to upgrade to [Airbyte Self-Managed Enterprise](./README.md). +This page supplements the [Self-Managed Enterprise implementation guide](./implementation-guide.md). It highlights the steps to take if you are currently using Airbyte Self-Managed Community, our free open source offering, and are ready to upgrade to [Airbyte Self-Managed Enterprise](./README.md). -A valid license key is required to get started with Airbyte Enterprise. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. +A valid license key is required to get started with Airbyte Enterprise. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. These instructions are for you if: -* You want your Self-Managed Enterprise instance to inherit state from your existing deployment. -* You are currently deploying Airbyte on Kubernetes. -* You are comfortable with an in-place upgrade. This guide does not dual-write to a new Airbyte deployment. + +- You want your Self-Managed Enterprise instance to inherit state from your existing deployment. +- You are currently deploying Airbyte on Kubernetes. +- You are comfortable with an in-place upgrade. This guide does not dual-write to a new Airbyte deployment. ### Step 1: Update Airbyte Open Source @@ -35,21 +36,21 @@ At this step, please create and fill out the `airbyte.yml` as explained in the [ webapp-url: # example: localhost:8080 initial-user: - email: - first-name: - last-name: + email: + first-name: + last-name: username: # your existing Airbyte instance username password: # your existing Airbyte instance password -license-key: +license-key: auth: identity-providers: - type: okta - domain: - app-name: - client-id: - client-secret: + domain: + app-name: + client-id: + client-secret: ```
    @@ -62,7 +63,7 @@ auth: helm upgrade [RELEASE_NAME] airbyte/airbyte \ --version [RELEASE_VERSION] \ --set-file airbyteYml=./configs/airbyte.yml \ ---values ./charts/airbyte/airbyte-pro-values.yaml [... additional --values] +--values ./charts/airbyte/airbyte-pro-values.yaml [... additional --values] ``` 2. Once this is complete, you will need to upgrade your ingress to include the new `/auth` path. The following is a skimmed down definition of an ingress resource you could use for Self-Managed Enterprise: @@ -79,27 +80,27 @@ metadata: ingress.kubernetes.io/ssl-redirect: "false" spec: rules: - - host: # host, example: enterprise-demo.airbyte.com - http: - paths: - - backend: - service: - # format is ${RELEASE_NAME}-airbyte-webapp-svc - name: airbyte-pro-airbyte-webapp-svc - port: - number: # service port, example: 8080 - path: / - pathType: Prefix - - backend: - service: - # format is ${RELEASE_NAME}-airbyte-keycloak-svc - name: airbyte-pro-airbyte-keycloak-svc - port: - number: # service port, example: 8180 - path: /auth - pathType: Prefix + - host: # host, example: enterprise-demo.airbyte.com + http: + paths: + - backend: + service: + # format is ${RELEASE_NAME}-airbyte-webapp-svc + name: airbyte-pro-airbyte-webapp-svc + port: + number: # service port, example: 8080 + path: / + pathType: Prefix + - backend: + service: + # format is ${RELEASE_NAME}-airbyte-keycloak-svc + name: airbyte-pro-airbyte-keycloak-svc + port: + number: # service port, example: 8180 + path: /auth + pathType: Prefix ``` -All set! When you log in, you should expect all connections, sources and destinations to be present, and configured as prior. \ No newline at end of file +All set! When you log in, you should expect all connections, sources and destinations to be present, and configured as prior. diff --git a/docs/integrations/custom-connectors.md b/docs/integrations/custom-connectors.md index aef3132e0be..d4311984dbc 100644 --- a/docs/integrations/custom-connectors.md +++ b/docs/integrations/custom-connectors.md @@ -38,7 +38,7 @@ Once this is filled, you will see your connector in the UI and your team will be Note that this new connector could just be an updated version of an existing connector that you adapted to your specific edge case. Anything is possible! -When using Airbyte on Kubernetes, the repository name must be a valid Kubernetes name. That is, it must consist of lower case alphanumeric characters or '-', and must start and end with an alphanumeric character (e.g. 'my-name', or '123-abc'). Other names will work locally on Docker but cause an error on Kubernetes (Internal Server Error: Get Spec job failed). +When using Airbyte on Kubernetes, the repository name must be a valid Kubernetes name. That is, it must consist of lower case alphanumeric characters or '-', and must start and end with an alphanumeric character (e.g. 'my-name', or '123-abc'). Other names will work locally on Docker but cause an error on Kubernetes (Internal Server Error: Get Spec job failed). ## Upgrading a connector @@ -47,4 +47,3 @@ To upgrade your connector version, go to the admin panel in the left hand side o ![](../.gitbook/assets/upgrading_connector_admin_panel.png) To browse the available connector versions, simply click on the relevant link in the `Image` column to navigate to the connector's DockerHub page. From there, simply click on the `Tags` section in the top bar. - diff --git a/docs/integrations/destinations/README.md b/docs/integrations/destinations/README.md index 84df3b620ea..c480f56099b 100644 --- a/docs/integrations/destinations/README.md +++ b/docs/integrations/destinations/README.md @@ -6,7 +6,6 @@ A destination is a data warehouse, data lake, database, or an analytics tool whe Read more about our [Connector Support Levels](/integrations/connector-support-levels) to understand what to expect from a connector. - ## Destinations diff --git a/docs/integrations/destinations/astra.md b/docs/integrations/destinations/astra.md index 97cd8d30d95..46c5c2b58b0 100644 --- a/docs/integrations/destinations/astra.md +++ b/docs/integrations/destinations/astra.md @@ -15,11 +15,11 @@ This page contains the setup guide and reference information for the destination - Click Create Database. - In the Create Database dialog, select the Serverless (Vector) deployment type. - In the Configuration section, enter a name for the new database in the Database name field. --- Because database names can’t be changed later, it’s best to name your database something meaningful. Database names must start and end with an alphanumeric character, and may contain the following special characters: & + - _ ( ) < > . , @. + -- Because database names can’t be changed later, it’s best to name your database something meaningful. Database names must start and end with an alphanumeric character, and may contain the following special characters: & + - \_ ( ) < > . , @. - Select your preferred Provider and Region. --- You can select from a limited number of regions if you’re on the Free plan. Regions with a lock icon require that you upgrade to a Pay As You Go plan. + -- You can select from a limited number of regions if you’re on the Free plan. Regions with a lock icon require that you upgrade to a Pay As You Go plan. - Click Create Database. --- You are redirected to your new database’s Overview screen. Your database starts in Pending status before transitioning to Initializing. You’ll receive a notification once your database is initialized. + -- You are redirected to your new database’s Overview screen. Your database starts in Pending status before transitioning to Initializing. You’ll receive a notification once your database is initialized. #### Gathering other credentials @@ -37,9 +37,9 @@ This page contains the setup guide and reference information for the destination ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | -| 0.1.3 | 2024-04-19 | #37405 | Add "airbyte" user-agent in the HTTP requests to Astra DB | -| 0.1.2 | 2024-04-15 | | Moved to Poetry; Updated CDK & pytest versions| -| 0.1.1 | 2024-01-26 | | DS Branding Update | -| 0.1.0 | 2024-01-08 | | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------- | :-------------------------------------------------------- | +| 0.1.3 | 2024-04-19 | #37405 | Add "airbyte" user-agent in the HTTP requests to Astra DB | +| 0.1.2 | 2024-04-15 | | Moved to Poetry; Updated CDK & pytest versions | +| 0.1.1 | 2024-01-26 | | DS Branding Update | +| 0.1.0 | 2024-01-08 | | Initial Release | diff --git a/docs/integrations/destinations/aws-datalake.md b/docs/integrations/destinations/aws-datalake.md index 4a79a87fc0a..855a017c7a8 100644 --- a/docs/integrations/destinations/aws-datalake.md +++ b/docs/integrations/destinations/aws-datalake.md @@ -88,13 +88,13 @@ which will be translated for compatibility with the Glue Data Catalog: ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------------- | :---------------------------------------------------- | -| `0.1.7` | 2024-04-29 | [#33853](https://github.com/airbytehq/airbyte/pull/33853) | Enable STS Role Credential Refresh for Long Sync | -| `0.1.6` | 2024-03-22 | [#36386](https://github.com/airbytehq/airbyte/pull/36386) | Support new state message protocol | -| `0.1.5` | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region | -| `0.1.4` | 2023-10-25 | [\#29221](https://github.com/airbytehq/airbyte/pull/29221) | Upgrade AWSWrangler | -| `0.1.3` | 2023-03-28 | [\#24642](https://github.com/airbytehq/airbyte/pull/24642) | Prefer airbyte type for complex types when available | -| `0.1.2` | 2022-09-26 | [\#17193](https://github.com/airbytehq/airbyte/pull/17193) | Fix schema keyerror and add parquet support | -| `0.1.1` | 2022-04-20 | [\#11811](https://github.com/airbytehq/airbyte/pull/11811) | Fix name of required param in specification | -| `0.1.0` | 2022-03-29 | [\#10760](https://github.com/airbytehq/airbyte/pull/10760) | Initial release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :--------------------------------------------------- | +| `0.1.7` | 2024-04-29 | [#33853](https://github.com/airbytehq/airbyte/pull/33853) | Enable STS Role Credential Refresh for Long Sync | +| `0.1.6` | 2024-03-22 | [#36386](https://github.com/airbytehq/airbyte/pull/36386) | Support new state message protocol | +| `0.1.5` | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region | +| `0.1.4` | 2023-10-25 | [\#29221](https://github.com/airbytehq/airbyte/pull/29221) | Upgrade AWSWrangler | +| `0.1.3` | 2023-03-28 | [\#24642](https://github.com/airbytehq/airbyte/pull/24642) | Prefer airbyte type for complex types when available | +| `0.1.2` | 2022-09-26 | [\#17193](https://github.com/airbytehq/airbyte/pull/17193) | Fix schema keyerror and add parquet support | +| `0.1.1` | 2022-04-20 | [\#11811](https://github.com/airbytehq/airbyte/pull/11811) | Fix name of required param in specification | +| `0.1.0` | 2022-03-29 | [\#10760](https://github.com/airbytehq/airbyte/pull/10760) | Initial release | diff --git a/docs/integrations/destinations/bigquery-migrations.md b/docs/integrations/destinations/bigquery-migrations.md index 059044e8cec..8c7c4873f80 100644 --- a/docs/integrations/destinations/bigquery-migrations.md +++ b/docs/integrations/destinations/bigquery-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 2.0.0 -This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. +This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. Worthy of specific mention, this version includes: @@ -11,4 +11,4 @@ Worthy of specific mention, this version includes: - Removal of sub-tables for nested properties - Removal of SCD tables -Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping). \ No newline at end of file +Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping). diff --git a/docs/integrations/destinations/bigquery.md b/docs/integrations/destinations/bigquery.md index bbec53fe9bf..67d6c728916 100644 --- a/docs/integrations/destinations/bigquery.md +++ b/docs/integrations/destinations/bigquery.md @@ -83,13 +83,13 @@ https://github.com/airbytehq/airbyte/issues/3549 6. For **Dataset Location**, select the location of your BigQuery dataset. :::warning -You cannot change the location later. +You cannot change the location later. ::: 7. For **Default Dataset ID**, enter the BigQuery [Dataset ID](https://cloud.google.com/bigquery/docs/datasets#create-dataset). 8. For **Loading Method**, select [Standard Inserts](#using-insert) or - [GCS Staging](#recommended-using-a-google-cloud-storage-bucket). + [GCS Staging](#recommended-using-a-google-cloud-storage-bucket). :::tip We recommend using the GCS Staging option. diff --git a/docs/integrations/destinations/chroma.md b/docs/integrations/destinations/chroma.md index c3956cba832..c0eef50c221 100644 --- a/docs/integrations/destinations/chroma.md +++ b/docs/integrations/destinations/chroma.md @@ -1,8 +1,7 @@ # Chroma + This page guides you through the process of setting up the [Chroma](https://docs.trychroma.com/?lang=py) destination connector. - - ## Features | Feature | Supported?\(Yes/No\) | Notes | @@ -13,13 +12,12 @@ This page guides you through the process of setting up the [Chroma](https://docs #### Output Schema -Only one stream will exist to collect data from all source streams. This will be in a [collection](https://docs.trychroma.com/usage-guide#using-collections) in [Chroma](https://docs.trychroma.com/?lang=py) whose name will be defined by the user, and validated and corrected by Airbyte. +Only one stream will exist to collect data from all source streams. This will be in a [collection](https://docs.trychroma.com/usage-guide#using-collections) in [Chroma](https://docs.trychroma.com/?lang=py) whose name will be defined by the user, and validated and corrected by Airbyte. For each record, a UUID string is generated and used as the document id. The embeddings generated as defined will be stored as embeddings. Data in the text fields will be stored as documents and those in the metadata fields will be stored as metadata. ## Getting Started \(Airbyte Open Source\) - You can connect to a Chroma instance either in client/server mode or in a local persistent mode. For the local persistent mode, the database file will be saved in the path defined in the `path` config parameter. Note that `path` must be an absolute path, prefixed with `/local`. :::danger @@ -41,6 +39,7 @@ Please make sure that Docker Desktop has access to `/tmp` (and `/private` on a M #### Requirements To use the Chroma destination, you'll need: + - An account with API access for OpenAI, Cohere (depending on which embedding method you want to use) or neither (if you want to use the [default chroma embedding function](https://docs.trychroma.com/embeddings#default-all-minilm-l6-v2)) - A Chroma db instance (client/server mode or persistent mode) - Credentials (for cient/server mode) @@ -50,7 +49,6 @@ To use the Chroma destination, you'll need: Make sure your Chroma database can be accessed by Airbyte. If your database is within a VPC, you may need to allow access from the IP you're using to expose Airbyte. - ### Setup the Chroma Destination in Airbyte You should now have all the requirements needed to configure Chroma as a destination in the UI. You'll need the following information to configure the Chroma destination: @@ -58,8 +56,8 @@ You should now have all the requirements needed to configure Chroma as a destina - (Required) **Text fields to embed** - (Optional) **Text splitter** Options around configuring the chunking process provided by the [Langchain Python library](https://python.langchain.com/docs/get_started/introduction). - (Required) **Fields to store as metadata** -- (Required) **Collection** The name of the collection in Chroma db to store your data -- (Required) Authentication method +- (Required) **Collection** The name of the collection in Chroma db to store your data +- (Required) Authentication method - For client/server mode - **Host** for example localhost - **Port** for example 8000 @@ -67,22 +65,22 @@ You should now have all the requirements needed to configure Chroma as a destina - **Password** (Optional) - For persistent mode - **Path** The path to the local database file. Note that `path` must be an absolute path, prefixed with `/local`. -- (Optional) Embedding +- (Optional) Embedding - **OpenAI API key** if using OpenAI for embedding - **Cohere API key** if using Cohere for embedding - Embedding **Field name** and **Embedding dimensions** if getting the embeddings from stream records ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------------- | :----------------------------------------- | -| 0.0.10| 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK & pytest version to fix security vulnerabilities | -| 0.0.9 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | -| 0.0.8 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | -| 0.0.7 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | -| 0.0.6 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.0.5 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | -| 0.0.4 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | -| 0.0.3 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | -| 0.0.2 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | -| 0.0.1 | 2023-09-08 | [#30023](https://github.com/airbytehq/airbyte/pull/30023) | 🎉 New Destination: Chroma (Vector Database) | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------------------------- | +| 0.0.10 | 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK & pytest version to fix security vulnerabilities | +| 0.0.9 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | +| 0.0.8 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | +| 0.0.7 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | +| 0.0.6 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.0.5 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | +| 0.0.4 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | +| 0.0.3 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | +| 0.0.2 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | +| 0.0.1 | 2023-09-08 | [#30023](https://github.com/airbytehq/airbyte/pull/30023) | 🎉 New Destination: Chroma (Vector Database) | diff --git a/docs/integrations/destinations/clickhouse-migrations.md b/docs/integrations/destinations/clickhouse-migrations.md index df8590b36a5..f8096c77e84 100644 --- a/docs/integrations/destinations/clickhouse-migrations.md +++ b/docs/integrations/destinations/clickhouse-migrations.md @@ -5,11 +5,11 @@ This version removes the option to use "normalization" with clickhouse. It also changes the schema and database of Airbyte's "raw" tables to be compatible with the new [Destinations V2](https://docs.airbyte.com/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2) -format. These changes will likely require updates to downstream dbt / SQL models. After this update, -Airbyte will only produce the ‘raw’ v2 tables, which store all content in JSON. These changes remove -the ability to do deduplicated syncs with Clickhouse. (Clickhouse has an overview)[[https://clickhouse.com/docs/en/integrations/dbt]] +format. These changes will likely require updates to downstream dbt / SQL models. After this update, +Airbyte will only produce the ‘raw’ v2 tables, which store all content in JSON. These changes remove +the ability to do deduplicated syncs with Clickhouse. (Clickhouse has an overview)[[https://clickhouse.com/docs/en/integrations/dbt]] for integrating with dbt If you are interested in the Clickhouse destination gaining the full features -of Destinations V2 (including final tables), click [[https://github.com/airbytehq/airbyte/discussions/35339]] +of Destinations V2 (including final tables), click [[https://github.com/airbytehq/airbyte/discussions/35339]] to register your interest. This upgrade will ignore any existing raw tables and will not migrate any data to the new schema. @@ -42,25 +42,26 @@ INSERT INTO `airbyte_internal`.`default_raw__stream_{{stream_name}}` Airbyte will not delete any of your v1 data. ### Database/Schema and the Internal Schema + We have split the raw and final tables into their own schemas, which in clickhouse is analogous to a `database`. For the Clickhouse destination, this means that we will only write into the raw table which will live in the `airbyte_internal` database. -The tables written into this schema will be prefixed with either the default database provided in +The tables written into this schema will be prefixed with either the default database provided in the `DB Name` field when configuring clickhouse (but can also be overridden in the connection). You can -change the "raw" database from the default `airbyte_internal` by supplying a value for +change the "raw" database from the default `airbyte_internal` by supplying a value for `Raw Table Schema Name`. For Example: - - DB Name: `default` - - Stream Name: `my_stream` +- DB Name: `default` +- Stream Name: `my_stream` Writes to `airbyte_intneral.default_raw__stream_my_stream` where as: - - DB Name: `default` - - Stream Name: `my_stream` - - Raw Table Schema Name: `raw_data` +- DB Name: `default` +- Stream Name: `my_stream` +- Raw Table Schema Name: `raw_data` Writes to: `raw_data.default_raw__stream_my_stream` diff --git a/docs/integrations/destinations/clickhouse.md b/docs/integrations/destinations/clickhouse.md index 4495cb79e3d..90f722b66de 100644 --- a/docs/integrations/destinations/clickhouse.md +++ b/docs/integrations/destinations/clickhouse.md @@ -89,7 +89,7 @@ Therefore, Airbyte ClickHouse destination will create tables and schemas using t ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :--------------------------------------------------------- |:----------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :-------------------------------------------------------------------------------------------- | | 1.0.0 | 2024-02-07 | [\#34637](https://github.com/airbytehq/airbyte/pull/34637) | Update the raw table schema | | 0.2.5 | 2023-06-21 | [\#27555](https://github.com/airbytehq/airbyte/pull/27555) | Reduce image size | | 0.2.4 | 2023-06-05 | [\#27036](https://github.com/airbytehq/airbyte/pull/27036) | Internal code change for future development (install normalization packages inside connector) | diff --git a/docs/integrations/destinations/dev-null.md b/docs/integrations/destinations/dev-null.md index 10eab074f26..a29ea3a55c2 100644 --- a/docs/integrations/destinations/dev-null.md +++ b/docs/integrations/destinations/dev-null.md @@ -5,7 +5,7 @@ The Airbyte `dev-null` Destination. This destination is for testing and debuggin ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------- | | 0.3.2 | 2023-05-08 | [25776](https://github.com/airbytehq/airbyte/pull/25776) | Support Refreshes | | 0.3.0 | 2023-05-08 | [25776](https://github.com/airbytehq/airbyte/pull/25776) | Change Schema | | 0.2.7 | 2022-08-08 | [13932](https://github.com/airbytehq/airbyte/pull/13932) | Bump version | diff --git a/docs/integrations/destinations/duckdb.md b/docs/integrations/destinations/duckdb.md index 2812f0e3e56..fc43df5b401 100644 --- a/docs/integrations/destinations/duckdb.md +++ b/docs/integrations/destinations/duckdb.md @@ -104,15 +104,15 @@ Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------- | -| 0.3.5 | 2024-04-23 | [#37515](https://github.com/airbytehq/airbyte/pull/37515) | Add resource requirements declaration to `metatadat.yml`. | -| :------ | :--------- | :------------------------------------------------------- | :--------------------- | -| 0.3.4 | 2024-04-16 | [#36715](https://github.com/airbytehq/airbyte/pull/36715) | Improve ingestion performance using pyarrow inmem view for writing to DuckDB. | -| 0.3.3 | 2024-04-07 | [#36884](https://github.com/airbytehq/airbyte/pull/36884) | Fix stale dependency versions in lock file, add CLI for internal testing. | -| 0.3.2 | 2024-03-20 | [#32635](https://github.com/airbytehq/airbyte/pull/32635) | Instrument custom_user_agent to identify Airbyte-Motherduck connector usage. | -| 0.3.1 | 2023-11-18 | [#32635](https://github.com/airbytehq/airbyte/pull/32635) | Upgrade DuckDB version to [`v0.9.2`](https://github.com/duckdb/duckdb/releases/tag/v0.9.2). | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| 0.3.5 | 2024-04-23 | [#37515](https://github.com/airbytehq/airbyte/pull/37515) | Add resource requirements declaration to `metatadat.yml`. | +| :------ | :--------- | :------------------------------------------------------- | :--------------------- | +| 0.3.4 | 2024-04-16 | [#36715](https://github.com/airbytehq/airbyte/pull/36715) | Improve ingestion performance using pyarrow inmem view for writing to DuckDB. | +| 0.3.3 | 2024-04-07 | [#36884](https://github.com/airbytehq/airbyte/pull/36884) | Fix stale dependency versions in lock file, add CLI for internal testing. | +| 0.3.2 | 2024-03-20 | [#32635](https://github.com/airbytehq/airbyte/pull/32635) | Instrument custom_user_agent to identify Airbyte-Motherduck connector usage. | +| 0.3.1 | 2023-11-18 | [#32635](https://github.com/airbytehq/airbyte/pull/32635) | Upgrade DuckDB version to [`v0.9.2`](https://github.com/duckdb/duckdb/releases/tag/v0.9.2). | | 0.3.0 | 2022-10-23 | [#31744](https://github.com/airbytehq/airbyte/pull/31744) | Upgrade DuckDB version to [`v0.9.1`](https://github.com/duckdb/duckdb/releases/tag/v0.9.1). **Required update for all MotherDuck users.** Note, this is a **BREAKING CHANGE** for users who may have other connections using versions of DuckDB prior to 0.9.x. See the [0.9.0 release notes](https://github.com/duckdb/duckdb/releases/tag/v0.9.0) for more information and for upgrade instructions. | -| 0.2.1 | 2022-10-20 | [#30600](https://github.com/airbytehq/airbyte/pull/30600) | Fix: schema name mapping | -| 0.2.0 | 2022-10-19 | [#29428](https://github.com/airbytehq/airbyte/pull/29428) | Add support for MotherDuck. Upgrade DuckDB version to `v0.8``. | -| 0.1.0 | 2022-10-14 | [17494](https://github.com/airbytehq/airbyte/pull/17494) | New DuckDB destination | +| 0.2.1 | 2022-10-20 | [#30600](https://github.com/airbytehq/airbyte/pull/30600) | Fix: schema name mapping | +| 0.2.0 | 2022-10-19 | [#29428](https://github.com/airbytehq/airbyte/pull/29428) | Add support for MotherDuck. Upgrade DuckDB version to `v0.8``. | +| 0.1.0 | 2022-10-14 | [17494](https://github.com/airbytehq/airbyte/pull/17494) | New DuckDB destination | diff --git a/docs/integrations/destinations/e2e-test.md b/docs/integrations/destinations/e2e-test.md index 7fe6ad1930a..adfdbf862c1 100644 --- a/docs/integrations/destinations/e2e-test.md +++ b/docs/integrations/destinations/e2e-test.md @@ -4,13 +4,13 @@ This destination is for testing of Airbyte connections. It can be set up as a so ## Features -| Feature | Supported | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Replicate Incremental Deletes | No | | -| SSL connection | No | | -| SSH Tunnel Support | No | | +| Feature | Supported | Notes | +| :---------------------------- | :-------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Replicate Incremental Deletes | No | | +| SSL connection | No | | +| SSH Tunnel Support | No | | ## Mode @@ -26,11 +26,11 @@ This mode logs the data from the source connector. It will log at most 1,000 dat There are the different logging modes to choose from: -| Mode | Notes | Parameters | -| :--- | :--- | :--- | -| First N entries | Log the first N number of data entries for each data stream. | N: how many entries to log. | -| Every N-th entry | Log every N-th entry for each data stream. When N=1, it will log every entry. When N=2, it will log every other entry. Etc. | N: the N-th entry to log. Max entry count: max number of entries to log. | -| Random sampling | Log a random percentage of the entries for each data stream. | Sampling ratio: a number in range of `[0, 1]`. Optional seed: default to system epoch time. Max entry count: max number of entries to log. | +| Mode | Notes | Parameters | +| :--------------- | :-------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------- | +| First N entries | Log the first N number of data entries for each data stream. | N: how many entries to log. | +| Every N-th entry | Log every N-th entry for each data stream. When N=1, it will log every entry. When N=2, it will log every other entry. Etc. | N: the N-th entry to log. Max entry count: max number of entries to log. | +| Random sampling | Log a random percentage of the entries for each data stream. | Sampling ratio: a number in range of `[0, 1]`. Optional seed: default to system epoch time. Max entry count: max number of entries to log. | ### Throttling @@ -45,7 +45,7 @@ This mode throws an exception after receiving a configurable number of messages. The OSS and Cloud variants have the same version number starting from version `0.2.2`. | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:----------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------- | | 0.3.5 | 2024-04-29 | [37366](https://github.com/airbytehq/airbyte/pull/37366) | Support refreshes | | 0.3.4 | 2024-04-16 | [37366](https://github.com/airbytehq/airbyte/pull/37366) | Fix NPE | | 0.3.3 | 2024-04-16 | [37366](https://github.com/airbytehq/airbyte/pull/37366) | Fix Log trace messages | diff --git a/docs/integrations/destinations/elasticsearch.md b/docs/integrations/destinations/elasticsearch.md index 43cc9f677a6..f75bfd9e4b5 100644 --- a/docs/integrations/destinations/elasticsearch.md +++ b/docs/integrations/destinations/elasticsearch.md @@ -4,44 +4,42 @@ ### Output schema - Elasticsearch is a Lucene based search engine that's a type of NoSql storage. Documents are created in an `index`, similar to a `table`in a relation database. -The output schema matches the input schema of a source. +The output schema matches the input schema of a source. Each source `stream` becomes a destination `index`. For example, in with a relational database source - -The DB table name is mapped to the destination index. +The DB table name is mapped to the destination index. The DB table columns become fields in the destination document. -Each row becomes a document in the destination index. +Each row becomes a document in the destination index. ### Data type mapping [See Elastic documentation for detailed information about the field types](https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html) This section should contain a table mapping each of the connector's data types to Airbyte types. At the moment, Airbyte uses the same types used by [JSONSchema](https://json-schema.org/understanding-json-schema/reference/index.html). `string`, `date-time`, `object`, `array`, `boolean`, `integer`, and `number` are the most commonly used data types. -| Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| text | string | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/text.html) -| date | date-time | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/date.html) -| object | object | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/object.html) -| array | array | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html) -| boolean | boolean | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/boolean.html) -| numeric | integer | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html) -| numeric | number | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html) - +| Integration Type | Airbyte Type | Notes | +| :--------------- | :----------- | :---------------------------------------------------------------------------------------- | +| text | string | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/text.html) | +| date | date-time | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/date.html) | +| object | object | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/object.html) | +| array | array | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html) | +| boolean | boolean | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/boolean.html) | +| numeric | integer | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html) | +| numeric | number | [more info](https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html) | ### Features This section should contain a table with the following format: -| Feature | Supported?(Yes/No) | Notes | -| :--- |:-------------------| :--- | -| Full Refresh Sync | yes | | -| Incremental Sync | yes | | -| Replicate Incremental Deletes | no | | -| SSL connection | yes | | -| SSH Tunnel Support | yes | | +| Feature | Supported?(Yes/No) | Notes | +| :---------------------------- | :----------------- | :---- | +| Full Refresh Sync | yes | | +| Incremental Sync | yes | | +| Replicate Incremental Deletes | no | | +| SSL connection | yes | | +| SSH Tunnel Support | yes | | ### Performance considerations @@ -52,23 +50,25 @@ The connector should be enhanced to support variable batch sizes. ### Requirements -* Elasticsearch >= 7.x -* Configuration - * Endpoint URL [ex. https://elasticsearch.savantly.net:9423] - * Username [optional] (basic auth) - * Password [optional] (basic auth) - * CA certificate [optional] - * Api key ID [optional] - * Api key secret [optional] -* If authentication is used, the user should have permission to create an index if it doesn't exist, and/or be able to `create` documents +- Elasticsearch >= 7.x +- Configuration + - Endpoint URL [ex. https://elasticsearch.savantly.net:9423] + - Username [optional] (basic auth) + - Password [optional] (basic auth) + - CA certificate [optional] + - Api key ID [optional] + - Api key secret [optional] +- If authentication is used, the user should have permission to create an index if it doesn't exist, and/or be able to `create` documents ### CA certificate + Ca certificate may be fetched from the Elasticsearch server from /usr/share/elasticsearch/config/certs/http_ca.crt Fetching example from dockerized Elasticsearch: `docker cp es01:/usr/share/elasticsearch/config/certs/http_ca.crt .` where es01 is a container's name. For more details please visit https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html - + ### Setup guide -Enter the endpoint URL, select authentication method, and whether to use 'upsert' method when indexing new documents. + +Enter the endpoint URL, select authentication method, and whether to use 'upsert' method when indexing new documents. ### Connection via SSH Tunnel @@ -82,8 +82,8 @@ Using this feature requires additional configuration, when creating the source. 1. Configure all fields for the source as you normally would, except `SSH Tunnel Method`. 2. `SSH Tunnel Method` defaults to `No Tunnel` \(meaning a direct connection\). If you want to use an SSH Tunnel choose `SSH Key Authentication` or `Password Authentication`. - 1. Choose `Key Authentication` if you will be using an RSA private key as your secret for establishing the SSH Tunnel \(see below for more information on generating this key\). - 2. Choose `Password Authentication` if you will be using a password as your secret for establishing the SSH Tunnel. + 1. Choose `Key Authentication` if you will be using an RSA private key as your secret for establishing the SSH Tunnel \(see below for more information on generating this key\). + 2. Choose `Password Authentication` if you will be using a password as your secret for establishing the SSH Tunnel. 3. `SSH Tunnel Jump Server Host` refers to the intermediate \(bastion\) server that Airbyte will connect to. This should be a hostname or an IP Address. 4. `SSH Connection Port` is the port on the bastion server with which to make the SSH connection. The default port for SSH connections is `22`, so unless you have explicitly changed something, go with the default. 5. `SSH Login Username` is the username that Airbyte should use when connection to the bastion server. This is NOT the TiDB username. @@ -92,13 +92,12 @@ Using this feature requires additional configuration, when creating the source. ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.6 | 2022-10-26 | [18341](https://github.com/airbytehq/airbyte/pull/18341) | enforce ssl connection on cloud | -| 0.1.5 | 2022-10-24 | [18177](https://github.com/airbytehq/airbyte/pull/18177) | add custom CA certificate processing | -| 0.1.4 | 2022-10-14 | [17805](https://github.com/airbytehq/airbyte/pull/17805) | add SSH Tunneling | -| 0.1.3 | 2022-05-30 | [14640](https://github.com/airbytehq/airbyte/pull/14640) | Include lifecycle management | -| 0.1.2 | 2022-04-19 | [11752](https://github.com/airbytehq/airbyte/pull/11752) | Reduce batch size to 32Mb | -| 0.1.1 | 2022-02-10 | [10256](https://github.com/airbytehq/airbyte/pull/1256) | Add ExitOnOutOfMemoryError connectors | -| 0.1.0 | 2021-10-13 | [7005](https://github.com/airbytehq/airbyte/pull/7005) | Initial release. | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------ | +| 0.1.6 | 2022-10-26 | [18341](https://github.com/airbytehq/airbyte/pull/18341) | enforce ssl connection on cloud | +| 0.1.5 | 2022-10-24 | [18177](https://github.com/airbytehq/airbyte/pull/18177) | add custom CA certificate processing | +| 0.1.4 | 2022-10-14 | [17805](https://github.com/airbytehq/airbyte/pull/17805) | add SSH Tunneling | +| 0.1.3 | 2022-05-30 | [14640](https://github.com/airbytehq/airbyte/pull/14640) | Include lifecycle management | +| 0.1.2 | 2022-04-19 | [11752](https://github.com/airbytehq/airbyte/pull/11752) | Reduce batch size to 32Mb | +| 0.1.1 | 2022-02-10 | [10256](https://github.com/airbytehq/airbyte/pull/1256) | Add ExitOnOutOfMemoryError connectors | +| 0.1.0 | 2021-10-13 | [7005](https://github.com/airbytehq/airbyte/pull/7005) | Initial release. | diff --git a/docs/integrations/destinations/firestore.md b/docs/integrations/destinations/firestore.md index 94a6002a70c..16a067e47ae 100644 --- a/docs/integrations/destinations/firestore.md +++ b/docs/integrations/destinations/firestore.md @@ -12,6 +12,7 @@ Google Firestore, officially known as Cloud Firestore, is a flexible, scalable d - A role with permissions to create a Service Account Key in GCP ### Step 1: Create a Service Account + 1. Log in to the Google Cloud Console. Select the project where your Firestore database is located. 2. Navigate to "IAM & Admin" and select "Service Accounts". Create a Service Account and assign appropriate roles. Ensure “Cloud Datastore User” or “Firebase Rules System” are enabled. 3. Navigate to the service account and generate the JSON key. Download and copy the contents to the configuration. @@ -27,9 +28,9 @@ Each stream will be output into a BigQuery table. | Feature | Supported?\(Yes/No\) | Notes | | :----------------------------- | :------------------- | :---- | | Full Refresh Sync | ✅ | | -| Incremental - Append Sync | ✅ | | +| Incremental - Append Sync | ✅ | | | Incremental - Append + Deduped | ✅ | | -| Namespaces | ✅ | | +| Namespaces | ✅ | | ## Changelog diff --git a/docs/integrations/destinations/gcs.md b/docs/integrations/destinations/gcs.md index ee44df9e2fc..cae74cdd35b 100644 --- a/docs/integrations/destinations/gcs.md +++ b/docs/integrations/destinations/gcs.md @@ -10,14 +10,12 @@ The Airbyte GCS destination allows you to sync data to cloud storage buckets. Ea #### Features -| Feature | Support | Notes | -| :----------------------------- | :-----: | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. | +| Feature | Support | Notes | +| :----------------------------- | :-----: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. | | Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) | -| Incremental - Append + Deduped | ❌ | | -| Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. | - - +| Incremental - Append + Deduped | ❌ | | +| Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. | ## Getting started @@ -235,13 +233,12 @@ These parameters are related to the `ParquetOutputFormat`. See the [Java doc](ht Under the hood, an Airbyte data stream in Json schema is first converted to an Avro schema, then the Json object is converted to an Avro record, and finally the Avro record is outputted to the Parquet format. Because the data stream can come from any data source, the Json to Avro conversion process has arbitrary rules and limitations. Learn more about how source data is converted to Avro and the current limitations [here](https://docs.airbyte.com/understanding-airbyte/json-avro-conversion). - ## CHANGELOG | Version | Date | Pull Request | Subject | | :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------- | -| 0.4.6 | 2024-02-15 | [35285](https://github.com/airbytehq/airbyte/pull/35285) | Adopt CDK 0.20.8 | -| 0.4.5 | 2024-02-08 | [34745](https://github.com/airbytehq/airbyte/pull/34745) | Adopt CDK 0.19.0 | +| 0.4.6 | 2024-02-15 | [35285](https://github.com/airbytehq/airbyte/pull/35285) | Adopt CDK 0.20.8 | +| 0.4.5 | 2024-02-08 | [34745](https://github.com/airbytehq/airbyte/pull/34745) | Adopt CDK 0.19.0 | | 0.4.4 | 2023-07-14 | [#28345](https://github.com/airbytehq/airbyte/pull/28345) | Increment patch to trigger a rebuild | | 0.4.3 | 2023-07-05 | [#27936](https://github.com/airbytehq/airbyte/pull/27936) | Internal code update | | 0.4.2 | 2023-06-30 | [#27891](https://github.com/airbytehq/airbyte/pull/27891) | Internal code update | diff --git a/docs/integrations/destinations/google-sheets.md b/docs/integrations/destinations/google-sheets.md index 1bf21c51b22..073c2d26f1f 100644 --- a/docs/integrations/destinations/google-sheets.md +++ b/docs/integrations/destinations/google-sheets.md @@ -1,10 +1,10 @@ # Google Sheets -The Google Sheets Destination is configured to push data to a single Google Sheets spreadsheet with multiple Worksheets as streams. To replicate data to multiple spreadsheets, you can create multiple instances of the Google Sheets Destination in your Airbyte instance. +The Google Sheets Destination is configured to push data to a single Google Sheets spreadsheet with multiple Worksheets as streams. To replicate data to multiple spreadsheets, you can create multiple instances of the Google Sheets Destination in your Airbyte instance. :::warning -Google Sheets imposes rate limits and hard limits on the amount of data it can receive, which results in sync failure. Only use Google Sheets as a destination for small, non-production use cases, as it is not designed for handling large-scale data operations. +Google Sheets imposes rate limits and hard limits on the amount of data it can receive, which results in sync failure. Only use Google Sheets as a destination for small, non-production use cases, as it is not designed for handling large-scale data operations. Read more about the [limitations](#limitations) of using Google Sheets below. @@ -29,6 +29,7 @@ To create a Google account, visit [Google](https://support.google.com/accounts/a ## Step 2: Set up the Google Sheets destination connector in Airbyte + **For Airbyte Cloud:** 1. Select **Google Sheets** from the Source type dropdown and enter a name for this connector. @@ -38,6 +39,7 @@ To create a Google account, visit [Google](https://support.google.com/accounts/a + **For Airbyte Open Source:** Authentication to Google Sheets is only available using OAuth for authentication. @@ -55,7 +57,7 @@ Authentication to Google Sheets is only available using OAuth for authentication ### Output schema -Each worksheet in the selected spreadsheet will be the output as a separate source-connector stream. +Each worksheet in the selected spreadsheet will be the output as a separate source-connector stream. The output columns are re-ordered in alphabetical order. The output columns should **not** be reordered manually after the sync, as this could cause future syncs to fail. @@ -148,12 +150,12 @@ EXAMPLE: ## Changelog -| Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|------------------------------------------------| +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ---------------------------------------------------------- | | 0.2.3 | 2023-09-25 | [30748](https://github.com/airbytehq/airbyte/pull/30748) | Performance testing - include socat binary in docker image | -| 0.2.2 | 2023-07-06 | [28035](https://github.com/airbytehq/airbyte/pull/28035) | Migrate from authSpecification to advancedAuth | -| 0.2.1 | 2023-06-26 | [27782](https://github.com/airbytehq/airbyte/pull/27782) | Only allow HTTPS urls | -| 0.2.0 | 2023-06-26 | [27780](https://github.com/airbytehq/airbyte/pull/27780) | License Update: Elv2 | -| 0.1.2 | 2022-10-31 | [18729](https://github.com/airbytehq/airbyte/pull/18729) | Fix empty headers list | -| 0.1.1 | 2022-06-15 | [14751](https://github.com/airbytehq/airbyte/pull/14751) | Yield state only when records saved | -| 0.1.0 | 2022-04-26 | [12135](https://github.com/airbytehq/airbyte/pull/12135) | Initial Release | +| 0.2.2 | 2023-07-06 | [28035](https://github.com/airbytehq/airbyte/pull/28035) | Migrate from authSpecification to advancedAuth | +| 0.2.1 | 2023-06-26 | [27782](https://github.com/airbytehq/airbyte/pull/27782) | Only allow HTTPS urls | +| 0.2.0 | 2023-06-26 | [27780](https://github.com/airbytehq/airbyte/pull/27780) | License Update: Elv2 | +| 0.1.2 | 2022-10-31 | [18729](https://github.com/airbytehq/airbyte/pull/18729) | Fix empty headers list | +| 0.1.1 | 2022-06-15 | [14751](https://github.com/airbytehq/airbyte/pull/14751) | Yield state only when records saved | +| 0.1.0 | 2022-04-26 | [12135](https://github.com/airbytehq/airbyte/pull/12135) | Initial Release | diff --git a/docs/integrations/destinations/langchain-migrations.md b/docs/integrations/destinations/langchain-migrations.md index 005845d0382..90066ddbb0b 100644 --- a/docs/integrations/destinations/langchain-migrations.md +++ b/docs/integrations/destinations/langchain-migrations.md @@ -6,4 +6,4 @@ This version changes the way record ids are tracked internally. If you are using Prior to this version, deduplication only considered the primary key per record, without disambiugating between streams. This could lead to data loss if records from two different streams had the same primary key. -The problem is fixed by appending the namespace and stream name to the `_ab_record_id` field to disambiguate between records originating from different streams. If a connection using **append-dedup** mode is not reset after the upgrade, it will consider all records as new and will not deduplicate them, leading to stale vectors in the destination. \ No newline at end of file +The problem is fixed by appending the namespace and stream name to the `_ab_record_id` field to disambiguate between records originating from different streams. If a connection using **append-dedup** mode is not reset after the upgrade, it will consider all records as new and will not deduplicate them, leading to stale vectors in the destination. diff --git a/docs/integrations/destinations/langchain.md b/docs/integrations/destinations/langchain.md index 4ac1fe15190..2e92fdcd71d 100644 --- a/docs/integrations/destinations/langchain.md +++ b/docs/integrations/destinations/langchain.md @@ -6,21 +6,23 @@ The vector db destination destination has been split into separate destinations Please use the respective destination for the vector database you want to use to ensure you receive updates and support. To following databases are supported: -* [Pinecone](https://docs.airbyte.com/integrations/destinations/pinecone) -* [Weaviate](https://docs.airbyte.com/integrations/destinations/weaviate) -* [Milvus](https://docs.airbyte.com/integrations/destinations/milvus) -* [Chroma](https://docs.airbyte.com/integrations/destinations/chroma) -* [Qdrant](https://docs.airbyte.com/integrations/destinations/qdrant) -::: + +- [Pinecone](https://docs.airbyte.com/integrations/destinations/pinecone) +- [Weaviate](https://docs.airbyte.com/integrations/destinations/weaviate) +- [Milvus](https://docs.airbyte.com/integrations/destinations/milvus) +- [Chroma](https://docs.airbyte.com/integrations/destinations/chroma) +- [Qdrant](https://docs.airbyte.com/integrations/destinations/qdrant) + ::: ## Overview This destination prepares data to be used by [Langchain](https://langchain.com/) to retrieve relevant context for question answering use cases. There are three parts to this: -* Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. -* Embedding - convert the text into a vector representation using a pre-trained model (currently only OpenAI `text-embedding-ada-002` is supported) -* Indexing - store the vectors in a vector database for similarity search + +- Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. +- Embedding - convert the text into a vector representation using a pre-trained model (currently only OpenAI `text-embedding-ada-002` is supported) +- Indexing - store the vectors in a vector database for similarity search ### Processing @@ -72,6 +74,7 @@ For Pinecone pods of type starter, only up to 10,000 chunks can be indexed. For ::: + #### Chroma vector store The [Chroma vector store](https://trychroma.com) is running the Chroma embedding database as persistent client and stores the vectors in a local file. @@ -105,7 +108,6 @@ Please make sure that Docker Desktop has access to `/tmp` (and `/private` on a M ::: - #### DocArrayHnswSearch vector store For local testing, the [DocArrayHnswSearch](https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/docarray_hnsw) is recommended - it stores the vectors in a local file with a sqlite database for metadata. It is not suitable for production use, but it is the easiest to set up for testing and development purposes. @@ -146,20 +148,21 @@ DocArrayHnswSearch is meant to be used on a local workstation and won't work on Please make sure that Docker Desktop has access to `/tmp` (and `/private` on a MacOS, as /tmp has a symlink that points to /private. It will not work otherwise). You allow it with "File sharing" in `Settings -> Resources -> File sharing -> add the one or two above folder` and hit the "Apply & restart" button. ::: + ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------| :--------- |:--------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------| -| 0.1.2 | 2023-11-13 | [#32455](https://github.com/airbytehq/airbyte/pull/32455) | Fix build | -| 0.1.1 | 2023-09-01 | [#30282](https://github.com/airbytehq/airbyte/pull/30282) | Use embedders from CDK | -| 0.1.0 | 2023-09-01 | [#30080](https://github.com/airbytehq/airbyte/pull/30080) | Fix bug with potential data loss on append+dedup syncing. 🚨 Streams using append+dedup mode need to be reset after upgrade. | -| 0.0.8 | 2023-08-21 | [#29515](https://github.com/airbytehq/airbyte/pull/29515) | Clean up generated schema spec | -| 0.0.7 | 2023-08-18 | [#29513](https://github.com/airbytehq/airbyte/pull/29513) | Fix for starter pods | -| 0.0.6 | 2023-08-02 | [#28977](https://github.com/airbytehq/airbyte/pull/28977) | Validate pinecone index dimensions during check | -| 0.0.5 | 2023-07-25 | [#28605](https://github.com/airbytehq/airbyte/pull/28605) | Add Chroma support | -| 0.0.4 | 2023-07-21 | [#28556](https://github.com/airbytehq/airbyte/pull/28556) | Correctly dedupe records with composite and nested primary keys | -| 0.0.3 | 2023-07-20 | [#28509](https://github.com/airbytehq/airbyte/pull/28509) | Change the base image to python:3.9-slim to fix build | -| 0.0.2 | 2023-07-18 | [#26184](https://github.com/airbytehq/airbyte/pull/28398) | Adjust python dependencies and release on cloud | -| 0.0.1 | 2023-07-12 | [#26184](https://github.com/airbytehq/airbyte/pull/26184) | Initial release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------- | +| 0.1.2 | 2023-11-13 | [#32455](https://github.com/airbytehq/airbyte/pull/32455) | Fix build | +| 0.1.1 | 2023-09-01 | [#30282](https://github.com/airbytehq/airbyte/pull/30282) | Use embedders from CDK | +| 0.1.0 | 2023-09-01 | [#30080](https://github.com/airbytehq/airbyte/pull/30080) | Fix bug with potential data loss on append+dedup syncing. 🚨 Streams using append+dedup mode need to be reset after upgrade. | +| 0.0.8 | 2023-08-21 | [#29515](https://github.com/airbytehq/airbyte/pull/29515) | Clean up generated schema spec | +| 0.0.7 | 2023-08-18 | [#29513](https://github.com/airbytehq/airbyte/pull/29513) | Fix for starter pods | +| 0.0.6 | 2023-08-02 | [#28977](https://github.com/airbytehq/airbyte/pull/28977) | Validate pinecone index dimensions during check | +| 0.0.5 | 2023-07-25 | [#28605](https://github.com/airbytehq/airbyte/pull/28605) | Add Chroma support | +| 0.0.4 | 2023-07-21 | [#28556](https://github.com/airbytehq/airbyte/pull/28556) | Correctly dedupe records with composite and nested primary keys | +| 0.0.3 | 2023-07-20 | [#28509](https://github.com/airbytehq/airbyte/pull/28509) | Change the base image to python:3.9-slim to fix build | +| 0.0.2 | 2023-07-18 | [#26184](https://github.com/airbytehq/airbyte/pull/28398) | Adjust python dependencies and release on cloud | +| 0.0.1 | 2023-07-12 | [#26184](https://github.com/airbytehq/airbyte/pull/26184) | Initial release | diff --git a/docs/integrations/destinations/mariadb-columnstore.md b/docs/integrations/destinations/mariadb-columnstore.md index 14c191b1a65..d266e31f9b3 100644 --- a/docs/integrations/destinations/mariadb-columnstore.md +++ b/docs/integrations/destinations/mariadb-columnstore.md @@ -6,19 +6,19 @@ Each stream will be output into its own table in MariaDB ColumnStore. Each table will contain 3 columns: -* `_airbyte_ab_id`: a uuid assigned by Airbyte to each event that is processed. The column type in MariaDB ColumnStore is VARCHAR(256). -* `_airbyte_emitted_at`: a timestamp representing when the event was pulled from the data source. The column type in MariaDB ColumnStore is TIMESTAMP. -* `_airbyte_data`: a json blob representing with the event data. The column type in MariaDB ColumnStore is LONGTEXT. +- `_airbyte_ab_id`: a uuid assigned by Airbyte to each event that is processed. The column type in MariaDB ColumnStore is VARCHAR(256). +- `_airbyte_emitted_at`: a timestamp representing when the event was pulled from the data source. The column type in MariaDB ColumnStore is TIMESTAMP. +- `_airbyte_data`: a json blob representing with the event data. The column type in MariaDB ColumnStore is LONGTEXT. ### Features -| Feature | Supported?(Yes/No) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Replicate Incremental Deletes | Yes | | -| SSL connection | No | | -| SSH Tunnel Support | Yes | | +| Feature | Supported?(Yes/No) | Notes | +| :---------------------------- | :----------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Replicate Incremental Deletes | Yes | | +| SSL connection | No | | +| SSH Tunnel Support | Yes | | ### Performance considerations @@ -44,15 +44,15 @@ MariaDB ColumnStore doesn't differentiate between a database and schema. A datab ### Setup the MariaDB ColumnStore destination in Airbyte -Before setting up MariaDB ColumnStore destination in Airbyte, you need to set the [local\_infile](https://mariadb.com/kb/en/server-system-variables/#local_infile) system variable to true. You can do this by running the query `SET GLOBAL local_infile = true` . This is required cause Airbyte uses `LOAD DATA LOCAL INFILE` to load data into table. +Before setting up MariaDB ColumnStore destination in Airbyte, you need to set the [local_infile](https://mariadb.com/kb/en/server-system-variables/#local_infile) system variable to true. You can do this by running the query `SET GLOBAL local_infile = true` . This is required cause Airbyte uses `LOAD DATA LOCAL INFILE` to load data into table. You should now have all the requirements needed to configure MariaDB ColumnStore as a destination in the UI. You'll need the following information to configure the MariaDB ColumnStore destination: -* **Host** -* **Port** -* **Username** -* **Password** -* **Database** +- **Host** +- **Port** +- **Username** +- **Password** +- **Database** ## Connection via SSH Tunnel @@ -74,14 +74,13 @@ Using this feature requires additional configuration, when creating the destinat ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------| -| 0.1.7 | 2022-09-07 | [16391](https://github.com/airbytehq/airbyte/pull/16391) | Add custom JDBC parameters field | -| 0.1.6 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | -| 0.1.5 | 2022-05-17 | [12820](https://github.com/airbytehq/airbyte/pull/12820) | Improved 'check' operation performance | -| 0.1.4 | 2022-02-25 | [10421](https://github.com/airbytehq/airbyte/pull/10421) | Refactor JDBC parameters handling | -| 0.1.3 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | -| 0.1.2 | 2021-12-30 | [\#8809](https://github.com/airbytehq/airbyte/pull/8809) | Update connector fields title/description | -| 0.1.1 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key. | -| 0.1.0 | 2021-11-15 | [\#7961](https://github.com/airbytehq/airbyte/pull/7961) | Added MariaDB ColumnStore destination. | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------- | +| 0.1.7 | 2022-09-07 | [16391](https://github.com/airbytehq/airbyte/pull/16391) | Add custom JDBC parameters field | +| 0.1.6 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | +| 0.1.5 | 2022-05-17 | [12820](https://github.com/airbytehq/airbyte/pull/12820) | Improved 'check' operation performance | +| 0.1.4 | 2022-02-25 | [10421](https://github.com/airbytehq/airbyte/pull/10421) | Refactor JDBC parameters handling | +| 0.1.3 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | +| 0.1.2 | 2021-12-30 | [\#8809](https://github.com/airbytehq/airbyte/pull/8809) | Update connector fields title/description | +| 0.1.1 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key. | +| 0.1.0 | 2021-11-15 | [\#7961](https://github.com/airbytehq/airbyte/pull/7961) | Added MariaDB ColumnStore destination. | diff --git a/docs/integrations/destinations/milvus.md b/docs/integrations/destinations/milvus.md index 0af64809cdd..8e1983d2e5b 100644 --- a/docs/integrations/destinations/milvus.md +++ b/docs/integrations/destinations/milvus.md @@ -5,9 +5,10 @@ This page guides you through the process of setting up the [Milvus](https://milvus.io/) destination connector. There are three parts to this: -* Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. -* Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) -* Indexing - store the vectors in a vector database for similarity search + +- Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. +- Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) +- Indexing - store the vectors in a vector database for similarity search ## Prerequisites @@ -25,13 +26,13 @@ You'll need the following information to configure the destination: ## Features -| Feature | Supported? | Notes | -| :----------------------------- | :------------------- | :---- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Incremental - Append + Deduped | Yes | | -| Partitions | No | | -| Record-defined ID | No | Auto-id needs to be enabled | +| Feature | Supported? | Notes | +| :----------------------------- | :--------- | :-------------------------- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Incremental - Append + Deduped | Yes | | +| Partitions | No | | +| Record-defined ID | No | Auto-id needs to be enabled | ## Configuration @@ -51,7 +52,7 @@ The connector can use one of the following embedding methods: 1. OpenAI - using [OpenAI API](https://beta.openai.com/docs/api-reference/text-embedding) , the connector will produce embeddings using the `text-embedding-ada-002` model with **1536 dimensions**. This integration will be constrained by the [speed of the OpenAI embedding API](https://platform.openai.com/docs/guides/rate-limits/overview). -2. Cohere - using the [Cohere API](https://docs.cohere.com/reference/embed), the connector will produce embeddings using the `embed-english-light-v2.0` model with **1024 dimensions**. +2. Cohere - using the [Cohere API](https://docs.cohere.com/reference/embed), the connector will produce embeddings using the `embed-english-light-v2.0` model with **1024 dimensions**. For testing purposes, it's also possible to use the [Fake embeddings](https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/fake) integration. It will generate random embeddings and is suitable to test a data pipeline without incurring embedding costs. @@ -60,14 +61,16 @@ For testing purposes, it's also possible to use the [Fake embeddings](https://py If the specified collection doesn't exist, the connector will create it for you with a primary key field `pk` and the configured vector field matching the embedding configuration. Dynamic fields will be enabled. The vector field will have an L2 IVF_FLAT index with an `nlist` parameter of 1024. If you want to change any of these settings, create a new collection in your Milvus instance yourself. Make sure that -* The primary key field is set to [auto_id](https://milvus.io/docs/create_collection.md) -* There is a vector field with the correct dimensionality (1536 for OpenAI, 1024 for Cohere) and [a configured index](https://milvus.io/docs/build_index.md) + +- The primary key field is set to [auto_id](https://milvus.io/docs/create_collection.md) +- There is a vector field with the correct dimensionality (1536 for OpenAI, 1024 for Cohere) and [a configured index](https://milvus.io/docs/build_index.md) If the record contains a field with the same name as the primary key, it will be prefixed with an underscore so Milvus can control the primary key internally. ### Setting up a collection When using the Zilliz cloud, this can be done using the UI - in this case only the collection name and the vector dimensionality needs to be configured, the vector field with index will be automatically created under the name `vector`. Using the REST API, the following command will create the index: + ``` POST /v1/vector/collections/create { @@ -80,6 +83,7 @@ POST /v1/vector/collections/create ``` When using a self-hosted Milvus cluster, the collection needs to be created using the Milvus CLI or Python client. The following commands will create a collection set up for loading data via Airbyte: + ```python from pymilvus import CollectionSchema, FieldSchema, DataType, connections, Collection @@ -95,6 +99,7 @@ collection.create_index(field_name="vector", index_params={ "metric_type":"L2", ### Langchain integration To initialize a langchain vector store based on the indexed data, use the following code: + ```python embeddings = OpenAIEmbeddings(openai_api_key="my-key") vector_store = Milvus(embeddings=embeddings, collection_name="my-collection", connection_args={"uri": "my-zilliz-endpoint", "token": "my-api-key"}) @@ -104,22 +109,21 @@ vector_store.fields.append("text") vector_store.similarity_search("test") ``` - ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------| :--------- |:--------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------| -| 0.0.14 | 2024-3-22 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities | -| 0.0.13 | 2024-3-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Move to poetry; Fix tests | -| 0.0.12 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | -| 0.0.11 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | -| 0.0.10 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | -| 0.0.9 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.0.8 | 2023-11-08 | [#31563](https://github.com/airbytehq/airbyte/pull/32262) | Auto-create collection if it doesn't exist | -| 0.0.7 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | -| 0.0.6 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.0.5 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | -| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | -| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | -| 0.0.2 | 2023-08-25 | [#30689](https://github.com/airbytehq/airbyte/pull/30689) | Update CDK to support azure OpenAI embeddings and text splitting options, make sure primary key field is not accidentally set, promote to certified | -| 0.0.1 | 2023-08-12 | [#29442](https://github.com/airbytehq/airbyte/pull/29442) | Milvus connector with some embedders | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------- | +| 0.0.14 | 2024-3-22 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities | +| 0.0.13 | 2024-3-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Move to poetry; Fix tests | +| 0.0.12 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | +| 0.0.11 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | +| 0.0.10 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | +| 0.0.9 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.0.8 | 2023-11-08 | [#31563](https://github.com/airbytehq/airbyte/pull/32262) | Auto-create collection if it doesn't exist | +| 0.0.7 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | +| 0.0.6 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.0.5 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | +| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | +| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | +| 0.0.2 | 2023-08-25 | [#30689](https://github.com/airbytehq/airbyte/pull/30689) | Update CDK to support azure OpenAI embeddings and text splitting options, make sure primary key field is not accidentally set, promote to certified | +| 0.0.1 | 2023-08-12 | [#29442](https://github.com/airbytehq/airbyte/pull/29442) | Milvus connector with some embedders | diff --git a/docs/integrations/destinations/mssql-migrations.md b/docs/integrations/destinations/mssql-migrations.md index a966bee8b34..d4166eabacf 100644 --- a/docs/integrations/destinations/mssql-migrations.md +++ b/docs/integrations/destinations/mssql-migrations.md @@ -16,7 +16,6 @@ This upgrade will ignore any existing raw tables and will not migrate any data t For each stream, you should perform the following query to migrate the data from the old raw table to the new raw table: - ```sql -- assumes your schema was 'default' -- replace `{{stream_name}}` with replace your stream name @@ -42,6 +41,7 @@ FROM airbyte._airbyte_raw_{{stream_name}} **Airbyte will not delete any of your v1 data.** ### Schema and the Internal Schema + We have split the raw and final tables into their own schemas. For the Microsoft SQL Server destination, this means that we will only write into the raw table which will live in the `airbyte_internal` schema. The tables written into this schema will be prefixed with either the default database provided in diff --git a/docs/integrations/destinations/mssql.md b/docs/integrations/destinations/mssql.md index 9aa0a06dd72..60417298571 100644 --- a/docs/integrations/destinations/mssql.md +++ b/docs/integrations/destinations/mssql.md @@ -115,7 +115,7 @@ Using this feature requires additional configuration, when creating the source. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:----------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :-------------------------------------------------------------------------------------------------- | | 1.0.0 | 2024-04-11 | [\#36050](https://github.com/airbytehq/airbyte/pull/36050) | Update to Dv2 Table Format and Remove normalization | | 0.2.0 | 2023-06-27 | [\#27781](https://github.com/airbytehq/airbyte/pull/27781) | License Update: Elv2 | | 0.1.25 | 2023-06-21 | [\#27555](https://github.com/airbytehq/airbyte/pull/27555) | Reduce image size | diff --git a/docs/integrations/destinations/mysql-migrations.md b/docs/integrations/destinations/mysql-migrations.md index bb0e33134b0..2fd780d8b81 100644 --- a/docs/integrations/destinations/mysql-migrations.md +++ b/docs/integrations/destinations/mysql-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 1.0.0 -This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. +This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. Worthy of specific mention, this version includes: diff --git a/docs/integrations/destinations/oracle-migrations.md b/docs/integrations/destinations/oracle-migrations.md index 212006e46b5..96b01d83e2f 100644 --- a/docs/integrations/destinations/oracle-migrations.md +++ b/docs/integrations/destinations/oracle-migrations.md @@ -7,7 +7,7 @@ the schema and database of Airbyte's "raw" tables to be compatible with the new [Destinations V2](https://docs.airbyte.com/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2) format. These changes will likely require updates to downstream dbt / SQL models. After this update, Airbyte will only produce the "raw" v2 tables, which store all content in JSON. These changes remove -the ability to do deduplicated syncs with Oracle. +the ability to do deduplicated syncs with Oracle. If you are interested in the Oracle destination gaining the full features of Destinations V2 (including final tables), click [[https://github.com/airbytehq/airbyte/discussions/37024]] @@ -42,6 +42,7 @@ INSERT INTO airbyte_internal.default_raw__stream_{{stream_name}} **Airbyte will not delete any of your v1 data.** ### Database/Schema and the Internal Schema + We have split the raw and final tables into their own schemas, which means that we will only write into the raw tables which will live in the `airbyte_internal` schema. The tables written into this schema will be prefixed with either the default schema provided in diff --git a/docs/integrations/destinations/oracle.md b/docs/integrations/destinations/oracle.md index 4d6e43f6daa..14d388c0a09 100644 --- a/docs/integrations/destinations/oracle.md +++ b/docs/integrations/destinations/oracle.md @@ -91,7 +91,7 @@ Airbyte has the ability to connect to the Oracle source with 3 network connectiv ## Changelog | Version | Date | Pull Request | Subject | -|:------------|:-----------|:-----------------------------------------------------------|:----------------------------------------------------------------------------------------------------| +| :---------- | :--------- | :--------------------------------------------------------- | :-------------------------------------------------------------------------------------------------- | | 1.0.0 | 2024-04-11 | [\#36048](https://github.com/airbytehq/airbyte/pull/36048) | Removes Normalization, updates to V2 Raw Table Format | | 0.2.0 | 2023-06-27 | [\#27781](https://github.com/airbytehq/airbyte/pull/27781) | License Update: Elv2 | | 0.1.19 | 2022-07-26 | [\#10719](https://github.com/airbytehq/airbyte/pull/) | Destination Oracle: added custom JDBC parameters support. | diff --git a/docs/integrations/destinations/pinecone.md b/docs/integrations/destinations/pinecone.md index f55277c5181..f91731c4e45 100644 --- a/docs/integrations/destinations/pinecone.md +++ b/docs/integrations/destinations/pinecone.md @@ -5,9 +5,10 @@ This page guides you through the process of setting up the [Pinecone](https://pinecone.io/) destination connector. There are three parts to this: -* Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. -* Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) -* Indexing - store the vectors in a vector database for similarity search + +- Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. +- Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) +- Indexing - store the vectors in a vector database for similarity search ## Prerequisites @@ -25,20 +26,21 @@ You'll need the following information to configure the destination: ## Features -| Feature | Supported? | Notes | -| :----------------------------- | :------------------- | :---- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Incremental - Append + Deduped | Yes | Deleting records via CDC is not supported (see issue [#29827](https://github.com/airbytehq/airbyte/issues/29827)) | -| Namespaces | Yes | | +| Feature | Supported? | Notes | +| :----------------------------- | :--------- | :---------------------------------------------------------------------------------------------------------------- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Incremental - Append + Deduped | Yes | Deleting records via CDC is not supported (see issue [#29827](https://github.com/airbytehq/airbyte/issues/29827)) | +| Namespaces | Yes | | ## Data type mapping All fields specified as metadata fields will be stored in the metadata object of the document and can be used for filtering. The following data types are allowed for metadata fields: -* String -* Number (integer or floating point, gets converted to a 64 bit floating point) -* Booleans (true, false) -* List of String + +- String +- Number (integer or floating point, gets converted to a 64 bit floating point) +- Booleans (true, false) +- List of String All other fields are ignored. @@ -46,7 +48,7 @@ All other fields are ignored. ### Processing -Each record will be split into text fields and meta fields as configured in the "Processing" section. All text fields are concatenated into a single string and then split into chunks of configured length. If specified, the metadata fields are stored as-is along with the embedded text chunks. Please note that meta data fields can only be used for filtering and not for retrieval and have to be of type string, number, boolean (all other values are ignored). Please note that there's a 40kb limit on the _total_ size of the metadata saved for each entry. Options around configuring the chunking process use the [Langchain Python library](https://python.langchain.com/docs/get_started/introduction). +Each record will be split into text fields and meta fields as configured in the "Processing" section. All text fields are concatenated into a single string and then split into chunks of configured length. If specified, the metadata fields are stored as-is along with the embedded text chunks. Please note that meta data fields can only be used for filtering and not for retrieval and have to be of type string, number, boolean (all other values are ignored). Please note that there's a 40kb limit on the _total_ size of the metadata saved for each entry. Options around configuring the chunking process use the [Langchain Python library](https://python.langchain.com/docs/get_started/introduction). When specifying text fields, you can access nested fields in the record by using dot notation, e.g. `user.name` will access the `name` field in the `user` object. It's also possible to use wildcards to access all fields in an object, e.g. `users.*.name` will access all `names` fields in all entries of the `users` array. @@ -72,30 +74,30 @@ OpenAI and Fake embeddings produce vectors with 1536 dimensions, and the Cohere ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------| :--------- |:--------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------| -| 0.1.0 | 2023-05-06 | [#37756](https://github.com/airbytehq/airbyte/pull/37756) | Add support for Pinecone Serverless | -| 0.0.24 | 2023-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities. | -| 0.0.23 | 2023-03-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Bump versions to latest, resolves test failures. | -| 0.0.22 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | -| 0.0.21 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | -| 0.0.20 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.0.19 | 2023-10-20 | [#31329](https://github.com/airbytehq/airbyte/pull/31373) | Improve error messages | -| 0.0.18 | 2023-10-20 | [#31329](https://github.com/airbytehq/airbyte/pull/31373) | Add support for namespaces and fix index cleaning when namespace is defined | -| 0.0.17 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.0.16 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | -| 0.0.15 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | -| 0.0.14 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | -| 0.0.13 | 2023-09-26 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Allow more text splitting options | -| 0.0.12 | 2023-09-25 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Fix bug with stale documents left on starter pods | -| 0.0.11 | 2023-09-22 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Set visible certified flag | -| 0.0.10 | 2023-09-20 | [#30514](https://github.com/airbytehq/airbyte/pull/30514) | Fix bug with failing embedding step on large records | -| 0.0.9 | 2023-09-18 | [#30510](https://github.com/airbytehq/airbyte/pull/30510) | Fix bug with overwrite mode on starter pods | -| 0.0.8 | 2023-09-14 | [#30296](https://github.com/airbytehq/airbyte/pull/30296) | Add Azure embedder | -| 0.0.7 | 2023-09-13 | [#30382](https://github.com/airbytehq/airbyte/pull/30382) | Promote to certified/beta | -| 0.0.6 | 2023-09-09 | [#30193](https://github.com/airbytehq/airbyte/pull/30193) | Improve documentation | -| 0.0.5 | 2023-09-07 | [#30133](https://github.com/airbytehq/airbyte/pull/30133) | Refactor internal structure of connector | -| 0.0.4 | 2023-09-05 | [#30086](https://github.com/airbytehq/airbyte/pull/30079) | Switch to GRPC client for improved performance. | -| 0.0.3 | 2023-09-01 | [#30079](https://github.com/airbytehq/airbyte/pull/30079) | Fix bug with potential data loss on append+dedup syncing. 🚨 Streams using append+dedup mode need to be reset after upgrade. | -| 0.0.2 | 2023-08-31 | [#29442](https://github.com/airbytehq/airbyte/pull/29946) | Improve test coverage | -| 0.0.1 | 2023-08-29 | [#29539](https://github.com/airbytehq/airbyte/pull/29539) | Pinecone connector with some embedders | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------- | +| 0.1.0 | 2023-05-06 | [#37756](https://github.com/airbytehq/airbyte/pull/37756) | Add support for Pinecone Serverless | +| 0.0.24 | 2023-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities. | +| 0.0.23 | 2023-03-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Bump versions to latest, resolves test failures. | +| 0.0.22 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | +| 0.0.21 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | +| 0.0.20 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.0.19 | 2023-10-20 | [#31329](https://github.com/airbytehq/airbyte/pull/31373) | Improve error messages | +| 0.0.18 | 2023-10-20 | [#31329](https://github.com/airbytehq/airbyte/pull/31373) | Add support for namespaces and fix index cleaning when namespace is defined | +| 0.0.17 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.0.16 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | +| 0.0.15 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | +| 0.0.14 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | +| 0.0.13 | 2023-09-26 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Allow more text splitting options | +| 0.0.12 | 2023-09-25 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Fix bug with stale documents left on starter pods | +| 0.0.11 | 2023-09-22 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Set visible certified flag | +| 0.0.10 | 2023-09-20 | [#30514](https://github.com/airbytehq/airbyte/pull/30514) | Fix bug with failing embedding step on large records | +| 0.0.9 | 2023-09-18 | [#30510](https://github.com/airbytehq/airbyte/pull/30510) | Fix bug with overwrite mode on starter pods | +| 0.0.8 | 2023-09-14 | [#30296](https://github.com/airbytehq/airbyte/pull/30296) | Add Azure embedder | +| 0.0.7 | 2023-09-13 | [#30382](https://github.com/airbytehq/airbyte/pull/30382) | Promote to certified/beta | +| 0.0.6 | 2023-09-09 | [#30193](https://github.com/airbytehq/airbyte/pull/30193) | Improve documentation | +| 0.0.5 | 2023-09-07 | [#30133](https://github.com/airbytehq/airbyte/pull/30133) | Refactor internal structure of connector | +| 0.0.4 | 2023-09-05 | [#30086](https://github.com/airbytehq/airbyte/pull/30079) | Switch to GRPC client for improved performance. | +| 0.0.3 | 2023-09-01 | [#30079](https://github.com/airbytehq/airbyte/pull/30079) | Fix bug with potential data loss on append+dedup syncing. 🚨 Streams using append+dedup mode need to be reset after upgrade. | +| 0.0.2 | 2023-08-31 | [#29442](https://github.com/airbytehq/airbyte/pull/29946) | Improve test coverage | +| 0.0.1 | 2023-08-29 | [#29539](https://github.com/airbytehq/airbyte/pull/29539) | Pinecone connector with some embedders | diff --git a/docs/integrations/destinations/postgres-migrations.md b/docs/integrations/destinations/postgres-migrations.md index 5c6375c6f91..951a3c626c2 100644 --- a/docs/integrations/destinations/postgres-migrations.md +++ b/docs/integrations/destinations/postgres-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 2.0.0 -This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. +This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. Worthy of specific mention, this version includes: diff --git a/docs/integrations/destinations/postgres.md b/docs/integrations/destinations/postgres.md index d6d2add6a68..9df03b57b85 100644 --- a/docs/integrations/destinations/postgres.md +++ b/docs/integrations/destinations/postgres.md @@ -243,8 +243,9 @@ with this option! You may want to create objects that depend on the tables generated by Airbyte, such as views. If you do so, we strongly recommend: -* Using a tool like `dbt` to automate the creation -* And using an orchestrator to trigger `dbt`. + +- Using a tool like `dbt` to automate the creation +- And using an orchestrator to trigger `dbt`. This is because you will need to enable the "Drop tables with CASCADE" option. The connector sometimes needs to recreate the tables; if you have created dependent objects, Postgres will require @@ -262,7 +263,7 @@ Now that you have set up the Postgres destination connector, check out the follo ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:---------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------- | | 2.0.9 | 2024-04-11 | [\#36974](https://github.com/airbytehq/airbyte/pull/36974) | Add option to drop with `CASCADE` | | 2.0.8 | 2024-04-10 | [\#36805](https://github.com/airbytehq/airbyte/pull/36805) | Adopt CDK 0.29.10 to improve long column name handling | | 2.0.7 | 2024-04-08 | [\#36768](https://github.com/airbytehq/airbyte/pull/36768) | Adopt CDK 0.29.7 to improve destination state handling | diff --git a/docs/integrations/destinations/qdrant.md b/docs/integrations/destinations/qdrant.md index 9e56e223684..c57a1e83855 100644 --- a/docs/integrations/destinations/qdrant.md +++ b/docs/integrations/destinations/qdrant.md @@ -1,8 +1,7 @@ # Qdrant + This page guides you through the process of setting up the [Qdrant](https://qdrant.tech/documentation/) destination connector. - - ## Features | Feature | Supported?\(Yes/No\) | Notes | @@ -20,15 +19,16 @@ For each [point](https://qdrant.tech/documentation/concepts/points/) in the coll ## Getting Started You can connect to a Qdrant instance either in local mode or cloud mode. - - For the local mode, you will need to set it up using Docker. Check the Qdrant docs [here](https://qdrant.tech/documentation/guides/installation/#docker) for an official guide. After setting up, you would need your host, port and if applicable, your gRPC port. - - To setup to an instance in Qdrant cloud, check out [this official guide](https://qdrant.tech/documentation/cloud/) to get started. After setting up the instance, you would need the instance url and an API key to connect. + +- For the local mode, you will need to set it up using Docker. Check the Qdrant docs [here](https://qdrant.tech/documentation/guides/installation/#docker) for an official guide. After setting up, you would need your host, port and if applicable, your gRPC port. +- To setup to an instance in Qdrant cloud, check out [this official guide](https://qdrant.tech/documentation/cloud/) to get started. After setting up the instance, you would need the instance url and an API key to connect. Note that this connector does not support a local persistent mode. To test, use the docker option. - #### Requirements To use the Qdrant destination, you'll need: + - An account with API access for OpenAI, Cohere (depending on which embedding method you want to use) or neither (if you want to extract the vectors from the source stream) - A Qdrant db instance (local mode or cloud mode) - Qdrant API Credentials (for cloud mode) @@ -39,7 +39,6 @@ To use the Qdrant destination, you'll need: Make sure your Qdrant database can be accessed by Airbyte. If your database is within a VPC, you may need to allow access from the IP you're using to expose Airbyte. - ### Setup the Qdrant Destination in Airbyte You should now have all the requirements needed to configure Qdrant as a destination in the UI. You'll need the following information to configure the Qdrant destination: @@ -47,14 +46,14 @@ You should now have all the requirements needed to configure Qdrant as a destina - (Required) **Text fields to embed** - (Optional) **Text splitter** Options around configuring the chunking process provided by the [Langchain Python library](https://python.langchain.com/docs/get_started/introduction). - (Required) **Fields to store as metadata** -- (Required) **Collection** The name of the collection in Qdrant db to store your data +- (Required) **Collection** The name of the collection in Qdrant db to store your data - (Required) **The field in the payload that contains the embedded text** - (Required) **Prefer gRPC** Whether to prefer gRPC over HTTP. - (Required) **Distance Metric** The Distance metrics used to measure similarities among vectors. Select from: - [Dot product](https://en.wikipedia.org/wiki/Dot_product) - [Cosine similarity](https://en.wikipedia.org/wiki/Cosine_similarity) - [Euclidean distance](https://en.wikipedia.org/wiki/Euclidean_distance) -- (Required) Authentication method +- (Required) Authentication method - For local mode - **Host** for example localhost - **Port** for example 8000 @@ -62,23 +61,23 @@ You should now have all the requirements needed to configure Qdrant as a destina - For cloud mode - **Url** The url of the cloud Qdrant instance. - **API Key** The API Key for the cloud Qdrant instance -- (Optional) Embedding +- (Optional) Embedding - **OpenAI API key** if using OpenAI for embedding - **Cohere API key** if using Cohere for embedding - Embedding **Field name** and **Embedding dimensions** if getting the embeddings from stream records ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------------- | :----------------------------------------- | -| 0.0.11 | 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK and pytest versions to fix security vulnerabilities | -| 0.0.10 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | -| 0.0.9 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | -| 0.0.8 | 2023-11-29 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources and fix spec schema | -| 0.0.7 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.0.6 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | -| 0.0.5 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | -| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | -| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------------------------------------- | +| 0.0.11 | 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK and pytest versions to fix security vulnerabilities | +| 0.0.10 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | +| 0.0.9 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | +| 0.0.8 | 2023-11-29 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources and fix spec schema | +| 0.0.7 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.0.6 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option | +| 0.0.5 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | +| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size | +| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK | | 0.0.2 | 2023-09-25 | [#30689](https://github.com/airbytehq/airbyte/pull/30689) | Update CDK to support Azure OpenAI embeddings and text splitting options | -| 0.0.1 | 2023-09-22 | [#30332](https://github.com/airbytehq/airbyte/pull/30332) | 🎉 New Destination: Qdrant (Vector Database) | +| 0.0.1 | 2023-09-22 | [#30332](https://github.com/airbytehq/airbyte/pull/30332) | 🎉 New Destination: Qdrant (Vector Database) | diff --git a/docs/integrations/destinations/redshift-migrations.md b/docs/integrations/destinations/redshift-migrations.md index 59d91b557f8..7cd43c08cb6 100644 --- a/docs/integrations/destinations/redshift-migrations.md +++ b/docs/integrations/destinations/redshift-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 2.0.0 -This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. +This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. Worthy of specific mention, this version includes: diff --git a/docs/integrations/destinations/redshift.md b/docs/integrations/destinations/redshift.md index da0fef781ac..5ccf766a15d 100644 --- a/docs/integrations/destinations/redshift.md +++ b/docs/integrations/destinations/redshift.md @@ -241,7 +241,7 @@ Each stream will be output into its own raw table in Redshift. Each table will c ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 2.5.0 | 2024-05-06 | [\#34613](https://github.com/airbytehq/airbyte/pull/34613) | Upgrade Redshift driver to work with Cluster patch 181; Adapt to CDK 0.33.0; Minor signature changes | | 2.4.3 | 2024-04-10 | [\#36973](https://github.com/airbytehq/airbyte/pull/36973) | Limit the Standard inserts SQL statement to less than 16MB | | 2.4.2 | 2024-04-05 | [\#36365](https://github.com/airbytehq/airbyte/pull/36365) | Remove unused config option | @@ -307,7 +307,7 @@ Each stream will be output into its own raw table in Redshift. Each table will c | 0.3.55 | 2023-01-26 | [\#20631](https://github.com/airbytehq/airbyte/pull/20631) | Added support for destination checkpointing with staging | | 0.3.54 | 2023-01-18 | [\#21087](https://github.com/airbytehq/airbyte/pull/21087) | Wrap Authentication Errors as Config Exceptions | | 0.3.53 | 2023-01-03 | [\#17273](https://github.com/airbytehq/airbyte/pull/17273) | Flatten JSON arrays to fix maximum size check for SUPER field | -| 0.3.52 | 2022-12-30 | [\#20879](https://github.com/airbytehq/airbyte/pull/20879) | Added configurable parameter for number of file buffers (⛔ this version has a bug and will not work; use `0.3.56` instead) | +| 0.3.52 | 2022-12-30 | [\#20879](https://github.com/airbytehq/airbyte/pull/20879) | Added configurable parameter for number of file buffers (⛔ this version has a bug and will not work; use `0.3.56` instead) | | 0.3.51 | 2022-10-26 | [\#18434](https://github.com/airbytehq/airbyte/pull/18434) | Fix empty S3 bucket path handling | | 0.3.50 | 2022-09-14 | [\#15668](https://github.com/airbytehq/airbyte/pull/15668) | Wrap logs in AirbyteLogMessage | | 0.3.49 | 2022-09-01 | [\#16243](https://github.com/airbytehq/airbyte/pull/16243) | Fix Json to Avro conversion when there is field name clash from combined restrictions (`anyOf`, `oneOf`, `allOf` fields) | diff --git a/docs/integrations/destinations/s3-glue.md b/docs/integrations/destinations/s3-glue.md index ec0a2ac1bd6..8492427c14b 100644 --- a/docs/integrations/destinations/s3-glue.md +++ b/docs/integrations/destinations/s3-glue.md @@ -157,7 +157,10 @@ In order for everything to work correctly, it is also necessary that the user wh { "Effect": "Allow", "Action": "s3:*", - "Resource": ["arn:aws:s3:::YOUR_BUCKET_NAME/*", "arn:aws:s3:::YOUR_BUCKET_NAME"] + "Resource": [ + "arn:aws:s3:::YOUR_BUCKET_NAME/*", + "arn:aws:s3:::YOUR_BUCKET_NAME" + ] } ] } diff --git a/docs/integrations/destinations/s3.md b/docs/integrations/destinations/s3.md index 5a2563f475a..ca54f8115f6 100644 --- a/docs/integrations/destinations/s3.md +++ b/docs/integrations/destinations/s3.md @@ -140,7 +140,10 @@ to use: "s3:AbortMultipartUpload", "s3:GetBucketLocation" ], - "Resource": ["arn:aws:s3:::YOUR_BUCKET_NAME/*", "arn:aws:s3:::YOUR_BUCKET_NAME"] + "Resource": [ + "arn:aws:s3:::YOUR_BUCKET_NAME/*", + "arn:aws:s3:::YOUR_BUCKET_NAME" + ] } ] } @@ -392,7 +395,10 @@ In order for everything to work correctly, it is also necessary that the user wh { "Effect": "Allow", "Action": "s3:*", - "Resource": ["arn:aws:s3:::YOUR_BUCKET_NAME/*", "arn:aws:s3:::YOUR_BUCKET_NAME"] + "Resource": [ + "arn:aws:s3:::YOUR_BUCKET_NAME/*", + "arn:aws:s3:::YOUR_BUCKET_NAME" + ] } ] } @@ -401,7 +407,7 @@ In order for everything to work correctly, it is also necessary that the user wh ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------- | | 0.6.1 | 2024-04-08 | [37546](https://github.com/airbytehq/airbyte/pull/37546) | Adapt to CDK 0.30.8; | | 0.6.0 | 2024-04-08 | [36869](https://github.com/airbytehq/airbyte/pull/36869) | Adapt to CDK 0.29.8; Kotlin converted code. | | 0.5.9 | 2024-02-22 | [35569](https://github.com/airbytehq/airbyte/pull/35569) | Fix logging bug. | diff --git a/docs/integrations/destinations/snowflake-migrations.md b/docs/integrations/destinations/snowflake-migrations.md index adb75e5126e..e535022dff5 100644 --- a/docs/integrations/destinations/snowflake-migrations.md +++ b/docs/integrations/destinations/snowflake-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 3.0.0 -This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. +This version introduces [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides better error handling, incremental delivery of data for large syncs, and improved final table structures. To review the breaking changes, and how to upgrade, see [here](/release_notes/upgrading_to_destinations_v2/#quick-start-to-upgrading). These changes will likely require updates to downstream dbt / SQL models, which we walk through [here](/release_notes/upgrading_to_destinations_v2/#updating-downstream-transformations). Selecting `Upgrade` will upgrade **all** connections using this destination at their next sync. You can manually sync existing connections prior to the next scheduled sync to start the upgrade early. Worthy of specific mention, this version includes: diff --git a/docs/integrations/destinations/snowflake.md b/docs/integrations/destinations/snowflake.md index 343ec21273e..6b4ffc3c428 100644 --- a/docs/integrations/destinations/snowflake.md +++ b/docs/integrations/destinations/snowflake.md @@ -275,10 +275,10 @@ desired namespace. ## Changelog | Version | Date | Pull Request | Subject | -|:----------------|:-----------|:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :-------------- | :--------- | :--------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 3.7.2 | 2024-05-06 | [\#37857](https://github.com/airbytehq/airbyte/pull/37857) | Use safe executeMetadata call | | 3.7.1 | 2024-04-30 | [\#36910](https://github.com/airbytehq/airbyte/pull/36910) | Bump CDK version | -| 3.7.0 | 2024-04-08 | [\#35754](https://github.com/airbytehq/airbyte/pull/35754) | Allow configuring `data_retention_time_in_days`; apply to both raw and final tables. *Note*: Existing tables will not be affected; you must manually alter them. | +| 3.7.0 | 2024-04-08 | [\#35754](https://github.com/airbytehq/airbyte/pull/35754) | Allow configuring `data_retention_time_in_days`; apply to both raw and final tables. _Note_: Existing tables will not be affected; you must manually alter them. | | 3.6.6 | 2024-03-26 | [\#36466](https://github.com/airbytehq/airbyte/pull/36466) | Correctly hhandle instances with `QUOTED_IDENTIFIERS_IGNORE_CASE` enabled globally | | 3.6.5 | 2024-03-25 | [\#36461](https://github.com/airbytehq/airbyte/pull/36461) | Internal code change (use published CDK artifact instead of source dependency) | | 3.6.4 | 2024-03-25 | [\#36396](https://github.com/airbytehq/airbyte/pull/36396) | Handle instances with `QUOTED_IDENTIFIERS_IGNORE_CASE` enabled globally | diff --git a/docs/integrations/destinations/teradata.md b/docs/integrations/destinations/teradata.md index 8f6bfd22c0f..cba27e72829 100644 --- a/docs/integrations/destinations/teradata.md +++ b/docs/integrations/destinations/teradata.md @@ -84,11 +84,11 @@ You can also use a pre-existing user but we highly recommend creating a dedicate ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :---------------------------------------------- |:--------------------------------------------------------| -| 0.1.5 | 2024-01-12 | https://github.com/airbytehq/airbyte/pull/33872 | Added Primary Index on _airbyte_ab_id to fix NoPI issue | -| 0.1.4 | 2023-12-04 | https://github.com/airbytehq/airbyte/pull/28667 | Make connector available on Airbyte Cloud | -| 0.1.3 | 2023-08-17 | https://github.com/airbytehq/airbyte/pull/30740 | Enable custom DBT transformation | -| 0.1.2 | 2023-08-09 | https://github.com/airbytehq/airbyte/pull/29174 | Small internal refactor | -| 0.1.1 | 2023-03-03 | https://github.com/airbytehq/airbyte/pull/21760 | Added SSL support | -| 0.1.0 | 2022-12-13 | https://github.com/airbytehq/airbyte/pull/20428 | New Destination Teradata Vantage | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :---------------------------------------------- | :------------------------------------------------------- | +| 0.1.5 | 2024-01-12 | https://github.com/airbytehq/airbyte/pull/33872 | Added Primary Index on \_airbyte_ab_id to fix NoPI issue | +| 0.1.4 | 2023-12-04 | https://github.com/airbytehq/airbyte/pull/28667 | Make connector available on Airbyte Cloud | +| 0.1.3 | 2023-08-17 | https://github.com/airbytehq/airbyte/pull/30740 | Enable custom DBT transformation | +| 0.1.2 | 2023-08-09 | https://github.com/airbytehq/airbyte/pull/29174 | Small internal refactor | +| 0.1.1 | 2023-03-03 | https://github.com/airbytehq/airbyte/pull/21760 | Added SSL support | +| 0.1.0 | 2022-12-13 | https://github.com/airbytehq/airbyte/pull/20428 | New Destination Teradata Vantage | diff --git a/docs/integrations/destinations/vectara.md b/docs/integrations/destinations/vectara.md index af29d82dfdf..da0a4c57f09 100644 --- a/docs/integrations/destinations/vectara.md +++ b/docs/integrations/destinations/vectara.md @@ -9,8 +9,8 @@ The Vectara destination connector allows you to connect any Airbyte source to Ve :::info In case of issues, the following public channels are available for support: -* For Airbyte related issues such as data source or processing: [Open a Github issue](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fbug%2Carea%2Fconnectors%2Cneeds-triage&projects=&template=1-issue-connector.yaml) -* For Vectara related issues such as data indexing or RAG: Create a post in the [Vectara forum](https://discuss.vectara.com/) or reach out on [Vectara's Discord server](https://discord.gg/GFb8gMz6UH) +- For Airbyte related issues such as data source or processing: [Open a Github issue](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fbug%2Carea%2Fconnectors%2Cneeds-triage&projects=&template=1-issue-connector.yaml) +- For Vectara related issues such as data indexing or RAG: Create a post in the [Vectara forum](https://discuss.vectara.com/) or reach out on [Vectara's Discord server](https://discord.gg/GFb8gMz6UH) ::: @@ -20,31 +20,32 @@ The Vectara destination connector supports Full Refresh Overwrite, Full Refresh ### Output schema -All streams will be output into a corpus in Vectara whose name must be specified in the config. +All streams will be output into a corpus in Vectara whose name must be specified in the config. Note that there are no restrictions in naming the Vectara corpus and if a corpus with the specified name is not found, a new corpus with that name will be created. Also, if multiple corpora exists with the same name, an error will be returned as Airbyte will be unable to determine the prefered corpus. ### Features -| Feature | Supported? | -| :---------------------------- | :--------- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Incremental - Dedupe Sync | Yes | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| Incremental - Dedupe Sync | Yes | ## Getting started You will need a Vectara account to use Vectara with Airbyte. To get started, use the following steps: + 1. [Sign up](https://vectara.com/integrations/airbyte) for a Vectara account if you don't already have one. Once you have completed your sign up you will have a Vectara customer ID. You can find your customer ID by clicking on your name, on the top-right of the Vectara console window. -2. Within your account you can create your corpus, which represents an area that stores text data you want to ingest into Vectara. - * To create a corpus, use the **"Create Corpus"** button in the console. You then provide a name to your corpus as well as a description. If you click on your created corpus, you can see its name and corpus ID right on the top. You can see more details in this [guide](https://docs.vectara.com/docs/console-ui/creating-a-corpus). - * Optionally you can define filtering attributes and apply some advanced options. - * For the Vectara connector to work properly you **must** define a special meta-data field called `_ab_stream` (string typed) which the connector uses to identify source streams. +2. Within your account you can create your corpus, which represents an area that stores text data you want to ingest into Vectara. + - To create a corpus, use the **"Create Corpus"** button in the console. You then provide a name to your corpus as well as a description. If you click on your created corpus, you can see its name and corpus ID right on the top. You can see more details in this [guide](https://docs.vectara.com/docs/console-ui/creating-a-corpus). + - Optionally you can define filtering attributes and apply some advanced options. + - For the Vectara connector to work properly you **must** define a special meta-data field called `_ab_stream` (string typed) which the connector uses to identify source streams. 3. The Vectara destination connector uses [OAuth2.0 Credentials](https://docs.vectara.com/docs/learn/authentication/oauth-2). You will need your `Client ID` and `Client Secret` handy for your connector setup. ### Setup the Vectara Destination in Airbyte -You should now have all the requirements needed to configure Vectara as a destination in the UI. +You should now have all the requirements needed to configure Vectara as a destination in the UI. You'll need the following information to configure the Vectara destination: @@ -55,16 +56,17 @@ You'll need the following information to configure the Vectara destination: - (Required) **Corpus Name**. You can specify a corpus name you've setup manually given the instructions above, or if you specify a corpus name that does not exist, the connector will generate a new corpus in this name and setup the required meta-data filtering fields within that corpus. In addition, in the connector UI you define two set of fields for this connector: -* `text_fields` define the source fields which are turned into text in the Vectara side and are used for query or summarization. -* `title_field` define the source field which will be used as a title of the document on the Vectara side -* `metadata_fields` define the source fields which will be added to each document as meta-data. + +- `text_fields` define the source fields which are turned into text in the Vectara side and are used for query or summarization. +- `title_field` define the source field which will be used as a title of the document on the Vectara side +- `metadata_fields` define the source fields which will be added to each document as meta-data. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------- | -| 0.2.3 | 2024-03-22 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK & pytest version to fix security vulnerabilities | -| 0.2.2 | 2024-03-22 | [#36261](https://github.com/airbytehq/airbyte/pull/36261) | Move project to Poetry | -| 0.2.1 | 2024-03-05 | [#35206](https://github.com/airbytehq/airbyte/pull/35206) | Fix: improved title parsing | -| 0.2.0 | 2024-01-29 | [#34579](https://github.com/airbytehq/airbyte/pull/34579) | Add document title file configuration | -| 0.1.0 | 2023-11-10 | [#31958](https://github.com/airbytehq/airbyte/pull/31958) | 🎉 New Destination: Vectara (Vector Database) | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------------------------- | +| 0.2.3 | 2024-03-22 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Updated CDK & pytest version to fix security vulnerabilities | +| 0.2.2 | 2024-03-22 | [#36261](https://github.com/airbytehq/airbyte/pull/36261) | Move project to Poetry | +| 0.2.1 | 2024-03-05 | [#35206](https://github.com/airbytehq/airbyte/pull/35206) | Fix: improved title parsing | +| 0.2.0 | 2024-01-29 | [#34579](https://github.com/airbytehq/airbyte/pull/34579) | Add document title file configuration | +| 0.1.0 | 2023-11-10 | [#31958](https://github.com/airbytehq/airbyte/pull/31958) | 🎉 New Destination: Vectara (Vector Database) | diff --git a/docs/integrations/destinations/weaviate-migrations.md b/docs/integrations/destinations/weaviate-migrations.md index 1d54ca39f45..a7d5ff075e5 100644 --- a/docs/integrations/destinations/weaviate-migrations.md +++ b/docs/integrations/destinations/weaviate-migrations.md @@ -15,4 +15,3 @@ It's no longer possible to configure `id` fields in the destination. Instead, th ### Vector fields It's not possible anymore to configure separate vector fields per stream. To load embedding vectors from the records itself, the embedding method `From Field` can be used and configured with a single field name that has to be available in records from all streams. If your records contain multiple vector fields, you need to configure separate destinations and connections to configure separate vector field names. - diff --git a/docs/integrations/destinations/weaviate.md b/docs/integrations/destinations/weaviate.md index 05a1261c574..1ef02222fff 100644 --- a/docs/integrations/destinations/weaviate.md +++ b/docs/integrations/destinations/weaviate.md @@ -5,18 +5,19 @@ This page guides you through the process of setting up the [Weaviate](https://weaviate.io/) destination connector. There are three parts to this: -* Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. -* Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) -* Indexing - store the vectors in a vector database for similarity search + +- Processing - split up individual records in chunks so they will fit the context window and decide which fields to use as context and which are supplementary metadata. +- Embedding - convert the text into a vector representation using a pre-trained model (Currently, OpenAI's `text-embedding-ada-002` and Cohere's `embed-english-light-v2.0` are supported.) +- Indexing - store the vectors in a vector database for similarity search ## Prerequisites To use the Weaviate destination, you'll need: -* Access to a running Weaviate instance (either self-hosted or via Weaviate Cloud Services), minimum version 1.21.2 -* Either - * An account with API access for OpenAI or Cohere (depending on which embedding method you want to use) - * Pre-calculated embeddings stored in a field in your source database +- Access to a running Weaviate instance (either self-hosted or via Weaviate Cloud Services), minimum version 1.21.2 +- Either + - An account with API access for OpenAI or Cohere (depending on which embedding method you want to use) + - Pre-calculated embeddings stored in a field in your source database You'll need the following information to configure the destination: @@ -26,21 +27,22 @@ You'll need the following information to configure the destination: ## Features -| Feature | Supported?\(Yes/No\) | Notes | -| :----------------------------- | :------------------- | :---- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Incremental - Append + Deduped | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :----------------------------- | :------------------- | :------------------------------------------------------- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Incremental - Append + Deduped | Yes | | +| Namespaces | No | | | Provide vector | Yes | Either from field are calculated during the load process | ## Data type mapping All fields specified as metadata fields will be stored as properties in the object can be used for filtering. The following data types are allowed for metadata fields: -* String -* Number (integer or floating point, gets converted to a 64 bit floating point) -* Booleans (true, false) -* List of String + +- String +- Number (integer or floating point, gets converted to a 64 bit floating point) +- Booleans (true, false) +- List of String All other fields are serialized into their JSON representation. @@ -62,7 +64,7 @@ The connector can use one of the following embedding methods: 1. OpenAI - using [OpenAI API](https://beta.openai.com/docs/api-reference/text-embedding) , the connector will produce embeddings using the `text-embedding-ada-002` model with **1536 dimensions**. This integration will be constrained by the [speed of the OpenAI embedding API](https://platform.openai.com/docs/guides/rate-limits/overview). -2. Cohere - using the [Cohere API](https://docs.cohere.com/reference/embed), the connector will produce embeddings using the `embed-english-light-v2.0` model with **1024 dimensions**. +2. Cohere - using the [Cohere API](https://docs.cohere.com/reference/embed), the connector will produce embeddings using the `embed-english-light-v2.0` model with **1024 dimensions**. 3. From field - if you have pre-calculated embeddings stored in a field in your source database, you can use the `From field` integration to load them into Weaviate. The field must be a JSON array of numbers, e.g. `[0.1, 0.2, 0.3]`. @@ -72,36 +74,36 @@ For testing purposes, it's also possible to use the [Fake embeddings](https://py ### Indexing -All streams will be indexed into separate classes derived from the stream name. +All streams will be indexed into separate classes derived from the stream name. If a class doesn't exist in the schema of the cluster, it will be created using the configure vectorizer configuration. In this case, dynamic schema has to be enabled on the server. You can also create the class in Weaviate in advance if you need more control over the schema in Weaviate. In this case, the text properies `_ab_stream` and `_ab_record_id` need to be created for bookkeeping reasons. In case a sync is run in `Overwrite` mode, the class will be deleted and recreated. -As properties have to start will a lowercase letter in Weaviate and can't contain spaces or special characters. Field names might be updated during the loading process. The field names `id`, `_id` and `_additional` are reserved keywords in Weaviate, so they will be renamed to `raw_id`, `raw__id` and `raw_additional` respectively. +As properties have to start will a lowercase letter in Weaviate and can't contain spaces or special characters. Field names might be updated during the loading process. The field names `id`, `_id` and `_additional` are reserved keywords in Weaviate, so they will be renamed to `raw_id`, `raw__id` and `raw_additional` respectively. When using [multi-tenancy](https://weaviate.io/developers/weaviate/manage-data/multi-tenancy), the tenant id can be configured in the connector configuration. If not specified, multi-tenancy will be disabled. In case you want to index into an already created class, you need to make sure the class is created with multi-tenancy enabled. In case the class doesn't exist, it will be created with multi-tenancy properly configured. If the class already exists but the tenant id is not associated with the class, the connector will automatically add the tenant id to the class. This allows you to configure multiple connections for different tenants on the same schema. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------- | -| 0.2.17 | 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities. -| 0.2.16 | 2024-03-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Fix tests and move to Poetry | -| 0.2.15 | 2023-01-25 | [#34529](https://github.com/airbytehq/airbyte/pull/34529) | Fix tests | -| 0.2.14 | 2023-01-15 | [#34229](https://github.com/airbytehq/airbyte/pull/34229) | Allow configuring tenant id | -| 0.2.13 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | -| 0.2.12 | 2023-12-07 | [#33218](https://github.com/airbytehq/airbyte/pull/33218) | Normalize metadata field names | -| 0.2.11 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | -| 0.2.10 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | -| 0.2.9 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.2.8 | 2023-11-03 | [#32134](https://github.com/airbytehq/airbyte/pull/32134) | Improve test coverage | -| 0.2.7 | 2023-11-03 | [#32134](https://github.com/airbytehq/airbyte/pull/32134) | Upgrade weaviate client library | -| 0.2.6 | 2023-11-01 | [#32038](https://github.com/airbytehq/airbyte/pull/32038) | Retry failed object loads | -| 0.2.5 | 2023-10-24 | [#31953](https://github.com/airbytehq/airbyte/pull/31953) | Fix memory leak | -| 0.2.4 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option, improve append+dedupe sync performance and remove unnecessary retry logic | -| 0.2.3 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.2 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | -| 0.2.1 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size and conflict field name handling | -| 0.2.0 | 2023-09-22 | [#30151](https://github.com/airbytehq/airbyte/pull/30151) | Add embedding capabilities, overwrite and dedup support and API key auth mode, make certified. 🚨 Breaking changes - check migrations guide. | -| 0.1.1 | 2022-02-08 | [\#22527](https://github.com/airbytehq/airbyte/pull/22527) | Multiple bug fixes: Support String based IDs, arrays of uknown type and additionalProperties of type object and array of objects | -| 0.1.0 | 2022-12-06 | [\#20094](https://github.com/airbytehq/airbyte/pull/20094) | Add Weaviate destination | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------- | +| 0.2.17 | 2024-04-15 | [#37333](https://github.com/airbytehq/airbyte/pull/37333) | Update CDK & pytest version to fix security vulnerabilities. | +| 0.2.16 | 2024-03-22 | [#35911](https://github.com/airbytehq/airbyte/pull/35911) | Fix tests and move to Poetry | +| 0.2.15 | 2023-01-25 | [#34529](https://github.com/airbytehq/airbyte/pull/34529) | Fix tests | +| 0.2.14 | 2023-01-15 | [#34229](https://github.com/airbytehq/airbyte/pull/34229) | Allow configuring tenant id | +| 0.2.13 | 2023-12-11 | [#33303](https://github.com/airbytehq/airbyte/pull/33303) | Fix bug with embedding special tokens | +| 0.2.12 | 2023-12-07 | [#33218](https://github.com/airbytehq/airbyte/pull/33218) | Normalize metadata field names | +| 0.2.11 | 2023-12-01 | [#32697](https://github.com/airbytehq/airbyte/pull/32697) | Allow omitting raw text | +| 0.2.10 | 2023-11-16 | [#32608](https://github.com/airbytehq/airbyte/pull/32608) | Support deleting records for CDC sources | +| 0.2.9 | 2023-11-13 | [#32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.2.8 | 2023-11-03 | [#32134](https://github.com/airbytehq/airbyte/pull/32134) | Improve test coverage | +| 0.2.7 | 2023-11-03 | [#32134](https://github.com/airbytehq/airbyte/pull/32134) | Upgrade weaviate client library | +| 0.2.6 | 2023-11-01 | [#32038](https://github.com/airbytehq/airbyte/pull/32038) | Retry failed object loads | +| 0.2.5 | 2023-10-24 | [#31953](https://github.com/airbytehq/airbyte/pull/31953) | Fix memory leak | +| 0.2.4 | 2023-10-23 | [#31563](https://github.com/airbytehq/airbyte/pull/31563) | Add field mapping option, improve append+dedupe sync performance and remove unnecessary retry logic | +| 0.2.3 | 2023-10-19 | [#31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.2 | 2023-10-15 | [#31329](https://github.com/airbytehq/airbyte/pull/31329) | Add OpenAI-compatible embedder option | +| 0.2.1 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size and conflict field name handling | +| 0.2.0 | 2023-09-22 | [#30151](https://github.com/airbytehq/airbyte/pull/30151) | Add embedding capabilities, overwrite and dedup support and API key auth mode, make certified. 🚨 Breaking changes - check migrations guide. | +| 0.1.1 | 2022-02-08 | [\#22527](https://github.com/airbytehq/airbyte/pull/22527) | Multiple bug fixes: Support String based IDs, arrays of uknown type and additionalProperties of type object and array of objects | +| 0.1.0 | 2022-12-06 | [\#20094](https://github.com/airbytehq/airbyte/pull/20094) | Add Weaviate destination | diff --git a/docs/integrations/destinations/yellowbrick.md b/docs/integrations/destinations/yellowbrick.md index 5c3433fc85c..8a7ff8bb41f 100644 --- a/docs/integrations/destinations/yellowbrick.md +++ b/docs/integrations/destinations/yellowbrick.md @@ -134,8 +134,8 @@ following[ sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-s ### Output Schema (Raw Tables) -Each stream will be mapped to a separate raw table in Yellowbrick. The default schema in which the raw tables are -created is `airbyte_internal`. This can be overridden in the configuration. +Each stream will be mapped to a separate raw table in Yellowbrick. The default schema in which the raw tables are +created is `airbyte_internal`. This can be overridden in the configuration. Each table will contain 3 columns: - `_airbyte_raw_id`: a uuid assigned by Airbyte to each event that is processed. The column type in @@ -143,13 +143,14 @@ Each table will contain 3 columns: - `_airbyte_extracted_at`: a timestamp representing when the event was pulled from the data source. The column type in Yellowbrick is `TIMESTAMP WITH TIME ZONE`. - `_airbyte_loaded_at`: a timestamp representing when the row was processed into final table. - The column type in Yellowbrick is `TIMESTAMP WITH TIME ZONE`. + The column type in Yellowbrick is `TIMESTAMP WITH TIME ZONE`. - `_airbyte_data`: a json blob representing with the event data. The column type in Yellowbrick is `JSONB`. ### Final Tables Data type mapping + | Airbyte Type | Yellowbrick Type | -|:---------------------------|:-------------------------| +| :------------------------- | :----------------------- | | string | VARCHAR | | number | DECIMAL | | integer | BIGINT | @@ -168,6 +169,6 @@ Each table will contain 3 columns: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-------------------------------------------------------------|:----------------------------------------------------------------------------------------------------| -| 0.0.1 | 2024-03-02 | [\#35775](https://github.com/airbytehq/airbyte/pull/35775) | Initial release +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :-------------- | +| 0.0.1 | 2024-03-02 | [\#35775](https://github.com/airbytehq/airbyte/pull/35775) | Initial release | diff --git a/docs/integrations/destinations/yugabytedb.md b/docs/integrations/destinations/yugabytedb.md index f0fd46c3ac1..0152a15a6cb 100644 --- a/docs/integrations/destinations/yugabytedb.md +++ b/docs/integrations/destinations/yugabytedb.md @@ -8,29 +8,28 @@ TODO: update this doc Is the output schema fixed (e.g: for an API like Stripe)? If so, point to the connector's schema (e.g: link to Stripe’s documentation) or describe the schema here directly (e.g: include a diagram or paragraphs describing the schema). -Describe how the connector's schema is mapped to Airbyte concepts. An example description might be: "MagicDB tables become Airbyte Streams and MagicDB columns become Airbyte Fields. In addition, an extracted\_at column is appended to each row being read." +Describe how the connector's schema is mapped to Airbyte concepts. An example description might be: "MagicDB tables become Airbyte Streams and MagicDB columns become Airbyte Fields. In addition, an extracted_at column is appended to each row being read." ### Data type mapping This section should contain a table mapping each of the connector's data types to Airbyte types. At the moment, Airbyte uses the same types used by [JSONSchema](https://json-schema.org/understanding-json-schema/reference/index.html). `string`, `date-time`, `object`, `array`, `boolean`, `integer`, and `number` are the most commonly used data types. | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | - +| :--------------- | :----------- | :---- | ### Features This section should contain a table with the following format: -| Feature | Supported?(Yes/No) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | | | -| Incremental Sync | | | -| Replicate Incremental Deletes | | | -| For databases, WAL/Logical replication | | | -| SSL connection | | | -| SSH Tunnel Support | | | -| (Any other source-specific features) | | | +| Feature | Supported?(Yes/No) | Notes | +| :------------------------------------- | :----------------- | :---- | +| Full Refresh Sync | | | +| Incremental Sync | | | +| Replicate Incremental Deletes | | | +| For databases, WAL/Logical replication | | | +| SSL connection | | | +| SSH Tunnel Support | | | +| (Any other source-specific features) | | | ### Performance considerations @@ -40,10 +39,10 @@ Could this connector hurt the user's database/API/etc... or put too much strain ### Requirements -* What versions of this connector does this implementation support? (e.g: `postgres v3.14 and above`) -* What configurations, if any, are required on the connector? (e.g: `buffer_size > 1024`) -* Network accessibility requirements -* Credentials/authentication requirements? (e.g: A DB user with read permissions on certain tables) +- What versions of this connector does this implementation support? (e.g: `postgres v3.14 and above`) +- What configurations, if any, are required on the connector? (e.g: `buffer_size > 1024`) +- Network accessibility requirements +- Credentials/authentication requirements? (e.g: A DB user with read permissions on certain tables) ### Setup guide @@ -51,10 +50,9 @@ For each of the above high-level requirements as appropriate, add or point to a For each major cloud provider we support, also add a follow-along guide for setting up Airbyte to connect to that destination. See the Postgres destination guide for an example of what this should look like. - ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:--------------------------------------------------------------|:------------------------| -| 0.1.1 | 2023-03-17 | [#24180](https://github.com/airbytehq/airbyte/pull/24180) | Fix field order | -| 0.1.0 | 2022-10-28 | [#18039](https://github.com/airbytehq/airbyte/pull/18039) | New Destination YugabyteDB | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------- | +| 0.1.1 | 2023-03-17 | [#24180](https://github.com/airbytehq/airbyte/pull/24180) | Fix field order | +| 0.1.0 | 2022-10-28 | [#18039](https://github.com/airbytehq/airbyte/pull/18039) | New Destination YugabyteDB | diff --git a/docs/integrations/locating-files-local-destination.md b/docs/integrations/locating-files-local-destination.md index d401d795245..35db2da2eec 100644 --- a/docs/integrations/locating-files-local-destination.md +++ b/docs/integrations/locating-files-local-destination.md @@ -35,9 +35,8 @@ Note that this method does not allow direct access to any files directly, instea 3. This will copy the entire `airbyte_local` folder to your host machine. Note that if you know the specific filename or wildcard, you can add append it to the source path of the `docker cp` command. - + ## Notes 1. Local JSON and Local CSV files do not persist between Docker restarts. This means that once you turn off your Docker image, your data is lost. This is consistent with the `tmp` nature of the folder. 2. In the root folder of your docker files, it might generate tmp and var folders that only have empty folders inside. - diff --git a/docs/integrations/sources/activecampaign.md b/docs/integrations/sources/activecampaign.md index 186544d5bdb..d55001e5a1c 100644 --- a/docs/integrations/sources/activecampaign.md +++ b/docs/integrations/sources/activecampaign.md @@ -6,19 +6,19 @@ This source can sync data from the [ActiveCampaign API](https://developers.activ ## This Source Supports the Following Streams -* campaigns -* contacts -* lists -* deals -* segments -* forms +- campaigns +- contacts +- lists +- deals +- segments +- forms ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -28,11 +28,11 @@ The connector has a rate limit of 5 requests per second per account. ### Requirements -* ActiveCampaign account -* ActiveCampaign API Key +- ActiveCampaign account +- ActiveCampaign API Key ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-25 | [18335](https://github.com/airbytehq/airbyte/pull/18335) | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-25 | [18335](https://github.com/airbytehq/airbyte/pull/18335) | Initial commit | diff --git a/docs/integrations/sources/adjust.md b/docs/integrations/sources/adjust.md index a359a1e0b3b..ec044778535 100644 --- a/docs/integrations/sources/adjust.md +++ b/docs/integrations/sources/adjust.md @@ -9,6 +9,7 @@ An API token is required to get hold of reports from the Adjust reporting API. S As Adjust allows you to setup custom events etc that are specific to your apps, only a subset of available metrics are made pre-selectable. To list all metrics that are available, query the filters data endpoint. Information about available metrics are available in the [Datascape metrics glossary](https://help.adjust.com/en/article/datascape-metrics-glossary). ### Full Metrics Listing + Take a look at the [filters data endpoint documentation](https://help.adjust.com/en/article/filters-data-endpoint) to see available filters. The example below shows how to obtain the events that are defined for your apps (replace the `API_KEY` with the key obtained in the previous step): ```sh @@ -37,5 +38,5 @@ The source connector supports the following [sync modes](https://docs.airbyte.co ## Changelog | Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|------------------| +| ------- | ---------- | -------------------------------------------------------- | ---------------- | | 0.1.0 | 2022-08-26 | [16051](https://github.com/airbytehq/airbyte/pull/16051) | Initial version. | diff --git a/docs/integrations/sources/aha.md b/docs/integrations/sources/aha.md index b5ec1d410ae..7214736ef53 100644 --- a/docs/integrations/sources/aha.md +++ b/docs/integrations/sources/aha.md @@ -1,5 +1,7 @@ # Aha API + API Documentation link [here](https://www.aha.io/api) + ## Overview The Aha API source supports full refresh syncs @@ -8,13 +10,13 @@ The Aha API source supports full refresh syncs Two output streams are available from this source: -*[features](https://www.aha.io/api/resources/features/list_features). -*[products](https://www.aha.io/api/resources/products/list_products_in_the_account). +_[features](https://www.aha.io/api/resources/features/list_features). +_[products](https://www.aha.io/api/resources/products/list_products_in_the_account). ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | @@ -26,7 +28,7 @@ Rate Limiting information is updated [here](https://www.aha.io/api#rate-limiting ### Requirements -* Aha API Key. +- Aha API Key. ### Connect using `API Key`: @@ -35,9 +37,9 @@ Rate Limiting information is updated [here](https://www.aha.io/api#rate-limiting ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------| -| 0.3.1 | 2023-06-05 | [27002](https://github.com/airbytehq/airbyte/pull/27002) | Flag spec `api_key` field as `airbyte-secret` | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- | +| 0.3.1 | 2023-06-05 | [27002](https://github.com/airbytehq/airbyte/pull/27002) | Flag spec `api_key` field as `airbyte-secret` | | 0.3.0 | 2023-05-30 | [22642](https://github.com/airbytehq/airbyte/pull/22642) | Add `idea_comments`, `idea_endorsements`, and `idea_categories` streams | -| 0.2.0 | 2023-05-26 | [26666](https://github.com/airbytehq/airbyte/pull/26666) | Fix integration test and schemas | -| 0.1.0 | 2022-11-02 | [18883](https://github.com/airbytehq/airbyte/pull/18893) | 🎉 New Source: Aha | +| 0.2.0 | 2023-05-26 | [26666](https://github.com/airbytehq/airbyte/pull/26666) | Fix integration test and schemas | +| 0.1.0 | 2022-11-02 | [18883](https://github.com/airbytehq/airbyte/pull/18893) | 🎉 New Source: Aha | diff --git a/docs/integrations/sources/aircall.md b/docs/integrations/sources/aircall.md index 01685351dbb..efad7a45605 100644 --- a/docs/integrations/sources/aircall.md +++ b/docs/integrations/sources/aircall.md @@ -13,9 +13,9 @@ Access Token (which acts as bearer token) is mandate for this connector to work, - Get an Aircall access token via settings (ref - https://dashboard.aircall.io/integrations/api-keys) - Setup params (All params are required) - Available params - - api_id: The auto generated id - - api_token: Seen at the Aircall settings (ref - https://dashboard.aircall.io/integrations/api-keys) - - start_date: Date filter for eligible streams, enter + - api_id: The auto generated id + - api_token: Seen at the Aircall settings (ref - https://dashboard.aircall.io/integrations/api-keys) + - start_date: Date filter for eligible streams, enter ### Step 2: Set up the Aircall connector in Airbyte @@ -32,7 +32,7 @@ Access Token (which acts as bearer token) is mandate for this connector to work, 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_id, api_token and start_date`. -5. Click **Set up source**. +4. Click **Set up source**. ## Supported sync modes @@ -68,7 +68,7 @@ Aircall [API reference](https://api.aircall.io/v1) has v1 at present. The connec ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-------------------------------------------------------------------------------| :------------- | -| 0.1.0 | 2023-04-19 | [Init](https://github.com/airbytehq/airbyte/pull/) | Initial commit | -| 0.2.0 | 2023-06-20 | [Correcting availablity typo](https://github.com/airbytehq/airbyte/pull/27433) | Correcting availablity typo | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------------------------------- | :-------------------------- | +| 0.1.0 | 2023-04-19 | [Init](https://github.com/airbytehq/airbyte/pull/) | Initial commit | +| 0.2.0 | 2023-06-20 | [Correcting availablity typo](https://github.com/airbytehq/airbyte/pull/27433) | Correcting availablity typo | diff --git a/docs/integrations/sources/airtable-migrations.md b/docs/integrations/sources/airtable-migrations.md index 66a0d6526f0..0c9d014a05d 100644 --- a/docs/integrations/sources/airtable-migrations.md +++ b/docs/integrations/sources/airtable-migrations.md @@ -1,4 +1,5 @@ # Airtable Migration Guide ## Upgrading to 4.0.0 -Columns with Formulas are narrowing from `array` to `string` or `number`. You may need to refresh the connection schema (with the reset), and run a sync. \ No newline at end of file + +Columns with Formulas are narrowing from `array` to `string` or `number`. You may need to refresh the connection schema (with the reset), and run a sync. diff --git a/docs/integrations/sources/airtable.md b/docs/integrations/sources/airtable.md index 266ef1a1c9f..c6c7c9ded44 100644 --- a/docs/integrations/sources/airtable.md +++ b/docs/integrations/sources/airtable.md @@ -4,49 +4,57 @@ This page contains the setup guide and reference information for the [Airtable]( ## Prerequisites -* An active Airtable account -* [Personal Access Token](https://airtable.com/developers/web/guides/personal-access-tokens) with the following scopes: +- An active Airtable account +- [Personal Access Token](https://airtable.com/developers/web/guides/personal-access-tokens) with the following scopes: - `data.records:read` - `data.recordComments:read` - `schema.bases:read` ## Setup guide + ### Step 1: Set up Airtable + #### For Airbyte Open Source: + 1. Go to https://airtable.com/create/tokens to create new token. - ![Generate new Token](../../.gitbook/assets/source/airtable/generate_new_token.png) + ![Generate new Token](../../.gitbook/assets/source/airtable/generate_new_token.png) 2. Add following scopes: + - `data.records:read` - `data.recordComments:read` - `schema.bases:read` - ![Add Scopes](../../.gitbook/assets/source/airtable/add_scopes.png) + ![Add Scopes](../../.gitbook/assets/source/airtable/add_scopes.png) + 3. Select required bases or allow access to all available and press the `Create Token` button. - ![Add Bases](../../.gitbook/assets/source/airtable/add_bases.png) + ![Add Bases](../../.gitbook/assets/source/airtable/add_bases.png) 4. Save token from the popup window. ### Step 2: Set up Airtable connector in Airbyte + ### For Airbyte Cloud: 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+new source**. 3. On the Set up the source page, enter the name for the Airtable connector and select **Airtable** from the Source type dropdown. 4. You can use OAuth or a Personal Access Token to authenticate your Airtable account. We recommend using OAuth for Airbyte Cloud. + - To authenticate using OAuth, select **OAuth2.0** from the Authentication dropdown click **Authenticate your Airtable account** to sign in with Airtable, select required workspaces you want to sync and authorize your account. - To authenticate using a Personal Access Token, select **Personal Access Token** from the Authentication dropdown and enter the Access Token for your Airtable account. -:::info -When using OAuth, you may see a `400` or `401` error causing a failed sync. You can re-authenticate your Airtable connector to solve the issue temporarily. We are working on a permanent fix that you can follow [here](https://github.com/airbytehq/airbyte/issues/25278). -::: + :::info + When using OAuth, you may see a `400` or `401` error causing a failed sync. You can re-authenticate your Airtable connector to solve the issue temporarily. We are working on a permanent fix that you can follow [here](https://github.com/airbytehq/airbyte/issues/25278). + ::: 5. Click **Set up source**. + ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard @@ -57,7 +65,8 @@ When using OAuth, you may see a `400` or `401` error causing a failed sync. You ### Note on changed table names and deleted tables -Please keep in mind that if you start syncing a table via Airbyte, then rename it in your Airtable account, the connector will not continue syncing that table until you reset your connection schema and select it again. At that point, the table will begin syncing to a table with the new name in the destination. This is because there is no way for Airtable to tell Airbyte which tables have been renamed. Similarly, if you delete a table that was previously syncing, the connector will stop syncing it. + +Please keep in mind that if you start syncing a table via Airbyte, then rename it in your Airtable account, the connector will not continue syncing that table until you reset your connection schema and select it again. At that point, the table will begin syncing to a table with the new name in the destination. This is because there is no way for Airtable to tell Airbyte which tables have been renamed. Similarly, if you delete a table that was previously syncing, the connector will stop syncing it. ## Supported sync modes @@ -77,7 +86,7 @@ See information about rate limits [here](https://airtable.com/developers/web/api ## Data type map | Integration Type | Airbyte Type | Nullable | -|:------------------------|:---------------------------------------|----------| +| :---------------------- | :------------------------------------- | -------- | | `multipleAttachments` | `string` | Yes | | `autoNumber` | `string` | Yes | | `barcode` | `string` | Yes | @@ -110,16 +119,16 @@ See information about rate limits [here](https://airtable.com/developers/web/api | `multipleLookupValues` | `array with any` | Yes | | `rollup` | `array with any` | Yes | -* All the fields are `nullable` by default, meaning that the field could be empty. -* The `array with any` - represents the classic array with one of the other Airtable data types inside, such as: - - string - - number/integer - - nested lists/objects +- All the fields are `nullable` by default, meaning that the field could be empty. +- The `array with any` - represents the classic array with one of the other Airtable data types inside, such as: + - string + - number/integer + - nested lists/objects ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------- | | 4.2.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 4.1.6 | 2024-02-12 | [35149](https://github.com/airbytehq/airbyte/pull/35149) | Manage dependencies with Poetry. | | 4.1.5 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | diff --git a/docs/integrations/sources/alpha-vantage.md b/docs/integrations/sources/alpha-vantage.md index 72789514e19..a9074e86cba 100644 --- a/docs/integrations/sources/alpha-vantage.md +++ b/docs/integrations/sources/alpha-vantage.md @@ -6,23 +6,22 @@ This source retrieves time series data from the free [Alpha Vantage](https://www.alphavantage.co/) API. It supports intraday, daily, weekly and monthly time series data. - ### Output schema This source is capable of syncing the following streams: -* `time_series_intraday` -* `time_series_daily` -* `time_series_daily_adjusted` (premium only) -* `time_series_weekly` -* `time_series_weekly_adjusted` -* `time_series_monthly` -* `time_series_monthly_adjusted` +- `time_series_intraday` +- `time_series_daily` +- `time_series_daily_adjusted` (premium only) +- `time_series_weekly` +- `time_series_weekly_adjusted` +- `time_series_monthly` +- `time_series_monthly_adjusted` ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:--------------------------------------------------------| +| :---------------- | :-------------------- | :------------------------------------------------------ | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | API Environments | Yes | Both sandbox and production environments are supported. | @@ -30,7 +29,7 @@ This source is capable of syncing the following streams: ### Performance considerations Since a single API call returns the full history of a time series if -configured, it is recommended to use `Full Refresh` with `Overwrite` to avoid +configured, it is recommended to use `Full Refresh` with `Overwrite` to avoid storing duplicate data. Also, the data returned can be quite large. @@ -57,7 +56,7 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------- | | 0.1.1 | 2022-12-16 | [20564](https://github.com/airbytehq/airbyte/pull/20564) | add quote stream to alpha-vantage | -| 0.1.0 | 2022-10-21 | [18320](https://github.com/airbytehq/airbyte/pull/18320) | New source | +| 0.1.0 | 2022-10-21 | [18320](https://github.com/airbytehq/airbyte/pull/18320) | New source | diff --git a/docs/integrations/sources/amazon-ads-migrations.md b/docs/integrations/sources/amazon-ads-migrations.md index b9447fd491f..f8a4b028547 100644 --- a/docs/integrations/sources/amazon-ads-migrations.md +++ b/docs/integrations/sources/amazon-ads-migrations.md @@ -4,43 +4,47 @@ The following streams have updated schemas due to a change with the Amazon Ads API: -* `SponsoredBrandsCampaigns` -* `SponsoredBrandsAdGroups` -* `SponsoredProductsCampaigns` -* `SponsoredProductsAdGroupBidRecommendations` +- `SponsoredBrandsCampaigns` +- `SponsoredBrandsAdGroups` +- `SponsoredProductsCampaigns` +- `SponsoredProductsAdGroupBidRecommendations` ### Schema Changes - Removed/Added Fields -| Stream Name | Removed Fields | Added Fields | -|-------------------------------------------------|-----------------------------|--------------------------| -| `SponsoredBrandsCampaigns` | `serviceStatus`, `bidOptimization`, `bidMultiplier`, `adFormat`, `bidAdjustments`, `creative`, `landingPage`, `supplySource` | `ruleBasedBudget`, `bidding`, `productLocation`, `costType`, `smartDefault`, `extendedData` | -| `SponsoredBrandsAdGroups` | `bid`, `keywordId`, `keywordText`, `nativeLanuageKeyword`, `matchType` | `extendedData` | -| `SponsoredProductsCampaigns` | `campaignType`, `dailyBudget`, `ruleBasedBudget`, `premiumBidAdjustment`, `networks` | `dynamicBidding`, `budget`, `extendedData` | -| `SponsoredProductsAdGroupBidRecommendations` | `suggestedBid` | `theme`, `bidRecommendationsForTargetingExpressions` | +| Stream Name | Removed Fields | Added Fields | +| -------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | +| `SponsoredBrandsCampaigns` | `serviceStatus`, `bidOptimization`, `bidMultiplier`, `adFormat`, `bidAdjustments`, `creative`, `landingPage`, `supplySource` | `ruleBasedBudget`, `bidding`, `productLocation`, `costType`, `smartDefault`, `extendedData` | +| `SponsoredBrandsAdGroups` | `bid`, `keywordId`, `keywordText`, `nativeLanuageKeyword`, `matchType` | `extendedData` | +| `SponsoredProductsCampaigns` | `campaignType`, `dailyBudget`, `ruleBasedBudget`, `premiumBidAdjustment`, `networks` | `dynamicBidding`, `budget`, `extendedData` | +| `SponsoredProductsAdGroupBidRecommendations` | `suggestedBid` | `theme`, `bidRecommendationsForTargetingExpressions` | ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. + ```note Any detected schema changes will be listed for your review. ``` + 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. + ```note Depending on destination type you may not be prompted to reset your data. ``` + 4. Select **Save connection**. + ```note This will reset the data in your destination and initiate a fresh sync. ``` For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 4.0.0 Streams `SponsoredBrandsAdGroups` and `SponsoredBrandsKeywords` now have updated schemas. @@ -48,19 +52,24 @@ Streams `SponsoredBrandsAdGroups` and `SponsoredBrandsKeywords` now have updated ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. + ```note Any detected schema changes will be listed for your review. ``` + 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. + ```note Depending on destination type you may not be prompted to reset your data. ``` + 4. Select **Save connection**. + ```note This will reset the data in your destination and initiate a fresh sync. ``` @@ -70,4 +79,4 @@ For more information on resetting your data in Airbyte, see [this page](https:// ## Upgrading to 3.0.0 A major update of attribution report stream schemas. -For a smooth migration, a data reset and a schema refresh are needed. \ No newline at end of file +For a smooth migration, a data reset and a schema refresh are needed. diff --git a/docs/integrations/sources/amazon-ads.md b/docs/integrations/sources/amazon-ads.md index f0e1cce041b..3904d87e3f6 100644 --- a/docs/integrations/sources/amazon-ads.md +++ b/docs/integrations/sources/amazon-ads.md @@ -1,28 +1,34 @@ # Amazon Ads + This page contains the setup guide and reference information for the Amazon Ads source connector. ## Prerequisites -* Client ID -* Client Secret -* Refresh Token -* Region -* Start Date (Optional) -* Profile IDs (Optional) -* Marketplace IDs (Optional) +- Client ID +- Client Secret +- Refresh Token +- Region +- Start Date (Optional) +- Profile IDs (Optional) +- Marketplace IDs (Optional) ## Setup guide + ### Step 1: Set up Amazon Ads + Create an [Amazon user](https://www.amazon.com) with access to an [Amazon Ads account](https://advertising.amazon.com). + **For Airbyte Open Source:** To use the [Amazon Ads API](https://advertising.amazon.com/API/docs/en-us), you must first complete the [onboarding process](https://advertising.amazon.com/API/docs/en-us/setting-up/overview). The onboarding process has several steps and may take several days to complete. After completing all steps you will have to get the Amazon client application's `Client ID`, `Client Secret` and `Refresh Token`. + ### Step 2: Set up the Amazon Ads connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -38,6 +44,7 @@ To use the [Amazon Ads API](https://advertising.amazon.com/API/docs/en-us), you + **For Airbyte Open Source:** 1. **Client ID** of your Amazon Ads developer application. See [onboarding process](https://advertising.amazon.com/API/docs/en-us/setting-up/overview) for more details. @@ -50,44 +57,46 @@ To use the [Amazon Ads API](https://advertising.amazon.com/API/docs/en-us), you :::note -The Amazon Ads source connector uses Sponsored Products, Sponsored Brands, and Sponsored Display APIs which are not compatible with agency account type. See [docs](https://advertising.amazon.com/API/docs/en-us/concepts/authorization/profiles) for more details. +The Amazon Ads source connector uses Sponsored Products, Sponsored Brands, and Sponsored Display APIs which are not compatible with agency account type. See [docs](https://advertising.amazon.com/API/docs/en-us/concepts/authorization/profiles) for more details. If you have only agency profile, please use accounts associated with the profile of seller/vendor type. ::: - ## Supported sync modes + The Amazon Ads source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts/#connection-sync-mode): - - Full Refresh - - Incremental + +- Full Refresh +- Incremental ## Supported Streams + This source is capable of syncing the following streams: -* [Profiles](https://advertising.amazon.com/API/docs/en-us/reference/2/profiles#/Profiles) -* [Portfolios](https://advertising.amazon.com/API/docs/en-us/reference/2/portfolios#/Portfolios%20extended) -* [Sponsored Brands Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Campaigns) -* [Sponsored Brands Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Ad%20groups) -* [Sponsored Brands Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Keywords) -* [Sponsored Display Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Campaigns) -* [Sponsored Display Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Ad%20groups) -* [Sponsored Display Product Ads](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Product%20ads) -* [Sponsored Display Targetings](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Targeting) -* [Sponsored Display Creatives](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Creatives) -* [Sponsored Display Budget Rules](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi/prod#/BudgetRules/GetSDBudgetRulesForAdvertiser) -* [Sponsored Products Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Campaigns) -* [Sponsored Products Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Ad%20groups) -* [Sponsored Products Ad Group Bid Recommendations](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Bid%20recommendations/getAdGroupBidRecommendations) -* [Sponsored Products Ad Group Suggested Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Suggested%20keywords) -* [Sponsored Products Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Keywords) -* [Sponsored Products Negative keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Negative%20keywords) -* [Sponsored Products Campaign Negative keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Negative%20keywords) -* [Sponsored Products Ads](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Product%20ads) -* [Sponsored Products Targetings](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Product%20targeting) -* [Brands Reports](https://advertising.amazon.com/API/docs/en-us/reference/sponsored-brands/2/reports) -* [Brand Video Reports](https://advertising.amazon.com/API/docs/en-us/reference/sponsored-brands/2/reports) -* [Display Reports](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Reports) (Contextual targeting only) -* [Products Reports](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Reports) -* [Attribution Reports](https://advertising.amazon.com/API/docs/en-us/amazon-attribution-prod-3p/#/) +- [Profiles](https://advertising.amazon.com/API/docs/en-us/reference/2/profiles#/Profiles) +- [Portfolios](https://advertising.amazon.com/API/docs/en-us/reference/2/portfolios#/Portfolios%20extended) +- [Sponsored Brands Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Campaigns) +- [Sponsored Brands Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Ad%20groups) +- [Sponsored Brands Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-brands/3-0/openapi#/Keywords) +- [Sponsored Display Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Campaigns) +- [Sponsored Display Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Ad%20groups) +- [Sponsored Display Product Ads](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Product%20ads) +- [Sponsored Display Targetings](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Targeting) +- [Sponsored Display Creatives](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Creatives) +- [Sponsored Display Budget Rules](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi/prod#/BudgetRules/GetSDBudgetRulesForAdvertiser) +- [Sponsored Products Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Campaigns) +- [Sponsored Products Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Ad%20groups) +- [Sponsored Products Ad Group Bid Recommendations](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Bid%20recommendations/getAdGroupBidRecommendations) +- [Sponsored Products Ad Group Suggested Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Suggested%20keywords) +- [Sponsored Products Keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Keywords) +- [Sponsored Products Negative keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Negative%20keywords) +- [Sponsored Products Campaign Negative keywords](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Negative%20keywords) +- [Sponsored Products Ads](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Product%20ads) +- [Sponsored Products Targetings](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Product%20targeting) +- [Brands Reports](https://advertising.amazon.com/API/docs/en-us/reference/sponsored-brands/2/reports) +- [Brand Video Reports](https://advertising.amazon.com/API/docs/en-us/reference/sponsored-brands/2/reports) +- [Display Reports](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Reports) (Contextual targeting only) +- [Products Reports](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Reports) +- [Attribution Reports](https://advertising.amazon.com/API/docs/en-us/amazon-attribution-prod-3p/#/) :::note As of connector version 5.0.0, the `Sponsored Products Ad Group Bid Recommendations` stream provides bid recommendations and impact metrics for an existing automatic targeting ad group. The stream returns bid recommendations for match types `CLOSE_MATCH`, `LOOSE_MATCH`, `SUBSTITUTES`, and `COMPLEMENTS` per theme. For more detail on theme-based bid recommendations, review Amazon's [Theme-base bid suggestions - Quick-start guide](https://advertising.amazon.com/API/docs/en-us/guides/sponsored-products/bid-suggestions/theme-based-bid-suggestions-quickstart-guide). @@ -108,7 +117,7 @@ Information about expected report generation waiting time can be found [here](ht ### Data type mapping | Integration Type | Airbyte Type | -|:-------------------------|:-------------| +| :----------------------- | :----------- | | `string` | `string` | | `int`, `float`, `number` | `number` | | `date` | `date` | @@ -119,7 +128,7 @@ Information about expected report generation waiting time can be found [here](ht ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------------- | | 5.0.1 | 2024-04-29 | [37655](https://github.com/airbytehq/airbyte/pull/37655) | Update error messages and spec with info about `agency` profile type. | | 5.0.0 | 2024-03-22 | [36169](https://github.com/airbytehq/airbyte/pull/36169) | Update `SponsoredBrand` and `SponsoredProduct` streams due to API endpoint deprecation | | 4.1.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | diff --git a/docs/integrations/sources/amazon-seller-partner-migrations.md b/docs/integrations/sources/amazon-seller-partner-migrations.md index 8cb9deade9b..710a8f25a7a 100644 --- a/docs/integrations/sources/amazon-seller-partner-migrations.md +++ b/docs/integrations/sources/amazon-seller-partner-migrations.md @@ -9,26 +9,30 @@ Users will need to refresh the source schema and reset this stream after upgradi ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. + ```note Any detected schema changes will be listed for your review. ``` + 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. + ```note Depending on destination type you may not be prompted to reset your data. ``` -4. Select **Save connection**. + +4. Select **Save connection**. + ```note This will reset the data in your destination and initiate a fresh sync. ``` For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 3.0.0 Streams `GET_FLAT_FILE_ALL_ORDERS_DATA_BY_ORDER_DATE_GENERAL` and `GET_FLAT_FILE_ALL_ORDERS_DATA_BY_LAST_UPDATE_GENERAL` now have updated schemas. @@ -36,10 +40,10 @@ Streams `GET_FLAT_FILE_ALL_ORDERS_DATA_BY_ORDER_DATE_GENERAL` and `GET_FLAT_FILE The following streams now have date-time formatted fields: | Stream | Affected fields | Format change | -|-----------------------------------------------|-------------------------------------------------------------------------------|----------------------------------------------------------------------| +| --------------------------------------------- | ----------------------------------------------------------------------------- | -------------------------------------------------------------------- | | `GET_AMAZON_FULFILLED_SHIPMENTS_DATA_GENERAL` | `estimated-arrival-date` | `string YYYY-MM-DDTHH:mm:ssZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | | `GET_LEDGER_DETAIL_VIEW_DATA` | `Date and Time` | `string YYYY-MM-DDTHH:mm:ssZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | -| `GET_MERCHANTS_LISTINGS_FYP_REPORT` | `Status Change Date` | `string MMM D[,] YYYY` -> `date-time YYYY-MM-DD` | +| `GET_MERCHANTS_LISTINGS_FYP_REPORT` | `Status Change Date` | `string MMM D[,] YYYY` -> `date-time YYYY-MM-DD` | | `GET_STRANDED_INVENTORY_UI_DATA` | `Date-to-take-auto-removal` | `string YYYY-MM-DDTHH:mm:ssZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | | `GET_V2_SETTLEMENT_REPORT_DATA_FLAT_FILE` | `settlement-start-date`, `settlement-end-date`, `deposit-date`, `posted-date` | `string YYYY-MM-DDTHH:mm:ssZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | | `GET_MERCHANT_LISTINGS_ALL_DATA` | `open-date` | `string YYYY-MM-DD HH:mm:ss ZZZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | @@ -47,48 +51,53 @@ The following streams now have date-time formatted fields: | `GET_MERCHANT_LISTINGS_INACTIVE_DATA` | `open-date` | `string YYYY-MM-DD HH:mm:ss ZZZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | | `GET_MERCHANT_LISTINGS_DATA_BACK_COMPAT` | `open-date` | `string YYYY-MM-DD HH:mm:ss ZZZ` -> `date-time YYYY-MM-DDTHH:mm:ssZ` | - Users will need to refresh the source schemas and reset these streams after upgrading. ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. + ```note Any detected schema changes will be listed for your review. ``` + 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. + ```note Depending on destination type you may not be prompted to reset your data. ``` -4. Select **Save connection**. + +4. Select **Save connection**. + ```note This will reset the data in your destination and initiate a fresh sync. ``` For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 2.0.0 This change removes Brand Analytics and permanently removes deprecated FBA reports (from Airbyte Cloud). Customers who have those streams must refresh their schema OR disable the following streams: -* `GET_BRAND_ANALYTICS_MARKET_BASKET_REPORT` -* `GET_BRAND_ANALYTICS_SEARCH_TERMS_REPORT` -* `GET_BRAND_ANALYTICS_REPEAT_PURCHASE_REPORT` -* `GET_BRAND_ANALYTICS_ALTERNATE_PURCHASE_REPORT` -* `GET_BRAND_ANALYTICS_ITEM_COMPARISON_REPORT` -* `GET_SALES_AND_TRAFFIC_REPORT` -* `GET_VENDOR_SALES_REPORT` -* `GET_VENDOR_INVENTORY_REPORT` + +- `GET_BRAND_ANALYTICS_MARKET_BASKET_REPORT` +- `GET_BRAND_ANALYTICS_SEARCH_TERMS_REPORT` +- `GET_BRAND_ANALYTICS_REPEAT_PURCHASE_REPORT` +- `GET_BRAND_ANALYTICS_ALTERNATE_PURCHASE_REPORT` +- `GET_BRAND_ANALYTICS_ITEM_COMPARISON_REPORT` +- `GET_SALES_AND_TRAFFIC_REPORT` +- `GET_VENDOR_SALES_REPORT` +- `GET_VENDOR_INVENTORY_REPORT` Customers, who have the following streams, will have to disable them: -* `GET_FBA_FULFILLMENT_INVENTORY_ADJUSTMENTS_DATA` -* `GET_FBA_FULFILLMENT_CURRENT_INVENTORY_DATA` -* `GET_FBA_FULFILLMENT_INVENTORY_RECEIPTS_DATA` -* `GET_FBA_FULFILLMENT_INVENTORY_SUMMARY_DATA` -* `GET_FBA_FULFILLMENT_MONTHLY_INVENTORY_DATA` + +- `GET_FBA_FULFILLMENT_INVENTORY_ADJUSTMENTS_DATA` +- `GET_FBA_FULFILLMENT_CURRENT_INVENTORY_DATA` +- `GET_FBA_FULFILLMENT_INVENTORY_RECEIPTS_DATA` +- `GET_FBA_FULFILLMENT_INVENTORY_SUMMARY_DATA` +- `GET_FBA_FULFILLMENT_MONTHLY_INVENTORY_DATA` diff --git a/docs/integrations/sources/amazon-seller-partner.md b/docs/integrations/sources/amazon-seller-partner.md index a2a3c52527c..c91578530c0 100644 --- a/docs/integrations/sources/amazon-seller-partner.md +++ b/docs/integrations/sources/amazon-seller-partner.md @@ -69,10 +69,10 @@ To pass the check for Seller and Vendor accounts, you must have access to the [O **For Airbyte Open Source:** -1. Using developer application from Step 1, [generate](https://developer-docs.amazon.com/sp-api/docs/self-authorization) refresh token. +1. Using developer application from Step 1, [generate](https://developer-docs.amazon.com/sp-api/docs/self-authorization) refresh token. 2. Go to local Airbyte page. 3. On the Set up the source page, select **Amazon Seller Partner** from the **Source type** dropdown. -4. Enter a name for the Amazon Seller Partner connector. +4. Enter a name for the Amazon Seller Partner connector. 5. For Start Date, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. This field is optional - if not provided, the date 2 years ago from today will be used. 6. For End Date, enter the date in YYYY-MM-DD format. Any data after this date will not be replicated. This field is optional - if not provided, today's date will be used. 7. You can specify report options for each stream using **Report Options** section. Available options can be found in corresponding category [here](https://developer-docs.amazon.com/sp-api/docs/report-type-values). @@ -83,8 +83,9 @@ To pass the check for Seller and Vendor accounts, you must have access to the [O ## Supported sync modes The Amazon Seller Partner source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts/#connection-sync-mode): - - Full Refresh - - Incremental + +- Full Refresh +- Incremental ## Supported streams @@ -160,7 +161,7 @@ Information about rate limits you may find [here](https://developer-docs.amazon. ## Data type map | Integration Type | Airbyte Type | -|:-------------------------|:-------------| +| :----------------------- | :----------- | | `string` | `string` | | `int`, `float`, `number` | `number` | | `date` | `date` | @@ -171,7 +172,7 @@ Information about rate limits you may find [here](https://developer-docs.amazon. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 4.2.2 | 2024-04-24 | [#36630](https://github.com/airbytehq/airbyte/pull/36630) | Schema descriptions and CDK 0.80.0 | | 4.2.1 | 2024-04-08 | [#36895](https://github.com/airbytehq/airbyte/pull/36895) | Fix `reportPeriod` day query params | | 4.2.0 | 2024-03-19 | [#36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | diff --git a/docs/integrations/sources/amplitude.md b/docs/integrations/sources/amplitude.md index 5fa611b4e3b..af3bbba4ccc 100644 --- a/docs/integrations/sources/amplitude.md +++ b/docs/integrations/sources/amplitude.md @@ -9,7 +9,7 @@ To set up the Amplitude source connector, you'll need your Amplitude [`API Key` ## Set up the Amplitude source connector 1. Log into your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open Source account. -2. Click **Sources** and then click **+ New source**. +2. Click **Sources** and then click **+ New source**. 3. On the Set up the source page, select **Amplitude** from the Source type dropdown. 4. Enter a name for your source. 5. For **API Key** and **Secret Key**, enter the Amplitude [API key and secret key](https://help.amplitude.com/hc/en-us/articles/360058073772-Create-and-manage-organizations-and-projects#view-and-edit-your-project-information). @@ -20,14 +20,16 @@ To set up the Amplitude source connector, you'll need your Amplitude [`API Key` The Amplitude source connector supports the following streams: -* [Active Users Counts](https://www.docs.developers.amplitude.com/analytics/apis/dashboard-rest-api/#get-active-and-new-user-counts) \(Incremental sync\) -* [Annotations](https://www.docs.developers.amplitude.com/analytics/apis/chart-annotations-api/#get-all-chart-annotations) -* [Average Session Length](https://www.docs.developers.amplitude.com/analytics/apis/dashboard-rest-api/#get-average-session-length) \(Incremental sync\) -* [Cohorts](https://www.docs.developers.amplitude.com/analytics/apis/behavioral-cohorts-api/#get-all-cohorts-response) -* [Events](https://www.docs.developers.amplitude.com/analytics/apis/export-api/#response-schema) \(Incremental sync\) +- [Active Users Counts](https://www.docs.developers.amplitude.com/analytics/apis/dashboard-rest-api/#get-active-and-new-user-counts) \(Incremental sync\) +- [Annotations](https://www.docs.developers.amplitude.com/analytics/apis/chart-annotations-api/#get-all-chart-annotations) +- [Average Session Length](https://www.docs.developers.amplitude.com/analytics/apis/dashboard-rest-api/#get-average-session-length) \(Incremental sync\) +- [Cohorts](https://www.docs.developers.amplitude.com/analytics/apis/behavioral-cohorts-api/#get-all-cohorts-response) +- [Events](https://www.docs.developers.amplitude.com/analytics/apis/export-api/#response-schema) \(Incremental sync\) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) + + ## Supported sync modes The Amplitude source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): @@ -50,47 +52,48 @@ The Amplitude connector ideally should gracefully handle Amplitude API limitatio ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------| -| 0.3.10 | 2024-04-19 | [36631](https://github.com/airbytehq/airbyte/pull/36631) | Updating to 0.80.0 CDK | -| 0.3.9 | 2024-04-12 | [36631](https://github.com/airbytehq/airbyte/pull/36631) | schema descriptions | -| 0.3.8 | 2024-03-12 | [35987](https://github.com/airbytehq/airbyte/pull/35987) | Unpin CDK version | -| 0.3.7 | 2024-02-12 | [35162](https://github.com/airbytehq/airbyte/pull/35162) | Manage dependencies with Poetry. | -| 0.3.6 | 2023-10-23 | [31702](https://github.com/airbytehq/airbyte/pull/31702) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.5 | 2023-09-28 | [30846](https://github.com/airbytehq/airbyte/pull/30846) | Add support of multiple cursor date formats | -| 0.3.4 | 2023-09-28 | [30831](https://github.com/airbytehq/airbyte/pull/30831) | Add user friendly error description on 403 error | -| 0.3.3 | 2023-09-21 | [30652](https://github.com/airbytehq/airbyte/pull/30652) | Update spec: declare `start_date` type as `date-time` | -| 0.3.2 | 2023-09-18 | [30525](https://github.com/airbytehq/airbyte/pull/30525) | Fix `KeyError` while getting `data_region` from config | -| 0.3.1 | 2023-09-15 | [30471](https://github.com/airbytehq/airbyte/pull/30471) | Fix `Event` stream: Use `start_time` instead of cursor in the case of more recent | -| 0.3.0 | 2023-09-13 | [30378](https://github.com/airbytehq/airbyte/pull/30378) | Switch to latest CDK version | -| 0.2.4 | 2023-05-05 | [25842](https://github.com/airbytehq/airbyte/pull/25842) | added missing attrs in events schema, enabled default availability strategy | -| 0.2.3 | 2023-04-20 | [25317](https://github.com/airbytehq/airbyte/pull/25317) | Refactor Events Stream, use pre-YAML version based on Python CDK | -| 0.2.2 | 2023-04-19 | [25315](https://github.com/airbytehq/airbyte/pull/25315) | Refactor to only fetch date_time_fields once per request | -| 0.2.1 | 2023-02-03 | [25281](https://github.com/airbytehq/airbyte/pull/25281) | Reduce request_time_range to 4 hours | -| 0.2.0 | 2023-02-03 | [22362](https://github.com/airbytehq/airbyte/pull/22362) | Migrate to YAML | -| 0.1.24 | 2023-03-28 | [21022](https://github.com/airbytehq/airbyte/pull/21022) | Enable event stream time interval selection | -| 0.1.23 | 2023-03-02 | [23087](https://github.com/airbytehq/airbyte/pull/23087) | Specified date formatting in specification | -| 0.1.22 | 2023-02-17 | [23192](https://github.com/airbytehq/airbyte/pull/23192) | Skip the stream if `start_date` is specified in the future. | -| 0.1.21 | 2023-02-01 | [21888](https://github.com/airbytehq/airbyte/pull/21888) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.1.20 | 2023-01-27 | [21957](https://github.com/airbytehq/airbyte/pull/21957) | Handle null values and empty strings in date-time fields | -| 0.1.19 | 2022-12-09 | [19727](https://github.com/airbytehq/airbyte/pull/19727) | Remove `data_region` as required | -| 0.1.18 | 2022-12-08 | [19727](https://github.com/airbytehq/airbyte/pull/19727) | Add parameter to select region | -| 0.1.17 | 2022-10-31 | [18684](https://github.com/airbytehq/airbyte/pull/18684) | Add empty `series` validation for `AverageSessionLength` stream | -| 0.1.16 | 2022-10-11 | [17854](https://github.com/airbytehq/airbyte/pull/17854) | Add empty `series` validation for `ActtiveUsers` steam | -| 0.1.15 | 2022-10-03 | [17320](https://github.com/airbytehq/airbyte/pull/17320) | Add validation `start_date` filed if it's in the future | -| 0.1.14 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | -| 0.1.13 | 2022-08-31 | [16185](https://github.com/airbytehq/airbyte/pull/16185) | Re-release on new `airbyte_cdk==0.1.81` | -| 0.1.12 | 2022-08-11 | [15506](https://github.com/airbytehq/airbyte/pull/15506) | Changed slice day window to 1, instead of 3 for Events stream | -| 0.1.11 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from spec | -| 0.1.10 | 2022-06-16 | [13846](https://github.com/airbytehq/airbyte/pull/13846) | Try-catch the BadZipFile error | -| 0.1.9 | 2022-06-10 | [13638](https://github.com/airbytehq/airbyte/pull/13638) | Fixed an infinite loop when fetching Amplitude data | -| 0.1.8 | 2022-06-01 | [13373](https://github.com/airbytehq/airbyte/pull/13373) | Fixed the issue when JSON Validator produces errors on `date-time` check | -| 0.1.7 | 2022-05-21 | [13074](https://github.com/airbytehq/airbyte/pull/13074) | Removed time offset for `Events` stream, which caused a lot of duplicated records | -| 0.1.6 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | -| 0.1.5 | 2022-04-28 | [12430](https://github.com/airbytehq/airbyte/pull/12430) | Added HTTP error descriptions and fixed `Events` stream fail caused by `404` HTTP Error | -| 0.1.4 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.3 | 2021-10-12 | [6375](https://github.com/airbytehq/airbyte/pull/6375) | Log Transient 404 Error in Events stream | -| 0.1.2 | 2021-09-21 | [6353](https://github.com/airbytehq/airbyte/pull/6353) | Correct output schemas on cohorts, events, active\_users, and average\_session\_lengths streams | -| 0.1.1 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add AIRBYTE\_ENTRYPOINT for kubernetes support | -| 0.1.0 | 2021-06-08 | [3664](https://github.com/airbytehq/airbyte/pull/3664) | New Source: Amplitude | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------- | +| 0.3.10 | 2024-04-19 | [36631](https://github.com/airbytehq/airbyte/pull/36631) | Updating to 0.80.0 CDK | +| 0.3.9 | 2024-04-12 | [36631](https://github.com/airbytehq/airbyte/pull/36631) | schema descriptions | +| 0.3.8 | 2024-03-12 | [35987](https://github.com/airbytehq/airbyte/pull/35987) | Unpin CDK version | +| 0.3.7 | 2024-02-12 | [35162](https://github.com/airbytehq/airbyte/pull/35162) | Manage dependencies with Poetry. | +| 0.3.6 | 2023-10-23 | [31702](https://github.com/airbytehq/airbyte/pull/31702) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.5 | 2023-09-28 | [30846](https://github.com/airbytehq/airbyte/pull/30846) | Add support of multiple cursor date formats | +| 0.3.4 | 2023-09-28 | [30831](https://github.com/airbytehq/airbyte/pull/30831) | Add user friendly error description on 403 error | +| 0.3.3 | 2023-09-21 | [30652](https://github.com/airbytehq/airbyte/pull/30652) | Update spec: declare `start_date` type as `date-time` | +| 0.3.2 | 2023-09-18 | [30525](https://github.com/airbytehq/airbyte/pull/30525) | Fix `KeyError` while getting `data_region` from config | +| 0.3.1 | 2023-09-15 | [30471](https://github.com/airbytehq/airbyte/pull/30471) | Fix `Event` stream: Use `start_time` instead of cursor in the case of more recent | +| 0.3.0 | 2023-09-13 | [30378](https://github.com/airbytehq/airbyte/pull/30378) | Switch to latest CDK version | +| 0.2.4 | 2023-05-05 | [25842](https://github.com/airbytehq/airbyte/pull/25842) | added missing attrs in events schema, enabled default availability strategy | +| 0.2.3 | 2023-04-20 | [25317](https://github.com/airbytehq/airbyte/pull/25317) | Refactor Events Stream, use pre-YAML version based on Python CDK | +| 0.2.2 | 2023-04-19 | [25315](https://github.com/airbytehq/airbyte/pull/25315) | Refactor to only fetch date_time_fields once per request | +| 0.2.1 | 2023-02-03 | [25281](https://github.com/airbytehq/airbyte/pull/25281) | Reduce request_time_range to 4 hours | +| 0.2.0 | 2023-02-03 | [22362](https://github.com/airbytehq/airbyte/pull/22362) | Migrate to YAML | +| 0.1.24 | 2023-03-28 | [21022](https://github.com/airbytehq/airbyte/pull/21022) | Enable event stream time interval selection | +| 0.1.23 | 2023-03-02 | [23087](https://github.com/airbytehq/airbyte/pull/23087) | Specified date formatting in specification | +| 0.1.22 | 2023-02-17 | [23192](https://github.com/airbytehq/airbyte/pull/23192) | Skip the stream if `start_date` is specified in the future. | +| 0.1.21 | 2023-02-01 | [21888](https://github.com/airbytehq/airbyte/pull/21888) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.1.20 | 2023-01-27 | [21957](https://github.com/airbytehq/airbyte/pull/21957) | Handle null values and empty strings in date-time fields | +| 0.1.19 | 2022-12-09 | [19727](https://github.com/airbytehq/airbyte/pull/19727) | Remove `data_region` as required | +| 0.1.18 | 2022-12-08 | [19727](https://github.com/airbytehq/airbyte/pull/19727) | Add parameter to select region | +| 0.1.17 | 2022-10-31 | [18684](https://github.com/airbytehq/airbyte/pull/18684) | Add empty `series` validation for `AverageSessionLength` stream | +| 0.1.16 | 2022-10-11 | [17854](https://github.com/airbytehq/airbyte/pull/17854) | Add empty `series` validation for `ActtiveUsers` steam | +| 0.1.15 | 2022-10-03 | [17320](https://github.com/airbytehq/airbyte/pull/17320) | Add validation `start_date` filed if it's in the future | +| 0.1.14 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | +| 0.1.13 | 2022-08-31 | [16185](https://github.com/airbytehq/airbyte/pull/16185) | Re-release on new `airbyte_cdk==0.1.81` | +| 0.1.12 | 2022-08-11 | [15506](https://github.com/airbytehq/airbyte/pull/15506) | Changed slice day window to 1, instead of 3 for Events stream | +| 0.1.11 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from spec | +| 0.1.10 | 2022-06-16 | [13846](https://github.com/airbytehq/airbyte/pull/13846) | Try-catch the BadZipFile error | +| 0.1.9 | 2022-06-10 | [13638](https://github.com/airbytehq/airbyte/pull/13638) | Fixed an infinite loop when fetching Amplitude data | +| 0.1.8 | 2022-06-01 | [13373](https://github.com/airbytehq/airbyte/pull/13373) | Fixed the issue when JSON Validator produces errors on `date-time` check | +| 0.1.7 | 2022-05-21 | [13074](https://github.com/airbytehq/airbyte/pull/13074) | Removed time offset for `Events` stream, which caused a lot of duplicated records | +| 0.1.6 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | +| 0.1.5 | 2022-04-28 | [12430](https://github.com/airbytehq/airbyte/pull/12430) | Added HTTP error descriptions and fixed `Events` stream fail caused by `404` HTTP Error | +| 0.1.4 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.3 | 2021-10-12 | [6375](https://github.com/airbytehq/airbyte/pull/6375) | Log Transient 404 Error in Events stream | +| 0.1.2 | 2021-09-21 | [6353](https://github.com/airbytehq/airbyte/pull/6353) | Correct output schemas on cohorts, events, active_users, and average_session_lengths streams | +| 0.1.1 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add AIRBYTE_ENTRYPOINT for kubernetes support | +| 0.1.0 | 2021-06-08 | [3664](https://github.com/airbytehq/airbyte/pull/3664) | New Source: Amplitude | + diff --git a/docs/integrations/sources/apify-dataset-migrations.md b/docs/integrations/sources/apify-dataset-migrations.md index f4bb1ed7c32..585614d997c 100644 --- a/docs/integrations/sources/apify-dataset-migrations.md +++ b/docs/integrations/sources/apify-dataset-migrations.md @@ -5,6 +5,7 @@ Major update: The old broken Item Collection stream has been removed and replaced with a new Item Collection (WCC) stream specific for the datasets produced by [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor. In a follow-up release 2.1.0, a generic item collection stream will be added to support all other datasets. After upgrading, users should: + - Reconfigure dataset id and API key - Reset all streams diff --git a/docs/integrations/sources/apify-dataset.md b/docs/integrations/sources/apify-dataset.md index 94507474edd..f8a51d89f91 100644 --- a/docs/integrations/sources/apify-dataset.md +++ b/docs/integrations/sources/apify-dataset.md @@ -41,45 +41,45 @@ The Apify dataset connector uses [Apify Python Client](https://docs.apify.com/ap - Calls `api.apify.com/v2/datasets` ([docs](https://docs.apify.com/api/v2#/reference/datasets/dataset-collection/get-list-of-datasets)) - Properties: - - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) + - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) ### `dataset` - Calls `https://api.apify.com/v2/datasets/{datasetId}` ([docs](https://docs.apify.com/api/v2#/reference/datasets/dataset/get-dataset)) - Properties: - - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) - - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset)) + - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) + - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset)) ### `item_collection` - Calls `api.apify.com/v2/datasets/{datasetId}/items` ([docs](https://docs.apify.com/api/v2#/reference/datasets/item-collection/get-items)) - Properties: - - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) - - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset)) + - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations)) + - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset)) - Limitations: - - The stream uses a dynamic schema (all the data are stored under the `"data"` key), so it should support all the Apify Datasets (produced by whatever Actor). + - The stream uses a dynamic schema (all the data are stored under the `"data"` key), so it should support all the Apify Datasets (produced by whatever Actor). ### `item_collection_website_content_crawler` - Calls the same endpoint and uses the same properties as the `item_collection` stream. - Limitations: - - The stream uses a static schema which corresponds to the datasets produced by [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor. So only datasets produced by this Actor are supported. + - The stream uses a static schema which corresponds to the datasets produced by [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor. So only datasets produced by this Actor are supported. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------------- | :-------------------------------------------------------------------------- | -| 2.1.5 | 2024-04-19 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Updating to 0.80.0 CDK | -| 2.1.4 | 2024-04-18 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Manage dependencies with Poetry. | -| 2.1.3 | 2024-04-15 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 2.1.2 | 2024-04-12 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | schema descriptions | -| 2.1.1 | 2023-12-14 | [33414](https://github.com/airbytehq/airbyte/pull/33414) | Prepare for airbyte-lib | -| 2.1.0 | 2023-10-13 | [31333](https://github.com/airbytehq/airbyte/pull/31333) | Add stream for arbitrary datasets | -| 2.0.0 | 2023-09-18 | [30428](https://github.com/airbytehq/airbyte/pull/30428) | Fix broken stream, manifest refactor | -| 1.0.0 | 2023-08-25 | [29859](https://github.com/airbytehq/airbyte/pull/29859) | Migrate to lowcode | -| 0.2.0 | 2022-06-20 | [28290](https://github.com/airbytehq/airbyte/pull/28290) | Make connector work with platform changes not syncing empty stream schemas. | -| 0.1.11 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. | -| 0.1.9 | 2022-04-05 | [PR\#11712](https://github.com/airbytehq/airbyte/pull/11712) | No changes from 0.1.4. Used connector to test publish workflow changes. | -| 0.1.4 | 2021-12-23 | [PR\#8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.2 | 2021-11-08 | [PR\#7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | -| 0.1.0 | 2021-07-29 | [PR\#5069](https://github.com/airbytehq/airbyte/pull/5069) | Initial version of the connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 2.1.5 | 2024-04-19 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Updating to 0.80.0 CDK | +| 2.1.4 | 2024-04-18 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Manage dependencies with Poetry. | +| 2.1.3 | 2024-04-15 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 2.1.2 | 2024-04-12 | [37115](https://github.com/airbytehq/airbyte/pull/37115) | schema descriptions | +| 2.1.1 | 2023-12-14 | [33414](https://github.com/airbytehq/airbyte/pull/33414) | Prepare for airbyte-lib | +| 2.1.0 | 2023-10-13 | [31333](https://github.com/airbytehq/airbyte/pull/31333) | Add stream for arbitrary datasets | +| 2.0.0 | 2023-09-18 | [30428](https://github.com/airbytehq/airbyte/pull/30428) | Fix broken stream, manifest refactor | +| 1.0.0 | 2023-08-25 | [29859](https://github.com/airbytehq/airbyte/pull/29859) | Migrate to lowcode | +| 0.2.0 | 2022-06-20 | [28290](https://github.com/airbytehq/airbyte/pull/28290) | Make connector work with platform changes not syncing empty stream schemas. | +| 0.1.11 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. | +| 0.1.9 | 2022-04-05 | [PR\#11712](https://github.com/airbytehq/airbyte/pull/11712) | No changes from 0.1.4. Used connector to test publish workflow changes. | +| 0.1.4 | 2021-12-23 | [PR\#8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.2 | 2021-11-08 | [PR\#7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | +| 0.1.0 | 2021-07-29 | [PR\#5069](https://github.com/airbytehq/airbyte/pull/5069) | Initial version of the connector | diff --git a/docs/integrations/sources/appfollow-migrations.md b/docs/integrations/sources/appfollow-migrations.md index 69485b8ad80..6e8a9844660 100644 --- a/docs/integrations/sources/appfollow-migrations.md +++ b/docs/integrations/sources/appfollow-migrations.md @@ -2,4 +2,4 @@ ## Upgrading to 1.0.0 -Remove connector parameters to ingest all possible apps and add new streams. \ No newline at end of file +Remove connector parameters to ingest all possible apps and add new streams. diff --git a/docs/integrations/sources/appfollow.md b/docs/integrations/sources/appfollow.md index 7d7fdb9e233..150e8d9fb30 100644 --- a/docs/integrations/sources/appfollow.md +++ b/docs/integrations/sources/appfollow.md @@ -35,7 +35,7 @@ The Appfollow connector ideally should gracefully handle Appfollow API limitatio ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------- | -| 1.0.0 | 2023-08-05 | [29128](https://github.com/airbytehq/airbyte/pull/29128) | Migrate to low-code and add new streams | -| 0.1.1 | 2022-08-11 | [14418](https://github.com/airbytehq/airbyte/pull/14418) | New Source: Appfollow | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------- | +| 1.0.0 | 2023-08-05 | [29128](https://github.com/airbytehq/airbyte/pull/29128) | Migrate to low-code and add new streams | +| 0.1.1 | 2022-08-11 | [14418](https://github.com/airbytehq/airbyte/pull/14418) | New Source: Appfollow | diff --git a/docs/integrations/sources/appstore.md b/docs/integrations/sources/appstore.md index a9989304841..16b26090f00 100644 --- a/docs/integrations/sources/appstore.md +++ b/docs/integrations/sources/appstore.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The Appstore source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The Appstore source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. @@ -14,7 +14,6 @@ Users who still wish to sync data from this connector are advised to explore cre ::: - ## Sync overview This source can sync data for the [Appstore API](https://developer.apple.com/documentation/appstoreconnectapi). It supports only Incremental syncs. The Appstore API is available for [many types of services](https://developer.apple.com/documentation/appstoreconnectapi). Currently, this API supports syncing Sales and Trends reports. If you'd like to sync data from other endpoints, please create an issue on Github. @@ -25,31 +24,31 @@ This Source Connector is based on a [Singer Tap](https://github.com/miroapp/tap- This Source is capable of syncing the following "Sales and Trends" Streams: -* [SALES](https://help.apple.com/app-store-connect/#/dev15f9508ca) -* [SUBSCRIPTION](https://help.apple.com/app-store-connect/#/itc5dcdf6693) -* [SUBSCRIPTION\_EVENT](https://help.apple.com/app-store-connect/#/itc0b9b9d5b2) -* [SUBSCRIBER](https://help.apple.com/app-store-connect/#/itcf20f3392e) +- [SALES](https://help.apple.com/app-store-connect/#/dev15f9508ca) +- [SUBSCRIPTION](https://help.apple.com/app-store-connect/#/itc5dcdf6693) +- [SUBSCRIPTION_EVENT](https://help.apple.com/app-store-connect/#/itc0b9b9d5b2) +- [SUBSCRIBER](https://help.apple.com/app-store-connect/#/itcf20f3392e) Note that depending on the credentials you enter, you may only be able to sync some of these reports. For example, if your app does not offer subscriptions, then it is not possible to sync subscription related reports. ### Data type mapping -| Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `int`, `float`, `number` | `number` | | -| `date` | `date` | | -| `datetime` | `datetime` | | -| `array` | `array` | | -| `object` | `object` | | +| Integration Type | Airbyte Type | Notes | +| :----------------------- | :----------- | :---- | +| `string` | `string` | | +| `int`, `float`, `number` | `number` | | +| `date` | `date` | | +| `datetime` | `datetime` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | no | | -| Incremental Sync | yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | no | | +| Incremental Sync | yes | | +| Namespaces | No | | ### Performance considerations @@ -63,11 +62,11 @@ One issue that can happen is the API not having the data available for the perio ### Requirements -* Key ID -* Private Key The contents of the private API key file, which is in the P8 format and should start with `-----BEGIN PRIVATE KEY-----` and end with `-----END PRIVATE KEY-----`. -* Issuer ID -* Vendor ID Go to "Sales and Trends", then choose "Reports" from the drop-down menu in the top left. On the next screen, there'll be a drop-down menu for "Vendor". Your name and ID will be shown there. Use the numeric Vendor ID. -* Start Date \(The date that will be used in the first sync. Apple only allows to go back 365 days from today.\) Example: `2020-11-16T00:00:00Z` +- Key ID +- Private Key The contents of the private API key file, which is in the P8 format and should start with `-----BEGIN PRIVATE KEY-----` and end with `-----END PRIVATE KEY-----`. +- Issuer ID +- Vendor ID Go to "Sales and Trends", then choose "Reports" from the drop-down menu in the top left. On the next screen, there'll be a drop-down menu for "Vendor". Your name and ID will be shown there. Use the numeric Vendor ID. +- Start Date \(The date that will be used in the first sync. Apple only allows to go back 365 days from today.\) Example: `2020-11-16T00:00:00Z` ### Setup guide @@ -75,9 +74,8 @@ Generate/Find all requirements using this [external article](https://leapfin.com ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :--- |:------------------------------------------------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------- | :------------------------------------------------ | | 0.2.6 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.2.5 | 2021-12-09 | [7757](https://github.com/airbytehq/airbyte/pull/7757) | Migrate to the CDK | -| 0.2.4 | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | - +| 0.2.5 | 2021-12-09 | [7757](https://github.com/airbytehq/airbyte/pull/7757) | Migrate to the CDK | +| 0.2.4 | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | diff --git a/docs/integrations/sources/asana.md b/docs/integrations/sources/asana.md index a53215a84ac..d3ac292d917 100644 --- a/docs/integrations/sources/asana.md +++ b/docs/integrations/sources/asana.md @@ -23,6 +23,7 @@ This connector supports **OAuth** and **Personal Access Tokens**. Please follow 5. Click **Set up source**. #### Syncing Multiple Projects + If you have access to multiple projects, Airbyte will sync data related to all projects you have access to. The ability to filter to specific projects is not available at this time. @@ -52,6 +53,7 @@ The Asana source connector supports the following [sync modes](https://docs.airb | Namespaces | No | ## Supported Streams + - [Attachments](https://developers.asana.com/docs/attachments) - [Custom fields](https://developers.asana.com/docs/custom-fields) - [Projects](https://developers.asana.com/docs/projects) @@ -92,8 +94,8 @@ The connector is restricted by [Asana rate limits](https://developers.asana.com/ ### Troubleshooting -* If you encounter access errors while using **OAuth** authentication, please make sure you've followed this [Asana Article](https://developers.asana.com/docs/oauth). -* Check out common troubleshooting issues for the Asana source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). +- If you encounter access errors while using **OAuth** authentication, please make sure you've followed this [Asana Article](https://developers.asana.com/docs/oauth). +- Check out common troubleshooting issues for the Asana source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). @@ -101,9 +103,9 @@ The connector is restricted by [Asana rate limits](https://developers.asana.com/ | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------- | -| 0.6.1 | 2023-11-13 | [31110](https://github.com/airbytehq/airbyte/pull/31110) | Fix hidden config access | +| 0.6.1 | 2023-11-13 | [31110](https://github.com/airbytehq/airbyte/pull/31110) | Fix hidden config access | | 0.6.0 | 2023-11-03 | [31110](https://github.com/airbytehq/airbyte/pull/31110) | Add new stream Portfolio Memberships with Parent Portfolio | -| 0.5.0 | 2023-10-30 | [31114](https://github.com/airbytehq/airbyte/pull/31114) | Add Portfolios stream | +| 0.5.0 | 2023-10-30 | [31114](https://github.com/airbytehq/airbyte/pull/31114) | Add Portfolios stream | | 0.4.0 | 2023-10-24 | [31084](https://github.com/airbytehq/airbyte/pull/31084) | Add StoriesCompact stream | | 0.3.0 | 2023-10-24 | [31634](https://github.com/airbytehq/airbyte/pull/31634) | Add OrganizationExports stream | | 0.2.0 | 2023-10-17 | [31090](https://github.com/airbytehq/airbyte/pull/31090) | Add Attachments stream | @@ -118,4 +120,4 @@ The connector is restricted by [Asana rate limits](https://developers.asana.com/ | 0.1.1 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add entrypoint and bump version for connector | | 0.1.0 | 2021-05-25 | [3510](https://github.com/airbytehq/airbyte/pull/3510) | New Source: Asana | - \ No newline at end of file + diff --git a/docs/integrations/sources/ashby.md b/docs/integrations/sources/ashby.md index 2a7e51c1359..a7888636b16 100644 --- a/docs/integrations/sources/ashby.md +++ b/docs/integrations/sources/ashby.md @@ -43,6 +43,6 @@ The Ashby connector should not run into Ashby API limitations under normal usage ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------- | :------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------- | | 0.1.0 | 2022-10-22 | [18334](https://github.com/airbytehq/airbyte/pull/18334) | Add Ashby Source Connector | diff --git a/docs/integrations/sources/auth0.md b/docs/integrations/sources/auth0.md index 5514e370067..5fae7ca3759 100644 --- a/docs/integrations/sources/auth0.md +++ b/docs/integrations/sources/auth0.md @@ -56,12 +56,12 @@ The connector is restricted by Auth0 [rate limits](https://auth0.com/docs/troubl ## Changelog | Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- |:------------------------------------------------------------------------| -| 0.5.2 | 2024-05-02 | [37770](https://github.com/airbytehq/airbyte/pull/37770) | Add Selective Authenticator. Migrate to poetry | -| 0.5.1 | 2023-10-20 | [31643](https://github.com/airbytehq/airbyte/pull/31643) | Upgrade base image to airbyte/python-connector-base:1.1.0 | -| 0.5.0 | 2023-10-11 | [30467](https://github.com/airbytehq/airbyte/pull/30467) | Use Python base image | -| 0.4.1 | 2023-08-24 | [29804](https://github.com/airbytehq/airbyte/pull/29804) | Fix low code migration bugs | -| 0.4.0 | 2023-08-03 | [28972](https://github.com/airbytehq/airbyte/pull/28972) | Migrate to Low-Code CDK | -| 0.3.0 | 2023-06-20 | [29001](https://github.com/airbytehq/airbyte/pull/29001) | Add Organizations, OrganizationMembers, OrganizationMemberRoles streams | -| 0.2.0 | 2023-05-23 | [26445](https://github.com/airbytehq/airbyte/pull/26445) | Add Clients stream | -| 0.1.0 | 2022-10-21 | [18338](https://github.com/airbytehq/airbyte/pull/18338) | Add Auth0 and Users stream | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- | +| 0.5.2 | 2024-05-02 | [37770](https://github.com/airbytehq/airbyte/pull/37770) | Add Selective Authenticator. Migrate to poetry | +| 0.5.1 | 2023-10-20 | [31643](https://github.com/airbytehq/airbyte/pull/31643) | Upgrade base image to airbyte/python-connector-base:1.1.0 | +| 0.5.0 | 2023-10-11 | [30467](https://github.com/airbytehq/airbyte/pull/30467) | Use Python base image | +| 0.4.1 | 2023-08-24 | [29804](https://github.com/airbytehq/airbyte/pull/29804) | Fix low code migration bugs | +| 0.4.0 | 2023-08-03 | [28972](https://github.com/airbytehq/airbyte/pull/28972) | Migrate to Low-Code CDK | +| 0.3.0 | 2023-06-20 | [29001](https://github.com/airbytehq/airbyte/pull/29001) | Add Organizations, OrganizationMembers, OrganizationMemberRoles streams | +| 0.2.0 | 2023-05-23 | [26445](https://github.com/airbytehq/airbyte/pull/26445) | Add Clients stream | +| 0.1.0 | 2022-10-21 | [18338](https://github.com/airbytehq/airbyte/pull/18338) | Add Auth0 and Users stream | diff --git a/docs/integrations/sources/avni.md b/docs/integrations/sources/avni.md index 8e2272a8643..05eaa8f854c 100644 --- a/docs/integrations/sources/avni.md +++ b/docs/integrations/sources/avni.md @@ -36,7 +36,6 @@ The Avni source connector supports the following[ sync modes](https://docs.airby - [Incremental Sync - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) - (Recommended)[ Incremental Sync - Deduped History](https://docs.airbyte.com/understanding-airbyte/connections/incremental-deduped-history) - ## Supported Streams Avni Source connector Support Following Streams: @@ -47,7 +46,8 @@ Avni Source connector Support Following Streams: - **Subject Encounter Stream**, This stream provides data about encounters involving subjects, excluding program encounters. You can obtain information about all the encounters that subjects have had outside of program-encounter. avirajsingh7 marked this conversation as resolved. + ## Changelog | Version | Date | Pull Request | Subject | -| 0.1.0 | 2023-09-07 | [30222](https://github.com/airbytehq/airbyte/pull/30222) | Avni Source Connector | \ No newline at end of file +| 0.1.0 | 2023-09-07 | [30222](https://github.com/airbytehq/airbyte/pull/30222) | Avni Source Connector | diff --git a/docs/integrations/sources/aws-cloudtrail.md b/docs/integrations/sources/aws-cloudtrail.md index e7fa055be8a..c8794d77e03 100644 --- a/docs/integrations/sources/aws-cloudtrail.md +++ b/docs/integrations/sources/aws-cloudtrail.md @@ -49,13 +49,13 @@ Please, follow this [steps](https://docs.aws.amazon.com/powershell/latest/usergu ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | -| 0.1.7 | 2024-04-15 | [37122](https://github.com/airbytehq/airbyte/pull/37122) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.6 | 2024-04-12 | [37122](https://github.com/airbytehq/airbyte/pull/37122) | schema descriptions | -| 0.1.5 | 2023-02-15 | [23083](https://github.com/airbytehq/airbyte/pull/23083) | Specified date formatting in specification | -| 0.1.4 | 2022-04-11 | [11763](https://github.com/airbytehq/airbyte/pull/11763) | Upgrade to Python 3.9 | -| 0.1.3 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.2 | 2021-08-04 | [5152](https://github.com/airbytehq/airbyte/pull/5152) | Fix connector spec.json | -| 0.1.1 | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | -| 0.1.0 | 2021-06-23 | [4122](https://github.com/airbytehq/airbyte/pull/4122) | Initial release supporting the LookupEvent API | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.7 | 2024-04-15 | [37122](https://github.com/airbytehq/airbyte/pull/37122) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.6 | 2024-04-12 | [37122](https://github.com/airbytehq/airbyte/pull/37122) | schema descriptions | +| 0.1.5 | 2023-02-15 | [23083](https://github.com/airbytehq/airbyte/pull/23083) | Specified date formatting in specification | +| 0.1.4 | 2022-04-11 | [11763](https://github.com/airbytehq/airbyte/pull/11763) | Upgrade to Python 3.9 | +| 0.1.3 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.2 | 2021-08-04 | [5152](https://github.com/airbytehq/airbyte/pull/5152) | Fix connector spec.json | +| 0.1.1 | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | +| 0.1.0 | 2021-06-23 | [4122](https://github.com/airbytehq/airbyte/pull/4122) | Initial release supporting the LookupEvent API | diff --git a/docs/integrations/sources/azure-blob-storage.md b/docs/integrations/sources/azure-blob-storage.md index 7cfe87ddc0c..70b79b56bc1 100644 --- a/docs/integrations/sources/azure-blob-storage.md +++ b/docs/integrations/sources/azure-blob-storage.md @@ -37,8 +37,7 @@ Minimum permissions (role [Storage Blob Data Reader](https://learn.microsoft.com ### Step 1: Set up Azure Blob Storage -* Create a storage account with the permissions [details](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal) - +- Create a storage account with the permissions [details](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal) :::warning To use Oauth 2.0 Authentication method, Access Control (IAM) should be setup. @@ -70,7 +69,7 @@ Follow these steps to set up an IAM role: 7. Enter the name of the **Container** containing your files to replicate. 8. Add a stream 1. Write the **File Type** - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. 3. Give a **Name** to the stream 4. (Optional)—If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. @@ -82,7 +81,7 @@ Follow these steps to set up an IAM role: The Azure Blob Storage source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:-----------------------------------------------|:-----------| +| :--------------------------------------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Replicate Incremental Deletes | No | @@ -93,7 +92,7 @@ The Azure Blob Storage source connector supports the following [sync modes](http ### File Compressions | Compression | Supported? | -|:------------|:-----------| +| :---------- | :--------- | | Gzip | Yes | | Zip | No | | Bzip2 | Yes | @@ -194,7 +193,7 @@ Product,Description,Price Jeans,"Navy Blue, Bootcut, 34\"",49.99 ``` -The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). +The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). Leaving this field blank (default option) will disallow escaping. @@ -206,7 +205,6 @@ Leaving this field blank (default option) will disallow escaping. - **Strings Can Be Null**: Whether strings can be interpreted as null values. If true, strings that match the null_values set will be interpreted as null. If false, strings that match the null_values set will be interpreted as the string itself. - **True Values**: A set of case-sensitive strings that should be interpreted as true values. - #### Parquet Apache Parquet is a column-oriented data storage format of the Apache Hadoop ecosystem. It provides efficient data compression and encoding schemes with enhanced performance to handle complex data in bulk. At the moment, partitioned parquet datasets are unsupported. The following settings are available: @@ -216,6 +214,7 @@ Apache Parquet is a column-oriented data storage format of the Apache Hadoop eco #### Avro The Avro parser uses the [Fastavro library](https://fastavro.readthedocs.io/en/latest/). The following settings are available: + - **Convert Double Fields to Strings**: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers. #### JSONL @@ -247,7 +246,7 @@ The Azure Blob Storage connector should not encounter any [Microsoft API limitat ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------- | | 0.4.2 | 2024-04-23 | [37504](https://github.com/airbytehq/airbyte/pull/37504) | Update specification | | 0.4.1 | 2024-04-22 | [37467](https://github.com/airbytehq/airbyte/pull/37467) | Fix start date filter | | 0.4.0 | 2024-04-05 | [36825](https://github.com/airbytehq/airbyte/pull/36825) | Add oauth 2.0 support | diff --git a/docs/integrations/sources/azure-table.md b/docs/integrations/sources/azure-table.md index f617018961a..5051a44d547 100644 --- a/docs/integrations/sources/azure-table.md +++ b/docs/integrations/sources/azure-table.md @@ -7,7 +7,7 @@ The Azure table storage supports Full Refresh and Incremental syncs. You can cho ### Output schema This Source have generic schema for all streams. -Azure Table storage is a service that stores non-relational structured data (also known as structured NoSQL data). There is no efficient way to read schema for the given table. We use `data` property to have all the properties for any given row. +Azure Table storage is a service that stores non-relational structured data (also known as structured NoSQL data). There is no efficient way to read schema for the given table. We use `data` property to have all the properties for any given row. - data - This property contains all values - additionalProperties - This property denotes that all the values are in `data` property. @@ -49,16 +49,17 @@ The Azure table storage connector should not run into API limitations under norm ### Requirements -* Azure Storage Account -* Azure Storage Account Key -* Azure Storage Endpoint Suffix +- Azure Storage Account +- Azure Storage Account Key +- Azure Storage Endpoint Suffix ### Setup guide Visit the [Azure Portal](https://portal.azure.com). Go to your storage account, you can find : - - Azure Storage Account - under the overview tab - - Azure Storage Account Key - under the Access keys tab - - Azure Storage Endpoint Suffix - under the Endpoint tab + +- Azure Storage Account - under the overview tab +- Azure Storage Account Key - under the Access keys tab +- Azure Storage Endpoint Suffix - under the Endpoint tab We recommend creating a restricted key specifically for Airbyte access. This will allow you to control which resources Airbyte should be able to access. However, shared access key authentication is not supported by this connector yet. diff --git a/docs/integrations/sources/babelforce.md b/docs/integrations/sources/babelforce.md index 749fbf11059..3f80a43e858 100644 --- a/docs/integrations/sources/babelforce.md +++ b/docs/integrations/sources/babelforce.md @@ -2,7 +2,7 @@ ## Overview -The Babelforce source supports _Full Refresh_ as well as _Incremental_ syncs. +The Babelforce source supports _Full Refresh_ as well as _Incremental_ syncs. _Full Refresh_ sync means every time a sync is run, Airbyte will copy all rows in the tables and columns you set up for replication into the destination in a new table. _Incremental_ syn means only changed resources are copied from Babelformce. For the first run, it will be a Full Refresh sync. @@ -11,20 +11,19 @@ _Incremental_ syn means only changed resources are copied from Babelformce. For Several output streams are available from this source: -* [Calls](https://api.babelforce.com/#af7a6b6e-b262-487f-aabd-c59e6fe7ba41) - +- [Calls](https://api.babelforce.com/#af7a6b6e-b262-487f-aabd-c59e6fe7ba41) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | +| Feature | Supported? | +| :---------------------------- | :---------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | | Replicate Incremental Deletes | Coming soon | -| SSL connection | Yes | -| Namespaces | No | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -34,10 +33,10 @@ There are no performance consideration in the current version. ### Requirements -* Region/environment as listed in the `Regions & environments` section [here](https://api.babelforce.com/#intro) -* Babelforce access key ID -* Babelforce access token -* (Optional) start date from when the import starts in epoch Unix timestamp +- Region/environment as listed in the `Regions & environments` section [here](https://api.babelforce.com/#intro) +- Babelforce access key ID +- Babelforce access token +- (Optional) start date from when the import starts in epoch Unix timestamp ### Setup guide @@ -46,6 +45,6 @@ Generate a API access key ID and token using the [Babelforce documentation](http ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------| - 0.2.0 | 2023-08-24 | [29314](https://github.com/airbytehq/airbyte/pull/29314) | Migrate to Low Code | - 0.1.0 | 2022-05-09 | [12700](https://github.com/airbytehq/airbyte/pull/12700) | Introduce Babelforce source | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | +| 0.2.0 | 2023-08-24 | [29314](https://github.com/airbytehq/airbyte/pull/29314) | Migrate to Low Code | +| 0.1.0 | 2022-05-09 | [12700](https://github.com/airbytehq/airbyte/pull/12700) | Introduce Babelforce source | diff --git a/docs/integrations/sources/bamboo-hr.md b/docs/integrations/sources/bamboo-hr.md index 58ca1064947..07702749d10 100644 --- a/docs/integrations/sources/bamboo-hr.md +++ b/docs/integrations/sources/bamboo-hr.md @@ -8,8 +8,8 @@ This page contains the setup guide and reference information for the [BambooHR]( ## Prerequisites -* BambooHR Account -* BambooHR [API key](https://documentation.bamboohr.com/docs) +- BambooHR Account +- BambooHR [API key](https://documentation.bamboohr.com/docs) ## Setup Guide @@ -22,11 +22,11 @@ This page contains the setup guide and reference information for the [BambooHR]( 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+ New source**. 3. On the Set up the source page, enter the name for the BambooHR connector and select **BambooHR** from the Source type dropdown. -3. Enter your `subdomain`. If you access BambooHR at https://mycompany.bamboohr.com, then the subdomain is "mycompany". -4. Enter your `api_key`. To generate an API key, log in and click your name in the upper right-hand corner of any page to get to the user context menu. If you have sufficient administrator permissions, there will be an "API Keys" option in that menu to go to the page. -5. (Optional) Enter any `Custom Report Fields` as a comma-separated list of fields to include in your custom reports. Example: `firstName,lastName`. If none are listed, then the [default fields](https://documentation.bamboohr.com/docs/list-of-field-names) will be returned. -6. Toggle `Custom Reports Include Default Fields`. If true, then the [default fields](https://documentation.bamboohr.com/docs/list-of-field-names) will be returned. If false, then the values defined in `Custom Report Fields` will be returned. -7. Click **Set up source** +4. Enter your `subdomain`. If you access BambooHR at https://mycompany.bamboohr.com, then the subdomain is "mycompany". +5. Enter your `api_key`. To generate an API key, log in and click your name in the upper right-hand corner of any page to get to the user context menu. If you have sufficient administrator permissions, there will be an "API Keys" option in that menu to go to the page. +6. (Optional) Enter any `Custom Report Fields` as a comma-separated list of fields to include in your custom reports. Example: `firstName,lastName`. If none are listed, then the [default fields](https://documentation.bamboohr.com/docs/list-of-field-names) will be returned. +7. Toggle `Custom Reports Include Default Fields`. If true, then the [default fields](https://documentation.bamboohr.com/docs/list-of-field-names) will be returned. If false, then the values defined in `Custom Report Fields` will be returned. +8. Click **Set up source** @@ -50,17 +50,16 @@ This page contains the setup guide and reference information for the [BambooHR]( The BambooHR source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | No | -| SSL connection | Yes | -| Namespaces | No | - +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | No | +| SSL connection | Yes | +| Namespaces | No | ## Supported Streams -* [Custom Reports](https://documentation.bamboohr.com/reference/request-custom-report-1) +- [Custom Reports](https://documentation.bamboohr.com/reference/request-custom-report-1) ## Limitations & Troubleshooting @@ -79,21 +78,21 @@ Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see ### Troubleshooting -* Check out common troubleshooting issues for the BambooHR source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the BambooHR source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog -| Version | Date | Pull Request | Subject | -|:--------| :--------- | :------------------------------------------------------ | :---------------------------------------- | -| 0.2.6 | 2024-04-19 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Updating to 0.80.0 CDK | -| 0.2.5 | 2024-04-18 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Manage dependencies with Poetry. | -| 0.2.4 | 2024-04-15 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.3 | 2024-04-12 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | schema descriptions | -| 0.2.2 | 2022-09-16 | [17684](https://github.com/airbytehq/airbyte/pull/17684) | Fix custom field validation retrieve | -| 0.2.1 | 2022-09-16 | [16826](https://github.com/airbytehq/airbyte/pull/16826) | Add custom fields validation during check | -| 0.2.0 | 2022-03-24 | [11326](https://github.com/airbytehq/airbyte/pull/11326) | Add support for Custom Reports endpoint | -| 0.1.0 | 2021-08-27 | [5054](https://github.com/airbytehq/airbyte/pull/5054) | Initial release with Employees API | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.6 | 2024-04-19 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Updating to 0.80.0 CDK | +| 0.2.5 | 2024-04-18 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Manage dependencies with Poetry. | +| 0.2.4 | 2024-04-15 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.3 | 2024-04-12 | [37124](https://github.com/airbytehq/airbyte/pull/37124) | schema descriptions | +| 0.2.2 | 2022-09-16 | [17684](https://github.com/airbytehq/airbyte/pull/17684) | Fix custom field validation retrieve | +| 0.2.1 | 2022-09-16 | [16826](https://github.com/airbytehq/airbyte/pull/16826) | Add custom fields validation during check | +| 0.2.0 | 2022-03-24 | [11326](https://github.com/airbytehq/airbyte/pull/11326) | Add support for Custom Reports endpoint | +| 0.1.0 | 2021-08-27 | [5054](https://github.com/airbytehq/airbyte/pull/5054) | Initial release with Employees API | diff --git a/docs/integrations/sources/bigcommerce.md b/docs/integrations/sources/bigcommerce.md index 251d0e4376e..fe6936da53f 100644 --- a/docs/integrations/sources/bigcommerce.md +++ b/docs/integrations/sources/bigcommerce.md @@ -54,7 +54,7 @@ BigCommerce has some [rate limit restrictions](https://developer.bigcommerce.com ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------- | | 0.2.0 | 2023-08-16 | [29469](https://github.com/airbytehq/airbyte/pull/29469) | Migrate Python CDK to Low Code | | 0.1.10 | 2022-12-16 | [20518](https://github.com/airbytehq/airbyte/pull/20518) | Add brands and categories streams | | 0.1.9 | 2022-12-15 | [20540](https://github.com/airbytehq/airbyte/pull/20540) | Rebuild on CDK 0.15.0 | diff --git a/docs/integrations/sources/bigquery.md b/docs/integrations/sources/bigquery.md index b0d73b12429..6a5ccfe6172 100644 --- a/docs/integrations/sources/bigquery.md +++ b/docs/integrations/sources/bigquery.md @@ -87,7 +87,7 @@ Once you've configured BigQuery as a source, delete the Service Account Key from ### source-bigquery | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | | 0.4.2 | 2024-02-22 | [35503](https://github.com/airbytehq/airbyte/pull/35503) | Source BigQuery: replicating RECORD REPEATED fields | | 0.4.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.4.0 | 2023-12-18 | [33484](https://github.com/airbytehq/airbyte/pull/33484) | Remove LEGACY state | diff --git a/docs/integrations/sources/bing-ads-migrations.md b/docs/integrations/sources/bing-ads-migrations.md index c078d1d0cb5..dad0f47aa8e 100644 --- a/docs/integrations/sources/bing-ads-migrations.md +++ b/docs/integrations/sources/bing-ads-migrations.md @@ -6,7 +6,7 @@ This version update affects all hourly reports (end in report_hourly) and the fo - Accounts - Campaigns -- Search Query Performance Report +- Search Query Performance Report - AppInstallAds - AppInstallAdLabels - Labels @@ -21,7 +21,7 @@ All `date` and `date-time` fields will be converted to standard `RFC3339`. Strea For the changes to take effect, please refresh the source schema and reset affected streams after you have applied the upgrade. | Stream field | Current Airbyte Type | New Airbyte Type | -|-----------------------------|----------------------|-------------------| +| --------------------------- | -------------------- | ----------------- | | LinkedAgencies | string | object | | BiddingScheme.MaxCpc.Amount | string | number | | CostPerConversion | integer | number | @@ -31,17 +31,17 @@ For the changes to take effect, please refresh the source schema and reset affec Detailed date-time field change examples: -| Affected streams | Field_name | Old type | New type (`RFC3339`) | -|----------------------------------------------------------------------------------------------------------------------|-----------------|---------------------------|---------------------------------| -| `AppInstallAds`, `AppInstallAdLabels`, `Labels`, `Campaign Labels`, `Keyword Labels`, `Ad Group Labels`, `Keywords` | `Modified Time` | `04/27/2023 18:00:14.970` | `2023-04-27T16:00:14.970+00:00` | -| `Budget Summary Report` | `Date` | `6/10/2021` | `2021-06-10` | -| `* Report Hourly` | `TimePeriod` | `2023-11-04\|11` | `2023-11-04T11:00:00+00:00` | +| Affected streams | Field_name | Old type | New type (`RFC3339`) | +| ------------------------------------------------------------------------------------------------------------------- | --------------- | ------------------------- | ------------------------------- | +| `AppInstallAds`, `AppInstallAdLabels`, `Labels`, `Campaign Labels`, `Keyword Labels`, `Ad Group Labels`, `Keywords` | `Modified Time` | `04/27/2023 18:00:14.970` | `2023-04-27T16:00:14.970+00:00` | +| `Budget Summary Report` | `Date` | `6/10/2021` | `2021-06-10` | +| `* Report Hourly` | `TimePeriod` | `2023-11-04\|11` | `2023-11-04T11:00:00+00:00` | ## Upgrading to 1.0.0 -This version update only affects the geographic performance reports streams. +This version update only affects the geographic performance reports streams. -Version 1.0.0 prevents the data loss by removing the primary keys from the `GeographicPerformanceReportMonthly`, `GeographicPerformanceReportWeekly`, `GeographicPerformanceReportDaily`, `GeographicPerformanceReportHourly` streams. +Version 1.0.0 prevents the data loss by removing the primary keys from the `GeographicPerformanceReportMonthly`, `GeographicPerformanceReportWeekly`, `GeographicPerformanceReportDaily`, `GeographicPerformanceReportHourly` streams. Due to multiple records with the same primary key, users could experience data loss in the incremental append+dedup mode because of deduplication. -For the changes to take effect, please reset your data and refresh the stream schemas after you have applied the upgrade. \ No newline at end of file +For the changes to take effect, please reset your data and refresh the stream schemas after you have applied the upgrade. diff --git a/docs/integrations/sources/bing-ads.md b/docs/integrations/sources/bing-ads.md index 54b490ac6f2..bf6189fde9e 100644 --- a/docs/integrations/sources/bing-ads.md +++ b/docs/integrations/sources/bing-ads.md @@ -7,6 +7,7 @@ This page contains the setup guide and reference information for the Bing Ads so ## Prerequisites + - Microsoft Advertising account - Microsoft Developer Token @@ -14,7 +15,7 @@ This page contains the setup guide and reference information for the Bing Ads so -For Airbyte Open Source set up your application to get **Client ID**, **Client Secret**, **Refresh Token** +For Airbyte Open Source set up your application to get **Client ID**, **Client Secret**, **Refresh Token** 1. [Register your application](https://docs.microsoft.com/en-us/advertising/guides/authentication-oauth-register?view=bingads-13) in the Azure portal. 2. [Request user consent](https://docs.microsoft.com/en-us/advertising/guides/authentication-oauth-consent?view=bingads-13l) to get the authorization code. @@ -31,8 +32,9 @@ Please be sure to authenticate with the email (personal or work) that you used t ### Step 1: Set up Bing Ads 1. Get your [Microsoft developer token](https://docs.microsoft.com/en-us/advertising/guides/get-started?view=bingads-13#get-developer-token). To use Bing Ads APIs, you must have a developer token and valid user credentials. See [Microsoft Advertising docs](https://docs.microsoft.com/en-us/advertising/guides/get-started?view=bingads-13#get-developer-token) for more info. - 1. Sign in with [Super Admin](https://learn.microsoft.com/en-us/advertising/guides/account-hierarchy-permissions?view=bingads-13#user-roles-permissions) credentials at the [Microsoft Advertising Developer Portal](https://developers.ads.microsoft.com/Account) account tab. - 2. Choose the user that you want associated with the developer token. Typically an application only needs one universal token regardless how many users will be supported. + + 1. Sign in with [Super Admin](https://learn.microsoft.com/en-us/advertising/guides/account-hierarchy-permissions?view=bingads-13#user-roles-permissions) credentials at the [Microsoft Advertising Developer Portal](https://developers.ads.microsoft.com/Account) account tab. + 2. Choose the user that you want associated with the developer token. Typically an application only needs one universal token regardless how many users will be supported. 3. Click on the Request Token button. 2. If your OAuth app has a custom tenant, and you cannot use Microsoft’s recommended common tenant, use the custom tenant in the **Tenant ID** field when you set up the connector. @@ -56,16 +58,16 @@ The tenant is used in the authentication URL, for example: `https://login.micros 5. For **Tenant ID**, enter the custom tenant or use the common tenant. 6. Add the developer token from [Step 1](#step-1-set-up-bing-ads). 7. For **Account Names Predicates** - see [predicates](https://learn.microsoft.com/en-us/advertising/customer-management-service/predicate?view=bingads-13) in bing ads docs. Will be used to filter your accounts by specified operator and account name. You can use multiple predicates pairs. The **Operator** is a one of Contains or Equals. The **Account Name** is a value to compare Accounts Name field in rows by specified operator. For example, for operator=Contains and name=Dev, all accounts where name contains dev will be replicated. And for operator=Equals and name=Airbyte, all accounts where name is equal to Airbyte will be replicated. Account Name value is not case-sensitive. -8. For **Reports Replication Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data from previous and current calendar years. +8. For **Reports Replication Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data from previous and current calendar years. 9. For **Lookback window** (also known as attribution or conversion window) enter the number of **days** to look into the past. If your conversion window has an hours/minutes granularity, round it up to the number of days exceeding. If you're not using performance report streams in incremental mode and Reports Start Date is not provided, let it with 0 default value. -10. For *Custom Reports* - see [custom reports](#custom-reports) section, list of custom reports object: - 1. For *Report Name* enter the name that you want for your custom report. - 2. For *Reporting Data Object* add the Bing Ads Reporting Object that you want to sync in the custom report. - 3. For *Columns* add list columns of Reporting Data Object that you want to see in the custom report. - 4. For *Aggregation* add time aggregation. See [report aggregation](#report-aggregation) section. -11. Click **Authenticate your Bing Ads account**. -12. Log in and authorize the Bing Ads account. -13. Click **Set up source**. +10. For _Custom Reports_ - see [custom reports](#custom-reports) section, list of custom reports object: +11. For _Report Name_ enter the name that you want for your custom report. +12. For _Reporting Data Object_ add the Bing Ads Reporting Object that you want to sync in the custom report. +13. For _Columns_ add list columns of Reporting Data Object that you want to see in the custom report. +14. For _Aggregation_ add time aggregation. See [report aggregation](#report-aggregation) section. +15. Click **Authenticate your Bing Ads account**. +16. Log in and authorize the Bing Ads account. +17. Click **Set up source**. @@ -81,13 +83,13 @@ The tenant is used in the authentication URL, for example: `https://login.micros 7. For **Account Names Predicates** - see [predicates](https://learn.microsoft.com/en-us/advertising/customer-management-service/predicate?view=bingads-13) in bing ads docs. Will be used to filter your accounts by specified operator and account name. You can use multiple predicates pairs. The **Operator** is a one of Contains or Equals. The **Account Name** is a value to compare Accounts Name field in rows by specified operator. For example, for operator=Contains and name=Dev, all accounts where name contains dev will be replicated. And for operator=Equals and name=Airbyte, all accounts where name is equal to Airbyte will be replicated. Account Name value is not case-sensitive. 8. For **Reports Replication Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data from previous and current calendar years. 9. For **Lookback window** (also known as attribution or conversion window) enter the number of **days** to look into the past. If your conversion window has an hours/minutes granularity, round it up to the number of days exceeding. If you're not using performance report streams in incremental mode and Reports Start Date is not provided, let it with 0 default value. -10. For *Custom Reports* - see [custom reports](#custom-reports) section: - 1. For *Report Name* enter the name that you want for your custom report. - 2. For *Reporting Data Object* add the Bing Ads Reporting Object that you want to sync in the custom report. - 3. For *Columns* add columns of Reporting Data Object that you want to see in the custom report. - 4. For *Aggregation* select time aggregation. See [report aggregation](#report-aggregation) section. +10. For _Custom Reports_ - see [custom reports](#custom-reports) section: +11. For _Report Name_ enter the name that you want for your custom report. +12. For _Reporting Data Object_ add the Bing Ads Reporting Object that you want to sync in the custom report. +13. For _Columns_ add columns of Reporting Data Object that you want to see in the custom report. +14. For _Aggregation_ select time aggregation. See [report aggregation](#report-aggregation) section. -11. Click **Set up source**. +15. Click **Set up source**. @@ -198,12 +200,12 @@ If you faced this issue please use custom report, where you can define only that :::info -Ad Group Impression Performance Report, Geographic Performance Report, Account Impression Performance Report have user-defined primary key. -This means that you can define your own primary key in Replication tab in your connection for these streams. +Ad Group Impression Performance Report, Geographic Performance Report, Account Impression Performance Report have user-defined primary key. +This means that you can define your own primary key in Replication tab in your connection for these streams. Example pk: -Ad Group Impression Performance Report: composite pk - [AdGroupId, Status, TimePeriod, AccountId] -Geographic Performance Report: composite pk - [AdGroupId, Country, State, MetroArea, City] +Ad Group Impression Performance Report: composite pk - [AdGroupId, Status, TimePeriod, AccountId] +Geographic Performance Report: composite pk - [AdGroupId, Country, State, MetroArea, City] Account Impression Performance Report: composite pk - [AccountName, AccountNumber, AccountId, TimePeriod] Note: These are just examples, and you should consider your own data and needs in order to correctly define the primary key. @@ -213,12 +215,14 @@ See more info about user-defined pk [here](https://docs.airbyte.com/understandin ::: ### Custom Reports + You can build your own report by providing: -- *Report Name* - name of the stream -- *Reporting Data Object* - Bing Ads reporting data object that you can find [here](https://learn.microsoft.com/en-us/advertising/reporting-service/reporting-data-objects?view=bingads-13). All data object with ending ReportRequest can be used as data object in custom reports. -- *Columns* - Reporting object columns that you want to sync. You can find it on ReportRequest data object page by clicking the ...ReportColumn link in [Bing Ads docs](https://learn.microsoft.com/en-us/advertising/reporting-service/reporting-value-sets?view=bingads-13). -The report must include the Required Columns (you can find it under list of all columns of reporting object) at a minimum. As a general rule, each report must include at least one attribute column and at least one non-impression share performance statistics column. Be careful you can't add extra columns that not specified in Bing Ads docs and not all fields can be skipped. -- *Aggregation* - Hourly, Daily, Weekly, Monthly, DayOfWeek, HourOfDay, WeeklyStartingMonday, Summary. See [report aggregation](#report-aggregation). + +- _Report Name_ - name of the stream +- _Reporting Data Object_ - Bing Ads reporting data object that you can find [here](https://learn.microsoft.com/en-us/advertising/reporting-service/reporting-data-objects?view=bingads-13). All data object with ending ReportRequest can be used as data object in custom reports. +- _Columns_ - Reporting object columns that you want to sync. You can find it on ReportRequest data object page by clicking the ...ReportColumn link in [Bing Ads docs](https://learn.microsoft.com/en-us/advertising/reporting-service/reporting-value-sets?view=bingads-13). + The report must include the Required Columns (you can find it under list of all columns of reporting object) at a minimum. As a general rule, each report must include at least one attribute column and at least one non-impression share performance statistics column. Be careful you can't add extra columns that not specified in Bing Ads docs and not all fields can be skipped. +- _Aggregation_ - Hourly, Daily, Weekly, Monthly, DayOfWeek, HourOfDay, WeeklyStartingMonday, Summary. See [report aggregation](#report-aggregation). ### Report aggregation @@ -243,73 +247,73 @@ The Bing Ads API limits the number of requests for all Microsoft Advertising cli ### Troubleshooting -* Check out common troubleshooting issues for the Bing Ads source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Bing Ads source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------| -| 2.6.1 | 2024-05-02 | [36632](https://github.com/airbytehq/airbyte/pull/36632) | Schema descriptions | -| 2.6.0 | 2024-04-25 | [35878](https://github.com/airbytehq/airbyte/pull/35878) | Add missing fields in keyword_performance_report | -| 2.5.0 | 2024-03-21 | [35891](https://github.com/airbytehq/airbyte/pull/35891) | Accounts stream: add TaxCertificate field to schema | -| 2.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | -| 2.3.0 | 2024-03-05 | [35812](https://github.com/airbytehq/airbyte/pull/35812) | New streams: Audience Performance Report, Goals And Funnels Report, Product Dimension Performance Report. | -| 2.2.0 | 2024-02-13 | [35201](https://github.com/airbytehq/airbyte/pull/35201) | New streams: Budget and Product Dimension Performance. | -| 2.1.4 | 2024-02-12 | [35179](https://github.com/airbytehq/airbyte/pull/35179) | Manage dependencies with Poetry | -| 2.1.3 | 2024-01-31 | [34712](https://github.com/airbytehq/airbyte/pull/34712) | Fix duplicated records for report-based streams | -| 2.1.2 | 2024-01-09 | [34045](https://github.com/airbytehq/airbyte/pull/34045) | Speed up record transformation | -| 2.1.1 | 2023-12-15 | [33500](https://github.com/airbytehq/airbyte/pull/33500) | Fix state setter when state was provided | -| 2.1.0 | 2023-12-05 | [33095](https://github.com/airbytehq/airbyte/pull/33095) | Add account filtering | -| 2.0.1 | 2023-11-16 | [32597](https://github.com/airbytehq/airbyte/pull/32597) | Fix start date parsing from stream state | -| 2.0.0 | 2023-11-07 | [31995](https://github.com/airbytehq/airbyte/pull/31995) | Schema update for Accounts, Campaigns and Search Query Performance Report streams. Convert `date` and `date-time` fields to standard `RFC3339` | -| 1.13.0 | 2023-11-13 | [32306](https://github.com/airbytehq/airbyte/pull/32306) | Add Custom reports and decrease backoff max tries number | -| 1.12.1 | 2023-11-10 | [32422](https://github.com/airbytehq/airbyte/pull/32422) | Normalize numeric values in reports | -| 1.12.0 | 2023-11-09 | [32340](https://github.com/airbytehq/airbyte/pull/32340) | Remove default start date in favor of Time Period - Last Year and This Year, if start date is not provided | -| 1.11.0 | 2023-11-06 | [32201](https://github.com/airbytehq/airbyte/pull/32201) | Skip broken CSV report files | -| 1.10.0 | 2023-11-06 | [32148](https://github.com/airbytehq/airbyte/pull/32148) | Add new fields to stream Ads: "BusinessName", "CallToAction", "Headline", "Images", "Videos", "Text" | -| 1.9.0 | 2023-11-03 | [32131](https://github.com/airbytehq/airbyte/pull/32131) | Add "CampaignId", "AccountId", "CustomerId" fields to Ad Groups, Ads and Campaigns streams. | -| 1.8.0 | 2023-11-02 | [32059](https://github.com/airbytehq/airbyte/pull/32059) | Add new streams `CampaignImpressionPerformanceReport` (daily, hourly, weekly, monthly) | -| 1.7.1 | 2023-11-02 | [32088](https://github.com/airbytehq/airbyte/pull/32088) | Raise config error when user does not have accounts | -| 1.7.0 | 2023-11-01 | [32027](https://github.com/airbytehq/airbyte/pull/32027) | Add new streams `AdGroupImpressionPerformanceReport` | -| 1.6.0 | 2023-10-31 | [32008](https://github.com/airbytehq/airbyte/pull/32008) | Add new streams `Keywords` | -| 1.5.0 | 2023-10-30 | [31952](https://github.com/airbytehq/airbyte/pull/31952) | Add new streams `Labels`, `App install ads`, `Keyword Labels`, `Campaign Labels`, `App Install Ad Labels`, `Ad Group Labels` | -| 1.4.0 | 2023-10-27 | [31885](https://github.com/airbytehq/airbyte/pull/31885) | Add new stream: `AccountImpressionPerformanceReport` (daily, hourly, weekly, monthly) | -| 1.3.0 | 2023-10-26 | [31837](https://github.com/airbytehq/airbyte/pull/31837) | Add new stream: `UserLocationPerformanceReport` (daily, hourly, weekly, monthly) | -| 1.2.0 | 2023-10-24 | [31783](https://github.com/airbytehq/airbyte/pull/31783) | Add new stream: `SearchQueryPerformanceReport` (daily, hourly, weekly, monthly) | -| 1.1.0 | 2023-10-24 | [31712](https://github.com/airbytehq/airbyte/pull/31712) | Add new stream: `AgeGenderAudienceReport` (daily, hourly, weekly, monthly) | -| 1.0.2 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 1.0.1 | 2023-10-16 | [31432](https://github.com/airbytehq/airbyte/pull/31432) | Remove primary keys from the geographic performance reports - complete what was missed in version 1.0.0 | -| 1.0.0 | 2023-10-11 | [31277](https://github.com/airbytehq/airbyte/pull/31277) | Remove primary keys from the geographic performance reports | -| 0.2.3 | 2023-09-28 | [30834](https://github.com/airbytehq/airbyte/pull/30834) | Wrap auth error with the config error | -| 0.2.2 | 2023-09-27 | [30791](https://github.com/airbytehq/airbyte/pull/30791) | Fix missing fields for geographic performance reports | -| 0.2.1 | 2023-09-04 | [30128](https://github.com/airbytehq/airbyte/pull/30128) | Add increasing download timeout if ReportingDownloadException occurs | -| 0.2.0 | 2023-08-17 | [27619](https://github.com/airbytehq/airbyte/pull/27619) | Add Geographic Performance Report | -| 0.1.24 | 2023-06-22 | [27619](https://github.com/airbytehq/airbyte/pull/27619) | Retry request after facing temporary name resolution error | -| 0.1.23 | 2023-05-11 | [25996](https://github.com/airbytehq/airbyte/pull/25996) | Implement a retry logic if SSL certificate validation fails | -| 0.1.22 | 2023-05-08 | [24223](https://github.com/airbytehq/airbyte/pull/24223) | Add CampaignLabels report column in campaign performance report | -| 0.1.21 | 2023-04-28 | [25668](https://github.com/airbytehq/airbyte/pull/25668) | Add undeclared fields to accounts, campaigns, campaign_performance_report, keyword_performance_report and account_performance_report streams | -| 0.1.20 | 2023-03-09 | [23663](https://github.com/airbytehq/airbyte/pull/23663) | Add lookback window for performance reports in incremental mode | -| 0.1.19 | 2023-03-08 | [23868](https://github.com/airbytehq/airbyte/pull/23868) | Add dimensional-type columns for reports | -| 0.1.18 | 2023-01-30 | [22073](https://github.com/airbytehq/airbyte/pull/22073) | Fix null values in the `Keyword` column of `keyword_performance_report` streams | -| 0.1.17 | 2022-12-10 | [20005](https://github.com/airbytehq/airbyte/pull/20005) | Add `Keyword` to `keyword_performance_report` stream | -| 0.1.16 | 2022-10-12 | [17873](https://github.com/airbytehq/airbyte/pull/17873) | Fix: added missing campaign types in (Audience, Shopping and DynamicSearchAds) in campaigns stream | -| 0.1.15 | 2022-10-03 | [17505](https://github.com/airbytehq/airbyte/pull/17505) | Fix: limit cache size for ServiceClient instances | -| 0.1.14 | 2022-09-29 | [17403](https://github.com/airbytehq/airbyte/pull/17403) | Fix: limit cache size for ReportingServiceManager instances | -| 0.1.13 | 2022-09-29 | [17386](https://github.com/airbytehq/airbyte/pull/17386) | Migrate to per-stream states | -| 0.1.12 | 2022-09-05 | [16335](https://github.com/airbytehq/airbyte/pull/16335) | Added backoff for socket.timeout | -| 0.1.11 | 2022-08-25 | [15684](https://github.com/airbytehq/airbyte/pull/15684) (published in [15987](https://github.com/airbytehq/airbyte/pull/15987)) | Fixed log messages being unreadable | -| 0.1.10 | 2022-08-12 | [15602](https://github.com/airbytehq/airbyte/pull/15602) | Fixed bug caused Hourly Reports to crash due to invalid fields set | -| 0.1.9 | 2022-08-02 | [14862](https://github.com/airbytehq/airbyte/pull/14862) | Added missing columns | -| 0.1.8 | 2022-06-15 | [13801](https://github.com/airbytehq/airbyte/pull/13801) | All reports `hourly/daily/weekly/monthly` will be generated by default, these options are removed from input configuration | -| 0.1.7 | 2022-05-17 | [12937](https://github.com/airbytehq/airbyte/pull/12937) | Added OAuth2.0 authentication method, removed `redirect_uri` from input configuration | -| 0.1.6 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | -| 0.1.5 | 2022-01-01 | [11652](https://github.com/airbytehq/airbyte/pull/11652) | Rebump attempt after DockerHub failure at registring the 0.1.4 | -| 0.1.4 | 2022-03-22 | [11311](https://github.com/airbytehq/airbyte/pull/11311) | Added optional Redirect URI & Tenant ID to spec | -| 0.1.3 | 2022-01-14 | [9510](https://github.com/airbytehq/airbyte/pull/9510) | Fixed broken dependency that blocked connector's operations | -| 0.1.2 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | -| 0.1.1 | 2021-08-31 | [5750](https://github.com/airbytehq/airbyte/pull/5750) | Added reporting streams | -| 0.1.0 | 2021-07-22 | [4911](https://github.com/airbytehq/airbyte/pull/4911) | Initial release supported core streams \(Accounts, Campaigns, Ads, AdGroups\) | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------------------------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------- | +| 2.6.1 | 2024-05-02 | [36632](https://github.com/airbytehq/airbyte/pull/36632) | Schema descriptions | +| 2.6.0 | 2024-04-25 | [35878](https://github.com/airbytehq/airbyte/pull/35878) | Add missing fields in keyword_performance_report | +| 2.5.0 | 2024-03-21 | [35891](https://github.com/airbytehq/airbyte/pull/35891) | Accounts stream: add TaxCertificate field to schema | +| 2.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | +| 2.3.0 | 2024-03-05 | [35812](https://github.com/airbytehq/airbyte/pull/35812) | New streams: Audience Performance Report, Goals And Funnels Report, Product Dimension Performance Report. | +| 2.2.0 | 2024-02-13 | [35201](https://github.com/airbytehq/airbyte/pull/35201) | New streams: Budget and Product Dimension Performance. | +| 2.1.4 | 2024-02-12 | [35179](https://github.com/airbytehq/airbyte/pull/35179) | Manage dependencies with Poetry | +| 2.1.3 | 2024-01-31 | [34712](https://github.com/airbytehq/airbyte/pull/34712) | Fix duplicated records for report-based streams | +| 2.1.2 | 2024-01-09 | [34045](https://github.com/airbytehq/airbyte/pull/34045) | Speed up record transformation | +| 2.1.1 | 2023-12-15 | [33500](https://github.com/airbytehq/airbyte/pull/33500) | Fix state setter when state was provided | +| 2.1.0 | 2023-12-05 | [33095](https://github.com/airbytehq/airbyte/pull/33095) | Add account filtering | +| 2.0.1 | 2023-11-16 | [32597](https://github.com/airbytehq/airbyte/pull/32597) | Fix start date parsing from stream state | +| 2.0.0 | 2023-11-07 | [31995](https://github.com/airbytehq/airbyte/pull/31995) | Schema update for Accounts, Campaigns and Search Query Performance Report streams. Convert `date` and `date-time` fields to standard `RFC3339` | +| 1.13.0 | 2023-11-13 | [32306](https://github.com/airbytehq/airbyte/pull/32306) | Add Custom reports and decrease backoff max tries number | +| 1.12.1 | 2023-11-10 | [32422](https://github.com/airbytehq/airbyte/pull/32422) | Normalize numeric values in reports | +| 1.12.0 | 2023-11-09 | [32340](https://github.com/airbytehq/airbyte/pull/32340) | Remove default start date in favor of Time Period - Last Year and This Year, if start date is not provided | +| 1.11.0 | 2023-11-06 | [32201](https://github.com/airbytehq/airbyte/pull/32201) | Skip broken CSV report files | +| 1.10.0 | 2023-11-06 | [32148](https://github.com/airbytehq/airbyte/pull/32148) | Add new fields to stream Ads: "BusinessName", "CallToAction", "Headline", "Images", "Videos", "Text" | +| 1.9.0 | 2023-11-03 | [32131](https://github.com/airbytehq/airbyte/pull/32131) | Add "CampaignId", "AccountId", "CustomerId" fields to Ad Groups, Ads and Campaigns streams. | +| 1.8.0 | 2023-11-02 | [32059](https://github.com/airbytehq/airbyte/pull/32059) | Add new streams `CampaignImpressionPerformanceReport` (daily, hourly, weekly, monthly) | +| 1.7.1 | 2023-11-02 | [32088](https://github.com/airbytehq/airbyte/pull/32088) | Raise config error when user does not have accounts | +| 1.7.0 | 2023-11-01 | [32027](https://github.com/airbytehq/airbyte/pull/32027) | Add new streams `AdGroupImpressionPerformanceReport` | +| 1.6.0 | 2023-10-31 | [32008](https://github.com/airbytehq/airbyte/pull/32008) | Add new streams `Keywords` | +| 1.5.0 | 2023-10-30 | [31952](https://github.com/airbytehq/airbyte/pull/31952) | Add new streams `Labels`, `App install ads`, `Keyword Labels`, `Campaign Labels`, `App Install Ad Labels`, `Ad Group Labels` | +| 1.4.0 | 2023-10-27 | [31885](https://github.com/airbytehq/airbyte/pull/31885) | Add new stream: `AccountImpressionPerformanceReport` (daily, hourly, weekly, monthly) | +| 1.3.0 | 2023-10-26 | [31837](https://github.com/airbytehq/airbyte/pull/31837) | Add new stream: `UserLocationPerformanceReport` (daily, hourly, weekly, monthly) | +| 1.2.0 | 2023-10-24 | [31783](https://github.com/airbytehq/airbyte/pull/31783) | Add new stream: `SearchQueryPerformanceReport` (daily, hourly, weekly, monthly) | +| 1.1.0 | 2023-10-24 | [31712](https://github.com/airbytehq/airbyte/pull/31712) | Add new stream: `AgeGenderAudienceReport` (daily, hourly, weekly, monthly) | +| 1.0.2 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 1.0.1 | 2023-10-16 | [31432](https://github.com/airbytehq/airbyte/pull/31432) | Remove primary keys from the geographic performance reports - complete what was missed in version 1.0.0 | +| 1.0.0 | 2023-10-11 | [31277](https://github.com/airbytehq/airbyte/pull/31277) | Remove primary keys from the geographic performance reports | +| 0.2.3 | 2023-09-28 | [30834](https://github.com/airbytehq/airbyte/pull/30834) | Wrap auth error with the config error | +| 0.2.2 | 2023-09-27 | [30791](https://github.com/airbytehq/airbyte/pull/30791) | Fix missing fields for geographic performance reports | +| 0.2.1 | 2023-09-04 | [30128](https://github.com/airbytehq/airbyte/pull/30128) | Add increasing download timeout if ReportingDownloadException occurs | +| 0.2.0 | 2023-08-17 | [27619](https://github.com/airbytehq/airbyte/pull/27619) | Add Geographic Performance Report | +| 0.1.24 | 2023-06-22 | [27619](https://github.com/airbytehq/airbyte/pull/27619) | Retry request after facing temporary name resolution error | +| 0.1.23 | 2023-05-11 | [25996](https://github.com/airbytehq/airbyte/pull/25996) | Implement a retry logic if SSL certificate validation fails | +| 0.1.22 | 2023-05-08 | [24223](https://github.com/airbytehq/airbyte/pull/24223) | Add CampaignLabels report column in campaign performance report | +| 0.1.21 | 2023-04-28 | [25668](https://github.com/airbytehq/airbyte/pull/25668) | Add undeclared fields to accounts, campaigns, campaign_performance_report, keyword_performance_report and account_performance_report streams | +| 0.1.20 | 2023-03-09 | [23663](https://github.com/airbytehq/airbyte/pull/23663) | Add lookback window for performance reports in incremental mode | +| 0.1.19 | 2023-03-08 | [23868](https://github.com/airbytehq/airbyte/pull/23868) | Add dimensional-type columns for reports | +| 0.1.18 | 2023-01-30 | [22073](https://github.com/airbytehq/airbyte/pull/22073) | Fix null values in the `Keyword` column of `keyword_performance_report` streams | +| 0.1.17 | 2022-12-10 | [20005](https://github.com/airbytehq/airbyte/pull/20005) | Add `Keyword` to `keyword_performance_report` stream | +| 0.1.16 | 2022-10-12 | [17873](https://github.com/airbytehq/airbyte/pull/17873) | Fix: added missing campaign types in (Audience, Shopping and DynamicSearchAds) in campaigns stream | +| 0.1.15 | 2022-10-03 | [17505](https://github.com/airbytehq/airbyte/pull/17505) | Fix: limit cache size for ServiceClient instances | +| 0.1.14 | 2022-09-29 | [17403](https://github.com/airbytehq/airbyte/pull/17403) | Fix: limit cache size for ReportingServiceManager instances | +| 0.1.13 | 2022-09-29 | [17386](https://github.com/airbytehq/airbyte/pull/17386) | Migrate to per-stream states | +| 0.1.12 | 2022-09-05 | [16335](https://github.com/airbytehq/airbyte/pull/16335) | Added backoff for socket.timeout | +| 0.1.11 | 2022-08-25 | [15684](https://github.com/airbytehq/airbyte/pull/15684) (published in [15987](https://github.com/airbytehq/airbyte/pull/15987)) | Fixed log messages being unreadable | +| 0.1.10 | 2022-08-12 | [15602](https://github.com/airbytehq/airbyte/pull/15602) | Fixed bug caused Hourly Reports to crash due to invalid fields set | +| 0.1.9 | 2022-08-02 | [14862](https://github.com/airbytehq/airbyte/pull/14862) | Added missing columns | +| 0.1.8 | 2022-06-15 | [13801](https://github.com/airbytehq/airbyte/pull/13801) | All reports `hourly/daily/weekly/monthly` will be generated by default, these options are removed from input configuration | +| 0.1.7 | 2022-05-17 | [12937](https://github.com/airbytehq/airbyte/pull/12937) | Added OAuth2.0 authentication method, removed `redirect_uri` from input configuration | +| 0.1.6 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | +| 0.1.5 | 2022-01-01 | [11652](https://github.com/airbytehq/airbyte/pull/11652) | Rebump attempt after DockerHub failure at registring the 0.1.4 | +| 0.1.4 | 2022-03-22 | [11311](https://github.com/airbytehq/airbyte/pull/11311) | Added optional Redirect URI & Tenant ID to spec | +| 0.1.3 | 2022-01-14 | [9510](https://github.com/airbytehq/airbyte/pull/9510) | Fixed broken dependency that blocked connector's operations | +| 0.1.2 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | +| 0.1.1 | 2021-08-31 | [5750](https://github.com/airbytehq/airbyte/pull/5750) | Added reporting streams | +| 0.1.0 | 2021-07-22 | [4911](https://github.com/airbytehq/airbyte/pull/4911) | Initial release supported core streams \(Accounts, Campaigns, Ads, AdGroups\) | diff --git a/docs/integrations/sources/braintree.md b/docs/integrations/sources/braintree.md index f16c2cabae2..b718992cd95 100644 --- a/docs/integrations/sources/braintree.md +++ b/docs/integrations/sources/braintree.md @@ -32,33 +32,32 @@ This source can sync data for the [Braintree API](https://developers.braintreepa ### Supported Sync Modes -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Namespaces | No | | ## Supported Streams The following streams are supported: -* [Customers](https://developer.paypal.com/braintree/docs/reference/request/customer/search) -* [Discounts](https://developer.paypal.com/braintree/docs/reference/response/discount) -* [Disputes](https://developer.paypal.com/braintree/docs/reference/request/dispute/search) -* [Transactions](https://developers.braintreepayments.com/reference/response/transaction/python) -* [Merchant Accounts](https://developer.paypal.com/braintree/docs/reference/response/merchant-account) -* [Plans](https://developer.paypal.com/braintree/docs/reference/response/plan) -* [Subscriptions](https://developer.paypal.com/braintree/docs/reference/response/subscription) +- [Customers](https://developer.paypal.com/braintree/docs/reference/request/customer/search) +- [Discounts](https://developer.paypal.com/braintree/docs/reference/response/discount) +- [Disputes](https://developer.paypal.com/braintree/docs/reference/request/dispute/search) +- [Transactions](https://developers.braintreepayments.com/reference/response/transaction/python) +- [Merchant Accounts](https://developer.paypal.com/braintree/docs/reference/response/merchant-account) +- [Plans](https://developer.paypal.com/braintree/docs/reference/response/plan) +- [Subscriptions](https://developer.paypal.com/braintree/docs/reference/response/subscription) ## Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | - +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ## Performance considerations @@ -66,16 +65,15 @@ The connector is restricted by normal Braintree [requests limitation](https://de The Braintree connector should not run into Braintree API limitations under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. - ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.1 | 2023-11-08 | [31489](https://github.com/airbytehq/airbyte/pull/31489) | Fix transaction stream custom fields | -| 0.2.0 | 2023-07-17 | [29200](https://github.com/airbytehq/airbyte/pull/29200) | Migrate connector to low-code framework | -| 0.1.5 | 2023-05-24 | [26340](https://github.com/airbytehq/airbyte/pull/26340) | Fix error in `check_connection` in integration tests | -| 0.1.4 | 2023-03-13 | [23548](https://github.com/airbytehq/airbyte/pull/23548) | Update braintree python library version to 4.18.1 | -| 0.1.3 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.2 | 2021-12-22 | [9042](https://github.com/airbytehq/airbyte/pull/9042) | Fix `$ref` in schema and spec | -| 0.1.1 | 2021-10-27 | [7432](https://github.com/airbytehq/airbyte/pull/7432) | Dispute model should accept multiple Evidences | -| 0.1.0 | 2021-08-17 | [5362](https://github.com/airbytehq/airbyte/pull/5362) | Initial version | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------- | +| 0.2.1 | 2023-11-08 | [31489](https://github.com/airbytehq/airbyte/pull/31489) | Fix transaction stream custom fields | +| 0.2.0 | 2023-07-17 | [29200](https://github.com/airbytehq/airbyte/pull/29200) | Migrate connector to low-code framework | +| 0.1.5 | 2023-05-24 | [26340](https://github.com/airbytehq/airbyte/pull/26340) | Fix error in `check_connection` in integration tests | +| 0.1.4 | 2023-03-13 | [23548](https://github.com/airbytehq/airbyte/pull/23548) | Update braintree python library version to 4.18.1 | +| 0.1.3 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.2 | 2021-12-22 | [9042](https://github.com/airbytehq/airbyte/pull/9042) | Fix `$ref` in schema and spec | +| 0.1.1 | 2021-10-27 | [7432](https://github.com/airbytehq/airbyte/pull/7432) | Dispute model should accept multiple Evidences | +| 0.1.0 | 2021-08-17 | [5362](https://github.com/airbytehq/airbyte/pull/5362) | Initial version | diff --git a/docs/integrations/sources/braze.md b/docs/integrations/sources/braze.md index 9f97ff8a595..f52c6afa901 100644 --- a/docs/integrations/sources/braze.md +++ b/docs/integrations/sources/braze.md @@ -5,6 +5,7 @@ This page contains the setup guide and reference information for the Braze sourc ## Prerequisites It is required to have an account on Braze to provide us with `URL` and `Rest API Key` during set up. + - `Rest API Key` could be found on Braze Dashboard -> Developer Console tab -> API Settings -> Rest API Keys - `URL` could be found on Braze Dashboard -> Manage Settings -> Settings tab -> `Your App name` -> SDK Endpoint @@ -44,16 +45,13 @@ The Braze source connector supports the following [ sync modes](https://docs.air Rate limits differ depending on stream. -Rate limits table: https://www.braze.com/docs/api/api_limits/#rate-limits-by-request-type - +Rate limits table: https://www.braze.com/docs/api/api_limits/#rate-limits-by-request-type ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:-----------------------------------| -| 0.3.0 | 2023-11-04 | [31857](https://github.com/airbytehq/airbyte/pull/31857) | Add Campaigns, Canvases, Segments Details Streams | -| 0.2.0 | 2023-10-28 | [31607](https://github.com/airbytehq/airbyte/pull/31607) | Fix CanvasAnalytics Stream Null Data for step_stats, variant_stats | -| 0.1.4 | 2023-11-03 | [20520](https://github.com/airbytehq/airbyte/pull/20520) | Fix integration tests | -| 0.1.3 | 2022-12-15 | [20520](https://github.com/airbytehq/airbyte/pull/20520) | The Braze connector born | - - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------- | +| 0.3.0 | 2023-11-04 | [31857](https://github.com/airbytehq/airbyte/pull/31857) | Add Campaigns, Canvases, Segments Details Streams | +| 0.2.0 | 2023-10-28 | [31607](https://github.com/airbytehq/airbyte/pull/31607) | Fix CanvasAnalytics Stream Null Data for step_stats, variant_stats | +| 0.1.4 | 2023-11-03 | [20520](https://github.com/airbytehq/airbyte/pull/20520) | Fix integration tests | +| 0.1.3 | 2022-12-15 | [20520](https://github.com/airbytehq/airbyte/pull/20520) | The Braze connector born | diff --git a/docs/integrations/sources/breezometer.md b/docs/integrations/sources/breezometer.md index 2d7ac09a0ca..e63cbefe6b0 100644 --- a/docs/integrations/sources/breezometer.md +++ b/docs/integrations/sources/breezometer.md @@ -3,8 +3,9 @@ Breezometer connector lets you request environment information like air quality, pollen forecast, current and forecasted weather and wildfires for a specific location. ## Prerequisites -* A Breezometer -* An `api_key`, that can be found on your Breezometer account home page. + +- A Breezometer +- An `api_key`, that can be found on your Breezometer account home page. ## Supported sync modes @@ -12,14 +13,13 @@ The Breezometer connector supports full sync refresh. ## Airbyte Open Source -* API Key -* Latitude -* Longitude -* Days to Forecast -* Hours to Forecast -* Historic Hours -* Radius - +- API Key +- Latitude +- Longitude +- Days to Forecast +- Hours to Forecast +- Historic Hours +- Radius ## Supported Streams @@ -32,9 +32,8 @@ The Breezometer connector supports full sync refresh. - [Wildfire - Burnt Area](https://docs.breezometer.com/api-documentation/wildfire-tracker-api/v1/#burnt-area-api) - [Wildfire - Locate](https://docs.breezometer.com/api-documentation/wildfire-tracker-api/v1/#current-conditions) - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------| -| 0.1.0 | 2022-10-29 | [18650](https://github.com/airbytehq/airbyte/pull/18650) | Initial version/release of the connector. \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------- | +| 0.1.0 | 2022-10-29 | [18650](https://github.com/airbytehq/airbyte/pull/18650) | Initial version/release of the connector. | diff --git a/docs/integrations/sources/callrail.md b/docs/integrations/sources/callrail.md index 75ad1587331..1d9281c572b 100644 --- a/docs/integrations/sources/callrail.md +++ b/docs/integrations/sources/callrail.md @@ -2,37 +2,36 @@ ## Overview -The CailRail source supports Full Refresh and Incremental syncs. +The CailRail source supports Full Refresh and Incremental syncs. ### Output schema This Source is capable of syncing the following core Streams: -* [Calls](https://apidocs.callrail.com/#calls) -* [Companies](https://apidocs.callrail.com/#companies) -* [Text Messages](https://apidocs.callrail.com/#text-messages) -* [Users](https://apidocs.callrail.com/#users) - +- [Calls](https://apidocs.callrail.com/#calls) +- [Companies](https://apidocs.callrail.com/#companies) +- [Text Messages](https://apidocs.callrail.com/#text-messages) +- [Users](https://apidocs.callrail.com/#users) ### Features -| Feature | Supported? | -| :--- |:-----------| -| Full Refresh Sync | Yes | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | | Incremental - Append Sync | Yes | | Incremental - Dedupe Sync | Yes | -| SSL connection | No | -| Namespaces | No | +| SSL connection | No | +| Namespaces | No | ## Getting started ### Requirements -* CallRail Account -* CallRail API Token +- CallRail Account +- CallRail API Token ## Changelog -| Version | Date | Pull Request | Subject | -| :--- |:-----------|:--------------------------------------------------------|:----------------------------------| -| 0.1.0 | 2022-10-31 | [18739](https://github.com/airbytehq/airbyte/pull/18739) | 🎉 New Source: CallRail | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------- | +| 0.1.0 | 2022-10-31 | [18739](https://github.com/airbytehq/airbyte/pull/18739) | 🎉 New Source: CallRail | diff --git a/docs/integrations/sources/cart.md b/docs/integrations/sources/cart.md index 3e809108425..fd78e5d8b37 100644 --- a/docs/integrations/sources/cart.md +++ b/docs/integrations/sources/cart.md @@ -50,16 +50,16 @@ Please follow these [steps](https://developers.cart.com/docs/rest-api/docs/READM | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------- | -| 0.3.5 | 2024-04-19 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Updating to 0.80.0 CDK | -| 0.3.4 | 2024-04-18 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Manage dependencies with Poetry. | -| 0.3.3 | 2024-04-15 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.2 | 2024-04-12 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | schema descriptions | -| 0.3.1 | 2023-11-21 | [32705](https://github.com/airbytehq/airbyte/pull/32705) | Update CDK version | -| 0.3.0 | 2023-11-14 | [23317](https://github.com/airbytehq/airbyte/pull/23317) | Update schemas | -| 0.2.1 | 2023-02-22 | [23317](https://github.com/airbytehq/airbyte/pull/23317) | Remove support for incremental for `order_statuses` stream | -| 0.2.0 | 2022-09-21 | [16612](https://github.com/airbytehq/airbyte/pull/16612) | Source Cart.com: implement Central API Router access method and improve backoff policy | -| 0.1.6 | 2022-07-15 | [14752](https://github.com/airbytehq/airbyte/pull/14752) | Add `order_statuses` stream | -| 0.1.5 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.3 | 2021-08-26 | [5465](https://github.com/airbytehq/airbyte/pull/5465) | Add the end_date option for limitation of the amount of synced data | -| 0.1.2 | 2021-08-23 | [1111](https://github.com/airbytehq/airbyte/pull/1111) | Add `order_items` stream | -| 0.1.0 | 2021-06-08 | [4574](https://github.com/airbytehq/airbyte/pull/4574) | Initial Release | +| 0.3.5 | 2024-04-19 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Updating to 0.80.0 CDK | +| 0.3.4 | 2024-04-18 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Manage dependencies with Poetry. | +| 0.3.3 | 2024-04-15 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.2 | 2024-04-12 | [37131](https://github.com/airbytehq/airbyte/pull/37131) | schema descriptions | +| 0.3.1 | 2023-11-21 | [32705](https://github.com/airbytehq/airbyte/pull/32705) | Update CDK version | +| 0.3.0 | 2023-11-14 | [23317](https://github.com/airbytehq/airbyte/pull/23317) | Update schemas | +| 0.2.1 | 2023-02-22 | [23317](https://github.com/airbytehq/airbyte/pull/23317) | Remove support for incremental for `order_statuses` stream | +| 0.2.0 | 2022-09-21 | [16612](https://github.com/airbytehq/airbyte/pull/16612) | Source Cart.com: implement Central API Router access method and improve backoff policy | +| 0.1.6 | 2022-07-15 | [14752](https://github.com/airbytehq/airbyte/pull/14752) | Add `order_statuses` stream | +| 0.1.5 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.3 | 2021-08-26 | [5465](https://github.com/airbytehq/airbyte/pull/5465) | Add the end_date option for limitation of the amount of synced data | +| 0.1.2 | 2021-08-23 | [1111](https://github.com/airbytehq/airbyte/pull/1111) | Add `order_items` stream | +| 0.1.0 | 2021-06-08 | [4574](https://github.com/airbytehq/airbyte/pull/4574) | Initial Release | diff --git a/docs/integrations/sources/chargebee.md b/docs/integrations/sources/chargebee.md index 4aa7c18678e..afa5238d59e 100644 --- a/docs/integrations/sources/chargebee.md +++ b/docs/integrations/sources/chargebee.md @@ -9,8 +9,9 @@ This page contains the setup guide and reference information for the Chargebee s ## Prerequisites To set up the Chargebee source connector, you will need: - - [Chargebee API key](https://apidocs.chargebee.com/docs/api/auth) - - [Product Catalog version](https://www.chargebee.com/docs/1.0/upgrade-product-catalog.html) of the Chargebee site you are syncing. + +- [Chargebee API key](https://apidocs.chargebee.com/docs/api/auth) +- [Product Catalog version](https://www.chargebee.com/docs/1.0/upgrade-product-catalog.html) of the Chargebee site you are syncing. :::info All Chargebee sites created from May 5, 2021 onward will have [Product Catalog 2.0](https://www.chargebee.com/docs/2.0/product-catalog.html) enabled by default. Sites created prior to this date will use [Product Catalog 1.0](https://www.chargebee.com/docs/1.0/product-catalog.html). @@ -34,10 +35,10 @@ All Chargebee sites created from May 5, 2021 onward will have [Product Catalog 2 The Chargebee source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) -* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) -* [Incremental - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) +- [Incremental - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped) ## Supported streams @@ -45,32 +46,32 @@ Most streams are supported regardless of your Chargebee site's [Product Catalog | Stream | Product Catalog 1.0 | Product Catalog 2.0 | | ------------------------------------------------------------------------------------------------------ | ------------------- | ------------------- | -| [Addons](https://apidocs.chargebee.com/docs/api/addons?prod_cat_ver=1) | ✔ | | -| [Attached Items](https://apidocs.chargebee.com/docs/api/attached_items?prod_cat_ver=2) | | ✔ | -| [Comments](https://apidocs.chargebee.com/docs/api/comments?prod_cat_ver=2) | ✔ | ✔ | -| [Contacts](https://apidocs.chargebee.com/docs/api/customers?lang=curl#list_of_contacts_for_a_customer) | ✔ | ✔ | -| [Coupons](https://apidocs.chargebee.com/docs/api/coupons) | ✔ | ✔ | -| [Credit Notes](https://apidocs.chargebee.com/docs/api/credit_notes) | ✔ | ✔ | -| [Customers](https://apidocs.chargebee.com/docs/api/customers) | ✔ | ✔ | -| [Differential Prices](https://apidocs.chargebee.com/docs/api/differential_prices) | ✔ | ✔ | -| [Events](https://apidocs.chargebee.com/docs/api/events) | ✔ | ✔ | -| [Gifts](https://apidocs.chargebee.com/docs/api/gifts) | ✔ | ✔ | -| [Hosted Pages](https://apidocs.chargebee.com/docs/api/hosted_pages) | ✔ | ✔ | -| [Invoices](https://apidocs.chargebee.com/docs/api/invoices) | ✔ | ✔ | -| [Items](https://apidocs.chargebee.com/docs/api/items?prod_cat_ver=2) | | ✔ | -| [Item Prices](https://apidocs.chargebee.com/docs/api/item_prices?prod_cat_ver=2) | | ✔ | -| [Item Families](https://apidocs.chargebee.com/docs/api/item_families?prod_cat_ver=2) | | ✔ | -| [Orders](https://apidocs.chargebee.com/docs/api/orders) | ✔ | ✔ | -| [Payment Sources](https://apidocs.chargebee.com/docs/api/payment_sources) | ✔ | ✔ | -| [Plans](https://apidocs.chargebee.com/docs/api/plans?prod_cat_ver=1) | ✔ | | -| [Promotional Credits](https://apidocs.chargebee.com/docs/api/promotional_credits) | ✔ | ✔ | -| [Quotes](https://apidocs.chargebee.com/docs/api/quotes) | ✔ | ✔ | -| [Quote Line Groups](https://apidocs.chargebee.com/docs/api/quote_line_groups) | ✔ | ✔ | -| [Site Migration Details](https://apidocs.chargebee.com/docs/api/site_migration_details) | ✔ | ✔ | -| [Subscriptions](https://apidocs.chargebee.com/docs/api/subscriptions) | ✔ | ✔ | -| [Transactions](https://apidocs.chargebee.com/docs/api/transactions) | ✔ | ✔ | -| [Unbilled Charges](https://apidocs.chargebee.com/docs/api/unbilled_charges) | ✔ | ✔ | -| [Virtual Bank Accounts](https://apidocs.chargebee.com/docs/api/virtual_bank_accounts) | ✔ | ✔ | +| [Addons](https://apidocs.chargebee.com/docs/api/addons?prod_cat_ver=1) | ✔ | | +| [Attached Items](https://apidocs.chargebee.com/docs/api/attached_items?prod_cat_ver=2) | | ✔ | +| [Comments](https://apidocs.chargebee.com/docs/api/comments?prod_cat_ver=2) | ✔ | ✔ | +| [Contacts](https://apidocs.chargebee.com/docs/api/customers?lang=curl#list_of_contacts_for_a_customer) | ✔ | ✔ | +| [Coupons](https://apidocs.chargebee.com/docs/api/coupons) | ✔ | ✔ | +| [Credit Notes](https://apidocs.chargebee.com/docs/api/credit_notes) | ✔ | ✔ | +| [Customers](https://apidocs.chargebee.com/docs/api/customers) | ✔ | ✔ | +| [Differential Prices](https://apidocs.chargebee.com/docs/api/differential_prices) | ✔ | ✔ | +| [Events](https://apidocs.chargebee.com/docs/api/events) | ✔ | ✔ | +| [Gifts](https://apidocs.chargebee.com/docs/api/gifts) | ✔ | ✔ | +| [Hosted Pages](https://apidocs.chargebee.com/docs/api/hosted_pages) | ✔ | ✔ | +| [Invoices](https://apidocs.chargebee.com/docs/api/invoices) | ✔ | ✔ | +| [Items](https://apidocs.chargebee.com/docs/api/items?prod_cat_ver=2) | | ✔ | +| [Item Prices](https://apidocs.chargebee.com/docs/api/item_prices?prod_cat_ver=2) | | ✔ | +| [Item Families](https://apidocs.chargebee.com/docs/api/item_families?prod_cat_ver=2) | | ✔ | +| [Orders](https://apidocs.chargebee.com/docs/api/orders) | ✔ | ✔ | +| [Payment Sources](https://apidocs.chargebee.com/docs/api/payment_sources) | ✔ | ✔ | +| [Plans](https://apidocs.chargebee.com/docs/api/plans?prod_cat_ver=1) | ✔ | | +| [Promotional Credits](https://apidocs.chargebee.com/docs/api/promotional_credits) | ✔ | ✔ | +| [Quotes](https://apidocs.chargebee.com/docs/api/quotes) | ✔ | ✔ | +| [Quote Line Groups](https://apidocs.chargebee.com/docs/api/quote_line_groups) | ✔ | ✔ | +| [Site Migration Details](https://apidocs.chargebee.com/docs/api/site_migration_details) | ✔ | ✔ | +| [Subscriptions](https://apidocs.chargebee.com/docs/api/subscriptions) | ✔ | ✔ | +| [Transactions](https://apidocs.chargebee.com/docs/api/transactions) | ✔ | ✔ | +| [Unbilled Charges](https://apidocs.chargebee.com/docs/api/unbilled_charges) | ✔ | ✔ | +| [Virtual Bank Accounts](https://apidocs.chargebee.com/docs/api/virtual_bank_accounts) | ✔ | ✔ | :::note When using incremental sync mode, the `Attached Items` stream behaves differently than the other streams. Whereas other incremental streams read and output _only new_ records, the `Attached Items` stream reads _all_ records but only outputs _new_ records, making it more demanding on your Chargebee API quota. Each sync incurs API calls equal to the total number of attached items in your Chargebee instance divided by 100, regardless of the actual number of `Attached Items` changed or synced. @@ -91,43 +92,43 @@ The Chargebee connector should not run into [Chargebee API](https://apidocs.char ### Troubleshooting -* Check out common troubleshooting issues for the Instagram source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Instagram source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------------------------- | -| 0.5.0 | 2024-03-28 | [36518](https://github.com/airbytehq/airbyte/pull/36518) | Updates CDK to ^0, updates IncrementalSingleSliceCursor | -| 0.4.2 | 2024-03-14 | [36037](https://github.com/airbytehq/airbyte/pull/36037) | Adds fields: `coupon_constraints` to `coupon` stream, `billing_month` to `customer stream`, and `error_detail` to `transaction` stream schemas | -| 0.4.1 | 2024-03-13 | [35509](https://github.com/airbytehq/airbyte/pull/35509) | Updates CDK version to latest (0.67.1), updates `site_migration_detail` record filtering | -| 0.4.0 | 2024-02-12 | [34053](https://github.com/airbytehq/airbyte/pull/34053) | Add missing fields to and cleans up schemas, adds incremental support for `gift`, `site_migration_detail`, and `unbilled_charge` streams.` | -| 0.3.1 | 2024-02-12 | [35169](https://github.com/airbytehq/airbyte/pull/35169) | Manage dependencies with Poetry. | -| 0.3.0 | 2023-12-26 | [33696](https://github.com/airbytehq/airbyte/pull/33696) | Add new stream, add fields to existing streams | -| 0.2.6 | 2023-12-19 | [32100](https://github.com/airbytehq/airbyte/pull/32100) | Add new fields in streams | -| 0.2.5 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.4 | 2023-08-01 | [28905](https://github.com/airbytehq/airbyte/pull/28905) | Updated the connector to use latest CDK version | -| 0.2.3 | 2023-03-22 | [24370](https://github.com/airbytehq/airbyte/pull/24370) | Ignore 404 errors for `Contact` stream | -| 0.2.2 | 2023-02-17 | [21688](https://github.com/airbytehq/airbyte/pull/21688) | Migrate to CDK beta 0.29; fix schemas | -| 0.2.1 | 2023-02-17 | [23207](https://github.com/airbytehq/airbyte/pull/23207) | Edited stream schemas to get rid of unnecessary `enum` | -| 0.2.0 | 2023-01-21 | [21688](https://github.com/airbytehq/airbyte/pull/21688) | Migrate to YAML; add new streams | -| 0.1.16 | 2022-10-06 | [17661](https://github.com/airbytehq/airbyte/pull/17661) | Make `transaction` stream to be consistent with `S3` by using type transformer | -| 0.1.15 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | -| 0.1.14 | 2022-09-23 | [17056](https://github.com/airbytehq/airbyte/pull/17056) | Add "custom fields" to the relevant Chargebee source data streams | -| 0.1.13 | 2022-08-18 | [15743](https://github.com/airbytehq/airbyte/pull/15743) | Fix transaction `exchange_rate` field type | -| 0.1.12 | 2022-07-13 | [14672](https://github.com/airbytehq/airbyte/pull/14672) | Fix transaction sort by | -| 0.1.11 | 2022-03-03 | [10827](https://github.com/airbytehq/airbyte/pull/10827) | Fix Credit Note stream | -| 0.1.10 | 2022-03-02 | [10795](https://github.com/airbytehq/airbyte/pull/10795) | Add support for Credit Note stream | -| 0.1.9 | 2022-0224 | [10312](https://github.com/airbytehq/airbyte/pull/10312) | Add support for Transaction Stream | -| 0.1.8 | 2022-02-22 | [10366](https://github.com/airbytehq/airbyte/pull/10366) | Fix broken `coupon` stream + add unit tests | -| 0.1.7 | 2022-02-14 | [10269](https://github.com/airbytehq/airbyte/pull/10269) | Add support for Coupon stream | -| 0.1.6 | 2022-02-10 | [10143](https://github.com/airbytehq/airbyte/pull/10143) | Add support for Event stream | -| 0.1.5 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | -| 0.1.4 | 2021-09-27 | [6454](https://github.com/airbytehq/airbyte/pull/6454) | Fix examples in spec file | -| 0.1.3 | 2021-08-17 | [5421](https://github.com/airbytehq/airbyte/pull/5421) | Add support for "Product Catalog 2.0" specific streams: `Items`, `Item prices` and `Attached Items` | -| 0.1.2 | 2021-07-30 | [5067](https://github.com/airbytehq/airbyte/pull/5067) | Prepare connector for publishing | -| 0.1.1 | 2021-07-07 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add entrypoint and bump version for connector | -| 0.1.0 | 2021-06-30 | [3410](https://github.com/airbytehq/airbyte/pull/3410) | New Source: Chargebee | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------- | +| 0.5.0 | 2024-03-28 | [36518](https://github.com/airbytehq/airbyte/pull/36518) | Updates CDK to ^0, updates IncrementalSingleSliceCursor | +| 0.4.2 | 2024-03-14 | [36037](https://github.com/airbytehq/airbyte/pull/36037) | Adds fields: `coupon_constraints` to `coupon` stream, `billing_month` to `customer stream`, and `error_detail` to `transaction` stream schemas | +| 0.4.1 | 2024-03-13 | [35509](https://github.com/airbytehq/airbyte/pull/35509) | Updates CDK version to latest (0.67.1), updates `site_migration_detail` record filtering | +| 0.4.0 | 2024-02-12 | [34053](https://github.com/airbytehq/airbyte/pull/34053) | Add missing fields to and cleans up schemas, adds incremental support for `gift`, `site_migration_detail`, and `unbilled_charge` streams.` | +| 0.3.1 | 2024-02-12 | [35169](https://github.com/airbytehq/airbyte/pull/35169) | Manage dependencies with Poetry. | +| 0.3.0 | 2023-12-26 | [33696](https://github.com/airbytehq/airbyte/pull/33696) | Add new stream, add fields to existing streams | +| 0.2.6 | 2023-12-19 | [32100](https://github.com/airbytehq/airbyte/pull/32100) | Add new fields in streams | +| 0.2.5 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.4 | 2023-08-01 | [28905](https://github.com/airbytehq/airbyte/pull/28905) | Updated the connector to use latest CDK version | +| 0.2.3 | 2023-03-22 | [24370](https://github.com/airbytehq/airbyte/pull/24370) | Ignore 404 errors for `Contact` stream | +| 0.2.2 | 2023-02-17 | [21688](https://github.com/airbytehq/airbyte/pull/21688) | Migrate to CDK beta 0.29; fix schemas | +| 0.2.1 | 2023-02-17 | [23207](https://github.com/airbytehq/airbyte/pull/23207) | Edited stream schemas to get rid of unnecessary `enum` | +| 0.2.0 | 2023-01-21 | [21688](https://github.com/airbytehq/airbyte/pull/21688) | Migrate to YAML; add new streams | +| 0.1.16 | 2022-10-06 | [17661](https://github.com/airbytehq/airbyte/pull/17661) | Make `transaction` stream to be consistent with `S3` by using type transformer | +| 0.1.15 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | +| 0.1.14 | 2022-09-23 | [17056](https://github.com/airbytehq/airbyte/pull/17056) | Add "custom fields" to the relevant Chargebee source data streams | +| 0.1.13 | 2022-08-18 | [15743](https://github.com/airbytehq/airbyte/pull/15743) | Fix transaction `exchange_rate` field type | +| 0.1.12 | 2022-07-13 | [14672](https://github.com/airbytehq/airbyte/pull/14672) | Fix transaction sort by | +| 0.1.11 | 2022-03-03 | [10827](https://github.com/airbytehq/airbyte/pull/10827) | Fix Credit Note stream | +| 0.1.10 | 2022-03-02 | [10795](https://github.com/airbytehq/airbyte/pull/10795) | Add support for Credit Note stream | +| 0.1.9 | 2022-0224 | [10312](https://github.com/airbytehq/airbyte/pull/10312) | Add support for Transaction Stream | +| 0.1.8 | 2022-02-22 | [10366](https://github.com/airbytehq/airbyte/pull/10366) | Fix broken `coupon` stream + add unit tests | +| 0.1.7 | 2022-02-14 | [10269](https://github.com/airbytehq/airbyte/pull/10269) | Add support for Coupon stream | +| 0.1.6 | 2022-02-10 | [10143](https://github.com/airbytehq/airbyte/pull/10143) | Add support for Event stream | +| 0.1.5 | 2021-12-23 | [8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications | +| 0.1.4 | 2021-09-27 | [6454](https://github.com/airbytehq/airbyte/pull/6454) | Fix examples in spec file | +| 0.1.3 | 2021-08-17 | [5421](https://github.com/airbytehq/airbyte/pull/5421) | Add support for "Product Catalog 2.0" specific streams: `Items`, `Item prices` and `Attached Items` | +| 0.1.2 | 2021-07-30 | [5067](https://github.com/airbytehq/airbyte/pull/5067) | Prepare connector for publishing | +| 0.1.1 | 2021-07-07 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add entrypoint and bump version for connector | +| 0.1.0 | 2021-06-30 | [3410](https://github.com/airbytehq/airbyte/pull/3410) | New Source: Chargebee | diff --git a/docs/integrations/sources/chargify.md b/docs/integrations/sources/chargify.md index 42ea9fb9bed..c4c64a8d5ca 100644 --- a/docs/integrations/sources/chargify.md +++ b/docs/integrations/sources/chargify.md @@ -40,9 +40,9 @@ Please follow the [Chargify documentation for generating an API key](https://dev ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------ | | 0.4.0 | 2023-10-16 | [31116](https://github.com/airbytehq/airbyte/pull/31116) | Add Coupons, Transactions, Invoices Streams | -| 0.3.0 | 2023-08-10 | [29130](https://github.com/airbytehq/airbyte/pull/29130) | Migrate Python CDK to Low Code | -| 0.2.0 | 2023-08-08 | [29218](https://github.com/airbytehq/airbyte/pull/29218) | Fix schema | -| 0.1.0 | 2022-03-16 | [10853](https://github.com/airbytehq/airbyte/pull/10853) | Initial release | +| 0.3.0 | 2023-08-10 | [29130](https://github.com/airbytehq/airbyte/pull/29130) | Migrate Python CDK to Low Code | +| 0.2.0 | 2023-08-08 | [29218](https://github.com/airbytehq/airbyte/pull/29218) | Fix schema | +| 0.1.0 | 2022-03-16 | [10853](https://github.com/airbytehq/airbyte/pull/10853) | Initial release | diff --git a/docs/integrations/sources/chartmogul.md b/docs/integrations/sources/chartmogul.md index f92aad3361f..660e652f5bc 100644 --- a/docs/integrations/sources/chartmogul.md +++ b/docs/integrations/sources/chartmogul.md @@ -1,12 +1,16 @@ # Chartmogul + This page contains the setup guide and reference information for the [Chartmogul](https://chartmogul.com/) source connector. ## Prerequisites + - A Chartmogul API Key. - A desired start date from which to begin replicating data. ## Setup guide + ### Step 1: Set up a Chartmogul API key + 1. Log in to your Chartmogul account. 2. In the left navbar, select **Profile** > **View Profile**. 3. Select **NEW API KEY**. @@ -15,10 +19,11 @@ This page contains the setup guide and reference information for the [Chartmogul 6. Click **ADD** to create the key. 7. Click the **Reveal** icon to see the key, and the **Copy** icon to copy it to your clipboard. -For further reading on Chartmogul API Key creation and maintenance, please refer to the official +For further reading on Chartmogul API Key creation and maintenance, please refer to the official [Chartmogul documentation](https://help.chartmogul.com/hc/en-us/articles/4407796325906-Creating-and-Managing-API-keys#creating-an-api-key). ### Step 2: Set up the Chartmogul connector in Airbyte + 1. [Log in to your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account, or navigate to the Airbyte Open Source dashboard. 2. From the Airbyte UI, click **Sources**, then click on **+ New Source** and select **Chartmogul** from the list of available sources. 3. Enter a **Source name** of your choosing. @@ -35,19 +40,19 @@ The **Start date** will only apply to the `Activities` stream. The `Customers` e The Chartmogul source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) ## Supported streams This connector outputs the following full refresh streams: -* [Activities](https://dev.chartmogul.com/reference/list-activities) -* [CustomerCountDaily](https://dev.chartmogul.com/reference/retrieve-customer-count) -* [CustomerCountWeekly](https://dev.chartmogul.com/reference/retrieve-customer-count) -* [CustomerCountMonthly](https://dev.chartmogul.com/reference/retrieve-customer-count) -* [CustomerCountQuarterly](https://dev.chartmogul.com/reference/retrieve-customer-count) -* [Customers](https://dev.chartmogul.com/reference/list-customers) +- [Activities](https://dev.chartmogul.com/reference/list-activities) +- [CustomerCountDaily](https://dev.chartmogul.com/reference/retrieve-customer-count) +- [CustomerCountWeekly](https://dev.chartmogul.com/reference/retrieve-customer-count) +- [CustomerCountMonthly](https://dev.chartmogul.com/reference/retrieve-customer-count) +- [CustomerCountQuarterly](https://dev.chartmogul.com/reference/retrieve-customer-count) +- [Customers](https://dev.chartmogul.com/reference/list-customers) ## Performance considerations @@ -55,10 +60,10 @@ The Chartmogul connector should not run into Chartmogul API limitations under no ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 1.0.0 | 2023-11-09 | [23075](https://github.com/airbytehq/airbyte/pull/23075) | Refactor CustomerCount stream into CustomerCountDaily, CustomerCountWeekly, CustomerCountMonthly, CustomerCountQuarterly Streams | -| 0.2.1 | 2023-02-15 | [23075](https://github.com/airbytehq/airbyte/pull/23075) | Specified date formatting in specification | -| 0.2.0 | 2022-11-15 | [19276](https://github.com/airbytehq/airbyte/pull/19276) | Migrate connector from Alpha (Python) to Beta (YAML) | -| 0.1.1 | 2022-03-02 | [10756](https://github.com/airbytehq/airbyte/pull/10756) | Add new stream: customer-count | -| 0.1.0 | 2022-01-10 | [9381](https://github.com/airbytehq/airbyte/pull/9381) | New Source: Chartmogul | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------- | +| 1.0.0 | 2023-11-09 | [23075](https://github.com/airbytehq/airbyte/pull/23075) | Refactor CustomerCount stream into CustomerCountDaily, CustomerCountWeekly, CustomerCountMonthly, CustomerCountQuarterly Streams | +| 0.2.1 | 2023-02-15 | [23075](https://github.com/airbytehq/airbyte/pull/23075) | Specified date formatting in specification | +| 0.2.0 | 2022-11-15 | [19276](https://github.com/airbytehq/airbyte/pull/19276) | Migrate connector from Alpha (Python) to Beta (YAML) | +| 0.1.1 | 2022-03-02 | [10756](https://github.com/airbytehq/airbyte/pull/10756) | Add new stream: customer-count | +| 0.1.0 | 2022-01-10 | [9381](https://github.com/airbytehq/airbyte/pull/9381) | New Source: Chartmogul | diff --git a/docs/integrations/sources/clickhouse.md b/docs/integrations/sources/clickhouse.md index 2fa69ac5a2a..b2ecb27f5e1 100644 --- a/docs/integrations/sources/clickhouse.md +++ b/docs/integrations/sources/clickhouse.md @@ -12,15 +12,15 @@ The ClickHouse source does not alter the schema present in your warehouse. Depen ### Features -| Feature | Supported | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Replicate Incremental Deletes | Coming soon | | -| Logical Replication \(WAL\) | Coming soon | | -| SSL Support | Yes | | -| SSH Tunnel Connection | Yes | | -| Namespaces | Yes | Enabled by default | +| Feature | Supported | Notes | +| :---------------------------- | :---------- | :----------------- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Replicate Incremental Deletes | Coming soon | | +| Logical Replication \(WAL\) | Coming soon | | +| SSL Support | Yes | | +| SSH Tunnel Connection | Yes | | +| Namespaces | Yes | Enabled by default | ## Getting started @@ -73,45 +73,43 @@ Using this feature requires additional configuration, when creating the source. 6. If you are using `Password Authentication`, then `SSH Login Username` should be set to the password of the User from the previous step. If you are using `SSH Key Authentication` leave this blank. Again, this is not the Clickhouse password, but the password for the OS-user that Airbyte is using to perform commands on the bastion. 7. If you are using `SSH Key Authentication`, then `SSH Private Key` should be set to the RSA Private Key that you are using to create the SSH connection. This should be the full contents of the key file starting with `-----BEGIN RSA PRIVATE KEY-----` and ending with `-----END RSA PRIVATE KEY-----`. - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------| :--- |:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------| -| 0.2.2 | 2024-02-13 | [35235](https://github.com/airbytehq/airbyte/pull/35235) | Adopt CDK 0.20.4 | -| 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | -| 0.1.17 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | -| 0.1.16 |2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------- | +| 0.2.2 | 2024-02-13 | [35235](https://github.com/airbytehq/airbyte/pull/35235) | Adopt CDK 0.20.4 | +| 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | +| 0.1.17 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | +| 0.1.16 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | | 0.1.15 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | -| 0.1.14 | 2022-09-27 | [17031](https://github.com/airbytehq/airbyte/pull/17031) | Added custom jdbc url parameters field | -| 0.1.13 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | -| 0.1.12 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | -| 0.1.10 | 2022-04-12 | [11729](https://github.com/airbytehq/airbyte/pull/11514) | Bump mina-sshd from 2.7.0 to 2.8.0 | +| 0.1.14 | 2022-09-27 | [17031](https://github.com/airbytehq/airbyte/pull/17031) | Added custom jdbc url parameters field | +| 0.1.13 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | +| 0.1.12 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | +| 0.1.10 | 2022-04-12 | [11729](https://github.com/airbytehq/airbyte/pull/11514) | Bump mina-sshd from 2.7.0 to 2.8.0 | | 0.1.9 | 2022-02-09 | [\#10214](https://github.com/airbytehq/airbyte/pull/10214) | Fix exception in case `password` field is not provided | -| 0.1.8 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | -| 0.1.7 | 2021-12-24 | [\#8958](https://github.com/airbytehq/airbyte/pull/8958) | Add support for JdbcType.ARRAY | -| 0.1.6 | 2021-12-15 | [\#8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | -| 0.1.5 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key | -| 0.1.4 | 20.10.2021 | [\#7327](https://github.com/airbytehq/airbyte/pull/7327) | Added support for connection via SSH tunnel(aka Bastion server). | -| 0.1.3 | 20.10.2021 | [\#7127](https://github.com/airbytehq/airbyte/pull/7127) | Added SSL connections support. | -| 0.1.2 | 13.08.2021 | [\#4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator. | - +| 0.1.8 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | +| 0.1.7 | 2021-12-24 | [\#8958](https://github.com/airbytehq/airbyte/pull/8958) | Add support for JdbcType.ARRAY | +| 0.1.6 | 2021-12-15 | [\#8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | +| 0.1.5 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key | +| 0.1.4 | 20.10.2021 | [\#7327](https://github.com/airbytehq/airbyte/pull/7327) | Added support for connection via SSH tunnel(aka Bastion server). | +| 0.1.3 | 20.10.2021 | [\#7127](https://github.com/airbytehq/airbyte/pull/7127) | Added SSL connections support. | +| 0.1.2 | 13.08.2021 | [\#4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator. | ## CHANGELOG source-clickhouse-strict-encrypt -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | -| 0.1.17 | 2022-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | -| 0.1.16 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | -| 0.1.15 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | -| | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | -| 0.1.14 | 2022-09-27 | [17031](https://github.com/airbytehq/airbyte/pull/17031) | Added custom jdbc url parameters field | -| 0.1.13 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | -| 0.1.9 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | -| 0.1.6 | 2022-02-09 | [\#10214](https://github.com/airbytehq/airbyte/pull/10214) | Fix exception in case `password` field is not provided | -| 0.1.5 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :---------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | +| 0.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | +| 0.1.17 | 2022-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | +| 0.1.16 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | +| 0.1.15 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | +| | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | +| 0.1.14 | 2022-09-27 | [17031](https://github.com/airbytehq/airbyte/pull/17031) | Added custom jdbc url parameters field | +| 0.1.13 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | +| 0.1.9 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | +| 0.1.6 | 2022-02-09 | [\#10214](https://github.com/airbytehq/airbyte/pull/10214) | Fix exception in case `password` field is not provided | +| 0.1.5 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | | 0.1.3 | 2021-12-29 | [\#9182](https://github.com/airbytehq/airbyte/pull/9182) [\#8958](https://github.com/airbytehq/airbyte/pull/8958) | Add support for JdbcType.ARRAY. Fixed tests | -| 0.1.2 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key | -| 0.1.1 | 20.10.2021 | [\#7327](https://github.com/airbytehq/airbyte/pull/7327) | Added support for connection via SSH tunnel(aka Bastion server). | -| 0.1.0 | 20.10.2021 | [\#7127](https://github.com/airbytehq/airbyte/pull/7127) | Added source-clickhouse-strict-encrypt that supports SSL connections only. | +| 0.1.2 | 2021-12-01 | [\#8371](https://github.com/airbytehq/airbyte/pull/8371) | Fixed incorrect handling "\n" in ssh key | +| 0.1.1 | 20.10.2021 | [\#7327](https://github.com/airbytehq/airbyte/pull/7327) | Added support for connection via SSH tunnel(aka Bastion server). | +| 0.1.0 | 20.10.2021 | [\#7127](https://github.com/airbytehq/airbyte/pull/7127) | Added source-clickhouse-strict-encrypt that supports SSL connections only. | diff --git a/docs/integrations/sources/clickup-api.md b/docs/integrations/sources/clickup-api.md index be23780c7db..1a126636248 100644 --- a/docs/integrations/sources/clickup-api.md +++ b/docs/integrations/sources/clickup-api.md @@ -4,25 +4,23 @@ This source can sync data from [ClickUp API](https://clickup.com/api/). Currently, this connector only supports full refresh syncs. That is, every time a sync is run, all the records are fetched from the source. - ### Output schema This source is capable of syncing the following streams: -* [`user`](https://clickup.com/api/clickupreference/operation/GetAuthorizedUser/) -* [`teams`](https://clickup.com/api/clickupreference/operation/GetAuthorizedTeams/) -* [`spaces`](https://clickup.com/api/clickupreference/operation/GetSpaces/) -* [`folders`](https://clickup.com/api/clickupreference/operation/GetFolders/) -* [`lists`](https://clickup.com/api/clickupreference/operation/GetLists/) -* [`tasks`](https://clickup.com/api/clickupreference/operation/GetTasks) - +- [`user`](https://clickup.com/api/clickupreference/operation/GetAuthorizedUser/) +- [`teams`](https://clickup.com/api/clickupreference/operation/GetAuthorizedTeams/) +- [`spaces`](https://clickup.com/api/clickupreference/operation/GetSpaces/) +- [`folders`](https://clickup.com/api/clickupreference/operation/GetFolders/) +- [`lists`](https://clickup.com/api/clickupreference/operation/GetLists/) +- [`tasks`](https://clickup.com/api/clickupreference/operation/GetTasks) ### Features -| Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:--------------------------------------------------------| -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported? \(Yes/No\) | Notes | +| :---------------- | :-------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -38,23 +36,23 @@ The ClickUp API enforces request rate limits per token. The rate limits are depe The following fields are required fields for the connector to work: -* `api_token`: Your ClickUp API Token. +- `api_token`: Your ClickUp API Token. Here are some optional fields for different streams: -* `team_id`: Your team ID in your ClickUp workspace. It is required for `space` stream. +- `team_id`: Your team ID in your ClickUp workspace. It is required for `space` stream. -* `space_id`: Your space ID in your ClickUp workspace. It is required for `folder` stream. +- `space_id`: Your space ID in your ClickUp workspace. It is required for `folder` stream. -* `folder_id`: Your folder ID in your ClickUp space. It is required for `list` stream. +- `folder_id`: Your folder ID in your ClickUp space. It is required for `list` stream. -* `list_id`: Your list ID in your folder of space. It is required for `task` stream. +- `list_id`: Your list ID in your folder of space. It is required for `task` stream. -* `Include Closed Tasks`: Toggle to include or exclude closed tasks. By default, they are excluded. +- `Include Closed Tasks`: Toggle to include or exclude closed tasks. By default, they are excluded. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-------------------------------------------------------------|:----------------------------------| -| 0.1.1 | 2023-02-10 | [23951](https://github.com/airbytehq/airbyte/pull/23951) | Add optional include Closed Tasks | -| 0.1.0 | 2022-11-07 | [17770](https://github.com/airbytehq/airbyte/pull/17770) | New source | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------- | +| 0.1.1 | 2023-02-10 | [23951](https://github.com/airbytehq/airbyte/pull/23951) | Add optional include Closed Tasks | +| 0.1.0 | 2022-11-07 | [17770](https://github.com/airbytehq/airbyte/pull/17770) | New source | diff --git a/docs/integrations/sources/clockify.md b/docs/integrations/sources/clockify.md index eea81da8589..026372bc5e1 100644 --- a/docs/integrations/sources/clockify.md +++ b/docs/integrations/sources/clockify.md @@ -4,12 +4,12 @@ The Airbyte Source for [Clockify](https://clockify.me) ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------- | -| 0.3.3 | 2024-04-19 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.3.2 | 2024-04-15 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.1 | 2024-04-12 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | schema descriptions | -| 0.3.0 | 2023-08-27 | [TBD](https://github.com/airbytehq/airbyte/pull/TBD) | ✨ Source Clockify: Migrate to LowCode CDK | -| 0.2.1 | 2023-08-01 | [27881](https://github.com/airbytehq/airbyte/pull/27881) | 🐛 Source Clockify: Source Clockify: Fix pagination logic | -| 0.2.0 | 2023-08-01 | [27689](https://github.com/airbytehq/airbyte/pull/27689) | ✨ Source Clockify: Add Optional API Url parameter | -| 0.1.0 | 2022-10-26 | [17767](https://github.com/airbytehq/airbyte/pull/17767) | 🎉 New Connector: Clockify [python cdk] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.3.3 | 2024-04-19 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.3.2 | 2024-04-15 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.1 | 2024-04-12 | [37135](https://github.com/airbytehq/airbyte/pull/37135) | schema descriptions | +| 0.3.0 | 2023-08-27 | [TBD](https://github.com/airbytehq/airbyte/pull/TBD) | ✨ Source Clockify: Migrate to LowCode CDK | +| 0.2.1 | 2023-08-01 | [27881](https://github.com/airbytehq/airbyte/pull/27881) | 🐛 Source Clockify: Source Clockify: Fix pagination logic | +| 0.2.0 | 2023-08-01 | [27689](https://github.com/airbytehq/airbyte/pull/27689) | ✨ Source Clockify: Add Optional API Url parameter | +| 0.1.0 | 2022-10-26 | [17767](https://github.com/airbytehq/airbyte/pull/17767) | 🎉 New Connector: Clockify [python cdk] | diff --git a/docs/integrations/sources/close-com.md b/docs/integrations/sources/close-com.md index d152f845fe9..406cb339de9 100644 --- a/docs/integrations/sources/close-com.md +++ b/docs/integrations/sources/close-com.md @@ -4,13 +4,14 @@ This page contains the setup guide and reference information for the [Close.com] ## Prerequisites -* Close.com API Key +- Close.com API Key We recommend creating a restricted key specifically for Airbyte access. This will allow you to control which resources Airbyte should be able to access. For ease of use, we recommend using read permissions for all resources and configuring which resource to replicate in the Airbyte UI. ## Setup guide ### Step 1: Set up your Close.com API Key + 1. [Log in to your Close.com](https://www.close.com) account. 2. At the bottom of the left navbar, select **Settings**. 3. In the left menu, select **Developer**. @@ -20,7 +21,7 @@ We recommend creating a restricted key specifically for Airbyte access. This wil For security purposes, the API Key will only be displayed once upon creation. Be sure to copy and store the key in a secure location. ::: -For further reading on creating and maintaining Close.com API keys, refer to the +For further reading on creating and maintaining Close.com API keys, refer to the [official documentation](https://help.close.com/docs/api-keys-oauth). ### Step 2: Set up the Close.com connector in Airbyte @@ -29,7 +30,7 @@ For further reading on creating and maintaining Close.com API keys, refer to the 2. From the Airbyte UI, click **Sources**, then click on **+ New Source** and select **Close.com** from the list of available sources. 3. Enter a **Source name** of your choosing. 4. In the **API Key** field, enter your Close.com **API Key** -5. *Optional* - In the **Replication Start Date** field, you may enter a starting date cutoff for the data you want to replicate. The format for this date should be as such: `YYYY-MM-DD`. Leaving this field blank will replicate all data. +5. _Optional_ - In the **Replication Start Date** field, you may enter a starting date cutoff for the data you want to replicate. The format for this date should be as such: `YYYY-MM-DD`. Leaving this field blank will replicate all data. 6. Click **Set up source** and wait for the tests to complete. ## Supported sync modes @@ -40,51 +41,51 @@ The Close.com source supports both **Full Refresh** and **Incremental** syncs. Y This source is capable of syncing the following core streams: -* [Leads](https://developer.close.com/#leads) \(Incremental\) -* [Created Activities](https://developer.close.com/#activities-list-or-filter-all-created-activities) \(Incremental\) -* [Opportunity Status Change Activities](https://developer.close.com/#activities-list-or-filter-all-opportunitystatuschange-activities) \(Incremental\) -* [Note Activities](https://developer.close.com/#activities-list-or-filter-all-note-activities) \(Incremental\) -* [Meeting Activities](https://developer.close.com/#activities-list-or-filter-all-meeting-activities) \(Incremental\) -* [Call Activities](https://developer.close.com/#activities-list-or-filter-all-call-activities) \(Incremental\) -* [Email Activities](https://developer.close.com/#activities-list-or-filter-all-email-activities) \(Incremental\) -* [Email Thread Activities](https://developer.close.com/#activities-list-or-filter-all-emailthread-activities) \(Incremental\) -* [Lead Status Change Activities](https://developer.close.com/#activities-list-or-filter-all-leadstatuschange-activities) \(Incremental\) -* [SMS Activities](https://developer.close.com/#activities-list-or-filter-all-sms-activities) \(Incremental\) -* [Task Completed Activities](https://developer.close.com/#activities-list-or-filter-all-taskcompleted-activities) \(Incremental\) -* [Lead Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Incoming Email Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Email Followup Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Missed Call Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Answered Detached Call Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Voicemail Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Opportunity Due Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Incoming SMS Tasks](https://developer.close.com/#tasks) \(Incremental\) -* [Events](https://developer.close.com/#event-log) \(Incremental\) -* [Lead Custom Fields](https://developer.close.com/#custom-fields-list-all-the-lead-custom-fields-for-your-organization) -* [Contact Custom Fields](https://developer.close.com/#custom-fields-list-all-the-contact-custom-fields-for-your-organization) -* [Opportunity Custom Fields](https://developer.close.com/#custom-fields-list-all-the-opportunity-custom-fields-for-your-organization) -* [Activity Custom Fields](https://developer.close.com/#custom-fields-list-all-the-activity-custom-fields-for-your-organization) -* [Users](https://developer.close.com/#users) -* [Contacts](https://developer.close.com/#contacts) -* [Opportunities](https://developer.close.com/#opportunities) \(Incremental\) -* [Roles](https://developer.close.com/#roles) -* [Lead Statuses](https://developer.close.com/#lead-statuses) -* [Opportunity Statuses](https://developer.close.com/#opportunity-statuses) -* [Pipelines](https://developer.close.com/#pipelines) -* [Email Templates](https://developer.close.com/#email-templates) -* [Google Connected Accounts](https://developer.close.com/#connected-accounts) -* [Custom Email Connected Accounts](https://developer.close.com/#connected-accounts) -* [Zoom Connected Accounts](https://developer.close.com/#connected-accounts) -* [Send As](https://developer.close.com/#send-as) -* [Email Sequences](https://developer.close.com/#email-sequences) -* [Dialer](https://developer.close.com/#dialer) -* [Smart Views](https://developer.close.com/#smart-views) -* [Email Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-emails) -* [Sequence Subscription Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-sequence-subscriptions) -* [Delete Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-deletes) -* [Edit Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-edits) -* [Integration Links](https://developer.close.com/#integration-links) -* [Custom Activities](https://developer.close.com/#custom-activities) +- [Leads](https://developer.close.com/#leads) \(Incremental\) +- [Created Activities](https://developer.close.com/#activities-list-or-filter-all-created-activities) \(Incremental\) +- [Opportunity Status Change Activities](https://developer.close.com/#activities-list-or-filter-all-opportunitystatuschange-activities) \(Incremental\) +- [Note Activities](https://developer.close.com/#activities-list-or-filter-all-note-activities) \(Incremental\) +- [Meeting Activities](https://developer.close.com/#activities-list-or-filter-all-meeting-activities) \(Incremental\) +- [Call Activities](https://developer.close.com/#activities-list-or-filter-all-call-activities) \(Incremental\) +- [Email Activities](https://developer.close.com/#activities-list-or-filter-all-email-activities) \(Incremental\) +- [Email Thread Activities](https://developer.close.com/#activities-list-or-filter-all-emailthread-activities) \(Incremental\) +- [Lead Status Change Activities](https://developer.close.com/#activities-list-or-filter-all-leadstatuschange-activities) \(Incremental\) +- [SMS Activities](https://developer.close.com/#activities-list-or-filter-all-sms-activities) \(Incremental\) +- [Task Completed Activities](https://developer.close.com/#activities-list-or-filter-all-taskcompleted-activities) \(Incremental\) +- [Lead Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Incoming Email Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Email Followup Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Missed Call Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Answered Detached Call Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Voicemail Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Opportunity Due Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Incoming SMS Tasks](https://developer.close.com/#tasks) \(Incremental\) +- [Events](https://developer.close.com/#event-log) \(Incremental\) +- [Lead Custom Fields](https://developer.close.com/#custom-fields-list-all-the-lead-custom-fields-for-your-organization) +- [Contact Custom Fields](https://developer.close.com/#custom-fields-list-all-the-contact-custom-fields-for-your-organization) +- [Opportunity Custom Fields](https://developer.close.com/#custom-fields-list-all-the-opportunity-custom-fields-for-your-organization) +- [Activity Custom Fields](https://developer.close.com/#custom-fields-list-all-the-activity-custom-fields-for-your-organization) +- [Users](https://developer.close.com/#users) +- [Contacts](https://developer.close.com/#contacts) +- [Opportunities](https://developer.close.com/#opportunities) \(Incremental\) +- [Roles](https://developer.close.com/#roles) +- [Lead Statuses](https://developer.close.com/#lead-statuses) +- [Opportunity Statuses](https://developer.close.com/#opportunity-statuses) +- [Pipelines](https://developer.close.com/#pipelines) +- [Email Templates](https://developer.close.com/#email-templates) +- [Google Connected Accounts](https://developer.close.com/#connected-accounts) +- [Custom Email Connected Accounts](https://developer.close.com/#connected-accounts) +- [Zoom Connected Accounts](https://developer.close.com/#connected-accounts) +- [Send As](https://developer.close.com/#send-as) +- [Email Sequences](https://developer.close.com/#email-sequences) +- [Dialer](https://developer.close.com/#dialer) +- [Smart Views](https://developer.close.com/#smart-views) +- [Email Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-emails) +- [Sequence Subscription Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-sequence-subscriptions) +- [Delete Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-deletes) +- [Edit Bulk Actions](https://developer.close.com/#bulk-actions-list-bulk-edits) +- [Integration Links](https://developer.close.com/#integration-links) +- [Custom Activities](https://developer.close.com/#custom-activities) ### Notes @@ -104,7 +105,7 @@ The Close.com connector is subject to rate limits. For more information on this ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------- | | 0.5.0 | 2023-11-30 | [32984](https://github.com/airbytehq/airbyte/pull/32984) | Add support for custom fields | | 0.4.3 | 2023-10-28 | [31534](https://github.com/airbytehq/airbyte/pull/31534) | Fixed Email Activities Stream Pagination | | 0.4.2 | 2023-08-08 | [29206](https://github.com/airbytehq/airbyte/pull/29206) | Fixed the issue with `DatePicker` format for `start date` | @@ -115,4 +116,3 @@ The Close.com connector is subject to rate limits. For more information on this | 0.2.1 | 2023-02-15 | [23074](https://github.com/airbytehq/airbyte/pull/23074) | Specified date formatting in specification | | 0.2.0 | 2022-11-04 | [18968](https://github.com/airbytehq/airbyte/pull/18968) | Migrate to Low-Code | | 0.1.0 | 2021-08-10 | [5366](https://github.com/airbytehq/airbyte/pull/5366) | Initial release of Close.com connector for Airbyte | - diff --git a/docs/integrations/sources/cockroachdb.md b/docs/integrations/sources/cockroachdb.md index 689f2b0ae81..c3ce4442dc8 100644 --- a/docs/integrations/sources/cockroachdb.md +++ b/docs/integrations/sources/cockroachdb.md @@ -12,39 +12,39 @@ The CockroachDb source does not alter the schema present in your database. Depen CockroachDb data types are mapped to the following data types when synchronizing data: -| CockroachDb Type | Resulting Type | Notes | -| :--- | :--- | :--- | -| `bigint` | integer | | -| `bit` | boolean | | -| `boolean` | boolean | | -| `character` | string | | -| `character varying` | string | | -| `date` | string | | -| `double precision` | string | | -| `enum` | number | | -| `inet` | string | | -| `int` | integer | | -| `json` | string | | -| `jsonb` | string | | -| `numeric` | number | | -| `smallint` | integer | | -| `text` | string | | -| `time with timezone` | string | may be written as a native date type depending on the destination | -| `time without timezone` | string | may be written as a native date type depending on the destination | -| `timestamp with timezone` | string | may be written as a native date type depending on the destination | -| `timestamp without timezone` | string | may be written as a native date type depending on the destination | -| `uuid` | string | | +| CockroachDb Type | Resulting Type | Notes | +| :--------------------------- | :------------- | :---------------------------------------------------------------- | +| `bigint` | integer | | +| `bit` | boolean | | +| `boolean` | boolean | | +| `character` | string | | +| `character varying` | string | | +| `date` | string | | +| `double precision` | string | | +| `enum` | number | | +| `inet` | string | | +| `int` | integer | | +| `json` | string | | +| `jsonb` | string | | +| `numeric` | number | | +| `smallint` | integer | | +| `text` | string | | +| `time with timezone` | string | may be written as a native date type depending on the destination | +| `time without timezone` | string | may be written as a native date type depending on the destination | +| `timestamp with timezone` | string | may be written as a native date type depending on the destination | +| `timestamp without timezone` | string | may be written as a native date type depending on the destination | +| `uuid` | string | | **Note:** arrays for all the above types as well as custom types are supported, although they may be de-nested depending on the destination. ### Features -| Feature | Supported | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Change Data Capture | No | | -| SSL Support | Yes | | +| Feature | Supported | Notes | +| :------------------ | :-------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Change Data Capture | No | | +| SSL Support | Yes | | ## Getting started @@ -93,15 +93,15 @@ Your database user should now be ready for use with Airbyte. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :--- |:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.2.2 | 2024-02-13 | [35234](https://github.com/airbytehq/airbyte/pull/35234) | Adopt CDK 0.20.4 | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | +| 0.2.2 | 2024-02-13 | [35234](https://github.com/airbytehq/airbyte/pull/35234) | Adopt CDK 0.20.4 | | 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Removed LEGACY state | | 0.1.22 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | | 0.1.21 | 2023-03-14 | [24000](https://github.com/airbytehq/airbyte/pull/24000) | Removed check method call on read. | | 0.1.20 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect | -| 0.1.19 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | +| 0.1.19 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | | | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | | 0.1.18 | 2022-09-01 | [16394](https://github.com/airbytehq/airbyte/pull/16394) | Added custom jdbc properties field | | 0.1.17 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | @@ -113,8 +113,8 @@ Your database user should now be ready for use with Airbyte. | 0.1.9 | 2022-02-21 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Fixed cursor for old connectors that use non-microsecond format. Now connectors work with both formats | | 0.1.8 | 2022-02-18 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Updated timestamp transformation with microseconds | | 0.1.7 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | -| 0.1.6 | 2022-02-08 | [10173](https://github.com/airbytehq/airbyte/pull/10173) | Improved discovering tables in case if user does not have permissions to any table | -| 0.1.5 | 2021-12-24 | [9004](https://github.com/airbytehq/airbyte/pull/9004) | User can see only permmited tables during discovery | -| 0.1.4 | 2021-12-24 | [8958](https://github.com/airbytehq/airbyte/pull/8958) | Add support for JdbcType.ARRAY | -| 0.1.3 | 2021-10-10 | [7819](https://github.com/airbytehq/airbyte/pull/7819) | Fixed Datatype errors during Cockroach DB parsing | -| 0.1.2 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator | +| 0.1.6 | 2022-02-08 | [10173](https://github.com/airbytehq/airbyte/pull/10173) | Improved discovering tables in case if user does not have permissions to any table | +| 0.1.5 | 2021-12-24 | [9004](https://github.com/airbytehq/airbyte/pull/9004) | User can see only permmited tables during discovery | +| 0.1.4 | 2021-12-24 | [8958](https://github.com/airbytehq/airbyte/pull/8958) | Add support for JdbcType.ARRAY | +| 0.1.3 | 2021-10-10 | [7819](https://github.com/airbytehq/airbyte/pull/7819) | Fixed Datatype errors during Cockroach DB parsing | +| 0.1.2 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator | diff --git a/docs/integrations/sources/coda.md b/docs/integrations/sources/coda.md index 5ba1c005a74..4674535ea12 100755 --- a/docs/integrations/sources/coda.md +++ b/docs/integrations/sources/coda.md @@ -63,9 +63,9 @@ The Coda source connector supports the following [sync modes](https://docs.airby ## Changelog | Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- |:------------------------------------------------------------------------------------------------------------| -| 1.2.1 | 2024-04-02 | [36775](https://github.com/airbytehq/airbyte/pull/36775) | Migrate to base image, manage dependencies with Poetry, and stop using last_records interpolation variable. | -| 1.2.0 | 2023-08-13 | [29288](https://github.com/airbytehq/airbyte/pull/29288) | Migrate python cdk to low-code | -| 1.1.0 | 2023-07-10 | [27797](https://github.com/airbytehq/airbyte/pull/27797) | Add `rows` stream | -| 1.0.0 | 2023-07-10 | [28093](https://github.com/airbytehq/airbyte/pull/28093) | Update `docs` and `pages` schemas | -| 0.1.0 | 2022-11-17 | [18675](https://github.com/airbytehq/airbyte/pull/18675) | 🎉 New source: Coda [python cdk] | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- | +| 1.2.1 | 2024-04-02 | [36775](https://github.com/airbytehq/airbyte/pull/36775) | Migrate to base image, manage dependencies with Poetry, and stop using last_records interpolation variable. | +| 1.2.0 | 2023-08-13 | [29288](https://github.com/airbytehq/airbyte/pull/29288) | Migrate python cdk to low-code | +| 1.1.0 | 2023-07-10 | [27797](https://github.com/airbytehq/airbyte/pull/27797) | Add `rows` stream | +| 1.0.0 | 2023-07-10 | [28093](https://github.com/airbytehq/airbyte/pull/28093) | Update `docs` and `pages` schemas | +| 0.1.0 | 2022-11-17 | [18675](https://github.com/airbytehq/airbyte/pull/18675) | 🎉 New source: Coda [python cdk] | diff --git a/docs/integrations/sources/coin-api.md b/docs/integrations/sources/coin-api.md index b6cdd71d6db..5ecc58e82ce 100644 --- a/docs/integrations/sources/coin-api.md +++ b/docs/integrations/sources/coin-api.md @@ -2,7 +2,7 @@ ## Sync overview -This source can sync OHLCV and trades historical data for a single coin listed on +This source can sync OHLCV and trades historical data for a single coin listed on [CoinAPI](https://www.coinapi.io/). It currently only supports Full Refresh syncs. @@ -16,7 +16,7 @@ This source is capable of syncing the following streams: ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:--------------------------------------------------------| +| :---------------- | :-------------------- | :------------------------------------------------------ | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | API Environments | Yes | Both sandbox and production environments are supported. | @@ -31,7 +31,7 @@ may require a paid plan. ### Requirements 1. Obtain an API key from [CoinAPI](https://www.coinapi.io/). -2. Choose a symbol to pull data for. You can find a list of symbols [here](https://docs.coinapi.io/#list-all-symbols-get). +2. Choose a symbol to pull data for. You can find a list of symbols [here](https://docs.coinapi.io/#list-all-symbols-get). 3. Choose a time interval to pull data for. You can find a list of intervals [here](https://docs.coinapi.io/#list-all-periods-get). ### Setup guide @@ -48,12 +48,12 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| -| 0.2.4 | 2024-04-19 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | schema descriptions | -| 0.2.0 | 2024-02-05 | [#34826](https://github.com/airbytehq/airbyte/pull/34826) | Fix catalog types for fields `bid_price` and `bid_size` in stream `quotes_historical_data`. | -| 0.1.1 | 2022-12-19 | [#20600](https://github.com/airbytehq/airbyte/pull/20600) | Add quotes historical data stream| -| 0.1.0 | 2022-10-21 | [#18302](https://github.com/airbytehq/airbyte/pull/18302) | New source | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37138](https://github.com/airbytehq/airbyte/pull/37138) | schema descriptions | +| 0.2.0 | 2024-02-05 | [#34826](https://github.com/airbytehq/airbyte/pull/34826) | Fix catalog types for fields `bid_price` and `bid_size` in stream `quotes_historical_data`. | +| 0.1.1 | 2022-12-19 | [#20600](https://github.com/airbytehq/airbyte/pull/20600) | Add quotes historical data stream | +| 0.1.0 | 2022-10-21 | [#18302](https://github.com/airbytehq/airbyte/pull/18302) | New source | diff --git a/docs/integrations/sources/coingecko-coins.md b/docs/integrations/sources/coingecko-coins.md index ddf461b8f00..4d0dc706b9d 100644 --- a/docs/integrations/sources/coingecko-coins.md +++ b/docs/integrations/sources/coingecko-coins.md @@ -9,13 +9,13 @@ This source can sync market chart and historical data for a single coin listed o This source is capable of syncing the following streams: -* `market_chart` -* `history` +- `market_chart` +- `history` ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:-------------------------------------------------------| +| :---------------- | :-------------------- | :----------------------------------------------------- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | CoinGecko Pro API | Yes | Will default to free API unless an API key is provided | @@ -30,7 +30,6 @@ this [here](https://www.coingecko.com/en/branding). ## Getting started - ### Requirements 1. Choose a coin to pull data from. The coin must be listed on CoinGecko, and can be listed via the `/coins/list` endpoint. @@ -48,8 +47,7 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------- | | 0.1.1 | 2023-04-30 | [25558](https://github.com/airbytehq/airbyte/pull/25558) | Make manifest.yaml connector builder-friendly | -| 0.1.0 | 2022-10-20 | [18248](https://github.com/airbytehq/airbyte/pull/18248) | New source | - +| 0.1.0 | 2022-10-20 | [18248](https://github.com/airbytehq/airbyte/pull/18248) | New source | diff --git a/docs/integrations/sources/commcare.md b/docs/integrations/sources/commcare.md index 091bf496a40..e278913b613 100644 --- a/docs/integrations/sources/commcare.md +++ b/docs/integrations/sources/commcare.md @@ -35,6 +35,6 @@ The Commcare source connector supports the following streams: ## Changelog -| Version | Date | Pull Request | Subject | -|---------|------|--------------|---------| -| 0.1.0 | 2022-11-08 | [20220](https://github.com/airbytehq/airbyte/pull/20220) | Commcare Source Connector | +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ------------------------- | +| 0.1.0 | 2022-11-08 | [20220](https://github.com/airbytehq/airbyte/pull/20220) | Commcare Source Connector | diff --git a/docs/integrations/sources/commercetools.md b/docs/integrations/sources/commercetools.md index a65bf4b7616..bd5b557842a 100644 --- a/docs/integrations/sources/commercetools.md +++ b/docs/integrations/sources/commercetools.md @@ -10,28 +10,28 @@ This source can sync data for the [Commercetools API](https://docs.commercetools This Source is capable of syncing the following core Streams: -* [Customers](https://docs.commercetools.com/api/projects/customers) -* [Orders](https://docs.commercetools.com/api/projects/orders) -* [Products](https://docs.commercetools.com/api/projects/products) -* [DiscountCodes](https://docs.commercetools.com/api/projects/discountCodes) -* [Payments](https://docs.commercetools.com/api/projects/payments) +- [Customers](https://docs.commercetools.com/api/projects/customers) +- [Orders](https://docs.commercetools.com/api/projects/orders) +- [Products](https://docs.commercetools.com/api/projects/products) +- [DiscountCodes](https://docs.commercetools.com/api/projects/discountCodes) +- [Payments](https://docs.commercetools.com/api/projects/payments) ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :------------------------ | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Namespaces | No | | ### Performance considerations @@ -41,15 +41,14 @@ Commercetools has some [rate limit restrictions](https://docs.commercetools.com/ 1. Create an API Client in the admin interface 2. Decide scopes for the API client. Airbyte only needs read-level access. - * Note: The UI will show all possible data sources and will show errors when syncing if it doesn't have permissions to access a resource. + - Note: The UI will show all possible data sources and will show errors when syncing if it doesn't have permissions to access a resource. 3. The `projectKey` of the store, the generated `client_id` and `client_secret` are required for the integration -5. You're ready to set up Commercetools in Airbyte! - +4. You're ready to set up Commercetools in Airbyte! ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :-------- | :----- | :------ | -| 0.2.0 | 2023-08-24 | [29384](https://github.com/airbytehq/airbyte/pull/29384) | Migrate to low code | -| 0.1.1 | 2023-08-23 | [5957](https://github.com/airbytehq/airbyte/pull/5957) | Fix schemas | -| 0.1.0 | 2021-08-19 | [5957](https://github.com/airbytehq/airbyte/pull/5957) | Initial Release. Source Commercetools | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------ | +| 0.2.0 | 2023-08-24 | [29384](https://github.com/airbytehq/airbyte/pull/29384) | Migrate to low code | +| 0.1.1 | 2023-08-23 | [5957](https://github.com/airbytehq/airbyte/pull/5957) | Fix schemas | +| 0.1.0 | 2021-08-19 | [5957](https://github.com/airbytehq/airbyte/pull/5957) | Initial Release. Source Commercetools | diff --git a/docs/integrations/sources/configcat.md b/docs/integrations/sources/configcat.md index 8459adef39b..097041b35e9 100644 --- a/docs/integrations/sources/configcat.md +++ b/docs/integrations/sources/configcat.md @@ -6,18 +6,18 @@ This source can sync data from the [Configcat API](https://api.configcat.com/doc ## This Source Supports the Following Streams -* organizations -* organization_members -* products -* tags -* environments +- organizations +- organization_members +- products +- tags +- environments ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -27,11 +27,11 @@ Configcat APIs are under rate limits for the number of API calls allowed per API ### Requirements -* Username -* Password +- Username +- Password ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-30 | [#18649](https://github.com/airbytehq/airbyte/pull/18649) | 🎉 New Source: Configcat API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------ | +| 0.1.0 | 2022-10-30 | [#18649](https://github.com/airbytehq/airbyte/pull/18649) | 🎉 New Source: Configcat API [low-code CDK] | diff --git a/docs/integrations/sources/confluence.md b/docs/integrations/sources/confluence.md index 7a36c1e80ed..708e8f17487 100644 --- a/docs/integrations/sources/confluence.md +++ b/docs/integrations/sources/confluence.md @@ -58,13 +58,13 @@ The Confluence connector should not run into Confluence API limitations under no ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------- | -| 0.2.3 | 2024-04-19 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | schema descriptions | -| 0.2.0 | 2023-08-14 | [29125](https://github.com/airbytehq/airbyte/pull/29125) | Migrate Confluence Source Connector to Low Code | -| 0.1.3 | 2023-03-13 | [23988](https://github.com/airbytehq/airbyte/pull/23988) | Add view and storage to pages body, add check for stream Audit | -| 0.1.2 | 2023-03-06 | [23775](https://github.com/airbytehq/airbyte/pull/23775) | Set additionalProperties: true, update docs and spec | -| 0.1.1 | 2022-01-31 | [9831](https://github.com/airbytehq/airbyte/pull/9831) | Fix: Spec was not pushed to cache | -| 0.1.0 | 2021-11-05 | [7241](https://github.com/airbytehq/airbyte/pull/7241) | 🎉 New Source: Confluence | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.3 | 2024-04-19 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37143](https://github.com/airbytehq/airbyte/pull/37143) | schema descriptions | +| 0.2.0 | 2023-08-14 | [29125](https://github.com/airbytehq/airbyte/pull/29125) | Migrate Confluence Source Connector to Low Code | +| 0.1.3 | 2023-03-13 | [23988](https://github.com/airbytehq/airbyte/pull/23988) | Add view and storage to pages body, add check for stream Audit | +| 0.1.2 | 2023-03-06 | [23775](https://github.com/airbytehq/airbyte/pull/23775) | Set additionalProperties: true, update docs and spec | +| 0.1.1 | 2022-01-31 | [9831](https://github.com/airbytehq/airbyte/pull/9831) | Fix: Spec was not pushed to cache | +| 0.1.0 | 2021-11-05 | [7241](https://github.com/airbytehq/airbyte/pull/7241) | 🎉 New Source: Confluence | diff --git a/docs/integrations/sources/convertkit.md b/docs/integrations/sources/convertkit.md index 63d3cba8cd7..0c09f383d7a 100644 --- a/docs/integrations/sources/convertkit.md +++ b/docs/integrations/sources/convertkit.md @@ -6,18 +6,18 @@ This source can sync data from the [ConvertKit API](https://developers.convertki ## This Source Supports the Following Streams -* sequences -* subscribers -* broadcasts -* tags -* forms +- sequences +- subscribers +- broadcasts +- tags +- forms ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -27,10 +27,10 @@ The connector has a rate limit of no more than 120 requests over a rolling 60 se ### Requirements -* ConvertKit API Secret +- ConvertKit API Secret ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-25 | [18455](https://github.com/airbytehq/airbyte/pull/18455) | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-25 | [18455](https://github.com/airbytehq/airbyte/pull/18455) | Initial commit | diff --git a/docs/integrations/sources/copper.md b/docs/integrations/sources/copper.md index 4d7265010fb..7c2147b428e 100644 --- a/docs/integrations/sources/copper.md +++ b/docs/integrations/sources/copper.md @@ -39,12 +39,12 @@ The Copper source connector supports the following [sync modes](https://docs.air ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------- | -| 0.3.4 | 2024-04-19 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Updating to 0.80.0 CDK | -| 0.3.3 | 2024-04-18 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Manage dependencies with Poetry. | -| 0.3.2 | 2024-04-15 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.1 | 2024-04-12 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | schema descriptions | -| 0.3.0 | 2023-08-10 | [*****](https://github.com/airbytehq/airbyte/pull/*****) | Migrate to low code | -| 0.2.0 | 2023-04-17 | [24824](https://github.com/airbytehq/airbyte/pull/24824) | Add `opportunities` stream | -| 0.1.0 | 2022-11-17 | [18848](https://github.com/airbytehq/airbyte/pull/18848) | 🎉 New Source: Copper [python cdk] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.3.4 | 2024-04-19 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Updating to 0.80.0 CDK | +| 0.3.3 | 2024-04-18 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Manage dependencies with Poetry. | +| 0.3.2 | 2024-04-15 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.1 | 2024-04-12 | [37145](https://github.com/airbytehq/airbyte/pull/37145) | schema descriptions | +| 0.3.0 | 2023-08-10 | [**\***](https://github.com/airbytehq/airbyte/pull/*****) | Migrate to low code | +| 0.2.0 | 2023-04-17 | [24824](https://github.com/airbytehq/airbyte/pull/24824) | Add `opportunities` stream | +| 0.1.0 | 2022-11-17 | [18848](https://github.com/airbytehq/airbyte/pull/18848) | 🎉 New Source: Copper [python cdk] | diff --git a/docs/integrations/sources/courier.md b/docs/integrations/sources/courier.md index 8f0b9ed55c3..055a36b7f10 100644 --- a/docs/integrations/sources/courier.md +++ b/docs/integrations/sources/courier.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The Courier source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The Courier source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. diff --git a/docs/integrations/sources/customer-io.md b/docs/integrations/sources/customer-io.md index 768332dd99a..88f7912593d 100644 --- a/docs/integrations/sources/customer-io.md +++ b/docs/integrations/sources/customer-io.md @@ -13,27 +13,27 @@ in the tables and columns you set up for replication, every time a sync is run. Several output streams are available from this source: -* [Campaigns](https://customer.io/docs/api/#operation/listCampaigns) \(Incremental\) -* [Campaign Actions](https://customer.io/docs/api/#operation/listCampaignActions) \(Incremental\) -* [Newsletters](https://customer.io/docs/api/#operation/listNewsletters) \(Incremental\) +- [Campaigns](https://customer.io/docs/api/#operation/listCampaigns) \(Incremental\) +- [Campaign Actions](https://customer.io/docs/api/#operation/listCampaignActions) \(Incremental\) +- [Newsletters](https://customer.io/docs/api/#operation/listNewsletters) \(Incremental\) If there are more endpoints you'd like Faros AI to support, please [create an issue.](https://github.com/faros-ai/airbyte-connectors/issues/new) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations The Customer.io API is divided into three different hosts, each serving a different component of Customer.io. This source only uses the Beta API host, -which enforces a rate limit of 10 requests per second. Please [create an +which enforces a rate limit of 10 requests per second. Please [create an issue](https://github.com/faros-ai/airbyte-connectors/issues/new) if you see any rate limit issues. @@ -41,13 +41,13 @@ rate limit issues. ### Requirements -* Customer.io App API Key +- Customer.io App API Key Please follow the [their documentation for generating an App API Key](https://customer.io/docs/managing-credentials/). ## Changelog -| Version | Date | Pull Request | Subject | -| :-------- | :----------- | :------------------------------------------------------------- | :-------------------------------------------- | -| 0.2.0 | 2021-11-09 | [29385](https://github.com/airbytehq/airbyte/pull/29385) | Migrate TS CDK to Low code | -| 0.1.23 | 2021-11-09 | [126](https://github.com/faros-ai/airbyte-connectors/pull/126) | Add Customer.io source | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------- | :------------------------- | +| 0.2.0 | 2021-11-09 | [29385](https://github.com/airbytehq/airbyte/pull/29385) | Migrate TS CDK to Low code | +| 0.1.23 | 2021-11-09 | [126](https://github.com/faros-ai/airbyte-connectors/pull/126) | Add Customer.io source | diff --git a/docs/integrations/sources/datadog.md b/docs/integrations/sources/datadog.md index be4deee99bc..ab41e329a57 100644 --- a/docs/integrations/sources/datadog.md +++ b/docs/integrations/sources/datadog.md @@ -30,17 +30,17 @@ An API key is required as well as an API application key. See the [Datadog API a ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. -4. Enter your `api_key` - Datadog API key. -5. Enter your `application_key` - Datadog application key. -6. Enter your `query` - Optional. Type your query to filter records when collecting data from Logs and AuditLogs stream. -7. Enter your `limit` - Number of records to collect per request. -8. Enter your `start_date` - Optional. Start date to filter records when collecting data from Logs and AuditLogs stream. -9. Enter your `end_date` - Optional. End date to filter records when collecting data from Logs and AuditLogs stream. -10. Enter your `queries` - Optional. Multiple queries resulting in multiple streams. - 1. Enter the `name`- Required. Query Name. - 2. Select the `data_source` - Required. Supported data sources - metrics, cloud_cost, logs, rum. - 3. Enter the `query`- Required. A classic query string. Example - `"kubernetes_state.node.count{*}"`, `"@type:resource @resource.status_code:>=400 @resource.type:(xhr OR fetch)"` +2. Set the name for your source. +3. Enter your `api_key` - Datadog API key. +4. Enter your `application_key` - Datadog application key. +5. Enter your `query` - Optional. Type your query to filter records when collecting data from Logs and AuditLogs stream. +6. Enter your `limit` - Number of records to collect per request. +7. Enter your `start_date` - Optional. Start date to filter records when collecting data from Logs and AuditLogs stream. +8. Enter your `end_date` - Optional. End date to filter records when collecting data from Logs and AuditLogs stream. +9. Enter your `queries` - Optional. Multiple queries resulting in multiple streams. + 1. Enter the `name`- Required. Query Name. + 2. Select the `data_source` - Required. Supported data sources - metrics, cloud_cost, logs, rum. + 3. Enter the `query`- Required. A classic query string. Example - `"kubernetes_state.node.count{*}"`, `"@type:resource @resource.status_code:>=400 @resource.type:(xhr OR fetch)"` 10. Click **Set up source**. ## Supported sync modes @@ -48,7 +48,7 @@ An API key is required as well as an API application key. See the [Datadog API a The Datadog source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -| :---------------- |:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | SSL connection | Yes | @@ -56,27 +56,27 @@ The Datadog source connector supports the following [sync modes](https://docs.ai ## Supported Streams -* [AuditLogs](https://docs.datadoghq.com/api/latest/audit/#search-audit-logs-events) -* [Dashboards](https://docs.datadoghq.com/api/latest/dashboards/#get-all-dashboards) -* [Downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) -* [IncidentTeams](https://docs.datadoghq.com/api/latest/incident-teams/#get-a-list-of-all-incident-teams) -* [Incidents](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incidents) -* [Logs](https://docs.datadoghq.com/api/latest/logs/#search-logs) -* [Metrics](https://docs.datadoghq.com/api/latest/metrics/#get-a-list-of-metrics) -* [Monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-details) -* [ServiceLevelObjectives](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos) -* [SyntheticTests](https://docs.datadoghq.com/api/latest/synthetics/#get-the-list-of-all-tests) -* [Users](https://docs.datadoghq.com/api/latest/users/#list-all-users) -* [Series](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl#query-timeseries-data-across-multiple-products) +- [AuditLogs](https://docs.datadoghq.com/api/latest/audit/#search-audit-logs-events) +- [Dashboards](https://docs.datadoghq.com/api/latest/dashboards/#get-all-dashboards) +- [Downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) +- [IncidentTeams](https://docs.datadoghq.com/api/latest/incident-teams/#get-a-list-of-all-incident-teams) +- [Incidents](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incidents) +- [Logs](https://docs.datadoghq.com/api/latest/logs/#search-logs) +- [Metrics](https://docs.datadoghq.com/api/latest/metrics/#get-a-list-of-metrics) +- [Monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-details) +- [ServiceLevelObjectives](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos) +- [SyntheticTests](https://docs.datadoghq.com/api/latest/synthetics/#get-the-list-of-all-tests) +- [Users](https://docs.datadoghq.com/api/latest/users/#list-all-users) +- [Series](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl#query-timeseries-data-across-multiple-products) ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:-----------------------------------------------------------------------------| -| 0.4.0 | 2023-12-04 | [30999](https://github.com/airbytehq/airbyte/pull/30999) | Add `monitors` and `service_level_objectives` Streams | -| 0.3.0 | 2023-08-27 | [29885](https://github.com/airbytehq/airbyte/pull/29885) | Migrate to low code | -| 0.2.2 | 2023-07-10 | [28089](https://github.com/airbytehq/airbyte/pull/28089) | Additional stream and query details in response | -| 0.2.1 | 2023-06-28 | [26534](https://github.com/airbytehq/airbyte/pull/26534) | Support multiple query streams and pulling data from different datadog sites | -| 0.2.0 | 2023-06-28 | [27784](https://github.com/airbytehq/airbyte/pull/27784) | Add necessary fields to schemas | -| 0.1.1 | 2023-04-27 | [25562](https://github.com/airbytehq/airbyte/pull/25562) | Update testing dependencies | -| 0.1.0 | 2022-10-18 | [18150](https://github.com/airbytehq/airbyte/pull/18150) | New Source: Datadog | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------- | +| 0.4.0 | 2023-12-04 | [30999](https://github.com/airbytehq/airbyte/pull/30999) | Add `monitors` and `service_level_objectives` Streams | +| 0.3.0 | 2023-08-27 | [29885](https://github.com/airbytehq/airbyte/pull/29885) | Migrate to low code | +| 0.2.2 | 2023-07-10 | [28089](https://github.com/airbytehq/airbyte/pull/28089) | Additional stream and query details in response | +| 0.2.1 | 2023-06-28 | [26534](https://github.com/airbytehq/airbyte/pull/26534) | Support multiple query streams and pulling data from different datadog sites | +| 0.2.0 | 2023-06-28 | [27784](https://github.com/airbytehq/airbyte/pull/27784) | Add necessary fields to schemas | +| 0.1.1 | 2023-04-27 | [25562](https://github.com/airbytehq/airbyte/pull/25562) | Update testing dependencies | +| 0.1.0 | 2022-10-18 | [18150](https://github.com/airbytehq/airbyte/pull/18150) | New Source: Datadog | diff --git a/docs/integrations/sources/datascope.md b/docs/integrations/sources/datascope.md index 3fa3285786b..71e49a392a5 100644 --- a/docs/integrations/sources/datascope.md +++ b/docs/integrations/sources/datascope.md @@ -4,8 +4,7 @@ This page contains the setup guide and reference information for the [DataScope] ## Prerequisites -A DataScope account with access to the API. You can create a free account [here](https://www.mydatascope.com/webhooks). - +A DataScope account with access to the API. You can create a free account [here](https://www.mydatascope.com/webhooks). ## Setup guide @@ -30,7 +29,7 @@ A DataScope account with access to the API. You can create a free account [here] 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key` which will be flagged with Authorization header. -6. Click **Set up source**. +4. Click **Set up source**. ## Supported sync modes @@ -50,6 +49,7 @@ The DataScope source connector supports the following [sync modes](https://docs. - answers Implemented but not added streams: + - Lists - Notifications @@ -60,5 +60,5 @@ GET https://www.mydatascope.com/api/external/locations ## Changelog | Version | Date | Pull Request | Subject | -| :------ |:-----------|:----------------------------------------------------------| :------------- | +| :------ | :--------- | :-------------------------------------------------------- | :------------- | | 0.1.0 | 2022-10-31 | [#18725](https://github.com/airbytehq/airbyte/pull/18725) | Initial commit | diff --git a/docs/integrations/sources/db2.md b/docs/integrations/sources/db2.md index d138124ed68..849c435b6a5 100644 --- a/docs/integrations/sources/db2.md +++ b/docs/integrations/sources/db2.md @@ -12,11 +12,11 @@ The IBM Db2 source does not alter the schema present in your warehouse. Dependin #### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Namespaces | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :------------------------ | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Namespaces | Yes | | ## Getting started @@ -58,30 +58,30 @@ You can also enter your own password for the keystore, but if you don't, the pas ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :--- |:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.2.2 | 2024-02-13 | [35233](https://github.com/airbytehq/airbyte/pull/35233) | Adopt CDK 0.20.4 | -| 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | -| 0.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | -| 0.1.20 | 2023-06-20 | [27212](https://github.com/airbytehq/airbyte/pull/27212) | Fix silent exception swallowing in StreamingJdbcDatabase | -| 0.1.19 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | -| 0.1.18 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | -| 0.1.17 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | -| | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | -| 0.1.16 | 2022-09-06 | [16354](https://github.com/airbytehq/airbyte/pull/16354) | Add custom JDBC params | -| 0.1.15 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | -| 0.1.14 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | -| 0.1.13 | 2022-07-22 | [14714](https://github.com/airbytehq/airbyte/pull/14714) | Clarified error message when invalid cursor column selected | -| 0.1.12 | 2022-07-14 | [14574](https://github.com/airbytehq/airbyte/pull/14574) | Removed additionalProperties:false from JDBC source connectors | -| 0.1.11 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | -| 0.1.10 | 2022-04-29 | [12480](https://github.com/airbytehq/airbyte/pull/12480) | Query tables with adaptive fetch size to optimize JDBC memory consumption | -| 0.1.9 | 2022-02-21 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Fixed cursor for old connectors that use non-microsecond format. Now connectors work with both formats | -| 0.1.8 | 2022-02-18 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Updated timestamp transformation with microseconds | -| 0.1.7 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option |**** -| 0.1.6 | 2022-02-08 | [10173](https://github.com/airbytehq/airbyte/pull/10173) | Improved discovering tables in case if user does not have permissions to any table | -| 0.1.5 | 2022-02-01 | [9875](https://github.com/airbytehq/airbyte/pull/9875) | Discover only permitted for user tables | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------------------------------------------------------ | :---------------------------------------------------------------------------------------------------------------------------------------- | -------- | +| 0.2.2 | 2024-02-13 | [35233](https://github.com/airbytehq/airbyte/pull/35233) | Adopt CDK 0.20.4 | +| 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | +| 0.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | +| 0.1.20 | 2023-06-20 | [27212](https://github.com/airbytehq/airbyte/pull/27212) | Fix silent exception swallowing in StreamingJdbcDatabase | +| 0.1.19 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | +| 0.1.18 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | +| 0.1.17 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | +| | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | +| 0.1.16 | 2022-09-06 | [16354](https://github.com/airbytehq/airbyte/pull/16354) | Add custom JDBC params | +| 0.1.15 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | +| 0.1.14 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field | +| 0.1.13 | 2022-07-22 | [14714](https://github.com/airbytehq/airbyte/pull/14714) | Clarified error message when invalid cursor column selected | +| 0.1.12 | 2022-07-14 | [14574](https://github.com/airbytehq/airbyte/pull/14574) | Removed additionalProperties:false from JDBC source connectors | +| 0.1.11 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | +| 0.1.10 | 2022-04-29 | [12480](https://github.com/airbytehq/airbyte/pull/12480) | Query tables with adaptive fetch size to optimize JDBC memory consumption | +| 0.1.9 | 2022-02-21 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Fixed cursor for old connectors that use non-microsecond format. Now connectors work with both formats | +| 0.1.8 | 2022-02-18 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Updated timestamp transformation with microseconds | +| 0.1.7 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | \*\*\*\* | +| 0.1.6 | 2022-02-08 | [10173](https://github.com/airbytehq/airbyte/pull/10173) | Improved discovering tables in case if user does not have permissions to any table | +| 0.1.5 | 2022-02-01 | [9875](https://github.com/airbytehq/airbyte/pull/9875) | Discover only permitted for user tables | | 0.1.4 | 2021-12-30 | [9187](https://github.com/airbytehq/airbyte/pull/9187) [8749](https://github.com/airbytehq/airbyte/pull/8749) | Add support of JdbcType.ARRAY to JdbcSourceOperations. | -| 0.1.3 | 2021-11-05 | [7670](https://github.com/airbytehq/airbyte/pull/7670) | Updated unique DB2 types transformation | -| 0.1.2 | 2021-10-25 | [7355](https://github.com/airbytehq/airbyte/pull/7355) | Added ssl support | -| 0.1.1 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator | -| 0.1.0 | 2021-06-22 | [4197](https://github.com/airbytehq/airbyte/pull/4197) | New Source: IBM DB2 | +| 0.1.3 | 2021-11-05 | [7670](https://github.com/airbytehq/airbyte/pull/7670) | Updated unique DB2 types transformation | +| 0.1.2 | 2021-10-25 | [7355](https://github.com/airbytehq/airbyte/pull/7355) | Added ssl support | +| 0.1.1 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator | +| 0.1.0 | 2021-06-22 | [4197](https://github.com/airbytehq/airbyte/pull/4197) | New Source: IBM DB2 | diff --git a/docs/integrations/sources/delighted.md b/docs/integrations/sources/delighted.md index 9898aa98360..7ad0b9db8bf 100644 --- a/docs/integrations/sources/delighted.md +++ b/docs/integrations/sources/delighted.md @@ -50,17 +50,17 @@ This source is capable of syncing the following core streams: ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------| -| 0.2.7 | 2024-04-19 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Updating to 0.80.0 CDK | -| 0.2.6 | 2024-04-18 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Manage dependencies with Poetry. | -| 0.2.5 | 2024-04-15 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.4 | 2024-04-12 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | schema descriptions | -| 0.2.3 | 2023-09-08 | [27946](https://github.com/airbytehq/airbyte/pull/27946) | Changed `Date Since` input field title to `Replication Start Date` | -| 0.2.2 | 2023-03-09 | [23909](https://github.com/airbytehq/airbyte/pull/23909) | Updated the input config pattern to accept both `RFC3339` and `datetime string` formats in UI | -| 0.2.1 | 2023-02-14 | [23009](https://github.com/airbytehq/airbyte/pull/23009) | Specified date formatting in specification | -| 0.2.0 | 2022-11-22 | [19822](https://github.com/airbytehq/airbyte/pull/19822) | Migrate to Low code + certify to Beta | -| 0.1.4 | 2022-06-10 | [13439](https://github.com/airbytehq/airbyte/pull/13439) | Change since parameter input to iso date | -| 0.1.3 | 2022-01-31 | [9550](https://github.com/airbytehq/airbyte/pull/9550) | Output only records in which cursor field is greater than the value in state for incremental streams | -| 0.1.2 | 2022-01-06 | [9333](https://github.com/airbytehq/airbyte/pull/9333) | Add incremental sync mode to streams in `integration_tests/configured_catalog.json` | -| 0.1.1 | 2022-01-04 | [9275](https://github.com/airbytehq/airbyte/pull/9275) | Fix pagination handling for `survey_responses`, `bounces` and `unsubscribes` streams | -| 0.1.0 | 2021-10-27 | [4551](https://github.com/airbytehq/airbyte/pull/4551) | Add Delighted source connector | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------- | +| 0.2.7 | 2024-04-19 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Updating to 0.80.0 CDK | +| 0.2.6 | 2024-04-18 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Manage dependencies with Poetry. | +| 0.2.5 | 2024-04-15 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.4 | 2024-04-12 | [37149](https://github.com/airbytehq/airbyte/pull/37149) | schema descriptions | +| 0.2.3 | 2023-09-08 | [27946](https://github.com/airbytehq/airbyte/pull/27946) | Changed `Date Since` input field title to `Replication Start Date` | +| 0.2.2 | 2023-03-09 | [23909](https://github.com/airbytehq/airbyte/pull/23909) | Updated the input config pattern to accept both `RFC3339` and `datetime string` formats in UI | +| 0.2.1 | 2023-02-14 | [23009](https://github.com/airbytehq/airbyte/pull/23009) | Specified date formatting in specification | +| 0.2.0 | 2022-11-22 | [19822](https://github.com/airbytehq/airbyte/pull/19822) | Migrate to Low code + certify to Beta | +| 0.1.4 | 2022-06-10 | [13439](https://github.com/airbytehq/airbyte/pull/13439) | Change since parameter input to iso date | +| 0.1.3 | 2022-01-31 | [9550](https://github.com/airbytehq/airbyte/pull/9550) | Output only records in which cursor field is greater than the value in state for incremental streams | +| 0.1.2 | 2022-01-06 | [9333](https://github.com/airbytehq/airbyte/pull/9333) | Add incremental sync mode to streams in `integration_tests/configured_catalog.json` | +| 0.1.1 | 2022-01-04 | [9275](https://github.com/airbytehq/airbyte/pull/9275) | Fix pagination handling for `survey_responses`, `bounces` and `unsubscribes` streams | +| 0.1.0 | 2021-10-27 | [4551](https://github.com/airbytehq/airbyte/pull/4551) | Add Delighted source connector | diff --git a/docs/integrations/sources/dixa.md b/docs/integrations/sources/dixa.md index 23e5f2cbc12..c9f12b65d00 100644 --- a/docs/integrations/sources/dixa.md +++ b/docs/integrations/sources/dixa.md @@ -51,7 +51,7 @@ When using the connector, keep in mind that increasing the `batch_size` paramete | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------- | -| 0.3.0 | 2023-10-17 | [30994](https://github.com/airbytehq/airbyte/pull/30994) | Migrate to Low-code Framework | +| 0.3.0 | 2023-10-17 | [30994](https://github.com/airbytehq/airbyte/pull/30994) | Migrate to Low-code Framework | | 0.2.0 | 2023-06-08 | [25103](https://github.com/airbytehq/airbyte/pull/25103) | Add fields to `conversation_export` stream | | 0.1.3 | 2022-07-07 | [14437](https://github.com/airbytehq/airbyte/pull/14437) | 🎉 Source Dixa: bump version 0.1.3 | | 0.1.2 | 2021-11-08 | [7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | diff --git a/docs/integrations/sources/dockerhub.md b/docs/integrations/sources/dockerhub.md index 756664276f6..ff83696a543 100644 --- a/docs/integrations/sources/dockerhub.md +++ b/docs/integrations/sources/dockerhub.md @@ -8,15 +8,15 @@ This source can sync data for the DockerHub API. It currently supports only [lis This Source is capable of syncing the following Streams: -* DockerHub +- DockerHub ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| Namespaces | No | | ### Performance considerations @@ -26,7 +26,7 @@ This connector has been tested for the Airbyte organization, which has 266 repos ### Requirements -* None +- None ### Setup guide @@ -34,13 +34,12 @@ This connector has been tested for the Airbyte organization, which has 266 repos ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.4 | 2024-04-19 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | schema descriptions | -| 0.2.0 | 2023-08-24 | [29320](https://github.com/airbytehq/airbyte/pull/29320) | Migrate to Low Code | -| 0.1.1 | 2023-08-16 | [13007](https://github.com/airbytehq/airbyte/pull/13007) | Fix schema and tests | -| 0.1.0 | 2022-05-20 | [13007](https://github.com/airbytehq/airbyte/pull/13007) | New source | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37151](https://github.com/airbytehq/airbyte/pull/37151) | schema descriptions | +| 0.2.0 | 2023-08-24 | [29320](https://github.com/airbytehq/airbyte/pull/29320) | Migrate to Low Code | +| 0.1.1 | 2023-08-16 | [13007](https://github.com/airbytehq/airbyte/pull/13007) | Fix schema and tests | +| 0.1.0 | 2022-05-20 | [13007](https://github.com/airbytehq/airbyte/pull/13007) | New source | diff --git a/docs/integrations/sources/dremio.md b/docs/integrations/sources/dremio.md index 0c3166340df..141b8303876 100644 --- a/docs/integrations/sources/dremio.md +++ b/docs/integrations/sources/dremio.md @@ -14,28 +14,28 @@ If there are more endpoints you'd like Airbyte to support, please [create an iss ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | No | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | No | +| SSL connection | Yes | +| Namespaces | No | ## Getting started ### Requirements -* API Key -* Base URL +- API Key +- Base URL ### Setup guide + Connector needs a self-hosted instance of Dremio, this way you can access the Dremio REST API on which this source is based. Please refer to [Dremio Deployment Models](https://docs.dremio.com/software/deployment/deployment-models/) document, or take a look at [Dremio OSS](https://github.com/dremio/dremio-oss) for reference. Please read [How to get your APIs credentials](https://docs.dremio.com/software/rest-api/#authenticationn). ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-12-01 | [19912](https://github.com/airbytehq/airbyte/pull/19912) | New Source: Dremio | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------- | +| 0.1.0 | 2022-12-01 | [19912](https://github.com/airbytehq/airbyte/pull/19912) | New Source: Dremio | diff --git a/docs/integrations/sources/drift.md b/docs/integrations/sources/drift.md index 03595a998b2..6ce8277721e 100644 --- a/docs/integrations/sources/drift.md +++ b/docs/integrations/sources/drift.md @@ -8,16 +8,16 @@ The Drift source supports Full Refresh syncs. That is, every time a sync is run, Several output streams are available from this source: -* [Accounts](https://devdocs.drift.com/docs/account-model) -* [Conversations](https://devdocs.drift.com/docs/conversation-model) -* [Users](https://devdocs.drift.com/docs/user-model) +- [Accounts](https://devdocs.drift.com/docs/account-model) +- [Conversations](https://devdocs.drift.com/docs/conversation-model) +- [Users](https://devdocs.drift.com/docs/user-model) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features | Feature | Supported? | -|:------------------------------|:------------| +| :---------------------------- | :---------- | | Full Refresh Sync | Yes | | Incremental Sync | Coming soon | | Replicate Incremental Deletes | Coming soon | @@ -32,17 +32,19 @@ The Drift connector should not run into Drift API limitations under normal usage ### Requirements -* A Drift API token linked to a Drift App with the following scopes: - * `conversation_read` to access Conversions - * `user_read` to access Users - * `account_read` to access Accounts +- A Drift API token linked to a Drift App with the following scopes: + - `conversation_read` to access Conversions + - `user_read` to access Users + - `account_read` to access Accounts ### Setup guide #### Authenticate using `Access Token` -* Follow Drift's [Setting Things Up ](https://devdocs.drift.com/docs/quick-start)guide for a more detailed description of how to obtain the API token. + +- Follow Drift's [Setting Things Up ](https://devdocs.drift.com/docs/quick-start)guide for a more detailed description of how to obtain the API token. #### Authenticate using `OAuth2.0` + 1. Select `OAuth2.0` from `Authorization Method` dropdown 2. Click on `Authenticate your Drift account` 3. Proceed the authentication in order to obtain the `access_token` @@ -50,7 +52,7 @@ The Drift connector should not run into Drift API limitations under normal usage ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | | 0.3.4 | 2024-05-03 | [37592](https://github.com/airbytehq/airbyte/pull/37592) | Change `last_records` to `last_record` | | 0.3.3 | 2024-04-19 | [37153](https://github.com/airbytehq/airbyte/pull/37153) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | | 0.3.2 | 2024-04-15 | [37153](https://github.com/airbytehq/airbyte/pull/37153) | Base image migration: remove Dockerfile and use the python-connector-base image | diff --git a/docs/integrations/sources/drupal.md b/docs/integrations/sources/drupal.md index f7e3591c75e..73c46709fe6 100644 --- a/docs/integrations/sources/drupal.md +++ b/docs/integrations/sources/drupal.md @@ -12,10 +12,10 @@ You will only be able to connect to a self-hosted instance of Drupal using these Drupal can run on MySQL, Percona, MariaDb, MSSQL, MongoDB, Postgres, or SQL-Lite. If you're not using SQL-lite, you can use Airbyte to sync your Drupal instance by connecting to the underlying database using the appropriate Airbyte connector: -* [MySQL/Percona/MariaDB](mysql.md) -* [MSSQL](mssql.md) -* [Mongo](mongodb-v2.md) -* [Postgres](postgres.md) +- [MySQL/Percona/MariaDB](mysql.md) +- [MSSQL](mssql.md) +- [Mongo](mongodb-v2.md) +- [Postgres](postgres.md) :::info @@ -26,4 +26,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The schema will be loaded according to the rules of the underlying database's connector. - diff --git a/docs/integrations/sources/dv-360.md b/docs/integrations/sources/dv-360.md index ebdcad8d041..504d39ffc1c 100644 --- a/docs/integrations/sources/dv-360.md +++ b/docs/integrations/sources/dv-360.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The Display & Video 360 source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The Display & Video 360 source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. diff --git a/docs/integrations/sources/dynamodb.md b/docs/integrations/sources/dynamodb.md index f81340759c5..3d96d84902c 100644 --- a/docs/integrations/sources/dynamodb.md +++ b/docs/integrations/sources/dynamodb.md @@ -58,8 +58,8 @@ This guide describes in details how you can configure the connector to connect w ## Role Based Access Defining **_access_key_id_** and **_secret_access_key_** will use User based Access. Role based access can be achieved -by omitting both values from the configuration. The connector will then use DefaultCredentialsProvider which will use -the underlying role executing the container workload in AWS. +by omitting both values from the configuration. The connector will then use DefaultCredentialsProvider which will use +the underlying role executing the container workload in AWS. ### Сonfiguration Parameters @@ -73,15 +73,15 @@ the underlying role executing the container workload in AWS. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------| :--------- | :-------------------------------------------------------- |:-----------------------------------------------------------------------| -| 0.3.2 | 2024-05-01 | [27045](https://github.com/airbytehq/airbyte/pull/27045) | Fix missing scan permissions | -| 0.3.1 | 2024-05-01 | [31935](https://github.com/airbytehq/airbyte/pull/31935) | Fix list more than 100 tables | -| 0.3.0 | 2024-04-24 | [37530](https://github.com/airbytehq/airbyte/pull/37530) | Allow role based access | -| 0.2.3 | 2024-02-13 | [35232](https://github.com/airbytehq/airbyte/pull/35232) | Adopt CDK 0.20.4 | -| 0.2.2 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | -| 0.2.1 | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region | -| 0.2.0 | 18-12-2023 | https://github.com/airbytehq/airbyte/pull/33485 | Remove LEGACY state | -| 0.1.2 | 01-19-2023 | https://github.com/airbytehq/airbyte/pull/20172 | Fix reserved words in projection expression & make them configurable | -| 0.1.1 | 02-09-2023 | https://github.com/airbytehq/airbyte/pull/22682 | Fix build | -| 0.1.0 | 11-14-2022 | https://github.com/airbytehq/airbyte/pull/18750 | Initial version | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------- | +| 0.3.2 | 2024-05-01 | [27045](https://github.com/airbytehq/airbyte/pull/27045) | Fix missing scan permissions | +| 0.3.1 | 2024-05-01 | [31935](https://github.com/airbytehq/airbyte/pull/31935) | Fix list more than 100 tables | +| 0.3.0 | 2024-04-24 | [37530](https://github.com/airbytehq/airbyte/pull/37530) | Allow role based access | +| 0.2.3 | 2024-02-13 | [35232](https://github.com/airbytehq/airbyte/pull/35232) | Adopt CDK 0.20.4 | +| 0.2.2 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | +| 0.2.1 | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region | +| 0.2.0 | 18-12-2023 | https://github.com/airbytehq/airbyte/pull/33485 | Remove LEGACY state | +| 0.1.2 | 01-19-2023 | https://github.com/airbytehq/airbyte/pull/20172 | Fix reserved words in projection expression & make them configurable | +| 0.1.1 | 02-09-2023 | https://github.com/airbytehq/airbyte/pull/22682 | Fix build | +| 0.1.0 | 11-14-2022 | https://github.com/airbytehq/airbyte/pull/18750 | Initial version | diff --git a/docs/integrations/sources/e2e-test-cloud.md b/docs/integrations/sources/e2e-test-cloud.md index 3e18bdf9f30..2cc09e5928c 100644 --- a/docs/integrations/sources/e2e-test-cloud.md +++ b/docs/integrations/sources/e2e-test-cloud.md @@ -29,7 +29,7 @@ Here is its configuration: The OSS and Cloud variants have the same version number. The Cloud variant was initially released at version `1.0.0`. | Version | Date | Pull request | Subject | -|---------|------------|----------------------------------------------------------|-----------------------------------------------------| +| ------- | ---------- | -------------------------------------------------------- | --------------------------------------------------- | | 2.2.1 | 2024-02-13 | [35231](https://github.com/airbytehq/airbyte/pull/35231) | Adopt JDK 0.20.4. | | 2.1.5 | 2023-10-06 | [31092](https://github.com/airbytehq/airbyte/pull/31092) | Bring in changes from oss | | 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Fix inheritance between e2e-test and e2e-test-cloud | diff --git a/docs/integrations/sources/e2e-test.md b/docs/integrations/sources/e2e-test.md index 0b459ebe238..e6e6d76dd4f 100644 --- a/docs/integrations/sources/e2e-test.md +++ b/docs/integrations/sources/e2e-test.md @@ -28,10 +28,10 @@ Here is its configuration: | | random seed | integer | no | current time millis | The seed is used in random Json object generation. Min 0. Max 1 million. | | | message interval | integer | no | 0 | The time interval between messages in millisecond. Min 0 ms. Max 60000 ms (1 minute). | - #### Example Stream Schemas + If you need a stream for testing performance simulating a wide table, we have an example [500 column stream](https://gist.github.com/jbfbell/9b7db8fdf0de0187c7da92df2f699502) -or use the form below to generate your own with an arbitrary width, then copy+paste the resulting schema into your configuration. +or use the form below to generate your own with an arbitrary width, then copy+paste the resulting schema into your configuration. @@ -42,10 +42,9 @@ This is a legacy mode used in Airbyte integration tests. It has been removed sin ```json { "type": "object", - "properties": - { - "column1": { "type": "string" } - } + "properties": { + "column1": { "type": "string" } + } } ``` @@ -70,19 +69,19 @@ This mode is also excluded from the Cloud variant of this connector. The OSS and Cloud variants have the same version number. The Cloud variant was initially released at version `1.0.0`. -| Version | Date | Pull request | Subject | -|---------|------------| ------------------------------------------------------------------ |-------------------------------------------------------------------------------------------------------| -| 2.2.2 | 2024-04-25 | [37581](https://github.com/airbytehq/airbyte/pull/37581) | bump jsonschemafriend to 0.12.4 | -| 2.2.1 | 2024-02-13 | [35231](https://github.com/airbytehq/airbyte/pull/35231) | Adopt JDK 0.20.4. | -| 2.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | -| 2.1.5 | 2023-10-04 | [31092](https://github.com/airbytehq/airbyte/pull/31092) | Bump jsonschemafriend dependency version to fix bug | -| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Add speed benchmark mode to e2e test | -| 2.1.3 | 2022-08-25 | [15591](https://github.com/airbytehq/airbyte/pull/15591) | Declare supported sync modes in catalogs | -| 2.1.1 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | -| 2.1.0 | 2021-02-12 | [\#10298](https://github.com/airbytehq/airbyte/pull/10298) | Support stream duplication to quickly create a multi-stream catalog. | -| 2.0.0 | 2021-02-01 | [\#9954](https://github.com/airbytehq/airbyte/pull/9954) | Remove legacy modes. Use more efficient Json generator. | -| 1.0.1 | 2021-01-29 | [\#9745](https://github.com/airbytehq/airbyte/pull/9745) | Integrate with Sentry. | -| 1.0.0 | 2021-01-23 | [\#9720](https://github.com/airbytehq/airbyte/pull/9720) | Add new continuous feed mode that supports arbitrary catalog specification. Initial release to cloud. | -| 0.1.2 | 2022-10-18 | [\#18100](https://github.com/airbytehq/airbyte/pull/18100) | Set supported sync mode on streams | -| 0.1.1 | 2021-12-16 | [\#8217](https://github.com/airbytehq/airbyte/pull/8217) | Fix sleep time in infinite feed mode. | +| Version | Date | Pull request | Subject | +| ------- | ---------- | ----------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| 2.2.2 | 2024-04-25 | [37581](https://github.com/airbytehq/airbyte/pull/37581) | bump jsonschemafriend to 0.12.4 | +| 2.2.1 | 2024-02-13 | [35231](https://github.com/airbytehq/airbyte/pull/35231) | Adopt JDK 0.20.4. | +| 2.2.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | +| 2.1.5 | 2023-10-04 | [31092](https://github.com/airbytehq/airbyte/pull/31092) | Bump jsonschemafriend dependency version to fix bug | +| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Add speed benchmark mode to e2e test | +| 2.1.3 | 2022-08-25 | [15591](https://github.com/airbytehq/airbyte/pull/15591) | Declare supported sync modes in catalogs | +| 2.1.1 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | +| 2.1.0 | 2021-02-12 | [\#10298](https://github.com/airbytehq/airbyte/pull/10298) | Support stream duplication to quickly create a multi-stream catalog. | +| 2.0.0 | 2021-02-01 | [\#9954](https://github.com/airbytehq/airbyte/pull/9954) | Remove legacy modes. Use more efficient Json generator. | +| 1.0.1 | 2021-01-29 | [\#9745](https://github.com/airbytehq/airbyte/pull/9745) | Integrate with Sentry. | +| 1.0.0 | 2021-01-23 | [\#9720](https://github.com/airbytehq/airbyte/pull/9720) | Add new continuous feed mode that supports arbitrary catalog specification. Initial release to cloud. | +| 0.1.2 | 2022-10-18 | [\#18100](https://github.com/airbytehq/airbyte/pull/18100) | Set supported sync mode on streams | +| 0.1.1 | 2021-12-16 | [\#8217](https://github.com/airbytehq/airbyte/pull/8217) | Fix sleep time in infinite feed mode. | | 0.1.0 | 2021-07-23 | [\#3290](https://github.com/airbytehq/airbyte/pull/3290) [\#4939](https://github.com/airbytehq/airbyte/pull/4939) | Initial release. | diff --git a/docs/integrations/sources/elasticsearch.md b/docs/integrations/sources/elasticsearch.md index 2aa1a3fbb61..8c7d5a2932c 100644 --- a/docs/integrations/sources/elasticsearch.md +++ b/docs/integrations/sources/elasticsearch.md @@ -82,9 +82,9 @@ all values in the array must be of the same data type. Hence, every field can be ## Changelog -| Version | Date | Pull Request | Subject | -|:--------| :--------- | :------------------------------------------------------- | :-------------- | -| 0.1.2 | 2024-02-13 | [35230](https://github.com/airbytehq/airbyte/pull/35230) | Adopt CDK 0.20.4 | -| `0.1.2` | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | +| 0.1.2 | 2024-02-13 | [35230](https://github.com/airbytehq/airbyte/pull/35230) | Adopt CDK 0.20.4 | +| `0.1.2` | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | `0.1.1` | 2022-12-02 | [18118](https://github.com/airbytehq/airbyte/pull/18118) | Avoid too_long_frame_exception | -| `0.1.0` | 2022-07-12 | [14118](https://github.com/airbytehq/airbyte/pull/14118) | Initial Release | +| `0.1.0` | 2022-07-12 | [14118](https://github.com/airbytehq/airbyte/pull/14118) | Initial Release | diff --git a/docs/integrations/sources/emailoctopus.md b/docs/integrations/sources/emailoctopus.md index dcbadd43b42..432adfee4b4 100644 --- a/docs/integrations/sources/emailoctopus.md +++ b/docs/integrations/sources/emailoctopus.md @@ -1,20 +1,21 @@ # EmailOctopus ## Requirements -* [EmailOctopus account](https://help.emailoctopus.com) -* EmailOctopus [API key](https://help.emailoctopus.com/article/165-how-to-create-and-delete-api-keys) + +- [EmailOctopus account](https://help.emailoctopus.com) +- EmailOctopus [API key](https://help.emailoctopus.com/article/165-how-to-create-and-delete-api-keys) ## Supported sync modes -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | [Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :--------------------------------------------------------------------------------------------- | +| Full Refresh Sync | Yes | [Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) | +| Incremental Sync | No | | ## Supported Streams -* [Get all campaigns](https://emailoctopus.com/api-documentation/campaigns/get-all) -* [Get all lists](https://emailoctopus.com/api-documentation/lists/get-all) +- [Get all campaigns](https://emailoctopus.com/api-documentation/campaigns/get-all) +- [Get all lists](https://emailoctopus.com/api-documentation/lists/get-all) ## Performance considerations @@ -22,9 +23,9 @@ No documented strict rate limit. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.3 | 2024-04-19 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | schema descriptions | -| 0.1.0 | 2022-10-29 | [18647](https://github.com/airbytehq/airbyte/pull/18647) | Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37154](https://github.com/airbytehq/airbyte/pull/37154) | schema descriptions | +| 0.1.0 | 2022-10-29 | [18647](https://github.com/airbytehq/airbyte/pull/18647) | Initial commit | diff --git a/docs/integrations/sources/everhour.md b/docs/integrations/sources/everhour.md index d8a99925fc5..ed6634d018a 100644 --- a/docs/integrations/sources/everhour.md +++ b/docs/integrations/sources/everhour.md @@ -8,7 +8,7 @@ This page contains the setup guide and reference information for the [Everhour]( ## Supported sync modes -Currently, this project only supports full sync mode. +Currently, this project only supports full sync mode. ## Supported Streams @@ -23,6 +23,6 @@ This project supports the following streams: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------| -| 0.1.0 | 2023-02-28 | [23593](https://github.com/airbytehq/airbyte/pull/23593) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------- | +| 0.1.0 | 2023-02-28 | [23593](https://github.com/airbytehq/airbyte/pull/23593) | Initial Release | diff --git a/docs/integrations/sources/exchange-rates.md b/docs/integrations/sources/exchange-rates.md index 635cea69122..4e890f95341 100644 --- a/docs/integrations/sources/exchange-rates.md +++ b/docs/integrations/sources/exchange-rates.md @@ -27,7 +27,7 @@ If you have a `free` subscription plan, you will have two limitations to the pla 1. Limit of 1,000 API calls per month 2. You won't be able to specify the `base` parameter, meaning that you will be only be allowed to use the default base value which is `EUR`. -::: + ::: ### Step 2: Set up the Exchange Rates connector in Airbyte @@ -58,10 +58,10 @@ Each record in the stream contains many fields: ## Data type map -| Field | Airbyte Type | -| :------------------------ | :----------- | -| Currency | `number` | -| Date | `string` | +| Field | Airbyte Type | +| :------- | :----------- | +| Currency | `number` | +| Date | `string` | ## Limitations & Troubleshooting @@ -78,8 +78,8 @@ The Exchange Rates API has rate limits that vary per pricing plan. The free plan ### Troubleshooting -* With the free plan, you won't be able to specify the `base` parameter, meaning that you will be only be allowed to use the default base value which is `EUR`. -* Check out common troubleshooting issues for the Exchange Rates API source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- With the free plan, you won't be able to specify the `base` parameter, meaning that you will be only be allowed to use the default base value which is `EUR`. +- Check out common troubleshooting issues for the Exchange Rates API source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). @@ -87,8 +87,8 @@ The Exchange Rates API has rate limits that vary per pricing plan. The free plan | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------ | -| 1.3.0 | 2023-08-25 | [29299](https://github.com/airbytehq/airbyte/pull/29299) | Migrate to low-code | -| 1.2.9 | 2023-08-15 | [23000](https://github.com/airbytehq/airbyte/pull/23000) | Fix schema and tests | +| 1.3.0 | 2023-08-25 | [29299](https://github.com/airbytehq/airbyte/pull/29299) | Migrate to low-code | +| 1.2.9 | 2023-08-15 | [23000](https://github.com/airbytehq/airbyte/pull/23000) | Fix schema and tests | | 1.2.8 | 2023-02-14 | [23000](https://github.com/airbytehq/airbyte/pull/23000) | Specified date formatting in specification | | 1.2.7 | 2022-10-31 | [18726](https://github.com/airbytehq/airbyte/pull/18726) | Fix handling error during check connection | | 1.2.6 | 2022-08-23 | [15884](https://github.com/airbytehq/airbyte/pull/15884) | Migrated to new API Layer endpoint | @@ -100,4 +100,4 @@ The Exchange Rates API has rate limits that vary per pricing plan. The free plan | 0.2.0 | 2021-05-26 | [3566](https://github.com/airbytehq/airbyte/pull/3566) | Move from `api.ratesapi.io/` to `api.exchangeratesapi.io/`. Add required field `access_key` to `config.json`. | | 0.1.0 | 2021-04-19 | [2942](https://github.com/airbytehq/airbyte/pull/2942) | Implement Exchange API using the CDK | - \ No newline at end of file + diff --git a/docs/integrations/sources/facebook-marketing-migrations.md b/docs/integrations/sources/facebook-marketing-migrations.md index d4c4c06765a..77fe1a1f517 100644 --- a/docs/integrations/sources/facebook-marketing-migrations.md +++ b/docs/integrations/sources/facebook-marketing-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 2.0.0 -Streams Ads-Insights-* streams now have updated schemas. +Streams Ads-Insights-\* streams now have updated schemas. :::danger Please note that data older than 37 months will become unavailable due to Facebook limitations. @@ -12,7 +12,7 @@ It is recommended to create a backup at the destination before proceeding with m ### Update Custom Insights Reports (this step can be skipped if you did not define any) 1. Select **Sources** in the main navbar. - 1. Select the Facebook Marketing Connector. + 1. Select the Facebook Marketing Connector. 2. Select the **Retest saved source**. 3. Remove unsupported fields from the list in Custom Insights section. 4. Select **Test and Save**. @@ -20,23 +20,22 @@ It is recommended to create a backup at the destination before proceeding with m ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: -4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: +4. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/facebook-marketing.md b/docs/integrations/sources/facebook-marketing.md index 911144f8371..3a49498f2fe 100644 --- a/docs/integrations/sources/facebook-marketing.md +++ b/docs/integrations/sources/facebook-marketing.md @@ -20,10 +20,10 @@ If you are not the owner/admin of the Ad account, you must be granted [permissio A [Facebook app](https://developers.facebook.com/apps/) with the Marketing API enabled and the following permissions: - - [ads_management](https://developers.facebook.com/docs/permissions#a) - - [ads_read](https://developers.facebook.com/docs/permissions#a) - - [business_management](https://developers.facebook.com/docs/permissions#b) - - [read_insights](https://developers.facebook.com/docs/permissions#r) +- [ads_management](https://developers.facebook.com/docs/permissions#a) +- [ads_read](https://developers.facebook.com/docs/permissions#a) +- [business_management](https://developers.facebook.com/docs/permissions#b) +- [read_insights](https://developers.facebook.com/docs/permissions#r) @@ -62,9 +62,11 @@ You can use the [Access Token Tool](https://developers.facebook.com/tools/access 5. To authenticate the connection: + **For Airbyte Cloud**: Click **Authenticate your account** to authorize your Facebook account. Make sure you are logged into the right account, as Airbyte will authenticate the account you are currently logged in to. + **For Airbyte Open Source**: In the **Access Token** field, enter the access token you generated with your Facebook app. @@ -90,21 +92,21 @@ You can use the [Access Token Tool](https://developers.facebook.com/tools/access To configure Custom Insights: - 1. For **Name**, enter a name for the insight. This will be used as the Airbyte stream name. - 2. (Optional) For **Level**, enter the level of granularity for the data you want to pull from the Facebook Marketing API (`account`, `ad`, `adset`, `campaign`). Set to `ad` by default. - 3. (Optional) For **Fields**, use the dropdown list to select the fields you want to pull from the Facebook Marketing API. - 4. (Optional) For **Breakdowns**, use the dropdown list to select the breakdowns you want to configure. - 5. (Optional) For **Action Breakdowns**, use the dropdown list to select the action breakdowns you want to configure. - 6. (Optional) For **Action Report Time**, enter the action report time you want to configure. This value determines the timing used to report action statistics. For example, if a user sees an ad on Jan 1st but converts on Jan 2nd, this value will determine how the action is reported. + 1. For **Name**, enter a name for the insight. This will be used as the Airbyte stream name. + 2. (Optional) For **Level**, enter the level of granularity for the data you want to pull from the Facebook Marketing API (`account`, `ad`, `adset`, `campaign`). Set to `ad` by default. + 3. (Optional) For **Fields**, use the dropdown list to select the fields you want to pull from the Facebook Marketing API. + 4. (Optional) For **Breakdowns**, use the dropdown list to select the breakdowns you want to configure. + 5. (Optional) For **Action Breakdowns**, use the dropdown list to select the action breakdowns you want to configure. + 6. (Optional) For **Action Report Time**, enter the action report time you want to configure. This value determines the timing used to report action statistics. For example, if a user sees an ad on Jan 1st but converts on Jan 2nd, this value will determine how the action is reported. - - `impression`: Actions are attributed to the time the ad was viewed (Jan 1st). - - `conversion`: Actions are attributed to the time the action was taken (Jan 2nd). - - `mixed`: Click-through actions are attributed to the time the ad was viewed (Jan 1st), and view-through actions are attributed to the time the action was taken (Jan 2nd). + - `impression`: Actions are attributed to the time the ad was viewed (Jan 1st). + - `conversion`: Actions are attributed to the time the action was taken (Jan 2nd). + - `mixed`: Click-through actions are attributed to the time the ad was viewed (Jan 1st), and view-through actions are attributed to the time the action was taken (Jan 2nd). - 7. (Optional) For **Time Increment**, you may provide a value in days by which to aggregate statistics. The sync will be chunked into intervals of this size. For example, if you set this value to 7, the sync will be chunked into 7-day intervals. The default value is 1 day. - 8. (Optional) For **Start Date**, enter the date in the `YYYY-MM-DDTHH:mm:ssZ` format. The data added on and after this date will be replicated. If this field is left blank, Airbyte will replicate all data. - 9. (Optional) For **End Date**, enter the date in the `YYYY-MM-DDTHH:mm:ssZ` format. The data added on and before this date will be replicated. If this field is left blank, Airbyte will replicate the latest data. - 10. (Optional) For **Custom Insights Lookback Window**, you may set a window in days to revisit data during syncing to capture updated conversion data from the API. Facebook allows for attribution windows of up to 28 days, during which time a conversion can be attributed to an ad. If you have set a custom attribution window in your Facebook account, please set the same value here. Otherwise, you may leave it at the default value of 28. For more information on action attributions, please refer to [the Meta Help Center](https://www.facebook.com/business/help/458681590974355?id=768381033531365). + 7. (Optional) For **Time Increment**, you may provide a value in days by which to aggregate statistics. The sync will be chunked into intervals of this size. For example, if you set this value to 7, the sync will be chunked into 7-day intervals. The default value is 1 day. + 8. (Optional) For **Start Date**, enter the date in the `YYYY-MM-DDTHH:mm:ssZ` format. The data added on and after this date will be replicated. If this field is left blank, Airbyte will replicate all data. + 9. (Optional) For **End Date**, enter the date in the `YYYY-MM-DDTHH:mm:ssZ` format. The data added on and before this date will be replicated. If this field is left blank, Airbyte will replicate the latest data. + 10. (Optional) For **Custom Insights Lookback Window**, you may set a window in days to revisit data during syncing to capture updated conversion data from the API. Facebook allows for attribution windows of up to 28 days, during which time a conversion can be attributed to an ad. If you have set a custom attribution window in your Facebook account, please set the same value here. Otherwise, you may leave it at the default value of 28. For more information on action attributions, please refer to [the Meta Help Center](https://www.facebook.com/business/help/458681590974355?id=768381033531365). :::warning Additional data streams for your Facebook Marketing connector are dynamically generated according to the Custom Insights you specify. If you have an existing Facebook Marketing source and you decide to update or remove some of your Custom Insights, you must also adjust the connections that sync to these streams. Specifically, you should either disable these connections or refresh the source schema associated with them to reflect the changes. @@ -137,16 +139,16 @@ The Facebook Marketing source connector supports the following sync modes: - [Campaigns](https://developers.facebook.com/docs/marketing-api/reference/ad-campaign-group#fields) - [CustomConversions](https://developers.facebook.com/docs/marketing-api/reference/custom-conversion) - [CustomAudiences](https://developers.facebook.com/docs/marketing-api/reference/custom-audience) -:::caution CustomAudiences -The `rule` field may not be synced for all records because it caused the error message `Please reduce the amount of data...`. -::: + :::caution CustomAudiences + The `rule` field may not be synced for all records because it caused the error message `Please reduce the amount of data...`. + ::: - [Images](https://developers.facebook.com/docs/marketing-api/reference/ad-image) - [Videos](https://developers.facebook.com/docs/marketing-api/reference/video) Airbyte also supports the following Prebuilt Facebook Ad Insights Reports: | Stream | Breakdowns | Action Breakdowns | -|:--------------------------------------------------|:--------------------------------------------------------------:|:-------------------------------------------------------:| +| :------------------------------------------------ | :------------------------------------------------------------: | :-----------------------------------------------------: | | Ad Insights Action Carousel Card | --- | `action_carousel_card_id`, `action_carousel_card_name` | | Ad Insights Action Conversion Device | `device_platform` | `action_type` | | Ad Insights Action Product ID | `product_id` | --- | @@ -190,7 +192,7 @@ The Facebook Marketing connector uses the `lookback_window` parameter to repeate ## Data type mapping | Integration Type | Airbyte Type | -|:----------------:|:------------:| +| :--------------: | :----------: | | string | string | | number | number | | array | array | @@ -199,7 +201,7 @@ The Facebook Marketing connector uses the `lookback_window` parameter to repeate ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | 2.1.7 | 2024-04-24 | [36634](https://github.com/airbytehq/airbyte/pull/36634) | Update to CDK 0.80.0 | | 2.1.6 | 2024-04-24 | [36634](https://github.com/airbytehq/airbyte/pull/36634) | Schema descriptions | | 2.1.5 | 2024-04-17 | [37341](https://github.com/airbytehq/airbyte/pull/37341) | Move rate limit errors to transient errors. | @@ -239,7 +241,7 @@ The Facebook Marketing connector uses the `lookback_window` parameter to repeate | 1.1.2 | 2023-08-03 | [29042](https://github.com/airbytehq/airbyte/pull/29042) | Fix broken `advancedAuth` references for `spec` | | 1.1.1 | 2023-07-26 | [27996](https://github.com/airbytehq/airbyte/pull/27996) | Remove reference to authSpecification | | 1.1.0 | 2023-07-11 | [26345](https://github.com/airbytehq/airbyte/pull/26345) | Add new `action_report_time` attribute to `AdInsights` class | -| 1.0.1 | 2023-07-07 | [27979](https://github.com/airbytehq/airbyte/pull/27979) | Added the ability to restore the reduced request record limit after the successful retry, and handle the `unknown error` (code 99) with the retry strategy | +| 1.0.1 | 2023-07-07 | [27979](https://github.com/airbytehq/airbyte/pull/27979) | Added the ability to restore the reduced request record limit after the successful retry, and handle the `unknown error` (code 99) with the retry strategy | | 1.0.0 | 2023-07-05 | [27563](https://github.com/airbytehq/airbyte/pull/27563) | Migrate to FB SDK version 17 | | 0.5.0 | 2023-06-26 | [27728](https://github.com/airbytehq/airbyte/pull/27728) | License Update: Elv2 | | 0.4.3 | 2023-05-12 | [27483](https://github.com/airbytehq/airbyte/pull/27483) | Reduce replication start date by one more day | diff --git a/docs/integrations/sources/facebook-pages-migrations.md b/docs/integrations/sources/facebook-pages-migrations.md index b6396e05951..195583ebd4d 100644 --- a/docs/integrations/sources/facebook-pages-migrations.md +++ b/docs/integrations/sources/facebook-pages-migrations.md @@ -6,36 +6,37 @@ This change is only breaking if you are syncing stream `Page`. ::: -This version brings an updated schema for the `v19.0` API version of the `Page` stream. +This version brings an updated schema for the `v19.0` API version of the `Page` stream. The `messenger_ads_default_page_welcome_message` field has been deleted, and `call_to_actions`, `posts`, `published_posts`, `ratings`, `tabs` and `tagged` fields have been added. Users should: - - Refresh the source schema for the `Page` stream. - - Reset the stream after upgrading to ensure uninterrupted syncs. + +- Refresh the source schema for the `Page` stream. +- Reset the stream after upgrading to ensure uninterrupted syncs. ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection affected by the update. + 1. Select the connection affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. :::note Depending on destination type you may not be prompted to reset your data. ::: -4. Select **Save connection**. +4. Select **Save connection**. :::note This will reset the data in your destination and initiate a fresh sync. ::: -For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset) \ No newline at end of file +For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset) diff --git a/docs/integrations/sources/facebook-pages.md b/docs/integrations/sources/facebook-pages.md index 1b92243d452..75f05f0cc43 100644 --- a/docs/integrations/sources/facebook-pages.md +++ b/docs/integrations/sources/facebook-pages.md @@ -1,7 +1,7 @@ # Facebook Pages :::danger -The Facebook Pages API utilized by this connector has been deprecated. You will not be able to make a successful connection. If you would like to make a community contribution or track API upgrade status, visit: https://github.com/airbytehq/airbyte/issues/25515. +The Facebook Pages API utilized by this connector has been deprecated. You will not be able to make a successful connection. If you would like to make a community contribution or track API upgrade status, visit: https://github.com/airbytehq/airbyte/issues/25515. ::: This page contains the setup guide and reference information for the Facebook Pages source connector. @@ -83,20 +83,20 @@ See Facebook's [documentation on rate limiting](https://developers.facebook.com/ ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:-------------------------------------------------------------------------------------| -| 1.0.0 | 2024-03-14 | [36015](https://github.com/airbytehq/airbyte/pull/36015) | Upgrade Facebook API to v19.0 | -| 0.3.0 | 2023-06-26 | [27728](https://github.com/airbytehq/airbyte/pull/27728) | License Update: Elv2 | -| 0.2.5 | 2023-04-13 | [26939](https://github.com/airbytehq/airbyte/pull/26939) | Add advancedAuth to the connector spec | -| 0.2.4 | 2023-04-13 | [25143](https://github.com/airbytehq/airbyte/pull/25143) | Update insight metrics request params | -| 0.2.3 | 2023-02-23 | [23395](https://github.com/airbytehq/airbyte/pull/23395) | Parse datetime to rfc3339 | -| 0.2.2 | 2023-02-10 | [22804](https://github.com/airbytehq/airbyte/pull/22804) | Retry 500 errors | -| 0.2.1 | 2022-12-29 | [20925](https://github.com/airbytehq/airbyte/pull/20925) | Fix tests; modify expected records | -| 0.2.0 | 2022-11-24 | [19788](https://github.com/airbytehq/airbyte/pull/19788) | Migrate lo low-code; Beta certification; Upgrade Facebook API to v.15 | -| 0.1.6 | 2021-12-22 | [9032](https://github.com/airbytehq/airbyte/pull/9032) | Remove deprecated field `live_encoders` from Page stream | -| 0.1.5 | 2021-11-26 | [8267](https://github.com/airbytehq/airbyte/pull/8267) | updated all empty objects in schemas for Page and Post streams | -| 0.1.4 | 2021-11-26 | [](https://github.com/airbytehq/airbyte/pull/) | Remove unsupported insights_export field from Pages request | -| 0.1.3 | 2021-10-28 | [7440](https://github.com/airbytehq/airbyte/pull/7440) | Generate Page token from config access token | -| 0.1.2 | 2021-10-18 | [7128](https://github.com/airbytehq/airbyte/pull/7128) | Upgrade Facebook API to v.12 | -| 0.1.1 | 2021-09-30 | [6438](https://github.com/airbytehq/airbyte/pull/6438) | Annotate Oauth2 flow initialization parameters in connector specification | -| 0.1.0 | 2021-09-01 | [5158](https://github.com/airbytehq/airbyte/pull/5158) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------ | +| 1.0.0 | 2024-03-14 | [36015](https://github.com/airbytehq/airbyte/pull/36015) | Upgrade Facebook API to v19.0 | +| 0.3.0 | 2023-06-26 | [27728](https://github.com/airbytehq/airbyte/pull/27728) | License Update: Elv2 | +| 0.2.5 | 2023-04-13 | [26939](https://github.com/airbytehq/airbyte/pull/26939) | Add advancedAuth to the connector spec | +| 0.2.4 | 2023-04-13 | [25143](https://github.com/airbytehq/airbyte/pull/25143) | Update insight metrics request params | +| 0.2.3 | 2023-02-23 | [23395](https://github.com/airbytehq/airbyte/pull/23395) | Parse datetime to rfc3339 | +| 0.2.2 | 2023-02-10 | [22804](https://github.com/airbytehq/airbyte/pull/22804) | Retry 500 errors | +| 0.2.1 | 2022-12-29 | [20925](https://github.com/airbytehq/airbyte/pull/20925) | Fix tests; modify expected records | +| 0.2.0 | 2022-11-24 | [19788](https://github.com/airbytehq/airbyte/pull/19788) | Migrate lo low-code; Beta certification; Upgrade Facebook API to v.15 | +| 0.1.6 | 2021-12-22 | [9032](https://github.com/airbytehq/airbyte/pull/9032) | Remove deprecated field `live_encoders` from Page stream | +| 0.1.5 | 2021-11-26 | [8267](https://github.com/airbytehq/airbyte/pull/8267) | updated all empty objects in schemas for Page and Post streams | +| 0.1.4 | 2021-11-26 | [](https://github.com/airbytehq/airbyte/pull/) | Remove unsupported insights_export field from Pages request | +| 0.1.3 | 2021-10-28 | [7440](https://github.com/airbytehq/airbyte/pull/7440) | Generate Page token from config access token | +| 0.1.2 | 2021-10-18 | [7128](https://github.com/airbytehq/airbyte/pull/7128) | Upgrade Facebook API to v.12 | +| 0.1.1 | 2021-09-30 | [6438](https://github.com/airbytehq/airbyte/pull/6438) | Annotate Oauth2 flow initialization parameters in connector specification | +| 0.1.0 | 2021-09-01 | [5158](https://github.com/airbytehq/airbyte/pull/5158) | Initial Release | diff --git a/docs/integrations/sources/fastbill.md b/docs/integrations/sources/fastbill.md index be2e8c85fdb..ceb76a9986f 100644 --- a/docs/integrations/sources/fastbill.md +++ b/docs/integrations/sources/fastbill.md @@ -1,4 +1,4 @@ -# Fastbill +# Fastbill This page contains the setup guide and reference information for the [Fastbill](https://www.fastbill.com/) source connector. @@ -24,7 +24,7 @@ You can find your Project ID and find or create an API key within [Fastbill](htt ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. +2. Set the name for your source. 3. Enter your `project_id` - Fastbill Project ID. 4. Enter your `api_key` - Fastbill API key with read permissions. 5. Click **Set up source**. @@ -34,7 +34,7 @@ You can find your Project ID and find or create an API key within [Fastbill](htt The Fastbill source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -| :---------------- |:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | | SSL connection | No | @@ -42,11 +42,11 @@ The Fastbill source connector supports the following [sync modes](https://docs.a ## Supported Streams -* [Customers](https://apidocs.fastbill.com/fastbill/de/customer.html#customer.get) -* [Invoices](https://apidocs.fastbill.com/fastbill/de/invoice.html#invoice.get) -* [Products](https://apidocs.fastbill.com/fastbill/de/recurring.html#recurring.get) -* [Recurring_invoices](https://apidocs.fastbill.com/fastbill/de/recurring.html#recurring.get) -* [Revenues](https://apidocs.fastbill.com/fastbill/de/revenue.html#revenue.get) +- [Customers](https://apidocs.fastbill.com/fastbill/de/customer.html#customer.get) +- [Invoices](https://apidocs.fastbill.com/fastbill/de/invoice.html#invoice.get) +- [Products](https://apidocs.fastbill.com/fastbill/de/recurring.html#recurring.get) +- [Recurring_invoices](https://apidocs.fastbill.com/fastbill/de/recurring.html#recurring.get) +- [Revenues](https://apidocs.fastbill.com/fastbill/de/revenue.html#revenue.get) ## Data type map @@ -59,11 +59,11 @@ The Fastbill source connector supports the following [sync modes](https://docs.a ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:------------|:---------------------------------------------------------|:--------------------------------------------------| -| 0.2.4 | 2024-04-19 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | schema descriptions | -| 0.2.0 | 2023-08-13 | [29390](https://github.com/airbytehq/airbyte/pull/29390) | Migrated to Low Code CDK | -| 0.1.0 | 2022-11-08 | [18522](https://github.com/airbytehq/airbyte/pull/18593) | New Source: Fastbill | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37159](https://github.com/airbytehq/airbyte/pull/37159) | schema descriptions | +| 0.2.0 | 2023-08-13 | [29390](https://github.com/airbytehq/airbyte/pull/29390) | Migrated to Low Code CDK | +| 0.1.0 | 2022-11-08 | [18522](https://github.com/airbytehq/airbyte/pull/18593) | New Source: Fastbill | diff --git a/docs/integrations/sources/fauna.md b/docs/integrations/sources/fauna.md index 5e93bbd4800..198e356c393 100644 --- a/docs/integrations/sources/fauna.md +++ b/docs/integrations/sources/fauna.md @@ -21,60 +21,64 @@ Enter the domain of the collection's database that you are exporting. The URL ca Follow these steps if you want this connection to perform a full sync. 1. Create a role that can read the collection that you are exporting. You can create the role in the [Dashboard](https://dashboard.fauna.com/) or the [fauna shell](https://github.com/fauna/fauna-shell) with the following query: + ```javascript CreateRole({ name: "airbyte-readonly", privileges: [ { resource: Collections(), - actions: { read: true } + actions: { read: true }, }, { resource: Indexes(), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("COLLECTION_NAME"), - actions: { read: true } - } + actions: { read: true }, + }, ], -}) +}); ``` Replace `COLLECTION_NAME` with the name of the collection configured for this connector. If you'd like to sync multiple collections, add an entry for each additional collection you'd like to sync. For example, to sync `users` and `products`, run this query instead: + ```javascript CreateRole({ name: "airbyte-readonly", privileges: [ { resource: Collections(), - actions: { read: true } + actions: { read: true }, }, { resource: Indexes(), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("users"), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("products"), - actions: { read: true } - } + actions: { read: true }, + }, ], -}) +}); ``` 2. Create a key with that role. You can create a key using this query: + ```javascript CreateKey({ name: "airbyte-readonly", role: Role("airbyte-readonly"), -}) +}); ``` + 3. Copy the `secret` output by the `CreateKey` command and enter that as the "Fauna Secret" on the left. **Important**: The secret is only ever displayed once. If you lose it, you would have to create a new key. @@ -83,16 +87,14 @@ CreateKey({ Follow these steps if you want this connection to perform incremental syncs. 1. Create the "Incremental Sync Index". This allows the connector to perform incremental syncs. You can create the index with the [fauna shell](https://github.com/fauna/fauna-shell) or in the [Dashboard](https://dashboard.fauna.com/) with the following query: + ```javascript CreateIndex({ name: "INDEX_NAME", source: Collection("COLLECTION_NAME"), terms: [], - values: [ - { "field": "ts" }, - { "field": "ref" } - ] -}) + values: [{ field: "ts" }, { field: "ref" }], +}); ``` Replace `COLLECTION_NAME` with the name of the collection configured for this connector. @@ -101,28 +103,29 @@ Replace `INDEX_NAME` with the name that you configured for the Incremental Sync Repeat this step for every collection you'd like to sync. 2. Create a role that can read the collection, the index, and the metadata of all indexes. It needs access to index metadata in order to validate the index settings. You can create the role with this query: + ```javascript CreateRole({ name: "airbyte-readonly", privileges: [ { resource: Collections(), - actions: { read: true } + actions: { read: true }, }, { resource: Indexes(), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("COLLECTION_NAME"), - actions: { read: true } + actions: { read: true }, }, { resource: Index("INDEX_NAME"), - actions: { read: true } - } + actions: { read: true }, + }, ], -}) +}); ``` Replace `COLLECTION_NAME` with the name of the collection configured for this connector. @@ -131,46 +134,48 @@ Replace `INDEX_NAME` with the name that you configured for the Incremental Sync If you'd like to sync multiple collections, add an entry for every collection and index you'd like to sync. For example, to sync `users` and `products` with Incremental Sync, run the following query: + ```javascript CreateRole({ name: "airbyte-readonly", privileges: [ { resource: Collections(), - actions: { read: true } + actions: { read: true }, }, { resource: Indexes(), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("users"), - actions: { read: true } + actions: { read: true }, }, { resource: Index("users-ts"), - actions: { read: true } + actions: { read: true }, }, { resource: Collection("products"), - actions: { read: true } + actions: { read: true }, }, { resource: Index("products-ts"), - actions: { read: true } - } + actions: { read: true }, + }, ], -}) +}); ``` - 3. Create a key with that role. You can create a key using this query: + ```javascript CreateKey({ name: "airbyte-readonly", role: Role("airbyte-readonly"), -}) +}); ``` + 4. Copy the `secret` output by the `CreateKey` command and enter that as the "Fauna Secret" on the left. **Important**: The secret is only ever displayed once. If you lose it, you would have to create a new key. @@ -182,7 +187,7 @@ Note that the `ref` column in the exported database contains only the document I reference (or "ref"). Since only one collection is involved in each connector configuration, it is inferred that the document ID refers to a document within the synced collection. -| Fauna Type | Format | Note | +| Fauna Type | Format | Note | | ----------------------------------------------------------------------------------- | ------------------------------------------------------------------- | ------------------------------------------- | | [Document Ref](https://docs.fauna.com/fauna/current/learn/understanding/types#ref) | `{ id: "id", "collection": "collection-name", "type": "document" }` | | | [Other Ref](https://docs.fauna.com/fauna/current/learn/understanding/types#ref) | `{ id: "id", "type": "ref-type" }` | This includes all other refs, listed below. | @@ -195,7 +200,7 @@ inferred that the document ID refers to a document within the synced collection. Every ref is serialized as a JSON object with 2 or 3 fields, as listed above. The `type` field must be one of these strings: -| Reference Type | `type` string | +| Reference Type | `type` string | | --------------------------------------------------------------------------------------- | ------------------- | | Document | `"document"` | | [Collection](https://docs.fauna.com/fauna/current/api/fql/functions/collection) | `"collection"` | diff --git a/docs/integrations/sources/file.md b/docs/integrations/sources/file.md index 4c90ef11458..4817951c4dc 100644 --- a/docs/integrations/sources/file.md +++ b/docs/integrations/sources/file.md @@ -26,16 +26,19 @@ This page contains the setup guide and reference information for the Files sourc 1. For **Storage Provider**, use the dropdown menu to select the _Storage Provider_ or _Location_ of the file(s) which should be replicated, then configure the provider-specific fields as needed: #### HTTPS: Public Web [Default] + - `User-Agent` (Optional) Set this to active if you want to add the [User-Agent header](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent) to requests (inactive by default). #### GCS: Google Cloud Storage + - `Service Account JSON` (Required for **private** buckets) To access **private** buckets stored on Google Cloud, this connector requires a service account JSON credentials file with the appropriate permissions. A detailed breakdown of this topic can be found at the [Google Cloud service accounts page](https://cloud.google.com/iam/docs/service-accounts). Please generate the "credentials.json" file and copy its content to this field, ensuring it is in JSON format. **If you are accessing publicly available data**, this field is not required. #### S3: Amazon Web Services + - `AWS Access Key ID` (Required for **private** buckets) - `AWS Secret Access Key` (Required for **private** buckets) @@ -45,6 +48,7 @@ More information on setting permissions in AWS can be found [here](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html). **If you are accessing publicly available data**, these fields are not required. #### AzBlob: Azure Blob Storage + - `Storage Account` (Required) This is the globally unique name of the storage account that the desired blob sits within. See the [Azure documentation](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview) for more details. @@ -55,21 +59,27 @@ This is the globally unique name of the storage account that the desired blob si - `Shared Key`: [Find more information here](https://learn.microsoft.com/en-us/rest/api/storageservices/authorize-with-shared-key). #### SSH: Secure Shell / SCP: Secure Copy Protocol / SFTP: Secure File Transfer Protocol + - `Host` (Required) Enter the _hostname_ or _IP address_ of the remote server where the file trasfer will take place. + - `User` (Required) Enter the _username_ associated with your account on the remote server. + - `Password` (Optional) **If required by the remote server**, enter the _password_ associated with your user account. Otherwise, leave this field blank. + - `Port` (Optional) Specify the _port number_ to use for the connection. The default port is usually 22. However, if your remote server uses a non-standard port, you can enter the appropriate port number here. + #### Local Filesystem (Airbyte Open Source only) + - `Storage` :::caution @@ -77,14 +87,17 @@ Currently, the local storage URL for reading must start with the local mount "/l ::: Please note that if you are replicating data from a locally stored file on Windows OS, you will need to open the `.env` file in your local Airbyte root folder and change the values for: + - `LOCAL_ROOT` - `LOCAL_DOCKER_MOUNT` - `HACK_LOCAL_ROOT_PARENT` Please set these to an existing absolute path on your machine. Colons in the path need to be replaced with a double forward slash, `//`. `LOCAL_ROOT` & `LOCAL_DOCKER_MOUNT` should be set to the same value, and `HACK_LOCAL_ROOT_PARENT` should be set to their parent directory. + ### Step 3: Complete the connector setup + 1. For **URL**, enter the _URL path_ of the file to be replicated. :::note @@ -127,7 +140,7 @@ This connector does not support syncing unstructured data files such as raw text ## Supported sync modes | Feature | Supported? | -|------------------------------------------|------------| +| ---------------------------------------- | ---------- | | Full Refresh Sync | Yes | | Incremental Sync | No | | Replicate Incremental Deletes | No | @@ -141,7 +154,7 @@ This source produces a single table for the target file as it replicates only on ## File / Stream Compression | Compression | Supported? | -|-------------|------------| +| ----------- | ---------- | | Gzip | Yes | | Zip | Yes | | Bzip2 | No | @@ -152,7 +165,7 @@ This source produces a single table for the target file as it replicates only on ## Storage Providers | Storage Providers | Supported? | -|------------------------|-------------------------------------------------| +| ---------------------- | ----------------------------------------------- | | HTTPS | Yes | | Google Cloud Storage | Yes | | Amazon Web Services S3 | Yes | @@ -163,7 +176,7 @@ This source produces a single table for the target file as it replicates only on ### File Formats | Format | Supported? | -|-----------------------|------------| +| --------------------- | ---------- | | CSV | Yes | | JSON/JSONL | Yes | | HTML | No | @@ -185,7 +198,7 @@ Normally, Airbyte tries to infer the data type from the source, but you can use Here are a list of examples of possible file inputs: | Dataset Name | Storage | URL | Reader Impl | Service Account | Description | -|-------------------|---------|------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------|------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| ----------------- | ------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ---------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | epidemiology | HTTPS | [https://storage.googleapis.com/covid19-open-data/v2/latest/epidemiology.csv](https://storage.googleapis.com/covid19-open-data/v2/latest/epidemiology.csv) | | | [COVID-19 Public dataset](https://console.cloud.google.com/marketplace/product/bigquery-public-datasets/covid19-public-data-program?filter=solution-type:dataset&id=7d6cc408-53c8-4485-a187-b8cb9a5c0b56) on BigQuery | | hr_and_financials | GCS | gs://airbyte-vault/financial.csv | smart_open or gcfs | `{"type": "service_account", "private_key_id": "XXXXXXXX", ...}` | data from a private bucket, a service account is necessary | | landsat_index | GCS | gcp-public-data-landsat/index.csv.gz | smart_open | | Using smart_open, we don't need to specify the compression (note the gs:// is optional too, same for other providers) | @@ -193,7 +206,7 @@ Here are a list of examples of possible file inputs: Examples with reader options: | Dataset Name | Storage | URL | Reader Impl | Reader Options | Description | -|---------------|---------|-------------------------------------------------|-------------|---------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------| +| ------------- | ------- | ----------------------------------------------- | ----------- | ------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------ | | landsat_index | GCS | gs://gcp-public-data-landsat/index.csv.gz | GCFS | `{"compression": "gzip"}` | Additional reader options to specify a compression option to `read_csv` | | GDELT | S3 | s3://gdelt-open-data/events/20190914.export.csv | | `{"sep": "\t", "header": null}` | Here is TSV data separated by tabs without header row from [AWS Open Data](https://registry.opendata.aws/gdelt/) | | server_logs | local | /local/logs.log | | `{"sep": ";"}` | After making sure a local text file exists at `/tmp/airbyte_local/logs.log` with logs file from some server that are delimited by ';' delimiters | @@ -201,7 +214,7 @@ Examples with reader options: Example for SFTP: | Dataset Name | Storage | User | Password | Host | URL | Reader Options | Description | -|--------------|---------|------|----------|-----------------|-------------------------|---------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------| +| ------------ | ------- | ---- | -------- | --------------- | ----------------------- | ------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- | | Test Rebext | SFTP | demo | password | test.rebext.net | /pub/example/readme.txt | `{"sep": "\r\n", "header": null, "names": \["text"], "engine": "python"}` | We use `python` engine for `read_csv` in order to handle delimiter of more than 1 character while providing our own column names. | Please see (or add) more at `airbyte-integrations/connectors/source-file/integration_tests/integration_source_test.py` for further usages examples. @@ -217,7 +230,7 @@ In order to read large files from a remote location, this connector uses the [sm ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------ | | 0.5.1 | 2024-05-03 | [37799](https://github.com/airbytehq/airbyte/pull/37799) | Add fastparquet engine for parquet file reader. | | 0.5.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 0.4.1 | 2024-03-04 | [35800](https://github.com/airbytehq/airbyte/pull/35800) | Add PyAirbyte support on Python 3.11 | diff --git a/docs/integrations/sources/firebase-realtime-database.md b/docs/integrations/sources/firebase-realtime-database.md index 0304a9ac94a..28dd6d294b0 100644 --- a/docs/integrations/sources/firebase-realtime-database.md +++ b/docs/integrations/sources/firebase-realtime-database.md @@ -12,12 +12,12 @@ If your database has data as below at path `https://{your-database-name}.firebas ```json { - "liam": {"address": "somewhere", "age": 24}, - "olivia": {"address": "somewhere", "age": 30} + "liam": { "address": "somewhere", "age": 24 }, + "olivia": { "address": "somewhere", "age": 30 } } ``` -and you specified a `store-a/users` as a path in configuration, you would sync records like below ... +and you specified a `store-a/users` as a path in configuration, you would sync records like below ... ```json {"key": "liam", "value": "{\"address\": \"somewhere\", \"age\": 24}}"} @@ -26,12 +26,12 @@ and you specified a `store-a/users` as a path in configuration, you would sync ### Features -| Feature | Supported | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| Change Data Capture | No | | -| SSL Support | Yes | | +| Feature | Supported | Notes | +| :------------------ | :-------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| Change Data Capture | No | | +| SSL Support | Yes | | ## Getting started @@ -39,9 +39,9 @@ and you specified a `store-a/users` as a path in configuration, you would sync To use the Firebase Realtime Database source, you'll need: -* A Google Cloud Project with Firebase enabled -* A Google Cloud Service Account with the "Firebase Realtime Database Viewer" roles in your Google Cloud project -* A Service Account Key to authenticate into your Service Account +- A Google Cloud Project with Firebase enabled +- A Google Cloud Service Account with the "Firebase Realtime Database Viewer" roles in your Google Cloud project +- A Service Account Key to authenticate into your Service Account See the setup guide for more information about how to create the required resources. @@ -65,10 +65,10 @@ Follow the [Creating and Managing Service Account Keys](https://cloud.google.com You should now have all the requirements needed to configure Firebase Realtime Database as a source in the UI. You'll need the following information to configure the Firebase Realtime Database source: -* **Database Name** -* **Service Account Key JSON**: the contents of your Service Account Key JSON file. -* **Node Path \[Optional\]**: node path in your database's data which you want to sync. default value is ""(root node). -* **Buffer Size \[Optional\]**: number of records to fetch at one time (buffered). default value is 10000. +- **Database Name** +- **Service Account Key JSON**: the contents of your Service Account Key JSON file. +- **Node Path \[Optional\]**: node path in your database's data which you want to sync. default value is ""(root node). +- **Buffer Size \[Optional\]**: number of records to fetch at one time (buffered). default value is 10000. Once you've configured Firebase Realtime Database as a source, delete the Service Account Key from your computer. @@ -76,7 +76,6 @@ Once you've configured Firebase Realtime Database as a source, delete the Servic ### source-firebase-realtime-database -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-10-16 | [\#18029](https://github.com/airbytehq/airbyte/pull/18029) | 🎉 New Source: Firebase Realtime Database. | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :----------------------------------------- | +| 0.1.0 | 2022-10-16 | [\#18029](https://github.com/airbytehq/airbyte/pull/18029) | 🎉 New Source: Firebase Realtime Database. | diff --git a/docs/integrations/sources/firebolt.md b/docs/integrations/sources/firebolt.md index c5927123766..ffb1c6bc76f 100644 --- a/docs/integrations/sources/firebolt.md +++ b/docs/integrations/sources/firebolt.md @@ -49,9 +49,9 @@ You can now use the Airbyte Firebolt source. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------- | | 1.0.0 | 2023-07-20 | [21842](https://github.com/airbytehq/airbyte/pull/21842) | PGDate, TimestampTZ, TimestampNTZ and Boolean column support | -| 0.2.1 | 2022-05-10 | [25965](https://github.com/airbytehq/airbyte/pull/25965) | Fix DATETIME conversion to Airbyte date-time type | -| 0.2.0 | 2022-09-09 | [16583](https://github.com/airbytehq/airbyte/pull/16583) | Reading from views | -| 0.1.0 | 2022-04-28 | [13874](https://github.com/airbytehq/airbyte/pull/13874) | Create Firebolt source | +| 0.2.1 | 2022-05-10 | [25965](https://github.com/airbytehq/airbyte/pull/25965) | Fix DATETIME conversion to Airbyte date-time type | +| 0.2.0 | 2022-09-09 | [16583](https://github.com/airbytehq/airbyte/pull/16583) | Reading from views | +| 0.1.0 | 2022-04-28 | [13874](https://github.com/airbytehq/airbyte/pull/13874) | Create Firebolt source | diff --git a/docs/integrations/sources/flexport.md b/docs/integrations/sources/flexport.md index 20cb5f41a8d..195de747506 100644 --- a/docs/integrations/sources/flexport.md +++ b/docs/integrations/sources/flexport.md @@ -16,25 +16,25 @@ This Source is capable of syncing the following data as streams: ### Data type mapping -| Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `number` | `number` | float number | -| `integer` | `integer` | whole number | -| `date` | `string` | FORMAT YYYY-MM-DD | -| `datetime` | `string` | FORMAT YYYY-MM-DDThh:mm:ss | -| `array` | `array` | | -| `boolean` | `boolean` | True/False | -| `string` | `string` | | +| Integration Type | Airbyte Type | Notes | +| :--------------- | :----------- | :------------------------- | +| `number` | `number` | float number | +| `integer` | `integer` | whole number | +| `date` | `string` | FORMAT YYYY-MM-DD | +| `datetime` | `string` | FORMAT YYYY-MM-DDThh:mm:ss | +| `array` | `array` | | +| `boolean` | `boolean` | True/False | +| `string` | `string` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Overwrite Sync | Yes | | -| Full Refresh Append Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Incremental - Append + Deduplication Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------------------------------- | :------------------- | :---- | +| Full Refresh Overwrite Sync | Yes | | +| Full Refresh Append Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Incremental - Append + Deduplication Sync | Yes | | +| Namespaces | No | | ## Getting started @@ -44,8 +44,8 @@ Authentication uses a pre-created API token which can be [created in the UI](htt ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.0 | 2023-08-23 | [29151](https://github.com/airbytehq/airbyte/pull/29151) | Migrate to low-code | -| 0.1.1 | 2022-07-26 | [15033](https://github.com/airbytehq/airbyte/pull/15033) | Source Flexport: Update schemas | -| 0.1.0 | 2021-12-14 | [8777](https://github.com/airbytehq/airbyte/pull/8777) | New Source: Flexport | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------ | +| 0.2.0 | 2023-08-23 | [29151](https://github.com/airbytehq/airbyte/pull/29151) | Migrate to low-code | +| 0.1.1 | 2022-07-26 | [15033](https://github.com/airbytehq/airbyte/pull/15033) | Source Flexport: Update schemas | +| 0.1.0 | 2021-12-14 | [8777](https://github.com/airbytehq/airbyte/pull/8777) | New Source: Flexport | diff --git a/docs/integrations/sources/freshcaller.md b/docs/integrations/sources/freshcaller.md index 72e9dd857f7..b2760ef319c 100644 --- a/docs/integrations/sources/freshcaller.md +++ b/docs/integrations/sources/freshcaller.md @@ -8,21 +8,21 @@ The Freshcaller source supports full refresh and incremental sync. Depending on The following endpoints are supported from this source: -* [Users](https://developers.freshcaller.com/api/#users) -* [Teams](https://developers.freshcaller.com/api/#teams) -* [Calls](https://developers.freshcaller.com/api/#calls) -* [Call Metrics](https://developers.freshcaller.com/api/#call-metrics) +- [Users](https://developers.freshcaller.com/api/#users) +- [Teams](https://developers.freshcaller.com/api/#teams) +- [Calls](https://developers.freshcaller.com/api/#calls) +- [Call Metrics](https://developers.freshcaller.com/api/#call-metrics) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -32,8 +32,8 @@ The Freshcaller connector should not run into Freshcaller API limitations under ### Requirements -* Freshcaller Account -* Freshcaller API Key +- Freshcaller Account +- Freshcaller API Key ### Setup guide @@ -41,9 +41,9 @@ Please read [How to find your API key](https://support.freshdesk.com/en/support/ ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------- | -| 0.3.1 | 2023-11-28 | [32874](https://github.com/airbytehq/airbyte/pull/32874) | 🐛 Source: fix page_size_option parameter in spec | -| 0.3.0 | 2023-10-24 | [31102](https://github.com/airbytehq/airbyte/pull/14759) | ✨ Source: Migrate to Low Code CDK | -| 0.2.0 | 2023-05-15 | [26065](https://github.com/airbytehq/airbyte/pull/26065) | Fix spec type check for `start_date` | -| 0.1.0 | 2022-08-11 | [14759](https://github.com/airbytehq/airbyte/pull/14759) | 🎉 New Source: Freshcaller | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | +| 0.3.1 | 2023-11-28 | [32874](https://github.com/airbytehq/airbyte/pull/32874) | 🐛 Source: fix page_size_option parameter in spec | +| 0.3.0 | 2023-10-24 | [31102](https://github.com/airbytehq/airbyte/pull/14759) | ✨ Source: Migrate to Low Code CDK | +| 0.2.0 | 2023-05-15 | [26065](https://github.com/airbytehq/airbyte/pull/26065) | Fix spec type check for `start_date` | +| 0.1.0 | 2022-08-11 | [14759](https://github.com/airbytehq/airbyte/pull/14759) | 🎉 New Source: Freshcaller | diff --git a/docs/integrations/sources/freshdesk.md b/docs/integrations/sources/freshdesk.md index da3c7f6ed60..0bfb5648ed0 100644 --- a/docs/integrations/sources/freshdesk.md +++ b/docs/integrations/sources/freshdesk.md @@ -68,7 +68,7 @@ If you don't use the start date Freshdesk will retrieve only the last 30 days. M ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:--------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------ | | 3.1.0 | 2024-03-12 | [35699](https://github.com/airbytehq/airbyte/pull/35699) | Migrate to low-code | | 3.0.7 | 2024-02-12 | [35187](https://github.com/airbytehq/airbyte/pull/35187) | Manage dependencies with Poetry. | | 3.0.6 | 2024-01-10 | [34101](https://github.com/airbytehq/airbyte/pull/34101) | Base image migration: remove Dockerfile and use the python-connector-base image | diff --git a/docs/integrations/sources/freshsales-migrations.md b/docs/integrations/sources/freshsales-migrations.md index 42b98fbb668..6a6dd4162e9 100644 --- a/docs/integrations/sources/freshsales-migrations.md +++ b/docs/integrations/sources/freshsales-migrations.md @@ -2,6 +2,6 @@ ## Upgrading to 1.0.0 -This version migrates the Freshsales connector to our low-code framework for greater maintainability. +This version migrates the Freshsales connector to our low-code framework for greater maintainability. As part of this release, we've also updated data types across streams to match the correct return types from the upstream API. You will need to run a reset on connections using this connector after upgrading to continue syncing. diff --git a/docs/integrations/sources/freshsales.md b/docs/integrations/sources/freshsales.md index b8b100a0301..88e472c0559 100644 --- a/docs/integrations/sources/freshsales.md +++ b/docs/integrations/sources/freshsales.md @@ -4,9 +4,9 @@ This page contains the setup guide and reference information for the Freshsales ## Prerequisites -* Freshsales Account -* Freshsales API Key -* Freshsales Domain Name +- Freshsales Account +- Freshsales API Key +- Freshsales Domain Name Please read [How to find your API key](https://crmsupport.freshworks.com/support/solutions/articles/50000002503-how-to-find-my-api-key-). @@ -23,7 +23,6 @@ Please read [How to find your API key](https://crmsupport.freshworks.com/support 5. Enter your `API Key` obtained from [these steps](https://crmsupport.freshworks.com/support/solutions/articles/50000002503-how-to-find-my-api-key-) 6. Click **Set up source** - ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard @@ -33,30 +32,28 @@ Please read [How to find your API key](https://crmsupport.freshworks.com/support 5. Enter your `API Key` obtained from [these steps](https://crmsupport.freshworks.com/support/solutions/articles/50000002503-how-to-find-my-api-key-) 6. Click **Set up source** - ## Supported sync modes | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | | SSL connection | No | | Namespaces | No | - ## Supported Streams Several output streams are available from this source: -* [Contacts](https://developers.freshworks.com/crm/api/#contacts) -* [Accounts](https://developers.freshworks.com/crm/api/#accounts) -* [Open Deals](https://developers.freshworks.com/crm/api/#deals) -* [Won Deals](https://developers.freshworks.com/crm/api/#deals) -* [Lost Deals](https://developers.freshworks.com/crm/api/#deals) -* [Open Tasks](https://developers.freshworks.com/crm/api/#tasks) -* [Completed Tasks](https://developers.freshworks.com/crm/api/#tasks) -* [Past appointments](https://developers.freshworks.com/crm/api/#appointments) -* [Upcoming appointments](https://developers.freshworks.com/crm/api/#appointments) +- [Contacts](https://developers.freshworks.com/crm/api/#contacts) +- [Accounts](https://developers.freshworks.com/crm/api/#accounts) +- [Open Deals](https://developers.freshworks.com/crm/api/#deals) +- [Won Deals](https://developers.freshworks.com/crm/api/#deals) +- [Lost Deals](https://developers.freshworks.com/crm/api/#deals) +- [Open Tasks](https://developers.freshworks.com/crm/api/#tasks) +- [Completed Tasks](https://developers.freshworks.com/crm/api/#tasks) +- [Past appointments](https://developers.freshworks.com/crm/api/#appointments) +- [Upcoming appointments](https://developers.freshworks.com/crm/api/#appointments) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) @@ -64,14 +61,13 @@ If there are more endpoints you'd like Airbyte to support, please [create an iss The Freshsales connector should not run into Freshsales API limitations under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------| -| 1.0.0 | 2023-10-21 | [31685](https://github.com/airbytehq/airbyte/pull/31685) | Migrate to Low-Code CDK | -| 0.1.4 | 2023-03-23 | [24396](https://github.com/airbytehq/airbyte/pull/24396) | Certify to Beta | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------- | +| 1.0.0 | 2023-10-21 | [31685](https://github.com/airbytehq/airbyte/pull/31685) | Migrate to Low-Code CDK | +| 0.1.4 | 2023-03-23 | [24396](https://github.com/airbytehq/airbyte/pull/24396) | Certify to Beta | | 0.1.3 | 2023-03-16 | [24155](https://github.com/airbytehq/airbyte/pull/24155) | Set `additionalProperties` to `True` in `spec` to support BC | -| 0.1.2 | 2022-07-14 | [00000](https://github.com/airbytehq/airbyte/pull/00000) | Tune the `get_view_id` function | -| 0.1.1 | 2021-12-24 | [9101](https://github.com/airbytehq/airbyte/pull/9101) | Update fields and descriptions | -| 0.1.0 | 2021-11-03 | [6963](https://github.com/airbytehq/airbyte/pull/6963) | 🎉 New Source: Freshsales | +| 0.1.2 | 2022-07-14 | [00000](https://github.com/airbytehq/airbyte/pull/00000) | Tune the `get_view_id` function | +| 0.1.1 | 2021-12-24 | [9101](https://github.com/airbytehq/airbyte/pull/9101) | Update fields and descriptions | +| 0.1.0 | 2021-11-03 | [6963](https://github.com/airbytehq/airbyte/pull/6963) | 🎉 New Source: Freshsales | diff --git a/docs/integrations/sources/freshservice.md b/docs/integrations/sources/freshservice.md index ef87a8d3a0a..70e1e39e68b 100644 --- a/docs/integrations/sources/freshservice.md +++ b/docs/integrations/sources/freshservice.md @@ -8,30 +8,30 @@ The Freshservice supports full refresh syncs. You can choose if this connector w Several output streams are available from this source: -* [Tickets](https://api.freshservice.com/v2/#view_all_ticket) (Incremental) -* [Problems](https://api.freshservice.com/v2/#problems) (Incremental) -* [Changes](https://api.freshservice.com/v2/#changes) (Incremental) -* [Releases](https://api.freshservice.com/v2/#releases) (Incremental) -* [Requesters](https://api.freshservice.com/v2/#requesters) -* [Agents](https://api.freshservice.com/v2/#agents) -* [Locations](https://api.freshservice.com/v2/#locations) -* [Products](https://api.freshservice.com/v2/#products) -* [Vendors](https://api.freshservice.com/v2/#vendors) -* [Assets](https://api.freshservice.com/v2/#assets) -* [PurchaseOrders](https://api.freshservice.com/v2/#purchase-order) -* [Software](https://api.freshservice.com/v2/#software) -* [Satisfaction Survey Responses](https://api.freshservice.com/#ticket_csat_attributes) +- [Tickets](https://api.freshservice.com/v2/#view_all_ticket) (Incremental) +- [Problems](https://api.freshservice.com/v2/#problems) (Incremental) +- [Changes](https://api.freshservice.com/v2/#changes) (Incremental) +- [Releases](https://api.freshservice.com/v2/#releases) (Incremental) +- [Requesters](https://api.freshservice.com/v2/#requesters) +- [Agents](https://api.freshservice.com/v2/#agents) +- [Locations](https://api.freshservice.com/v2/#locations) +- [Products](https://api.freshservice.com/v2/#products) +- [Vendors](https://api.freshservice.com/v2/#vendors) +- [Assets](https://api.freshservice.com/v2/#assets) +- [PurchaseOrders](https://api.freshservice.com/v2/#purchase-order) +- [Software](https://api.freshservice.com/v2/#software) +- [Satisfaction Survey Responses](https://api.freshservice.com/#ticket_csat_attributes) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | No | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | No | +| Namespaces | No | ### Performance considerations @@ -41,10 +41,10 @@ The Freshservice connector should not run into Freshservice API limitations unde ### Requirements -* Freshservice Account -* Freshservice API Key -* Freshservice domain name -* Replciation Start Date +- Freshservice Account +- Freshservice API Key +- Freshservice domain name +- Replciation Start Date ### Setup guide @@ -52,16 +52,16 @@ Please read [How to find your API key](https://api.freshservice.com/#authenticat ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 1.3.5 | 2024-04-19 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Updating to 0.80.0 CDK | -| 1.3.4 | 2024-04-18 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Manage dependencies with Poetry. | -| 1.3.3 | 2024-04-15 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 1.3.2 | 2024-04-12 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | schema descriptions | -| 1.3.1 | 2024-01-29 | [34633](https://github.com/airbytehq/airbyte/pull/34633) | Add backoff policy for `Requested Items` stream | -| 1.3.0 | 2024-01-15 | [29126](https://github.com/airbytehq/airbyte/pull/29126) | Add `Requested Items` stream | -| 1.2.0 | 2023-08-06 | [29126](https://github.com/airbytehq/airbyte/pull/29126) | Migrated to Low-Code CDK | -| 1.1.0 | 2023-05-09 | [25929](https://github.com/airbytehq/airbyte/pull/25929) | Add stream for customer satisfaction survey responses endpoint | -| 1.0.0 | 2023-05-02 | [25743](https://github.com/airbytehq/airbyte/pull/25743) | Correct data types in tickets, agents and requesters schemas to match Freshservice API | -| 0.1.1 | 2021-12-28 | [9143](https://github.com/airbytehq/airbyte/pull/9143) | Update titles and descriptions | -| 0.1.0 | 2021-10-29 | [6967](https://github.com/airbytehq/airbyte/pull/6967) | 🎉 New Source: Freshservice | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------- | +| 1.3.5 | 2024-04-19 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Updating to 0.80.0 CDK | +| 1.3.4 | 2024-04-18 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Manage dependencies with Poetry. | +| 1.3.3 | 2024-04-15 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 1.3.2 | 2024-04-12 | [37162](https://github.com/airbytehq/airbyte/pull/37162) | schema descriptions | +| 1.3.1 | 2024-01-29 | [34633](https://github.com/airbytehq/airbyte/pull/34633) | Add backoff policy for `Requested Items` stream | +| 1.3.0 | 2024-01-15 | [29126](https://github.com/airbytehq/airbyte/pull/29126) | Add `Requested Items` stream | +| 1.2.0 | 2023-08-06 | [29126](https://github.com/airbytehq/airbyte/pull/29126) | Migrated to Low-Code CDK | +| 1.1.0 | 2023-05-09 | [25929](https://github.com/airbytehq/airbyte/pull/25929) | Add stream for customer satisfaction survey responses endpoint | +| 1.0.0 | 2023-05-02 | [25743](https://github.com/airbytehq/airbyte/pull/25743) | Correct data types in tickets, agents and requesters schemas to match Freshservice API | +| 0.1.1 | 2021-12-28 | [9143](https://github.com/airbytehq/airbyte/pull/9143) | Update titles and descriptions | +| 0.1.0 | 2021-10-29 | [6967](https://github.com/airbytehq/airbyte/pull/6967) | 🎉 New Source: Freshservice | diff --git a/docs/integrations/sources/gainsight-px.md b/docs/integrations/sources/gainsight-px.md index 171df4fda9b..8a23d1133e3 100644 --- a/docs/integrations/sources/gainsight-px.md +++ b/docs/integrations/sources/gainsight-px.md @@ -4,7 +4,7 @@ This page contains the setup guide and reference information for the [Gainsight- ## Prerequisites -Api key is mandate for this connector to work, It could be generated from the dashboard settings (ref - https://app.aptrinsic.com/settings/api-keys). +Api key is mandate for this connector to work, It could be generated from the dashboard settings (ref - https://app.aptrinsic.com/settings/api-keys). ## Setup guide @@ -13,7 +13,7 @@ Api key is mandate for this connector to work, It could be generated from the da - Generate an API key (Example: 12345) - Params (If specific info is needed) - Available params - - api_key: The aptrinsic api_key + - api_key: The aptrinsic api_key ## Step 2: Set up the Gainsight-APIs connector in Airbyte @@ -23,8 +23,8 @@ Api key is mandate for this connector to work, It could be generated from the da 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+new source**. 3. On the Set up the source page, enter the name for the Gainsight-API connector and select **Gainsight-API** from the Source type dropdown. 4. Enter your `api_key`. -5. Enter the params configuration if needed. Supported params are: query, orientation, size, color, locale, collection_id \ -video_id, photo_id +5. Enter the params configuration if needed. Supported params are: query, orientation, size, color, locale, collection_id \ + video_id, photo_id 6. Click **Set up source**. ### For Airbyte OSS: @@ -32,8 +32,8 @@ video_id, photo_id 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -4. Enter the params configuration if needed. Supported params are: query, orientation, size, color, locale, collection_id \ -video_id, photo_id +4. Enter the params configuration if needed. Supported params are: query, orientation, size, color, locale, collection_id \ + video_id, photo_id 5. Click **Set up source**. ## Supported sync modes @@ -70,6 +70,6 @@ Gainsight-PX-API's [API reference](https://gainsightpx.docs.apiary.io/) has v1 a ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------- | | 0.1.1 | 2024-05-03 | [37593](https://github.com/airbytehq/airbyte/pull/37593) | Changed `last_records` to `last_record` | | 0.1.0 | 2023-05-10 | [26998](https://github.com/airbytehq/airbyte/pull/26998) | Initial PR | diff --git a/docs/integrations/sources/gcs.md b/docs/integrations/sources/gcs.md index e0d24e0716a..7ae84e9e4e9 100644 --- a/docs/integrations/sources/gcs.md +++ b/docs/integrations/sources/gcs.md @@ -32,10 +32,10 @@ Use the service account ID from above, grant read access to your target bucket. - Paste the service account JSON key to the `Service Account Information` field - Enter your GCS bucket name to the `Bucket` field - Add a stream - 1. Give a **Name** to the stream - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported format is **CSV**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. - 3. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. - 4. (Optional) - If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). + 1. Give a **Name** to the stream + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported format is **CSV**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 3. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. + 4. (Optional) - If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). - Configure the optional **Start Date** parameter that marks a starting date and time in UTC for data replication. Any files that have _not_ been modified since this specified date/time will _not_ be replicated. Use the provided datepicker (recommended) or enter the desired date programmatically in the format `YYYY-MM-DDTHH:mm:ssZ`. Leaving this field blank will replicate data from all files that have not been excluded by the **Path Pattern** and **Path Prefix**. - Click **Set up source** and wait for the tests to complete. @@ -132,7 +132,7 @@ Product,Description,Price Jeans,"Navy Blue, Bootcut, 34\"",49.99 ``` -The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). +The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). Leaving this field blank (default option) will disallow escaping. @@ -146,16 +146,16 @@ Leaving this field blank (default option) will disallow escaping. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------| -| 0.4.0 | 2024-03-21 | [36373](https://github.com/airbytehq/airbyte/pull/36373) | Add Gzip and Bzip compression support. Manage dependencies with Poetry. | -| 0.3.7 | 2024-02-06 | [34936](https://github.com/airbytehq/airbyte/pull/34936) | Bump CDK version to avoid missing SyncMode errors | -| 0.3.6 | 2024-01-30 | [34681](https://github.com/airbytehq/airbyte/pull/34681) | Unpin CDK version to make compatible with the Concurrent CDK | -| 0.3.5 | 2024-01-30 | [34661](https://github.com/airbytehq/airbyte/pull/34661) | Pin CDK version until upgrade for compatibility with the Concurrent CDK | -| 0.3.4 | 2024-01-11 | [34158](https://github.com/airbytehq/airbyte/pull/34158) | Fix issue in stream reader for document file type parser | -| 0.3.3 | 2023-12-06 | [33187](https://github.com/airbytehq/airbyte/pull/33187) | Bump CDK version to hide source-defined primary key | -| 0.3.2 | 2023-11-16 | [32608](https://github.com/airbytehq/airbyte/pull/32608) | Improve document file type parser | -| 0.3.1 | 2023-11-13 | [32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | -| 0.3.0 | 2023-10-11 | [31212](https://github.com/airbytehq/airbyte/pull/31212) | Migrated to file based CDK | -| 0.2.0 | 2023-06-26 | [27725](https://github.com/airbytehq/airbyte/pull/27725) | License Update: Elv2 | -| 0.1.0 | 2023-02-16 | [23186](https://github.com/airbytehq/airbyte/pull/23186) | New Source: GCS | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- | +| 0.4.0 | 2024-03-21 | [36373](https://github.com/airbytehq/airbyte/pull/36373) | Add Gzip and Bzip compression support. Manage dependencies with Poetry. | +| 0.3.7 | 2024-02-06 | [34936](https://github.com/airbytehq/airbyte/pull/34936) | Bump CDK version to avoid missing SyncMode errors | +| 0.3.6 | 2024-01-30 | [34681](https://github.com/airbytehq/airbyte/pull/34681) | Unpin CDK version to make compatible with the Concurrent CDK | +| 0.3.5 | 2024-01-30 | [34661](https://github.com/airbytehq/airbyte/pull/34661) | Pin CDK version until upgrade for compatibility with the Concurrent CDK | +| 0.3.4 | 2024-01-11 | [34158](https://github.com/airbytehq/airbyte/pull/34158) | Fix issue in stream reader for document file type parser | +| 0.3.3 | 2023-12-06 | [33187](https://github.com/airbytehq/airbyte/pull/33187) | Bump CDK version to hide source-defined primary key | +| 0.3.2 | 2023-11-16 | [32608](https://github.com/airbytehq/airbyte/pull/32608) | Improve document file type parser | +| 0.3.1 | 2023-11-13 | [32357](https://github.com/airbytehq/airbyte/pull/32357) | Improve spec schema | +| 0.3.0 | 2023-10-11 | [31212](https://github.com/airbytehq/airbyte/pull/31212) | Migrated to file based CDK | +| 0.2.0 | 2023-06-26 | [27725](https://github.com/airbytehq/airbyte/pull/27725) | License Update: Elv2 | +| 0.1.0 | 2023-02-16 | [23186](https://github.com/airbytehq/airbyte/pull/23186) | New Source: GCS | diff --git a/docs/integrations/sources/genesys.md b/docs/integrations/sources/genesys.md index 63847913d43..d64c5fc914d 100644 --- a/docs/integrations/sources/genesys.md +++ b/docs/integrations/sources/genesys.md @@ -1,12 +1,14 @@ # Genesys ## Overview + The Genesys source retrieves data from [Genesys](https://www.genesys.com/) using their [JSON REST APIs](https://developer.genesys.cloud/devapps/api-explorer). ## Setup Guide ### Requirements -We are using `OAuth2` as this is the only supported authentication method. So you will need to follow the steps below to generate the `Client ID` and `Client Secret`. + +We are using `OAuth2` as this is the only supported authentication method. So you will need to follow the steps below to generate the `Client ID` and `Client Secret`. - Genesys region - Client ID @@ -15,6 +17,7 @@ We are using `OAuth2` as this is the only supported authentication method. So yo You can follow the documentation on [API credentials](https://developer.genesys.cloud/authorization/platform-auth/use-client-credentials#obtain-an-access-token) or you can login directly to the [OAuth admin page](https://apps.mypurecloud.com/directory/#/admin/integrations/oauth) ## Supported Streams + - [Locations](https://developer.genesys.cloud/telephony/locations-apis) - [Routing](https://developer.genesys.cloud/routing/routing/) - [Stations](https://developer.genesys.cloud/telephony/stations-apis) @@ -23,7 +26,7 @@ You can follow the documentation on [API credentials](https://developer.genesys. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | | 0.1.1 | 2023-04-27 | [25598](https://github.com/airbytehq/airbyte/pull/25598) | Use region specific API server | -| 0.1.0 | 2022-10-06 | [17559](https://github.com/airbytehq/airbyte/pull/17559) | The Genesys Source is created | +| 0.1.0 | 2022-10-06 | [17559](https://github.com/airbytehq/airbyte/pull/17559) | The Genesys Source is created | diff --git a/docs/integrations/sources/getlago.md b/docs/integrations/sources/getlago.md index 33c296afd36..b91109f71cb 100644 --- a/docs/integrations/sources/getlago.md +++ b/docs/integrations/sources/getlago.md @@ -6,32 +6,32 @@ This source can sync data from the [Lago API](https://doc.getlago.com/docs/guide ## This Source Supports the Following Streams - * billable_metrics - * plans - * coupons - * add_ons - * invoices - * customers - * subscriptions +- billable_metrics +- plans +- coupons +- add_ons +- invoices +- customers +- subscriptions ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | - +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Getting started ### Requirements -* Lago API URL -* Lago API KEY + +- Lago API URL +- Lago API KEY ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.3.0 | 2023-10-05 | [#31099](https://github.com/airbytehq/airbyte/pull/31099) | Added customer_usage and wallet stream | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------------- | +| 0.3.0 | 2023-10-05 | [#31099](https://github.com/airbytehq/airbyte/pull/31099) | Added customer_usage and wallet stream | | 0.2.0 | 2023-09-19 | [#30572](https://github.com/airbytehq/airbyte/pull/30572) | Source GetLago: Support API URL | -| 0.1.0 | 2022-10-26 | [#18727](https://github.com/airbytehq/airbyte/pull/18727) | 🎉 New Source: getLago API [low-code CDK] | \ No newline at end of file +| 0.1.0 | 2022-10-26 | [#18727](https://github.com/airbytehq/airbyte/pull/18727) | 🎉 New Source: getLago API [low-code CDK] | diff --git a/docs/integrations/sources/github.md b/docs/integrations/sources/github.md index 5ca3e36afb2..a8d91ebd1df 100644 --- a/docs/integrations/sources/github.md +++ b/docs/integrations/sources/github.md @@ -11,6 +11,7 @@ This page contains the setup guide and reference information for the [GitHub](ht - List of GitHub Repositories (and access for them in case they are private) + **For Airbyte Cloud:** - OAuth @@ -18,6 +19,7 @@ This page contains the setup guide and reference information for the [GitHub](ht + **For Airbyte Open Source:** - Personal Access Token (see [Permissions and scopes](https://docs.airbyte.com/integrations/sources/github#permissions-and-scopes)) @@ -30,14 +32,17 @@ This page contains the setup guide and reference information for the [GitHub](ht Create a [GitHub Account](https://github.com). + **Airbyte Open Source additional setup steps** Log into [GitHub](https://github.com) and then generate a [personal access token](https://github.com/settings/tokens). To load balance your API quota consumption across multiple API tokens, input multiple tokens separated with `,`. + ### Step 2: Set up the GitHub connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -47,11 +52,11 @@ Log into [GitHub](https://github.com) and then generate a [personal access token 5. To authenticate: - - **For Airbyte Cloud:** **Authenticate your GitHub account** to authorize your GitHub account. Airbyte will authenticate the GitHub account you are already logged in to. Please make sure you are logged into the right account. - - +- **For Airbyte Cloud:** **Authenticate your GitHub account** to authorize your GitHub account. Airbyte will authenticate the GitHub account you are already logged in to. Please make sure you are logged into the right account. + + - - **For Airbyte Open Source:** Authenticate with **Personal Access Token**. To generate a personal access token, log into [GitHub](https://github.com) and then generate a [personal access token](https://github.com/settings/tokens). Enter your GitHub personal access token. To load balance your API quota consumption across multiple API tokens, input multiple tokens separated with `,`. +- **For Airbyte Open Source:** Authenticate with **Personal Access Token**. To generate a personal access token, log into [GitHub](https://github.com) and then generate a [personal access token](https://github.com/settings/tokens). Enter your GitHub personal access token. To load balance your API quota consumption across multiple API tokens, input multiple tokens separated with `,`. 6. **GitHub Repositories** - Enter a list of GitHub organizations/repositories, e.g. `airbytehq/airbyte` for single repository, `airbytehq/airbyte airbytehq/another-repo` for multiple repositories. If you want to specify the organization to receive data from all its repositories, then you should specify it according to the following example: `airbytehq/*`. @@ -64,7 +69,7 @@ Repositories with the wrong name or repositories that do not exist or have the w - These streams will only sync records generated on or after the **Start Date**: `comments`, `commit_comment_reactions`, `commit_comments`, `commits`, `deployments`, `events`, `issue_comment_reactions`, `issue_events`, `issue_milestones`, `issue_reactions`, `issues`, `project_cards`, `project_columns`, `projects`, `pull_request_comment_reactions`, `pull_requests`, `pull_requeststats`, `releases`, `review_comments`, `reviews`, `stargazers`, `workflow_runs`, `workflows`. -- The **Start Date** does not apply to the streams below and all data will be synced for these streams: `assignees`, `branches`, `collaborators`, `issue_labels`, `organizations`, `pull_request_commits`, `pull_request_stats`, `repositories`, `tags`, `teams`, `users` +- The **Start Date** does not apply to the streams below and all data will be synced for these streams: `assignees`, `branches`, `collaborators`, `issue_labels`, `organizations`, `pull_request_commits`, `pull_request_stats`, `repositories`, `tags`, `teams`, `users` 8. **Branch (Optional)** - List of GitHub repository branches to pull commits from, e.g. `airbytehq/airbyte/master`. If no branches are specified for a repository, the default branch will be pulled. (e.g. `airbytehq/airbyte/master airbytehq/airbyte/my-branch`). @@ -172,17 +177,18 @@ Expand to see details about GitHub connector limitations and troubleshooting. #### Rate limiting You can use a personal access token to make API requests. Additionally, you can authorize a GitHub App or OAuth app, which can then make API requests on your behalf. -All of these requests count towards your personal rate limit of 5,000 requests per hour (15,000 requests per hour if the app is owned by a GitHub Enterprise Cloud organization ). +All of these requests count towards your personal rate limit of 5,000 requests per hour (15,000 requests per hour if the app is owned by a GitHub Enterprise Cloud organization ). :::info `REST API` and `GraphQL API` rate limits are counted separately ::: :::tip In the event that limits are reached before all streams have been read, it is recommended to take the following actions: + 1. Utilize Incremental sync mode. 2. Set a higher sync interval. 3. Divide the sync into separate connections with a smaller number of streams. -::: + ::: Refer to GitHub article [Rate limits for the REST API](https://docs.github.com/en/rest/overview/rate-limits-for-the-rest-api). @@ -198,17 +204,17 @@ Your token should have at least the `repo` scope. Depending on which streams you ### Troubleshooting -* Check out common troubleshooting issues for the GitHub source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions) +- Check out common troubleshooting issues for the GitHub source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions) ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 1.7.2 | 2024-04-19 | [36636](https://github.com/airbytehq/airbyte/pull/36636) | Updating to 0.80.0 CDK | -| 1.7.1 | 2024-04-12 | [36636](https://github.com/airbytehq/airbyte/pull/36636) | schema descriptions | -| 1.7.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | +| :------ | :--------- | :---------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| 1.7.2 | 2024-04-19 | [36636](https://github.com/airbytehq/airbyte/pull/36636) | Updating to 0.80.0 CDK | +| 1.7.1 | 2024-04-12 | [36636](https://github.com/airbytehq/airbyte/pull/36636) | schema descriptions | +| 1.7.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 1.6.5 | 2024-03-12 | [35986](https://github.com/airbytehq/airbyte/pull/35986) | Handle rate limit exception as config error | | 1.6.4 | 2024-03-08 | [35915](https://github.com/airbytehq/airbyte/pull/35915) | Fix per stream error handler; Make use the latest CDK version | | 1.6.3 | 2024-02-15 | [35271](https://github.com/airbytehq/airbyte/pull/35271) | Update branches schema | diff --git a/docs/integrations/sources/gitlab-migrations.md b/docs/integrations/sources/gitlab-migrations.md index a96dd9b0fc4..194bfb12df8 100644 --- a/docs/integrations/sources/gitlab-migrations.md +++ b/docs/integrations/sources/gitlab-migrations.md @@ -1,6 +1,5 @@ # Gitlab Migration Guide - ## Upgrading to 4.0.0 We're continuously striving to enhance the quality and reliability of our connectors at Airbyte. @@ -18,19 +17,18 @@ Users will need to reset the affected streams after upgrading. Airbyte Open Source users must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. -2. Find Gitlab in the list of connectors. + 1. Select **Sources**. +2. Find Gitlab in the list of connectors. :::note You will see two versions listed, the current in-use version and the latest version available. -::: +::: 3. Select **Change** to update your OSS version to the latest available version. - ### Update the connector version -1. Select **Sources** in the main navbar. +1. Select **Sources** in the main navbar. 2. Select the instance of the connector you wish to upgrade. :::note @@ -38,31 +36,27 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. -2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: -4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: + 1. Select the connection(s) affected by the update. +2. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: +4. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 3.0.0 In this release, `merge_request_commits` stream schema has been fixed so that it returns commits for each merge_request. @@ -75,18 +69,18 @@ Users will need to refresh the source schema and reset `merge_request_commits` s Airbyte Open Source users must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. -2. Find Gitlab in the list of connectors. + 1. Select **Sources**. +2. Find Gitlab in the list of connectors. :::note You will see two versions listed, the current in-use version and the latest version available. -::: +::: 3. Select **Change** to update your OSS version to the latest available version. ### Update the connector version -1. Select **Sources** in the main navbar. +1. Select **Sources** in the main navbar. 2. Select the instance of the connector you wish to upgrade. :::note @@ -94,31 +88,27 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. -2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: -4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: + 1. Select the connection(s) affected by the update. +2. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: +4. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 2.0.0 In the 2.0.0 config change, several streams were updated to date-time field format, as declared in the Gitlab API. diff --git a/docs/integrations/sources/gitlab.md b/docs/integrations/sources/gitlab.md index 845eeca7d3e..772a04b4112 100644 --- a/docs/integrations/sources/gitlab.md +++ b/docs/integrations/sources/gitlab.md @@ -108,7 +108,7 @@ Gitlab has the [rate limits](https://docs.gitlab.com/ee/user/gitlab_com/index.ht ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 4.0.2 | 2024-04-24 | [36637](https://github.com/airbytehq/airbyte/pull/36637) | Schema descriptions and CDK 0.80.0 | | 4.0.1 | 2024-04-23 | [37505](https://github.com/airbytehq/airbyte/pull/37505) | Set error code `500` as retryable | | 4.0.0 | 2024-03-25 | [35989](https://github.com/airbytehq/airbyte/pull/35989) | Migrate to low-code | diff --git a/docs/integrations/sources/glassfrog.md b/docs/integrations/sources/glassfrog.md index ac5d834b551..d866c0f7c6a 100644 --- a/docs/integrations/sources/glassfrog.md +++ b/docs/integrations/sources/glassfrog.md @@ -10,32 +10,31 @@ This Source Connector is based on the [Airbyte CDK](https://docs.airbyte.com/con This Source is capable of syncing the following Streams: -* [Assignments](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#db2934bd-8c07-1951-b273-51fbc2dc6422) -* [Checklist items](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#a81716d4-b492-79ff-1348-9048fd9dc527) -* [Circles](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#ed696857-c3d8-fba1-a174-fbe63de07798) -* [Custom fields](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#901f8ec2-a986-0291-2fa2-281c16622107) -* [Metrics](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#00d4f5fb-d6e5-5521-a77d-bdce50a9fb84) -* [People](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#78b74b9f-72b7-63fc-a18c-18518932944b) -* [Projects](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#110bde88-a319-ae9c-077a-9752fd2f0843) -* [Roles](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#d1f31f7a-1d42-8c86-be1d-a36e640bf993) - +- [Assignments](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#db2934bd-8c07-1951-b273-51fbc2dc6422) +- [Checklist items](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#a81716d4-b492-79ff-1348-9048fd9dc527) +- [Circles](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#ed696857-c3d8-fba1-a174-fbe63de07798) +- [Custom fields](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#901f8ec2-a986-0291-2fa2-281c16622107) +- [Metrics](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#00d4f5fb-d6e5-5521-a77d-bdce50a9fb84) +- [People](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#78b74b9f-72b7-63fc-a18c-18518932944b) +- [Projects](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#110bde88-a319-ae9c-077a-9752fd2f0843) +- [Roles](https://documenter.getpostman.com/view/1014385/glassfrog-api-v3/2SJViY#d1f31f7a-1d42-8c86-be1d-a36e640bf993) ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | No | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :------------------------ | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | No | | +| Namespaces | No | | ## Getting started @@ -46,13 +45,12 @@ This Source is capable of syncing the following Streams: ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.4 | 2024-04-19 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | schema descriptions | -| 0.2.0 | 2023-08-10 | [29306](https://github.com/airbytehq/airbyte/pull/29306) | Migrated to LowCode CDK | -| 0.1.1 | 2023-08-15 | [13868](https://github.com/airbytehq/airbyte/pull/13868) | Fix schema and tests | -| 0.1.0 | 2022-06-16 | [13868](https://github.com/airbytehq/airbyte/pull/13868) | Add Native Glassfrog Source Connector | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37167](https://github.com/airbytehq/airbyte/pull/37167) | schema descriptions | +| 0.2.0 | 2023-08-10 | [29306](https://github.com/airbytehq/airbyte/pull/29306) | Migrated to LowCode CDK | +| 0.1.1 | 2023-08-15 | [13868](https://github.com/airbytehq/airbyte/pull/13868) | Fix schema and tests | +| 0.1.0 | 2022-06-16 | [13868](https://github.com/airbytehq/airbyte/pull/13868) | Add Native Glassfrog Source Connector | diff --git a/docs/integrations/sources/gnews.md b/docs/integrations/sources/gnews.md index ebbb9e45812..be8a23ae195 100644 --- a/docs/integrations/sources/gnews.md +++ b/docs/integrations/sources/gnews.md @@ -8,13 +8,13 @@ The GNews source supports full refresh syncs Two output streams are available from this source: -*[Search](https://gnews.io/docs/v4?shell#search-endpoint). -*[Top Headlines](https://gnews.io/docs/v4?shell#top-headlines-endpoint). +_[Search](https://gnews.io/docs/v4?shell#search-endpoint). +_[Top Headlines](https://gnews.io/docs/v4?shell#top-headlines-endpoint). ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | @@ -26,7 +26,7 @@ Rate Limiting is based on the API Key tier subscription, get more info [here](ht ### Requirements -* GNews API Key. +- GNews API Key. ### Connect using `API Key`: @@ -36,7 +36,7 @@ Rate Limiting is based on the API Key tier subscription, get more info [here](ht ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------- | | 0.1.3 | 2022-12-16 | [21322](https://github.com/airbytehq/airbyte/pull/21322) | Reorganize manifest inline stream schemas | | 0.1.2 | 2022-12-16 | [20405](https://github.com/airbytehq/airbyte/pull/20405) | Update the manifest to use inline stream schemas | | 0.1.1 | 2022-12-13 | [20460](https://github.com/airbytehq/airbyte/pull/20460) | Update source acceptance test config | diff --git a/docs/integrations/sources/gocardless.md b/docs/integrations/sources/gocardless.md index 8e58c914294..b9d705783ed 100644 --- a/docs/integrations/sources/gocardless.md +++ b/docs/integrations/sources/gocardless.md @@ -7,29 +7,29 @@ The GoCardless source can sync data from the [GoCardless API](https://gocardless #### Output schema This source is capable of syncing the following streams: -* Mandates -* Payments -* Payouts -* Refunds +- Mandates +- Payments +- Payouts +- Refunds #### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | No | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | No | +| Namespaces | No | ### Requirements / Setup Guide -* Access Token -* GoCardless Environment -* GoCardless Version -* Start Date +- Access Token +- GoCardless Environment +- GoCardless Version +- Start Date ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-10-19 | [17792](https://github.com/airbytehq/airbyte/pull/17792) | Initial release supporting the GoCardless | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------- | +| 0.1.0 | 2022-10-19 | [17792](https://github.com/airbytehq/airbyte/pull/17792) | Initial release supporting the GoCardless | diff --git a/docs/integrations/sources/gong.md b/docs/integrations/sources/gong.md index 142aa404383..b7e3bf6fd22 100644 --- a/docs/integrations/sources/gong.md +++ b/docs/integrations/sources/gong.md @@ -34,11 +34,11 @@ By default Gong limits your company's access to the service to 3 API calls per s ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :------------------------ | -| 0.1.5 | 2024-04-19 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Updating to 0.80.0 CDK | -| 0.1.4 | 2024-04-18 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Manage dependencies with Poetry. | -| 0.1.3 | 2024-04-15 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.2 | 2024-04-12 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | schema descriptions | -| 0.1.1 | 2024-02-05 | [34847](https://github.com/airbytehq/airbyte/pull/34847) | Adjust stream schemas and make ready for airbyte-lib | -| 0.1.0 | 2022-10-27 | [18819](https://github.com/airbytehq/airbyte/pull/18819) | Add Gong Source Connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.5 | 2024-04-19 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Updating to 0.80.0 CDK | +| 0.1.4 | 2024-04-18 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Manage dependencies with Poetry. | +| 0.1.3 | 2024-04-15 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.2 | 2024-04-12 | [37169](https://github.com/airbytehq/airbyte/pull/37169) | schema descriptions | +| 0.1.1 | 2024-02-05 | [34847](https://github.com/airbytehq/airbyte/pull/34847) | Adjust stream schemas and make ready for airbyte-lib | +| 0.1.0 | 2022-10-27 | [18819](https://github.com/airbytehq/airbyte/pull/18819) | Add Gong Source Connector | diff --git a/docs/integrations/sources/google-ads-migrations.md b/docs/integrations/sources/google-ads-migrations.md index 22dcc734b26..0c1ba27d4a6 100644 --- a/docs/integrations/sources/google-ads-migrations.md +++ b/docs/integrations/sources/google-ads-migrations.md @@ -5,7 +5,7 @@ This release upgrades the Google Ads API from Version 13 to Version 15 which causes the following changes in the schemas: | Stream | Current field name | New field name | -|----------------------------|----------------------------------------------------------------------------|--------------------------------------------------------------------------| +| -------------------------- | -------------------------------------------------------------------------- | ------------------------------------------------------------------------ | | ad_listing_group_criterion | ad_group_criterion.listing_group.case_value.product_bidding_category.id | ad_group_criterion.listing_group.case_value.product_category.category_id | | ad_listing_group_criterion | ad_group_criterion.listing_group.case_value.product_bidding_category.level | ad_group_criterion.listing_group.case_value.product_category.level | | shopping_performance_view | segments.product_bidding_category_level1 | segments.product_category_level1 | @@ -16,37 +16,43 @@ This release upgrades the Google Ads API from Version 13 to Version 15 which cau | campaign | campaign.shopping_setting.sales_country | This field has been deleted | Users should: + - Refresh the source schema - Reset affected streams after upgrading to ensure uninterrupted syncs. ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. + ```note Any detected schema changes will be listed for your review. ``` + 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. + ```note Depending on destination type you may not be prompted to reset your data. ``` -4. Select **Save connection**. + +4. Select **Save connection**. + ```note This will reset the data in your destination and initiate a fresh sync. ``` For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 2.0.0 This release updates the Source Google Ads connector so that its default streams and stream names match the related resources in [Google Ads API](https://developers.google.com/google-ads/api/fields/v14/ad_group_ad). Users should: + - Refresh the source schema - And reset affected streams after upgrading to ensure uninterrupted syncs. diff --git a/docs/integrations/sources/google-ads.md b/docs/integrations/sources/google-ads.md index 79c93f8b1fc..6b8f916ead7 100644 --- a/docs/integrations/sources/google-ads.md +++ b/docs/integrations/sources/google-ads.md @@ -48,6 +48,7 @@ A single access token can grant varying degrees of access to multiple APIs. A va The scope for the Google Ads API is: https://www.googleapis.com/auth/adwords Each Google Ads API developer token is assigned an access level and "permissible use". The access level determines whether you can affect production accounts and the number of operations and requests that you can execute daily. Permissible use determines the specific Google Ads API features that the developer token is allowed to use. Read more about it and apply for higher access [here](https://developers.google.com/google-ads/api/docs/access-levels#access_levels_2). + ### Step 3: Set up the Google Ads connector in Airbyte @@ -62,7 +63,7 @@ To set up Google Ads as a source in Airbyte Cloud: 3. Find and select **Google Ads** from the list of available sources. 4. Enter a **Source name** of your choosing. 5. Click **Sign in with Google** to authenticate your Google Ads account. In the pop-up, select the appropriate Google account and click **Continue** to proceed. -6. (Optional) Enter a comma-separated list of the **Customer ID(s)** for your account. These IDs are 10-digit numbers that uniquely identify your account. To find your Customer ID, please follow [Google's instructions](https://support.google.com/google-ads/answer/1704344). Leaving this field blank will replicate data from all connected accounts. +6. (Optional) Enter a comma-separated list of the **Customer ID(s)** for your account. These IDs are 10-digit numbers that uniquely identify your account. To find your Customer ID, please follow [Google's instructions](https://support.google.com/google-ads/answer/1704344). Leaving this field blank will replicate data from all connected accounts. 7. (Optional) Enter customer statuses to filter customers. Leaving this field blank will replicate data from all accounts. Check [Google Ads documentation](https://developers.google.com/google-ads/api/reference/rpc/v15/CustomerStatusEnum.CustomerStatus) for more info. 8. (Optional) Enter a **Start Date** using the provided datepicker, or by programmatically entering the date in YYYY-MM-DD format. The data added on and after this date will be replicated. (Default start date is 2 years ago) 9. (Optional) You can use the **Custom GAQL Queries** field to enter a custom query using Google Ads Query Language. Click **Add** and enter your query, as well as the desired name of the table for this data in the destination. Multiple queries can be provided. For more information on formulating these queries, refer to our [guide below](#custom-query-understanding-google-ads-query-language). @@ -84,7 +85,7 @@ To set up Google Ads as a source in Airbyte Open Source: 4. Enter a **Source name** of your choosing. 5. Enter the **Developer Token** you obtained from Google. 6. To authenticate your Google account, enter your Google application's **Client ID**, **Client Secret**, **Refresh Token**, and optionally, the **Access Token**. -7. (Optional) Enter a comma-separated list of the **Customer ID(s)** for your account. These IDs are 10-digit numbers that uniquely identify your account. To find your Customer ID, please follow [Google's instructions](https://support.google.com/google-ads/answer/1704344). Leaving this field blank will replicate data from all connected accounts. +7. (Optional) Enter a comma-separated list of the **Customer ID(s)** for your account. These IDs are 10-digit numbers that uniquely identify your account. To find your Customer ID, please follow [Google's instructions](https://support.google.com/google-ads/answer/1704344). Leaving this field blank will replicate data from all connected accounts. 8. (Optional) Enter customer statuses to filter customers. Leaving this field blank will replicate data from all accounts. Check [Google Ads documentation](https://developers.google.com/google-ads/api/reference/rpc/v15/CustomerStatusEnum.CustomerStatus) for more info. 9. (Optional) Enter a **Start Date** using the provided datepicker, or by programmatically entering the date in YYYY-MM-DD format. The data added on and after this date will be replicated. (Default start date is 2 years ago) 10. (Optional) You can use the **Custom GAQL Queries** field to enter a custom query using Google Ads Query Language. Click **Add** and enter your query, as well as the desired name of the table for this data in the destination. Multiple queries can be provided. For more information on formulating these queries, refer to our [guide below](#custom-query-understanding-google-ads-query-language). @@ -105,7 +106,9 @@ The Google Ads source connector supports the following [sync modes](https://docs - [Incremental Sync - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped) #### Incremental Events Streams + List of stream: + - [ad_group_criterions](https://developers.google.com/google-ads/api/fields/v15/ad_group_criterion) - [ad_listing_group_criterions](https://developers.google.com/google-ads/api/fields/v15/ad_group_criterion) - [campaign_criterion](https://developers.google.com/google-ads/api/fields/v15/campaign_criterion) @@ -117,6 +120,7 @@ The initial sync operates as a full refresh. Subsequent syncs begin by reading u :::warning It's important to note that the Google Ads API resource ChangeStatus has a limit of 10,000 records per request. That's why you cannot sync stream with more than 10,000 updates in a single microsecond. In such cases, it's recommended to use a full refresh sync to ensure all updates are captured. ::: + ## Supported Streams The Google Ads source connector can sync the following tables. It can also sync custom queries using GAQL. @@ -126,41 +130,52 @@ The Google Ads source connector can sync the following tables. It can also sync - [customer](https://developers.google.com/google-ads/api/fields/v15/customer) Highlights the setup and configurations of a Google Ads account. It encompasses features like call reporting and conversion tracking, giving a clear picture of the account's operational settings and features. + - [customer_label](https://developers.google.com/google-ads/api/fields/v15/customer_label) - [campaign_criterion](https://developers.google.com/google-ads/api/fields/v15/campaign_criterion) Targeting option for a campaign, such as a keyword, placement, or audience. + - [campaign_bidding_strategy](https://developers.google.com/google-ads/api/fields/v15/campaign) Represents the bidding strategy at the campaign level. + - [campaign_label](https://developers.google.com/google-ads/api/fields/v15/campaign_label) - [label](https://developers.google.com/google-ads/api/fields/v15/label) Represents labels that can be attached to different entities such as campaigns or ads. + - [ad_group_ad](https://developers.google.com/google-ads/api/fields/v15/ad_group_ad) Different attributes of ads from ad groups segmented by date. + - [ad_group_ad_label](https://developers.google.com/google-ads/api/fields/v15/ad_group_ad_label) - [ad_group](https://developers.google.com/google-ads/api/fields/v15/ad_group) Represents an ad group within a campaign. Ad groups contain one or more ads which target a shared set of keywords. + - [ad_group_label](https://developers.google.com/google-ads/api/fields/v15/ad_group_label) - [ad_group_bidding_strategy](https://developers.google.com/google-ads/api/fields/v15/ad_group) Represents the bidding strategy at the ad group level. + - [ad_group_criterion](https://developers.google.com/google-ads/api/fields/v15/ad_group_criterion) Represents criteria in an ad group, such as keywords or placements. + - [ad_listing_group_criterion](https://developers.google.com/google-ads/api/fields/v15/ad_group_criterion) Represents criteria for listing group ads. + - [ad_group_criterion_label](https://developers.google.com/google-ads/api/fields/v15/ad_group_criterion_label) - [audience](https://developers.google.com/google-ads/api/fields/v15/audience) Represents user lists that are defined by the advertiser to target specific users. + - [user_interest](https://developers.google.com/google-ads/api/fields/v15/user_interest) A particular interest-based vertical to be targeted. + - [click_view](https://developers.google.com/google-ads/api/reference/rpc/v15/ClickView) A click view with metrics aggregated at each click level, including both valid and invalid clicks. @@ -172,30 +187,39 @@ Note that `ad_group`, `ad_group_ad`, and `campaign` contain a `labels` field, wh - [account_performance_report](https://developers.google.com/google-ads/api/docs/migration/mapping#account_performance) Provides in-depth metrics related to ads interactions, including viewability, click-through rates, and conversions. Segments data by various factors, offering a granular look into how ads perform across different contexts. + - [campaign](https://developers.google.com/google-ads/api/fields/v15/campaign) Represents a campaign in Google Ads. + - [campaign_budget](https://developers.google.com/google-ads/api/fields/v15/campaign_budget) Represents the budget settings of a campaign. + - [geographic_view](https://developers.google.com/google-ads/api/fields/v15/geographic_view) Geographic View includes all metrics aggregated at the country level. It reports metrics at either actual physical location of the user or an area of interest. + - [user_location_view](https://developers.google.com/google-ads/api/fields/v15/user_location_view) User Location View includes all metrics aggregated at the country level. It reports metrics at the actual physical location of the user by targeted or not targeted location. + - [display_keyword_view](https://developers.google.com/google-ads/api/fields/v15/display_keyword_view) Metrics for display keywords, which are keywords that are targeted in display campaigns. + - [topic_view](https://developers.google.com/google-ads/api/fields/v15/topic_view) Reporting view that shows metrics aggregated by topic, which are broad categories of interests that users have. + - [shopping_performance_view](https://developers.google.com/google-ads/api/fields/v15/shopping_performance_view) Provides Shopping campaign statistics aggregated at several product dimension levels. Product dimension values from Merchant Center such as brand, category, custom attributes, product condition and product type will reflect the state of each dimension as of the date and time when the corresponding event was recorded. + - [keyword_view](https://developers.google.com/google-ads/api/fields/v15/keyword_view) Provides metrics related to the performance of keywords in the campaign. + - [ad_group_ad_legacy](https://developers.google.com/google-ads/api/fields/v15/ad_group_ad) Metrics and attributes of legacy ads from ad groups. @@ -205,14 +229,14 @@ Due to Google Ads API constraints, the `click_view` stream retrieves data one da ::: :::warning -Google Ads doesn't support `PERFORMANCE_MAX` campaigns on `ad_group` or `ad` stream level, only on `campaign` level. +Google Ads doesn't support `PERFORMANCE_MAX` campaigns on `ad_group` or `ad` stream level, only on `campaign` level. If you have this type of campaign Google will remove them from the results for the `ads` reports. More [info](https://github.com/airbytehq/airbyte/issues/11062) and [Google Discussions](https://groups.google.com/g/adwords-api/c/_mxbgNckaLQ). ::: For incremental streams, data is synced up to the previous day using your Google Ads account time zone since Google Ads can filter data only by [date](https://developers.google.com/google-ads/api/fields/v15/ad_group_ad#segments.date) without time. Also, some reports cannot load data real-time due to Google Ads [limitations](https://support.google.com/google-ads/answer/2544985?hl=en). -### Reasoning Behind Primary Key Selection +### Reasoning Behind Primary Key Selection Primary keys are chosen to uniquely identify records within streams. In this selection, we considered the scope of ID uniqueness as detailed in [the Google Ads API structure documentation](https://developers.google.com/google-ads/api/docs/concepts/api-structure#object_ids). This approach guarantees that each record remains unique across various scopes and contexts. Moreover, in the Google Ads API, segmentation is crucial for dissecting performance data. As pointed out in [the Google Ads support documentation](https://developers.google.com/google-ads/api/docs/reporting/segmentation), segments offer a granular insight into data based on specific criteria, like device type or click interactions. @@ -242,14 +266,13 @@ Follow Google's guidance on [Selectability between segments and metrics](https:/ For an existing Google Ads source, when you are updating or removing Custom GAQL Queries, you should also subsequently refresh your source schema to pull in any changes. ::: - ## Difference between manager and client accounts A manager account isn't an "upgrade" of your Google Ads account. Instead, it's an entirely new Google Ads account you create. Think of a manager account as an umbrella Google Ads account with several individual Google Ads accounts linked to it. You can link new and existing Google Ads accounts, as well as other manager accounts. You can then monitor ad performance, update campaigns, and manage other account tasks for those client accounts. Your manager account can also be given ownership of a client account. This allows you to manage user access for the client account. -[Link](https://support.google.com/google-ads/answer/6139186?hl=en#) for more details on how it works and how you can create it. +[Link](https://support.google.com/google-ads/answer/6139186?hl=en#) for more details on how it works and how you can create it. **Manager Accounts (MCC)** primarily focus on account management and oversight. They can access and manage multiple client accounts, view shared resources, and handle invitations to link with client accounts. @@ -279,7 +302,7 @@ Due to a limitation in the Google Ads API which does not allow getting performan ## Changelog | Version | Date | Pull Request | Subject | -|:---------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------| +| :------- | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------- | | `3.4.2` | 2024-04-24 | [36638](https://github.com/airbytehq/airbyte/pull/36638) | Schema descriptions and CDK 0.80.0 | | `3.4.1` | 2024-04-08 | [36891](https://github.com/airbytehq/airbyte/pull/36891) | Optimize `check` method | | `3.4.0` | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | diff --git a/docs/integrations/sources/google-analytics-data-api-migrations.md b/docs/integrations/sources/google-analytics-data-api-migrations.md index 84ac3684a6d..6872c06e870 100644 --- a/docs/integrations/sources/google-analytics-data-api-migrations.md +++ b/docs/integrations/sources/google-analytics-data-api-migrations.md @@ -2,24 +2,26 @@ ## Upgrading to 2.0.0 -This version update only affects the schema of GA4 connections that sync more than one property. +This version update only affects the schema of GA4 connections that sync more than one property. -Version 2.0.0 prevents the duplication of stream names by renaming some property streams with a new stream name that includes the property ID. +Version 2.0.0 prevents the duplication of stream names by renaming some property streams with a new stream name that includes the property ID. - If you only are syncing from one property, no changes will occur when you upgrade to the new version. The stream names will continue to appear as: - - "daily_active_users", - - "weekly_active_users" +If you only are syncing from one property, no changes will occur when you upgrade to the new version. The stream names will continue to appear as: -If you are syncing more than one property, any property after the first will have the property ID appended to the stream name. +- "daily_active_users", +- "weekly_active_users" + +If you are syncing more than one property, any property after the first will have the property ID appended to the stream name. For example, if your property IDs are: `0001`, `0002`, `0003`, the streams related to properties `0002` and `0003` will have the property ID appended to the end of the stream name. - - "daily_active_users", - - "daily_active_users_property_0002", - - "daily_active_users_property_0003", - - "weekly_active_users", - - "weekly_active_users_property_0002" - - "weekly_active_users_property_0003" + +- "daily_active_users", +- "daily_active_users_property_0002", +- "daily_active_users_property_0003", +- "weekly_active_users", +- "weekly_active_users_property_0002" +- "weekly_active_users_property_0003" If you are syncing more than one property ID, you will need to reset those streams to ensure syncing continues accurately. -In the future, if you add an additional property ID, all new streams will append the property ID to the stream name without affecting existing streams. A reset is not required if you add the consecutive property after upgrading to 2.0.0. \ No newline at end of file +In the future, if you add an additional property ID, all new streams will append the property ID to the stream name without affecting existing streams. A reset is not required if you add the consecutive property after upgrading to 2.0.0. diff --git a/docs/integrations/sources/google-analytics-data-api.md b/docs/integrations/sources/google-analytics-data-api.md index 5ba658c1b18..4bd58a4eacd 100644 --- a/docs/integrations/sources/google-analytics-data-api.md +++ b/docs/integrations/sources/google-analytics-data-api.md @@ -17,6 +17,7 @@ The [Google Analytics Universal Analytics (UA) connector](https://docs.airbyte.c ### For Airbyte Cloud + For **Airbyte Cloud** users, we highly recommend using OAuth for authentication, as this significantly simplifies the setup process by allowing you to authenticate your Google Analytics account directly in the Airbyte UI. Please follow the steps below to set up the connector using this method. 1. [Log in to your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -25,9 +26,9 @@ For **Airbyte Cloud** users, we highly recommend using OAuth for authentication, 4. In the **Source name** field, enter a name to help you identify this source. 5. Select **Authenticate via Google (Oauth)** from the dropdown menu and click **Authenticate your Google Analytics 4 (GA4) account**. This will open a pop-up window where you can log in to your Google account and grant Airbyte access to your Google Analytics account. 6. Enter the **Property ID** whose events are tracked. This ID should be a numeric value, such as `123456789`. If you are unsure where to find this value, refer to [Google's documentation](https://developers.google.com/analytics/devguides/reporting/data/v1/property-id#what_is_my_property_id). -:::note -If the Property Settings shows a "Tracking Id" such as "UA-123...-1", this denotes that the property is a Universal Analytics property, and the Analytics data for that property cannot be reported on using this connector. You can create a new Google Analytics 4 property by following [these instructions](https://support.google.com/analytics/answer/9744165?hl=en). -::: + :::note + If the Property Settings shows a "Tracking Id" such as "UA-123...-1", this denotes that the property is a Universal Analytics property, and the Analytics data for that property cannot be reported on using this connector. You can create a new Google Analytics 4 property by following [these instructions](https://support.google.com/analytics/answer/9744165?hl=en). + ::: 7. (Optional) In the **Start Date** field, use the provided datepicker or enter a date programmatically in the format `YYYY-MM-DD`. All data added from this date onward will be replicated. Note that this setting is _not_ applied to custom Cohort reports. 8. (Optional) In the **Custom Reports** field, you may optionally describe any custom reports you want to sync from Google Analytics. See the [Custom Reports](#custom-reports) section below for more information on formulating these reports. @@ -37,7 +38,7 @@ If the Property Settings shows a "Tracking Id" such as "UA-123...-1", this denot It's important to consider how dimensions like `month` or `yearMonth` are specified. These dimensions organize the data according to your preferences. However, keep in mind that the data presentation is also influenced by the chosen date range for the report. In cases where a very specific date range is selected, such as a single day (**Data Request Interval (Days)** set to one day), duplicated data entries for each day might appear. -To mitigate this, we recommend adjusting the **Data Request Interval (Days)** value to 364. By doing so, you can obtain more precise results and prevent the occurrence of duplicated data. +To mitigate this, we recommend adjusting the **Data Request Interval (Days)** value to 364. By doing so, you can obtain more precise results and prevent the occurrence of duplicated data. ::: @@ -77,9 +78,9 @@ Before you can use the service account to access Google Analytics data, you need 3. Find and select **Google Analytics 4 (GA4)** from the list of available sources. 4. Select **Service Account Key Authenication** dropdown list and enter **Service Account JSON Key** from Step 1. 5. Enter the **Property ID** whose events are tracked. This ID should be a numeric value, such as `123456789`. If you are unsure where to find this value, refer to [Google's documentation](https://developers.google.com/analytics/devguides/reporting/data/v1/property-id#what_is_my_property_id). -:::note -If the Property Settings shows a "Tracking Id" such as "UA-123...-1", this denotes that the property is a Universal Analytics property, and the Analytics data for that property cannot be reported on in the Data API. You can create a new Google Analytics 4 property by following [these instructions](https://support.google.com/analytics/answer/9744165?hl=en). -::: + :::note + If the Property Settings shows a "Tracking Id" such as "UA-123...-1", this denotes that the property is a Universal Analytics property, and the Analytics data for that property cannot be reported on in the Data API. You can create a new Google Analytics 4 property by following [these instructions](https://support.google.com/analytics/answer/9744165?hl=en). + ::: 6. (Optional) In the **Start Date** field, use the provided datepicker or enter a date programmatically in the format `YYYY-MM-DD`. All data added from this date onward will be replicated. Note that this setting is _not_ applied to custom Cohort reports. @@ -99,7 +100,7 @@ Many analyses and data investigations may require 24-48 hours to process informa It's important to consider how dimensions like `month` or `yearMonth` are specified. These dimensions organize the data according to your preferences. However, keep in mind that the data presentation is also influenced by the chosen date range for the report. In cases where a very specific date range is selected, such as a single day (**Data Request Interval (Days)** set to one day), duplicated data entries for each day might appear. -To mitigate this, we recommend adjusting the **Data Request Interval (Days)** value to 364. By doing so, you can obtain more precise results and prevent the occurrence of duplicated data. +To mitigate this, we recommend adjusting the **Data Request Interval (Days)** value to 364. By doing so, you can obtain more precise results and prevent the occurrence of duplicated data. ::: @@ -193,7 +194,6 @@ Custom reports in Google Analytics allow for flexibility in querying specific da A full list of dimensions and metrics supported in the API can be found [here](https://developers.google.com/analytics/devguides/reporting/data/v1/api-schema). To ensure your dimensions and metrics are compatible for your GA4 property, you can use the [GA4 Dimensions & Metrics Explorer](https://ga-dev-tools.google/ga4/dimensions-metrics-explorer/). - The following is an example of a basic User Engagement report to track sessions and bounce rate, segmented by city: ```json @@ -263,8 +263,8 @@ The Google Analytics connector is subject to Google Analytics Data API quotas. P ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- | :------------------------------------------------------------------------------------- | -| 2.4.2 | 2024-03-20 | [36302](https://github.com/airbytehq/airbyte/pull/36302) | Don't extract state from the latest record if stream doesn't have a cursor_field | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------- | +| 2.4.2 | 2024-03-20 | [36302](https://github.com/airbytehq/airbyte/pull/36302) | Don't extract state from the latest record if stream doesn't have a cursor_field | | 2.4.1 | 2024-02-09 | [35073](https://github.com/airbytehq/airbyte/pull/35073) | Manage dependencies with Poetry. | | 2.4.0 | 2024-02-07 | [34951](https://github.com/airbytehq/airbyte/pull/34951) | Replace the spec parameter from previous version to convert all `conversions:*` fields | | 2.3.0 | 2024-02-06 | [34907](https://github.com/airbytehq/airbyte/pull/34907) | Add new parameter to spec to convert `conversions:purchase` field to float | @@ -304,4 +304,4 @@ The Google Analytics connector is subject to Google Analytics Data API quotas. P | 0.1.0 | 2023-01-08 | [20889](https://github.com/airbytehq/airbyte/pull/20889) | Improved config validation, SAT | | 0.0.3 | 2022-08-15 | [15229](https://github.com/airbytehq/airbyte/pull/15229) | Source Google Analytics Data Api: code refactoring | | 0.0.2 | 2022-07-27 | [15087](https://github.com/airbytehq/airbyte/pull/15087) | fix documentationUrl | -| 0.0.1 | 2022-05-09 | [12701](https://github.com/airbytehq/airbyte/pull/12701) | Introduce Google Analytics Data API source | \ No newline at end of file +| 0.0.1 | 2022-05-09 | [12701](https://github.com/airbytehq/airbyte/pull/12701) | Introduce Google Analytics Data API source | diff --git a/docs/integrations/sources/google-analytics-v4-service-account-only.md b/docs/integrations/sources/google-analytics-v4-service-account-only.md index 9cd7dd1a221..8f7a4cc46a8 100644 --- a/docs/integrations/sources/google-analytics-v4-service-account-only.md +++ b/docs/integrations/sources/google-analytics-v4-service-account-only.md @@ -59,11 +59,11 @@ A Google Cloud account with [Viewer permissions](https://support.google.com/anal 4. Enter a name for the Google Analytics connector. 5. Authenticate your Google account via Service Account Key Authentication: - To authenticate your Google account via Service Account Key Authentication, enter your [Google Cloud service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys) in JSON format. Use the service account email address to [add a user](https://support.google.com/analytics/answer/1009702) to the Google analytics view you want to access via the API and grant [Read and Analyze permissions](https://support.google.com/analytics/answer/2884495). -5. Enter the **Replication Start Date** in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data. +6. Enter the **Replication Start Date** in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data. -6. Enter the [**View ID**](https://ga-dev-tools.appspot.com/account-explorer/) for the Google Analytics View you want to fetch data from. -7. Optionally, enter a JSON object as a string in the **Custom Reports** field. For details, refer to [Requesting custom reports](#requesting-custom-reports) -8. Leave **Data request time increment in days (Optional)** blank or set to 1. For faster syncs, set this value to more than 1 but that might result in the Google Analytics API returning [sampled data](#sampled-data-in-reports), potentially causing inaccuracies in the returned results. The maximum allowed value is 364. +7. Enter the [**View ID**](https://ga-dev-tools.appspot.com/account-explorer/) for the Google Analytics View you want to fetch data from. +8. Optionally, enter a JSON object as a string in the **Custom Reports** field. For details, refer to [Requesting custom reports](#requesting-custom-reports) +9. Leave **Data request time increment in days (Optional)** blank or set to 1. For faster syncs, set this value to more than 1 but that might result in the Google Analytics API returning [sampled data](#sampled-data-in-reports), potentially causing inaccuracies in the returned results. The maximum allowed value is 364. @@ -87,7 +87,7 @@ You need to add the service account email address on the account level, not the The Google Analytics (Universal Analytics) source connector can sync the following tables: | Stream name | Schema | -|:-------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :----------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | website_overview | `{"ga_date":"2021-02-11","ga_users":1,"ga_newUsers":0,"ga_sessions":9,"ga_sessionsPerUser":9.0,"ga_avgSessionDuration":28.77777777777778,"ga_pageviews":63,"ga_pageviewsPerSession":7.0,"ga_avgTimeOnPage":4.685185185185185,"ga_bounceRate":0.0,"ga_exitRate":14.285714285714285,"view_id":"211669975"}` | | traffic_sources | `{"ga_date":"2021-02-11","ga_source":"(direct)","ga_medium":"(none)","ga_socialNetwork":"(not set)","ga_users":1,"ga_newUsers":0,"ga_sessions":9,"ga_sessionsPerUser":9.0,"ga_avgSessionDuration":28.77777777777778,"ga_pageviews":63,"ga_pageviewsPerSession":7.0,"ga_avgTimeOnPage":4.685185185185185,"ga_bounceRate":0.0,"ga_exitRate":14.285714285714285,"view_id":"211669975"}` | | pages | `{"ga_date":"2021-02-11","ga_hostname":"mydemo.com","ga_pagePath":"/home5","ga_pageviews":63,"ga_uniquePageviews":9,"ga_avgTimeOnPage":4.685185185185185,"ga_entrances":9,"ga_entranceRate":14.285714285714285,"ga_bounceRate":0.0,"ga_exits":9,"ga_exitRate":14.285714285714285,"view_id":"211669975"}` | @@ -127,50 +127,41 @@ Custom Reports allow for flexibility in the reporting dimensions and metrics to A custom report is formatted as: `[{"name": "", "dimensions": ["", ...], "metrics": ["", ...]}]` Example of a custom report: -```json -[{ - "name" : "page_views_and_users", - "dimensions" :[ - "ga:date", - "ga:pagePath", - "ga:sessionDefaultChannelGrouping" - ], - "metrics" :[ - "ga:screenPageViews", - "ga:totalUsers" - ] -}] -``` -Multiple custom reports should be entered with a comma separator. Each custom report is created as it's own stream. -Example of multiple custom reports: + ```json [ { - "name" : "page_views_and_users", - "dimensions" :[ + "name": "page_views_and_users", + "dimensions": [ "ga:date", - "ga:pagePath" + "ga:pagePath", + "ga:sessionDefaultChannelGrouping" ], - "metrics" :[ - "ga:screenPageViews", - "ga:totalUsers" - ] + "metrics": ["ga:screenPageViews", "ga:totalUsers"] + } +] +``` + +Multiple custom reports should be entered with a comma separator. Each custom report is created as it's own stream. +Example of multiple custom reports: + +```json +[ + { + "name": "page_views_and_users", + "dimensions": ["ga:date", "ga:pagePath"], + "metrics": ["ga:screenPageViews", "ga:totalUsers"] }, { - "name" : "sessions_by_region", - "dimensions" :[ - "ga:date", - "ga:region" - ], - "metrics" :[ - "ga:totalUsers", - "ga:sessions" - ] + "name": "sessions_by_region", + "dimensions": ["ga:date", "ga:region"], + "metrics": ["ga:totalUsers", "ga:sessions"] } ] ``` Custom reports can also include segments and filters to pull a subset of your data. The report should be formatted as: + ```json [ { @@ -183,27 +174,20 @@ Custom reports can also include segments and filters to pull a subset of your da ] ``` -* When using segments, make sure you also add the `ga:segment` dimension. +- When using segments, make sure you also add the `ga:segment` dimension. Example of a custom report with segments and/or filters: + ```json -[{ "name" : "page_views_and_users", - "dimensions" :[ - "ga:date", - "ga:pagePath", - "ga:segment" - ], - "metrics" :[ - "ga:sessions", - "ga:totalUsers" - ], - "segments" :[ - "ga:sessionSource!=(direct)" - ], - "filter" :[ - "ga:sessionSource!=(direct);ga:sessionSource!=(not set)" - ] -}] +[ + { + "name": "page_views_and_users", + "dimensions": ["ga:date", "ga:pagePath", "ga:segment"], + "metrics": ["ga:sessions", "ga:totalUsers"], + "segments": ["ga:sessionSource!=(direct)"], + "filter": ["ga:sessionSource!=(direct);ga:sessionSource!=(not set)"] + } +] ``` To create a list of dimensions, you can use default Google Analytics dimensions (listed below) or custom dimensions if you have some defined. Each report can contain no more than 7 dimensions, and they must all be unique. The default Google Analytics dimensions are: @@ -273,15 +257,15 @@ The Google Analytics connector should not run into the "requests per 100 seconds -* Check out common troubleshooting issues for the Google Analytics v4 source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Google Analytics v4 source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------- | | 0.0.2 | 2024-04-19 | [37432](https://github.com/airbytehq/airbyte/pull/36267) | Fix empty response error for test stream | | 0.0.1 | 2024-01-29 | [34323](https://github.com/airbytehq/airbyte/pull/34323) | Initial Release | - \ No newline at end of file + diff --git a/docs/integrations/sources/google-analytics-v4.md b/docs/integrations/sources/google-analytics-v4.md index 44105537d1a..172788e601b 100644 --- a/docs/integrations/sources/google-analytics-v4.md +++ b/docs/integrations/sources/google-analytics-v4.md @@ -61,11 +61,11 @@ A Google Cloud account with [Viewer permissions](https://support.google.com/anal 5. Authenticate your Google account via OAuth or Service Account Key Authentication: - To authenticate your Google account via OAuth, enter your Google application's [client ID, client secret, and refresh token](https://developers.google.com/identity/protocols/oauth2). - To authenticate your Google account via Service Account Key Authentication, enter your [Google Cloud service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys) in JSON format. Use the service account email address to [add a user](https://support.google.com/analytics/answer/1009702) to the Google analytics view you want to access via the API and grant [Read and Analyze permissions](https://support.google.com/analytics/answer/2884495). -5. Enter the **Replication Start Date** in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data. +6. Enter the **Replication Start Date** in YYYY-MM-DD format. The data added on and after this date will be replicated. If this field is blank, Airbyte will replicate all data. -6. Enter the [**View ID**](https://ga-dev-tools.appspot.com/account-explorer/) for the Google Analytics View you want to fetch data from. -7. Optionally, enter a JSON object as a string in the **Custom Reports** field. For details, refer to [Requesting custom reports](#requesting-custom-reports) -8. Leave **Data request time increment in days (Optional)** blank or set to 1. For faster syncs, set this value to more than 1 but that might result in the Google Analytics API returning [sampled data](#sampled-data-in-reports), potentially causing inaccuracies in the returned results. The maximum allowed value is 364. +7. Enter the [**View ID**](https://ga-dev-tools.appspot.com/account-explorer/) for the Google Analytics View you want to fetch data from. +8. Optionally, enter a JSON object as a string in the **Custom Reports** field. For details, refer to [Requesting custom reports](#requesting-custom-reports) +9. Leave **Data request time increment in days (Optional)** blank or set to 1. For faster syncs, set this value to more than 1 but that might result in the Google Analytics API returning [sampled data](#sampled-data-in-reports), potentially causing inaccuracies in the returned results. The maximum allowed value is 364. @@ -129,50 +129,41 @@ Custom Reports allow for flexibility in the reporting dimensions and metrics to A custom report is formatted as: `[{"name": "", "dimensions": ["", ...], "metrics": ["", ...]}]` Example of a custom report: -```json -[{ - "name" : "page_views_and_users", - "dimensions" :[ - "ga:date", - "ga:pagePath", - "ga:sessionDefaultChannelGrouping" - ], - "metrics" :[ - "ga:screenPageViews", - "ga:totalUsers" - ] -}] -``` -Multiple custom reports should be entered with a comma separator. Each custom report is created as it's own stream. -Example of multiple custom reports: + ```json [ { - "name" : "page_views_and_users", - "dimensions" :[ + "name": "page_views_and_users", + "dimensions": [ "ga:date", - "ga:pagePath" + "ga:pagePath", + "ga:sessionDefaultChannelGrouping" ], - "metrics" :[ - "ga:screenPageViews", - "ga:totalUsers" - ] + "metrics": ["ga:screenPageViews", "ga:totalUsers"] + } +] +``` + +Multiple custom reports should be entered with a comma separator. Each custom report is created as it's own stream. +Example of multiple custom reports: + +```json +[ + { + "name": "page_views_and_users", + "dimensions": ["ga:date", "ga:pagePath"], + "metrics": ["ga:screenPageViews", "ga:totalUsers"] }, { - "name" : "sessions_by_region", - "dimensions" :[ - "ga:date", - "ga:region" - ], - "metrics" :[ - "ga:totalUsers", - "ga:sessions" - ] + "name": "sessions_by_region", + "dimensions": ["ga:date", "ga:region"], + "metrics": ["ga:totalUsers", "ga:sessions"] } ] ``` Custom reports can also include segments and filters to pull a subset of your data. The report should be formatted as: + ```json [ { @@ -185,27 +176,20 @@ Custom reports can also include segments and filters to pull a subset of your da ] ``` -* When using segments, make sure you also add the `ga:segment` dimension. +- When using segments, make sure you also add the `ga:segment` dimension. Example of a custom report with segments and/or filters: + ```json -[{ "name" : "page_views_and_users", - "dimensions" :[ - "ga:date", - "ga:pagePath", - "ga:segment" - ], - "metrics" :[ - "ga:sessions", - "ga:totalUsers" - ], - "segments" :[ - "ga:sessionSource!=(direct)" - ], - "filter" :[ - "ga:sessionSource!=(direct);ga:sessionSource!=(not set)" - ] -}] +[ + { + "name": "page_views_and_users", + "dimensions": ["ga:date", "ga:pagePath", "ga:segment"], + "metrics": ["ga:sessions", "ga:totalUsers"], + "segments": ["ga:sessionSource!=(direct)"], + "filter": ["ga:sessionSource!=(direct);ga:sessionSource!=(not set)"] + } +] ``` To create a list of dimensions, you can use default Google Analytics dimensions (listed below) or custom dimensions if you have some defined. Each report can contain no more than 7 dimensions, and they must all be unique. The default Google Analytics dimensions are: @@ -275,14 +259,14 @@ The Google Analytics connector should not run into the "requests per 100 seconds -* Check out common troubleshooting issues for the Google Analytics v4 source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Google Analytics v4 source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------- | | 0.3.1 | 2024-04-19 | [37432](https://github.com/airbytehq/airbyte/pull/36267) | Fix empty response error for test stream | | 0.3.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 0.2.5 | 2024-02-09 | [35101](https://github.com/airbytehq/airbyte/pull/35101) | Manage dependencies with Poetry. | diff --git a/docs/integrations/sources/google-directory.md b/docs/integrations/sources/google-directory.md index d263d9efc93..e5f8863d20d 100644 --- a/docs/integrations/sources/google-directory.md +++ b/docs/integrations/sources/google-directory.md @@ -8,28 +8,28 @@ The Directory source supports Full Refresh syncs. It uses [Google Directory API] This Source is capable of syncing the following Streams: -* [users](https://developers.google.com/admin-sdk/directory/v1/guides/manage-users#get_all_users) -* [groups](https://developers.google.com/admin-sdk/directory/v1/guides/manage-groups#get_all_domain_groups) -* [group members](https://developers.google.com/admin-sdk/directory/v1/guides/manage-group-members#get_all_members) +- [users](https://developers.google.com/admin-sdk/directory/v1/guides/manage-users#get_all_users) +- [groups](https://developers.google.com/admin-sdk/directory/v1/guides/manage-groups#get_all_domain_groups) +- [group members](https://developers.google.com/admin-sdk/directory/v1/guides/manage-group-members#get_all_members) ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| Replicate Incremental Deletes | Coming soon | | -| SSL connection | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| Replicate Incremental Deletes | Coming soon | | +| SSL connection | Yes | | +| Namespaces | No | | ### Performance considerations @@ -44,13 +44,13 @@ This connector attempts to back off gracefully when it hits Directory API's rate Google APIs use the OAuth 2.0 protocol for authentication and authorization. This connector supports [Web server application](https://developers.google.com/identity/protocols/oauth2#webserver) and [Service accounts](https://developers.google.com/identity/protocols/oauth2#serviceaccount) scenarios. Therefore, there are 2 options of setting up authorization for this source: -* Use your Google account and authorize over Google's OAuth on connection setup. Select "Default OAuth2.0 authorization" from dropdown list. -* Create service account specifically for Airbyte. +- Use your Google account and authorize over Google's OAuth on connection setup. Select "Default OAuth2.0 authorization" from dropdown list. +- Create service account specifically for Airbyte. ### Service account requirements -* Credentials to a Google Service Account with delegated Domain Wide Authority -* Email address of the workspace admin which created the Service Account +- Credentials to a Google Service Account with delegated Domain Wide Authority +- Email address of the workspace admin which created the Service Account ### Create a Service Account with delegated domain wide authority @@ -63,11 +63,10 @@ At the end of this process, you should have JSON credentials to this Google Serv You should now be ready to use the Google Directory connector in Airbyte. - ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------- | | 0.2.1 | 2023-05-30 | [27236](https://github.com/airbytehq/airbyte/pull/27236) | Autoformat code | | 0.2.0 | 2023-05-30 | [26775](https://github.com/airbytehq/airbyte/pull/26775) | Remove `authSpecification` from spec; update stream schemas. | | 0.1.9 | 2021-12-06 | [8524](https://github.com/airbytehq/airbyte/pull/8524) | Update connector fields title/description | diff --git a/docs/integrations/sources/google-drive.md b/docs/integrations/sources/google-drive.md index 66b1fc01603..11234250377 100644 --- a/docs/integrations/sources/google-drive.md +++ b/docs/integrations/sources/google-drive.md @@ -10,19 +10,21 @@ The Google Drive source connector pulls data from a single folder in Google Driv - Drive folder link - The link to the Google Drive folder you want to sync files from (includes files located in subfolders) -- **For Airbyte Cloud** A Google Workspace user with access to the spreadsheet - - -- **For Airbyte Open Source:** +- **For Airbyte Cloud** A Google Workspace user with access to the spreadsheet + + +- **For Airbyte Open Source:** - A GCP project - Enable the Google Drive API in your GCP project - Service Account Key with access to the Spreadsheet you want to replicate - + ## Setup guide The Google Drive source connector supports authentication via either OAuth or Service Account Key Authentication. + + For **Airbyte Cloud** users, we highly recommend using OAuth, as it significantly simplifies the setup process and allows you to authenticate [directly from the Airbyte UI](#set-up-the-google-drive-source-connector-in-airbyte). @@ -85,9 +87,9 @@ To set up Google Drive as a source in Airbyte Cloud: - **(Recommended)** Select **Service Account Key Authentication** from the dropdown and enter your Google Cloud service account key in JSON format: - ```js - { "type": "service_account", "project_id": "YOUR_PROJECT_ID", "private_key_id": "YOUR_PRIVATE_KEY", ... } - ``` + ```js + { "type": "service_account", "project_id": "YOUR_PROJECT_ID", "private_key_id": "YOUR_PRIVATE_KEY", ... } + ``` - To authenticate your Google account via OAuth, select **Authenticate via Google (OAuth)** from the dropdown and enter your Google application's client ID, client secret, and refresh token. @@ -203,7 +205,7 @@ Product,Description,Price Jeans,"Navy Blue, Bootcut, 34\"",49.99 ``` -The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). +The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). Leaving this field blank (default option) will disallow escaping. @@ -215,7 +217,6 @@ Leaving this field blank (default option) will disallow escaping. - **Strings Can Be Null**: Whether strings can be interpreted as null values. If true, strings that match the null_values set will be interpreted as null. If false, strings that match the null_values set will be interpreted as the string itself. - **True Values**: A set of case-sensitive strings that should be interpreted as true values. - ### Parquet Apache Parquet is a column-oriented data storage format of the Apache Hadoop ecosystem. It provides efficient data compression and encoding schemes with enhanced performance to handle complex data in bulk. At the moment, partitioned parquet datasets are unsupported. The following settings are available: @@ -225,6 +226,7 @@ Apache Parquet is a column-oriented data storage format of the Apache Hadoop eco ### Avro The Avro parser uses the [Fastavro library](https://fastavro.readthedocs.io/en/latest/). The following settings are available: + - **Convert Double Fields to Strings**: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers. ### JSONL @@ -250,7 +252,7 @@ This connector utilizes the open source [Unstructured](https://unstructured-io.g ## Changelog | Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------| +| ------- | ---------- | -------------------------------------------------------- | -------------------------------------------------------------------------------------------- | | 0.0.10 | 2024-03-28 | [36581](https://github.com/airbytehq/airbyte/pull/36581) | Manage dependencies with Poetry | | 0.0.9 | 2024-02-06 | [34936](https://github.com/airbytehq/airbyte/pull/34936) | Bump CDK version to avoid missing SyncMode errors | | 0.0.8 | 2024-01-30 | [34681](https://github.com/airbytehq/airbyte/pull/34681) | Unpin CDK version to make compatible with the Concurrent CDK | diff --git a/docs/integrations/sources/google-pagespeed-insights.md b/docs/integrations/sources/google-pagespeed-insights.md index a1e06ec8955..21216b3896f 100644 --- a/docs/integrations/sources/google-pagespeed-insights.md +++ b/docs/integrations/sources/google-pagespeed-insights.md @@ -5,6 +5,7 @@ This page guides you through the process of setting up the Google PageSpeed Insi ## Sync overview ## Prerequisites + - Your [Google PageSpeed `API Key`](https://developers.google.com/speed/docs/insights/v5/get-started#APIKey) ## Set up the Google PageSpeed Insights source connector @@ -19,7 +20,7 @@ This page guides you through the process of setting up the Google PageSpeed Insi 8. For **Lighthouse Categories**, select one or many of the provided options. Categories are also called "audits" in some of the [Google Lighthouse documentation](https://developer.chrome.com/docs/lighthouse/overview/). 9. Click **Set up source**. -> **IMPORTANT:** As of 2022-12-13, the PageSpeed Insights API - as well as this Airbyte Connector - allow to specify a URL with prefix "origin:" - like ``origin:https://www.google.com``. This results in condensed, aggregated reports about the specified origin - see [this FAQ](https://developers.google.com/speed/docs/insights/faq). **However**: This option is not specified in any official documentation anymore, therefore it might be deprecated anytime soon! +> **IMPORTANT:** As of 2022-12-13, the PageSpeed Insights API - as well as this Airbyte Connector - allow to specify a URL with prefix "origin:" - like `origin:https://www.google.com`. This results in condensed, aggregated reports about the specified origin - see [this FAQ](https://developers.google.com/speed/docs/insights/faq). **However**: This option is not specified in any official documentation anymore, therefore it might be deprecated anytime soon! ## Supported sync modes @@ -32,12 +33,13 @@ The Google PageSpeed Insights source connector supports the following [sync mode The Google PageSpeed Insights source connector supports the following stream: - [pagespeed](https://developers.google.com/speed/docs/insights/v5/get-started#cli): Full pagespeed report of the selected URLs, lighthouse categories and analyses strategies. + ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -48,11 +50,11 @@ If the connector is used with an API key, Google allows for 25.000 queries per d ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.5 | 2024-04-19 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Updating to 0.80.0 CDK | -| 0.1.4 | 2024-04-18 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Manage dependencies with Poetry. | -| 0.1.3 | 2024-04-15 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.2 | 2024-04-12 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | schema descriptions | -| 0.1.1 | 2023-05-25 | [#22287](https://github.com/airbytehq/airbyte/pull/22287) | 🐛 Fix URL pattern regex | -| 0.1.0 | 2022-11-26 | [#19813](https://github.com/airbytehq/airbyte/pull/19813) | 🎉 New Source: Google PageSpeed Insights [low-code CDK] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.5 | 2024-04-19 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Updating to 0.80.0 CDK | +| 0.1.4 | 2024-04-18 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Manage dependencies with Poetry. | +| 0.1.3 | 2024-04-15 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.2 | 2024-04-12 | [37171](https://github.com/airbytehq/airbyte/pull/37171) | schema descriptions | +| 0.1.1 | 2023-05-25 | [#22287](https://github.com/airbytehq/airbyte/pull/22287) | 🐛 Fix URL pattern regex | +| 0.1.0 | 2022-11-26 | [#19813](https://github.com/airbytehq/airbyte/pull/19813) | 🎉 New Source: Google PageSpeed Insights [low-code CDK] | diff --git a/docs/integrations/sources/google-search-console.md b/docs/integrations/sources/google-search-console.md index d33b10dc6ac..bf616ecbb79 100644 --- a/docs/integrations/sources/google-search-console.md +++ b/docs/integrations/sources/google-search-console.md @@ -20,12 +20,15 @@ This page contains the setup guide and reference information for the Google Sear To authenticate the Google Search Console connector, you will need to use one of the following methods: + #### OAuth (Recommended for Airbyte Cloud) You can authenticate using your Google Account with OAuth if you are the owner of the Google Search Console property or have view permissions. Follow [Google's instructions](https://support.google.com/webmasters/answer/7687615?sjid=11103698321670173176-NA) to ensure that your account has the necessary permissions (**Owner** or **Full User**) to view the Google Search Console property. This option is recommended for **Airbyte Cloud** users, as it significantly simplifies the setup process and allows you to authenticate the connection [directly from the Airbyte UI](#step-2-set-up-the-google-search-console-connector-in-airbyte). + + To authenticate with OAuth in **Airbyte Open Source**, you will need to create an authentication app and obtain the following credentials and tokens: - Client ID @@ -70,11 +73,13 @@ To enable delegated domain-wide authority, follow the steps listed in the [Googl - `https://www.googleapis.com/auth/webmasters.readonly` For more information on this topic, please refer to [this Google article](https://support.google.com/a/answer/162106?hl=en). + ### Step 2: Set up the Google Search Console connector in Airbyte + **For Airbyte Cloud:** 1. [Log in to your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -85,14 +90,16 @@ For more information on this topic, please refer to [this Google article](https: 6. For **Start Date**, by default the `2021-01-01` is set, use the provided datepicker or enter a date in the format `YYYY-MM-DD`. Any data created on or after this date will be replicated. 7. To authenticate the connection: + - **For Airbyte Cloud:** - Select **Oauth** from the Authentication dropdown, then click **Sign in with Google** to authorize your account. - - + + - **For Airbyte Open Source:** - (Recommended) Select **Service Account Key Authorization** from the Authentication dropdown, then enter the **Admin Email** and **Service Account JSON Key**. For the key, copy and paste the JSON key you obtained during the service account setup. It should begin with `{"type": "service account", "project_id": YOUR_PROJECT_ID, "private_key_id": YOUR_PRIVATE_KEY, ...}` - Select **Oauth** from the Authentication dropdown, then enter your **Client ID**, **Client Secret**, **Access Token** and **Refresh Token**. - + + 8. (Optional) For **End Date**, you may optionally provide a date in the format `YYYY-MM-DD`. Any data created between the defined Start Date and End Date will be replicated. Leaving this field blank will replicate all data created on or after the Start Date to the present. 9. (Optional) For **Custom Reports**, you may optionally provide an array of JSON objects representing any custom reports you wish to query the API with. Refer to the [Custom reports](#custom-reports) section below for more information on formulating these reports. 10. (Optional) For **Data Freshness**, you may choose whether to include "fresh" data that has not been finalized by Google, and may be subject to change. Please note that if you are using Incremental sync mode, we highly recommend leaving this option to its default value of `final`. Refer to the [Data Freshness](#data-freshness) section below for more information on this parameter. @@ -151,8 +158,8 @@ The available `Dimensions` are: For example, to query the API for a report that groups results by country, then by date, you could enter the following custom report: -* Name: country_date -* Dimensions: ["country", "date"] +- Name: country_date +- Dimensions: ["country", "date"] Please note, that for technical reasons `date` is the default dimension which will be included in your query whether you specify it or not. By specifying it you can change the order the results are grouped in. Primary key will consist of your custom dimensions and the default dimension along with `site_url` and `search_type`. @@ -173,12 +180,12 @@ When using Incremental Sync mode, we recommend leaving this parameter to its def ## Data type map -| Integration Type | Airbyte Type | -|:------------------|:-------------| -| `string` | `string` | -| `number` | `number` | -| `array` | `array` | -| `object` | `object` | +| Integration Type | Airbyte Type | +| :--------------- | :----------- | +| `string` | `string` | +| `number` | `number` | +| `array` | `array` | +| `object` | `object` | ## Limitations & Troubleshooting @@ -199,53 +206,53 @@ Google Search Console only retains data for websites from the last 16 months. An ### Troubleshooting -* Check out common troubleshooting issues for the Google Search Console source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Google Search Console source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------| -| 1.4.2 | 2024-04-19 | [36639](https://github.com/airbytehq/airbyte/pull/36639) | Updating to 0.80.0 CDK | -| 1.4.1 | 2024-04-12 | [36639](https://github.com/airbytehq/airbyte/pull/36639) | Schema descriptions | -| 1.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | -| 1.3.7 | 2024-02-12 | [35163](https://github.com/airbytehq/airbyte/pull/35163) | Manage dependencies with Poetry | -| 1.3.6 | 2023-10-26 | [31863](https://github.com/airbytehq/airbyte/pull/31863) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 1.3.5 | 2023-09-28 | [30822](https://github.com/airbytehq/airbyte/pull/30822) | Fix primary key for custom reports | -| 1.3.4 | 2023-09-27 | [30785](https://github.com/airbytehq/airbyte/pull/30785) | Do not migrate config for the newly created connections | -| 1.3.3 | 2023-08-29 | [29941](https://github.com/airbytehq/airbyte/pull/29941) | Added `primary key` to each stream, added `custom_report` config migration | -| 1.3.2 | 2023-08-25 | [29829](https://github.com/airbytehq/airbyte/pull/29829) | Make `Start Date` a non-required, added the `suggested streams`, corrected public docs | -| 1.3.1 | 2023-08-24 | [29329](https://github.com/airbytehq/airbyte/pull/29329) | Update tooltip descriptions | -| 1.3.0 | 2023-08-24 | [29750](https://github.com/airbytehq/airbyte/pull/29750) | Add new `Keyword-Site-Report-By-Site` stream | -| 1.2.2 | 2023-08-23 | [29741](https://github.com/airbytehq/airbyte/pull/29741) | Handle `HTTP-401`, `HTTP-403` errors | -| 1.2.1 | 2023-07-04 | [27952](https://github.com/airbytehq/airbyte/pull/27952) | Removed deprecated `searchType`, added `discover`(Discover results) and `googleNews`(Results from news.google.com, etc.) types | -| 1.2.0 | 2023-06-29 | [27831](https://github.com/airbytehq/airbyte/pull/27831) | Add new streams | -| 1.1.0 | 2023-06-26 | [27738](https://github.com/airbytehq/airbyte/pull/27738) | License Update: Elv2 | -| 1.0.2 | 2023-06-13 | [27307](https://github.com/airbytehq/airbyte/pull/27307) | Fix `data_state` config typo | -| 1.0.1 | 2023-05-30 | [26746](https://github.com/airbytehq/airbyte/pull/26746) | Remove `authSpecification` from connector spec in favour of advancedAuth | -| 1.0.0 | 2023-05-24 | [26452](https://github.com/airbytehq/airbyte/pull/26452) | Add data_state parameter to specification | -| 0.1.22 | 2023-03-20 | [22295](https://github.com/airbytehq/airbyte/pull/22295) | Update specification examples | -| 0.1.21 | 2023-02-14 | [22984](https://github.com/airbytehq/airbyte/pull/22984) | Specified date formatting in specification | -| 0.1.20 | 2023-02-02 | [22334](https://github.com/airbytehq/airbyte/pull/22334) | Turn on default HttpAvailabilityStrategy | -| 0.1.19 | 2023-01-27 | [22007](https://github.com/airbytehq/airbyte/pull/22007) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.1.18 | 2022-10-27 | [18568](https://github.com/airbytehq/airbyte/pull/18568) | Improved config validation: custom_reports.dimension | -| 0.1.17 | 2022-10-08 | [17751](https://github.com/airbytehq/airbyte/pull/17751) | Improved config validation: start_date, end_date, site_urls | -| 0.1.16 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | -| 0.1.15 | 2022-09-16 | [16819](https://github.com/airbytehq/airbyte/pull/16819) | Check available site urls to avoid 403 error on sync | -| 0.1.14 | 2022-09-08 | [16433](https://github.com/airbytehq/airbyte/pull/16433) | Add custom analytics stream. | -| 0.1.13 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from specs | -| 0.1.12 | 2022-05-04 | [12482](https://github.com/airbytehq/airbyte/pull/12482) | Update input configuration copy | -| 0.1.11 | 2022-01-05 | [9186](https://github.com/airbytehq/airbyte/pull/9186) | Fix incremental sync: keep all urls in state object | -| 0.1.10 | 2021-12-23 | [9073](https://github.com/airbytehq/airbyte/pull/9073) | Add slicing by date range | -| 0.1.9 | 2021-12-22 | [9047](https://github.com/airbytehq/airbyte/pull/9047) | Add 'order' to spec.json props | -| 0.1.8 | 2021-12-21 | [8248](https://github.com/airbytehq/airbyte/pull/8248) | Enable Sentry for performance and errors tracking | -| 0.1.7 | 2021-11-26 | [7431](https://github.com/airbytehq/airbyte/pull/7431) | Add default `end_date` param value | -| 0.1.6 | 2021-09-27 | [6460](https://github.com/airbytehq/airbyte/pull/6460) | Update OAuth Spec File | -| 0.1.4 | 2021-09-23 | [6394](https://github.com/airbytehq/airbyte/pull/6394) | Update Doc link Spec File | -| 0.1.3 | 2021-09-23 | [6405](https://github.com/airbytehq/airbyte/pull/6405) | Correct Spec File | -| 0.1.2 | 2021-09-17 | [6222](https://github.com/airbytehq/airbyte/pull/6222) | Correct Spec File | -| 0.1.1 | 2021-09-22 | [6315](https://github.com/airbytehq/airbyte/pull/6315) | Verify access to all sites when performing connection check | -| 0.1.0` | 2021-09-03 | [5350](https://github.com/airbytehq/airbyte/pull/5350) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------- | +| 1.4.2 | 2024-04-19 | [36639](https://github.com/airbytehq/airbyte/pull/36639) | Updating to 0.80.0 CDK | +| 1.4.1 | 2024-04-12 | [36639](https://github.com/airbytehq/airbyte/pull/36639) | Schema descriptions | +| 1.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | +| 1.3.7 | 2024-02-12 | [35163](https://github.com/airbytehq/airbyte/pull/35163) | Manage dependencies with Poetry | +| 1.3.6 | 2023-10-26 | [31863](https://github.com/airbytehq/airbyte/pull/31863) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 1.3.5 | 2023-09-28 | [30822](https://github.com/airbytehq/airbyte/pull/30822) | Fix primary key for custom reports | +| 1.3.4 | 2023-09-27 | [30785](https://github.com/airbytehq/airbyte/pull/30785) | Do not migrate config for the newly created connections | +| 1.3.3 | 2023-08-29 | [29941](https://github.com/airbytehq/airbyte/pull/29941) | Added `primary key` to each stream, added `custom_report` config migration | +| 1.3.2 | 2023-08-25 | [29829](https://github.com/airbytehq/airbyte/pull/29829) | Make `Start Date` a non-required, added the `suggested streams`, corrected public docs | +| 1.3.1 | 2023-08-24 | [29329](https://github.com/airbytehq/airbyte/pull/29329) | Update tooltip descriptions | +| 1.3.0 | 2023-08-24 | [29750](https://github.com/airbytehq/airbyte/pull/29750) | Add new `Keyword-Site-Report-By-Site` stream | +| 1.2.2 | 2023-08-23 | [29741](https://github.com/airbytehq/airbyte/pull/29741) | Handle `HTTP-401`, `HTTP-403` errors | +| 1.2.1 | 2023-07-04 | [27952](https://github.com/airbytehq/airbyte/pull/27952) | Removed deprecated `searchType`, added `discover`(Discover results) and `googleNews`(Results from news.google.com, etc.) types | +| 1.2.0 | 2023-06-29 | [27831](https://github.com/airbytehq/airbyte/pull/27831) | Add new streams | +| 1.1.0 | 2023-06-26 | [27738](https://github.com/airbytehq/airbyte/pull/27738) | License Update: Elv2 | +| 1.0.2 | 2023-06-13 | [27307](https://github.com/airbytehq/airbyte/pull/27307) | Fix `data_state` config typo | +| 1.0.1 | 2023-05-30 | [26746](https://github.com/airbytehq/airbyte/pull/26746) | Remove `authSpecification` from connector spec in favour of advancedAuth | +| 1.0.0 | 2023-05-24 | [26452](https://github.com/airbytehq/airbyte/pull/26452) | Add data_state parameter to specification | +| 0.1.22 | 2023-03-20 | [22295](https://github.com/airbytehq/airbyte/pull/22295) | Update specification examples | +| 0.1.21 | 2023-02-14 | [22984](https://github.com/airbytehq/airbyte/pull/22984) | Specified date formatting in specification | +| 0.1.20 | 2023-02-02 | [22334](https://github.com/airbytehq/airbyte/pull/22334) | Turn on default HttpAvailabilityStrategy | +| 0.1.19 | 2023-01-27 | [22007](https://github.com/airbytehq/airbyte/pull/22007) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.1.18 | 2022-10-27 | [18568](https://github.com/airbytehq/airbyte/pull/18568) | Improved config validation: custom_reports.dimension | +| 0.1.17 | 2022-10-08 | [17751](https://github.com/airbytehq/airbyte/pull/17751) | Improved config validation: start_date, end_date, site_urls | +| 0.1.16 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | +| 0.1.15 | 2022-09-16 | [16819](https://github.com/airbytehq/airbyte/pull/16819) | Check available site urls to avoid 403 error on sync | +| 0.1.14 | 2022-09-08 | [16433](https://github.com/airbytehq/airbyte/pull/16433) | Add custom analytics stream. | +| 0.1.13 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from specs | +| 0.1.12 | 2022-05-04 | [12482](https://github.com/airbytehq/airbyte/pull/12482) | Update input configuration copy | +| 0.1.11 | 2022-01-05 | [9186](https://github.com/airbytehq/airbyte/pull/9186) | Fix incremental sync: keep all urls in state object | +| 0.1.10 | 2021-12-23 | [9073](https://github.com/airbytehq/airbyte/pull/9073) | Add slicing by date range | +| 0.1.9 | 2021-12-22 | [9047](https://github.com/airbytehq/airbyte/pull/9047) | Add 'order' to spec.json props | +| 0.1.8 | 2021-12-21 | [8248](https://github.com/airbytehq/airbyte/pull/8248) | Enable Sentry for performance and errors tracking | +| 0.1.7 | 2021-11-26 | [7431](https://github.com/airbytehq/airbyte/pull/7431) | Add default `end_date` param value | +| 0.1.6 | 2021-09-27 | [6460](https://github.com/airbytehq/airbyte/pull/6460) | Update OAuth Spec File | +| 0.1.4 | 2021-09-23 | [6394](https://github.com/airbytehq/airbyte/pull/6394) | Update Doc link Spec File | +| 0.1.3 | 2021-09-23 | [6405](https://github.com/airbytehq/airbyte/pull/6405) | Correct Spec File | +| 0.1.2 | 2021-09-17 | [6222](https://github.com/airbytehq/airbyte/pull/6222) | Correct Spec File | +| 0.1.1 | 2021-09-22 | [6315](https://github.com/airbytehq/airbyte/pull/6315) | Verify access to all sites when performing connection check | +| 0.1.0` | 2021-09-03 | [5350](https://github.com/airbytehq/airbyte/pull/5350) | Initial Release | diff --git a/docs/integrations/sources/google-sheets.md b/docs/integrations/sources/google-sheets.md index e6cfd24ffc8..d7518e27e6b 100644 --- a/docs/integrations/sources/google-sheets.md +++ b/docs/integrations/sources/google-sheets.md @@ -11,15 +11,16 @@ The Google Sheets source connector pulls data from a single Google Sheets spread ::: ### Prerequisites + - Spreadsheet Link - The link to the Google spreadsheet you want to sync. - **For Airbyte Cloud** A Google Workspace user with access to the spreadsheet - - -- **For Airbyte Open Source:** - - A GCP project - - Enable the Google Sheets API in your GCP project - - Service Account Key with access to the Spreadsheet you want to replicate + + +- **For Airbyte Open Source:** +- A GCP project +- Enable the Google Sheets API in your GCP project +- Service Account Key with access to the Spreadsheet you want to replicate ## Setup guide @@ -27,6 +28,7 @@ The Google Sheets source connector pulls data from a single Google Sheets spread The Google Sheets source connector supports authentication via either OAuth or Service Account Key Authentication. + **For Airbyte Cloud:** We highly recommend using OAuth, as it significantly simplifies the setup process and allows you to authenticate [directly from the Airbyte UI](#set-up-the-google-sheets-source-connector-in-airbyte). @@ -72,41 +74,41 @@ If your spreadsheet is viewable by anyone with its link, no further action is ne ### Set up the Google Sheets source connector in Airbyte - - 1. [Log in to your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+ New source**. 3. Find and select **Google Sheets** from the list of available sources. 4. For **Source name**, enter a name to help you identify this source. 5. Select your authentication method: - - **For Airbyte Cloud: (Recommended)** Select **Authenticate via Google (OAuth)** from the Authentication dropdown, click **Sign in with Google** and complete the authentication workflow. - - - - **For Airbyte Open Source: (Recommended)** Select **Service Account Key Authentication** from the dropdown and enter your Google Cloud service account key in JSON format: - ```json - { - "type": "service_account", - "project_id": "YOUR_PROJECT_ID", - "private_key_id": "YOUR_PRIVATE_KEY", - ... - } - ``` +- **For Airbyte Cloud: (Recommended)** Select **Authenticate via Google (OAuth)** from the Authentication dropdown, click **Sign in with Google** and complete the authentication workflow. + + +- **For Airbyte Open Source: (Recommended)** Select **Service Account Key Authentication** from the dropdown and enter your Google Cloud service account key in JSON format: - - To authenticate your Google account via OAuth, select **Authenticate via Google (OAuth)** from the dropdown and enter your Google application's client ID, client secret, and refresh token. +```json + { + "type": "service_account", + "project_id": "YOUR_PROJECT_ID", + "private_key_id": "YOUR_PRIVATE_KEY", + ... + } +``` + +- To authenticate your Google account via OAuth, select **Authenticate via Google (OAuth)** from the dropdown and enter your Google application's client ID, client secret, and refresh token. + 6. For **Spreadsheet Link**, enter the link to the Google spreadsheet. To get the link, go to the Google spreadsheet you want to sync, click **Share** in the top right corner, and click **Copy Link**. 7. For **Batch Size**, enter an integer which represents batch size when processing a Google Sheet. Default value is 200. -Batch size is an integer representing row batch size for each sent request to Google Sheets API. -Row batch size means how many rows are processed from the google sheet, for example default value 200 -would process rows 1-201, then 201-401 and so on. -Based on [Google Sheets API limits documentation](https://developers.google.com/sheets/api/limits), -it is possible to send up to 300 requests per minute, but each individual request has to be processed under 180 seconds, -otherwise the request returns a timeout error. In regards to this information, consider network speed and -number of columns of the google sheet when deciding a batch_size value. -Default value should cover most of the cases, but if a google sheet has over 100,000 records or more, -consider increasing batch_size value. + Batch size is an integer representing row batch size for each sent request to Google Sheets API. + Row batch size means how many rows are processed from the google sheet, for example default value 200 + would process rows 1-201, then 201-401 and so on. + Based on [Google Sheets API limits documentation](https://developers.google.com/sheets/api/limits), + it is possible to send up to 300 requests per minute, but each individual request has to be processed under 180 seconds, + otherwise the request returns a timeout error. In regards to this information, consider network speed and + number of columns of the google sheet when deciding a batch_size value. + Default value should cover most of the cases, but if a google sheet has over 100,000 records or more, + consider increasing batch_size value. 8. (Optional) You may enable the option to **Convert Column Names to SQL-Compliant Format**. Enabling this option will allow the connector to convert column names to a standardized, SQL-friendly format. For example, a column name of `Café Earnings 2022` will be converted to `cafe_earnings_2022`. We recommend enabling this option if your target destination is SQL-based (ie Postgres, MySQL). Set to false by default. 9. Click **Set up source** and wait for the tests to complete. @@ -151,17 +153,17 @@ Airbyte batches requests to the API in order to efficiently pull data and respec ### Troubleshooting -* If your sheet is completely empty (no header rows) or deleted, Airbyte will not delete the table in the destination. If this happens, the sync logs will contain a message saying the sheet has been skipped when syncing the full spreadsheet. -* Connector setup will fail if the spreadsheet is not a Google Sheets file. If the file was saved or imported as another file type the setup could fail. -* Check out common troubleshooting issues for the Google Sheets source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- If your sheet is completely empty (no header rows) or deleted, Airbyte will not delete the table in the destination. If this happens, the sync logs will contain a message saying the sheet has been skipped when syncing the full spreadsheet. +- Connector setup will fail if the spreadsheet is not a Google Sheets file. If the file was saved or imported as another file type the setup could fail. +- Check out common troubleshooting issues for the Google Sheets source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|-----------------------------------------------------------------------------------| -| 0.5.1 | 2024-04-11 | [35404](https://github.com/airbytehq/airbyte/pull/35404) | Add `row_batch_size` parameter more granular control read records | +| ------- | ---------- | -------------------------------------------------------- | --------------------------------------------------------------------------------- | +| 0.5.1 | 2024-04-11 | [35404](https://github.com/airbytehq/airbyte/pull/35404) | Add `row_batch_size` parameter more granular control read records | | 0.5.0 | 2024-03-26 | [36515](https://github.com/airbytehq/airbyte/pull/36515) | Resolve poetry dependency conflict, add record counts to state messages | | 0.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 0.3.17 | 2024-02-29 | [35722](https://github.com/airbytehq/airbyte/pull/35722) | Add logic to emit stream statuses | diff --git a/docs/integrations/sources/google-webfonts.md b/docs/integrations/sources/google-webfonts.md index bca5acea8bd..a1cfab9ecde 100644 --- a/docs/integrations/sources/google-webfonts.md +++ b/docs/integrations/sources/google-webfonts.md @@ -34,8 +34,8 @@ Just pass the generated API key and optional parameters for establishing the con 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter the params configuration if needed. Supported params are: sort, alt, prettyPrint (Optional) -6. Click **Set up source**. +4. Enter the params configuration if needed. Supported params are: sort, alt, prettyPrint (Optional) +5. Click **Set up source**. ## Supported sync modes @@ -63,9 +63,9 @@ Google Webfont's [API reference](https://developers.google.com/fonts/docs/develo ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.3 | 2024-04-19 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | schema descriptions | -| 0.1.0 | 2022-10-26 | [Init](https://github.com/airbytehq/airbyte/pull/18496)| Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37172](https://github.com/airbytehq/airbyte/pull/37172) | schema descriptions | +| 0.1.0 | 2022-10-26 | [Init](https://github.com/airbytehq/airbyte/pull/18496) | Initial commit | diff --git a/docs/integrations/sources/google-workspace-admin-reports.md b/docs/integrations/sources/google-workspace-admin-reports.md index 6d244239d56..68492572038 100644 --- a/docs/integrations/sources/google-workspace-admin-reports.md +++ b/docs/integrations/sources/google-workspace-admin-reports.md @@ -8,29 +8,29 @@ This source supports Full Refresh syncs. It uses the [Reports API](https://devel This Source is capable of syncing the following Streams: -* [admin](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-admin) -* [drive](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-drive) -* [logins](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-login) -* [mobile](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-mobile) -* [oauth\_tokens](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-tokens) +- [admin](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-admin) +- [drive](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-drive) +- [logins](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-login) +- [mobile](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-mobile) +- [oauth_tokens](https://developers.google.com/admin-sdk/reports/v1/guides/manage-audit-tokens) ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| SSL connection | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| SSL connection | Yes | | +| Namespaces | No | | ### Performance considerations @@ -40,8 +40,8 @@ This connector attempts to back off gracefully when it hits Reports API's rate l ### Requirements -* Credentials to a Google Service Account with delegated Domain Wide Authority -* Email address of the workspace admin which created the Service Account +- Credentials to a Google Service Account with delegated Domain Wide Authority +- Email address of the workspace admin which created the Service Account ### Create a Service Account with delegated domain wide authority @@ -56,9 +56,9 @@ You should now be ready to use the Google Workspace Admin Reports API connector ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :-------- | :----- | :------ | -| 0.1.8 | 2022-02-24 | [10244](https://github.com/airbytehq/airbyte/pull/10244) | Add Meet Stream | -| 0.1.7 | 2021-12-06 | [8524](https://github.com/airbytehq/airbyte/pull/8524) | Update connector fields title/description | -| 0.1.6 | 2021-11-02 | [7623](https://github.com/airbytehq/airbyte/pull/7623) | Migrate to the CDK | -| 0.1.5 | 2021-10-07 | [6878](https://github.com/airbytehq/airbyte/pull/6878) | Improve testing & output schemas | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------- | +| 0.1.8 | 2022-02-24 | [10244](https://github.com/airbytehq/airbyte/pull/10244) | Add Meet Stream | +| 0.1.7 | 2021-12-06 | [8524](https://github.com/airbytehq/airbyte/pull/8524) | Update connector fields title/description | +| 0.1.6 | 2021-11-02 | [7623](https://github.com/airbytehq/airbyte/pull/7623) | Migrate to the CDK | +| 0.1.5 | 2021-10-07 | [6878](https://github.com/airbytehq/airbyte/pull/6878) | Improve testing & output schemas | diff --git a/docs/integrations/sources/greenhouse.md b/docs/integrations/sources/greenhouse.md index 132d1cb9368..4429e572aed 100644 --- a/docs/integrations/sources/greenhouse.md +++ b/docs/integrations/sources/greenhouse.md @@ -59,7 +59,7 @@ The Greenhouse source connector supports the following [sync modes](https://docs - [Scorecards](https://developers.greenhouse.io/harvest.html#get-list-scorecards) \(Incremental\) - [Sources](https://developers.greenhouse.io/harvest.html#get-list-sources) - [Tags](https://developers.greenhouse.io/harvest.html#get-list-candidate-tags) -- [Users](https://developers.greenhouse.io/harvest.html#get-list-users) \(Incremental\) +- [Users](https://developers.greenhouse.io/harvest.html#get-list-users) \(Incremental\) - [User Permissions](https://developers.greenhouse.io/harvest.html#get-list-job-permissions) - [User Roles](https://developers.greenhouse.io/harvest.html#the-user-role-object) @@ -69,25 +69,25 @@ The Greenhouse connector should not run into Greenhouse API limitations under no ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------| -| 0.5.3 | 2024-04-19 | [36640](https://github.com/airbytehq/airbyte/pull/36640) | Updating to 0.80.0 CDK | -| 0.5.2 | 2024-04-12 | [36640](https://github.com/airbytehq/airbyte/pull/36640) | schema descriptions | -| 0.5.1 | 2024-03-12 | [35988](https://github.com/airbytehq/airbyte/pull/35988) | Unpin CDK version | -| 0.5.0 | 2024-02-20 | [35465](https://github.com/airbytehq/airbyte/pull/35465) | Per-error reporting and continue sync on stream failures | -| 0.4.5 | 2024-02-09 | [35077](https://github.com/airbytehq/airbyte/pull/35077) | Manage dependencies with Poetry. | -| 0.4.4 | 2023-11-29 | [32397](https://github.com/airbytehq/airbyte/pull/32397) | Increase test coverage and migrate to base image | -| 0.4.3 | 2023-09-20 | [30648](https://github.com/airbytehq/airbyte/pull/30648) | Update candidates.json | -| 0.4.2 | 2023-08-02 | [28969](https://github.com/airbytehq/airbyte/pull/28969) | Update CDK version | -| 0.4.1 | 2023-06-28 | [27773](https://github.com/airbytehq/airbyte/pull/27773) | Update following state breaking changes | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| 0.5.3 | 2024-04-19 | [36640](https://github.com/airbytehq/airbyte/pull/36640) | Updating to 0.80.0 CDK | +| 0.5.2 | 2024-04-12 | [36640](https://github.com/airbytehq/airbyte/pull/36640) | schema descriptions | +| 0.5.1 | 2024-03-12 | [35988](https://github.com/airbytehq/airbyte/pull/35988) | Unpin CDK version | +| 0.5.0 | 2024-02-20 | [35465](https://github.com/airbytehq/airbyte/pull/35465) | Per-error reporting and continue sync on stream failures | +| 0.4.5 | 2024-02-09 | [35077](https://github.com/airbytehq/airbyte/pull/35077) | Manage dependencies with Poetry. | +| 0.4.4 | 2023-11-29 | [32397](https://github.com/airbytehq/airbyte/pull/32397) | Increase test coverage and migrate to base image | +| 0.4.3 | 2023-09-20 | [30648](https://github.com/airbytehq/airbyte/pull/30648) | Update candidates.json | +| 0.4.2 | 2023-08-02 | [28969](https://github.com/airbytehq/airbyte/pull/28969) | Update CDK version | +| 0.4.1 | 2023-06-28 | [27773](https://github.com/airbytehq/airbyte/pull/27773) | Update following state breaking changes | | 0.4.0 | 2023-04-26 | [25332](https://github.com/airbytehq/airbyte/pull/25332) | Add new streams: `ActivityFeed`, `Approvals`, `Disciplines`, `Eeoc`, `EmailTemplates`, `Offices`, `ProspectPools`, `Schools`, `Tags`, `UserPermissions`, `UserRoles` | -| 0.3.1 | 2023-03-06 | [23231](https://github.com/airbytehq/airbyte/pull/23231) | Publish using low-code CDK Beta version | -| 0.3.0 | 2022-10-19 | [18154](https://github.com/airbytehq/airbyte/pull/18154) | Extend `Users` stream schema | -| 0.2.11 | 2022-09-27 | [17239](https://github.com/airbytehq/airbyte/pull/17239) | Always install the latest version of Airbyte CDK | -| 0.2.10 | 2022-09-05 | [16338](https://github.com/airbytehq/airbyte/pull/16338) | Implement incremental syncs & fix SATs | -| 0.2.9 | 2022-08-22 | [15800](https://github.com/airbytehq/airbyte/pull/15800) | Bugfix to allow reading sentry.yaml and schemas at runtime | -| 0.2.8 | 2022-08-10 | [15344](https://github.com/airbytehq/airbyte/pull/15344) | Migrate connector to config-based framework | -| 0.2.7 | 2022-04-15 | [11941](https://github.com/airbytehq/airbyte/pull/11941) | Correct Schema data type for Applications, Candidates, Scorecards and Users | -| 0.2.6 | 2021-11-08 | [7607](https://github.com/airbytehq/airbyte/pull/7607) | Implement demographics streams support. Update SAT for demographics streams | -| 0.2.5 | 2021-09-22 | [6377](https://github.com/airbytehq/airbyte/pull/6377) | Refactor the connector to use CDK. Implement additional stream support | -| 0.2.4 | 2021-09-15 | [6238](https://github.com/airbytehq/airbyte/pull/6238) | Add identification of accessible streams for API keys with limited permissions | +| 0.3.1 | 2023-03-06 | [23231](https://github.com/airbytehq/airbyte/pull/23231) | Publish using low-code CDK Beta version | +| 0.3.0 | 2022-10-19 | [18154](https://github.com/airbytehq/airbyte/pull/18154) | Extend `Users` stream schema | +| 0.2.11 | 2022-09-27 | [17239](https://github.com/airbytehq/airbyte/pull/17239) | Always install the latest version of Airbyte CDK | +| 0.2.10 | 2022-09-05 | [16338](https://github.com/airbytehq/airbyte/pull/16338) | Implement incremental syncs & fix SATs | +| 0.2.9 | 2022-08-22 | [15800](https://github.com/airbytehq/airbyte/pull/15800) | Bugfix to allow reading sentry.yaml and schemas at runtime | +| 0.2.8 | 2022-08-10 | [15344](https://github.com/airbytehq/airbyte/pull/15344) | Migrate connector to config-based framework | +| 0.2.7 | 2022-04-15 | [11941](https://github.com/airbytehq/airbyte/pull/11941) | Correct Schema data type for Applications, Candidates, Scorecards and Users | +| 0.2.6 | 2021-11-08 | [7607](https://github.com/airbytehq/airbyte/pull/7607) | Implement demographics streams support. Update SAT for demographics streams | +| 0.2.5 | 2021-09-22 | [6377](https://github.com/airbytehq/airbyte/pull/6377) | Refactor the connector to use CDK. Implement additional stream support | +| 0.2.4 | 2021-09-15 | [6238](https://github.com/airbytehq/airbyte/pull/6238) | Add identification of accessible streams for API keys with limited permissions | diff --git a/docs/integrations/sources/gutendex.md b/docs/integrations/sources/gutendex.md index 434276e2db2..06f3b5a6d4c 100644 --- a/docs/integrations/sources/gutendex.md +++ b/docs/integrations/sources/gutendex.md @@ -8,27 +8,46 @@ The Gutendex source can sync data from the [Gutendex API](https://gutendex.com/) Gutendex requires no access token/API key to make requests. The following (optional) parameters can be provided to the connector :- -___ + +--- + ##### `author_year_start` and `author_year_end` -Use these to find books with at least one author alive in a given range of years. They must have positive (CE) or negative (BCE) integer values. + +Use these to find books with at least one author alive in a given range of years. They must have positive (CE) or negative (BCE) integer values. For example, `/books?author_year_start=1800&author_year_end=1899` gives books with authors alive in the 19th Century. -___ + +--- + ##### `copyright` + Use this to find books with a certain copyright status: true for books with existing copyrights, false for books in the public domain in the USA, or null for books with no available copyright information. -___ + +--- + ##### `languages` + Use this to find books in any of a list of languages. They must be comma-separated, two-character language codes. For example, `/books?languages=en` gives books in English, and `/books?languages=fr,fi` gives books in either French or Finnish or both. -___ + +--- + ##### `search` + Use this to search author names and book titles with given words. They must be separated by a space (i.e. %20 in URL-encoded format) and are case-insensitive. For example, `/books?search=dickens%20great` includes Great Expectations by Charles Dickens. -___ + +--- + ##### `sort` + Use this to sort books: ascending for Project Gutenberg ID numbers from lowest to highest, descending for IDs highest to lowest, or popular (the default) for most popular to least popular by number of downloads. -___ + +--- + ##### `topic` + Use this to search for a case-insensitive key-phrase in books' bookshelves or subjects. For example, `/books?topic=children` gives books on the "Children's Literature" bookshelf, with the subject "Sick children -- Fiction", and so on. -___ + +--- ## Output schema diff --git a/docs/integrations/sources/harness.md b/docs/integrations/sources/harness.md index b6433e30483..2ea177e49fa 100644 --- a/docs/integrations/sources/harness.md +++ b/docs/integrations/sources/harness.md @@ -13,24 +13,24 @@ the tables and columns you set up for replication, every time a sync is run. Only one stream is currently available from this source: -* [Organization](https://apidocs.harness.io/tag/Organization#operation/getOrganizationList) +- [Organization](https://apidocs.harness.io/tag/Organization#operation/getOrganizationList) If there are more endpoints you'd like Faros AI to support, please [create an issue.](https://github.com/faros-ai/airbyte-connectors/issues/new) ### Features -| Feature | Supported? | -| :----------------- | :--------- | -| Full Refresh Sync | Yes | -| Incremental Sync | No | -| SSL connection | No | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | No | +| SSL connection | No | +| Namespaces | No | ### Performance considerations The Harness source should not run into Harness API limitations under normal -usage. Please [create an +usage. Please [create an issue](https://github.com/faros-ai/airbyte-connectors/issues/new) if you see any rate limit issues that are not automatically retried successfully. @@ -38,16 +38,16 @@ rate limit issues that are not automatically retried successfully. ### Requirements -* Harness Account Id -* Harness API Key -* Harness API URL, if using a self-hosted Harness instance +- Harness Account Id +- Harness API Key +- Harness API URL, if using a self-hosted Harness instance Please follow the [their documentation for generating a Harness API Key](https://ngdocs.harness.io/article/tdoad7xrh9-add-and-manage-api-keys#harness_api_key). ## Changelog -| Version | Date | Pull Request | Subject | -| :--------- | :--------- | :------------------------------------------------------------------ | :---------------------------------------------------- | -| 0.1.0 | 2023-10-10 | [31103](https://github.com/airbytehq/airbyte/pull/31103) | Migrate to low code | -| 0.1.23 | 2021-11-16 | [153](https://github.com/faros-ai/airbyte-connectors/pull/153) | Add Harness source and Faros destination's converter | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------- | :--------------------------------------------------- | +| 0.1.0 | 2023-10-10 | [31103](https://github.com/airbytehq/airbyte/pull/31103) | Migrate to low code | +| 0.1.23 | 2021-11-16 | [153](https://github.com/faros-ai/airbyte-connectors/pull/153) | Add Harness source and Faros destination's converter | diff --git a/docs/integrations/sources/harvest-migrations.md b/docs/integrations/sources/harvest-migrations.md index 6fe39fb7f12..97e2f76cfdb 100644 --- a/docs/integrations/sources/harvest-migrations.md +++ b/docs/integrations/sources/harvest-migrations.md @@ -3,6 +3,7 @@ ## Upgrading to 1.0.0 This update results in a change the following streams, requiring them to be cleared and completely synced again: + - `expenses_clients` - `expenses_categories` - `expenses_projects` @@ -24,8 +25,8 @@ We're continuously striving to enhance the quality and reliability of our connec To clear your data for the impacted streams, follow the steps below: 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Status** tab. - 1. In the **Enabled streams** list, click the three dots on the right side of the stream and select **Clear Data**. + 1. In the **Enabled streams** list, click the three dots on the right side of the stream and select **Clear Data**. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/harvest.md b/docs/integrations/sources/harvest.md index 3db70c2f13d..fd10fb8b173 100644 --- a/docs/integrations/sources/harvest.md +++ b/docs/integrations/sources/harvest.md @@ -86,32 +86,32 @@ The connector is restricted by the [Harvest rate limits](https://help.getharvest ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------| -| 1.0.1 | 2024-04-24 | [36641](https://github.com/airbytehq/airbyte/pull/36641) | Schema descriptions and CDK 0.80.0 | -| 1.0.0 | 2024-04-15 | [35863](https://github.com/airbytehq/airbyte/pull/35863) | Migrates connector to Low Code CDK, Updates incremental substream state to per-partition state | -| 0.2.0 | 2024-04-08 | [36889](https://github.com/airbytehq/airbyte/pull/36889) | Unpin CDK version | -| 0.1.24 | 2024-02-26 | [35541](https://github.com/airbytehq/airbyte/pull/35541) | Improve check command to avoid missing alerts | -| 0.1.23 | 2024-02-19 | [35305](https://github.com/airbytehq/airbyte/pull/35305) | Fix pendulum parsing error | -| 0.1.22 | 2024-02-12 | [35154](https://github.com/airbytehq/airbyte/pull/35154) | Manage dependencies with Poetry. | -| 0.1.21 | 2023-11-30 | [33003](https://github.com/airbytehq/airbyte/pull/33003) | Update expected records | -| 0.1.20 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.19 | 2023-07-26 | [28755](https://github.com/airbytehq/airbyte/pull/28755) | Changed parameters for Time Reports to use 365 days as opposed to 1 year | -| 0.1.18 | 2023-05-29 | [26714](https://github.com/airbytehq/airbyte/pull/26714) | Remove `authSpecification` from spec in favour of `advancedAuth` | -| 0.1.17 | 2023-03-03 | [22983](https://github.com/airbytehq/airbyte/pull/22983) | Specified date formatting in specification | -| 0.1.16 | 2023-02-07 | [22417](https://github.com/airbytehq/airbyte/pull/22417) | Turn on default HttpAvailabilityStrategy | -| 0.1.15 | 2023-01-27 | [22008](https://github.com/airbytehq/airbyte/pull/22008) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.1.14 | 2023-01-09 | [21151](https://github.com/airbytehq/airbyte/pull/21151) | Skip 403 FORBIDDEN for all stream | -| 0.1.13 | 2022-12-22 | [20810](https://github.com/airbytehq/airbyte/pull/20810) | Skip 403 FORBIDDEN for `EstimateItemCategories` stream | -| 0.1.12 | 2022-12-16 | [20572](https://github.com/airbytehq/airbyte/pull/20572) | Introduce replication end date | -| 0.1.11 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | -| 0.1.10 | 2022-08-08 | [15221](https://github.com/airbytehq/airbyte/pull/15221) | Added `parent_id` for all streams which have parent stream | -| 0.1.9 | 2022-08-04 | [15312](https://github.com/airbytehq/airbyte/pull/15312) | Fix `started_time` and `ended_time` format schema error and updated report slicing | -| 0.1.8 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | -| 0.1.6 | 2021-11-14 | [7952](https://github.com/airbytehq/airbyte/pull/7952) | Implement OAuth 2.0 support | -| 0.1.5 | 2021-09-28 | [5747](https://github.com/airbytehq/airbyte/pull/5747) | Update schema date-time fields | -| 0.1.4 | 2021-06-22 | [5701](https://github.com/airbytehq/airbyte/pull/5071) | Harvest normalization failure: fixing the schemas | -| 0.1.3 | 2021-06-22 | [4274](https://github.com/airbytehq/airbyte/pull/4274) | Fix wrong data type on `statement_key` in `clients` stream | -| 0.1.2 | 2021-06-07 | [4222](https://github.com/airbytehq/airbyte/pull/4222) | Correct specification parameter name | -| 0.1.1 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | -| 0.1.0 | 2021-06-07 | [3709](https://github.com/airbytehq/airbyte/pull/3709) | Release Harvest connector! | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------- | +| 1.0.1 | 2024-04-24 | [36641](https://github.com/airbytehq/airbyte/pull/36641) | Schema descriptions and CDK 0.80.0 | +| 1.0.0 | 2024-04-15 | [35863](https://github.com/airbytehq/airbyte/pull/35863) | Migrates connector to Low Code CDK, Updates incremental substream state to per-partition state | +| 0.2.0 | 2024-04-08 | [36889](https://github.com/airbytehq/airbyte/pull/36889) | Unpin CDK version | +| 0.1.24 | 2024-02-26 | [35541](https://github.com/airbytehq/airbyte/pull/35541) | Improve check command to avoid missing alerts | +| 0.1.23 | 2024-02-19 | [35305](https://github.com/airbytehq/airbyte/pull/35305) | Fix pendulum parsing error | +| 0.1.22 | 2024-02-12 | [35154](https://github.com/airbytehq/airbyte/pull/35154) | Manage dependencies with Poetry. | +| 0.1.21 | 2023-11-30 | [33003](https://github.com/airbytehq/airbyte/pull/33003) | Update expected records | +| 0.1.20 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.19 | 2023-07-26 | [28755](https://github.com/airbytehq/airbyte/pull/28755) | Changed parameters for Time Reports to use 365 days as opposed to 1 year | +| 0.1.18 | 2023-05-29 | [26714](https://github.com/airbytehq/airbyte/pull/26714) | Remove `authSpecification` from spec in favour of `advancedAuth` | +| 0.1.17 | 2023-03-03 | [22983](https://github.com/airbytehq/airbyte/pull/22983) | Specified date formatting in specification | +| 0.1.16 | 2023-02-07 | [22417](https://github.com/airbytehq/airbyte/pull/22417) | Turn on default HttpAvailabilityStrategy | +| 0.1.15 | 2023-01-27 | [22008](https://github.com/airbytehq/airbyte/pull/22008) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.1.14 | 2023-01-09 | [21151](https://github.com/airbytehq/airbyte/pull/21151) | Skip 403 FORBIDDEN for all stream | +| 0.1.13 | 2022-12-22 | [20810](https://github.com/airbytehq/airbyte/pull/20810) | Skip 403 FORBIDDEN for `EstimateItemCategories` stream | +| 0.1.12 | 2022-12-16 | [20572](https://github.com/airbytehq/airbyte/pull/20572) | Introduce replication end date | +| 0.1.11 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | +| 0.1.10 | 2022-08-08 | [15221](https://github.com/airbytehq/airbyte/pull/15221) | Added `parent_id` for all streams which have parent stream | +| 0.1.9 | 2022-08-04 | [15312](https://github.com/airbytehq/airbyte/pull/15312) | Fix `started_time` and `ended_time` format schema error and updated report slicing | +| 0.1.8 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | +| 0.1.6 | 2021-11-14 | [7952](https://github.com/airbytehq/airbyte/pull/7952) | Implement OAuth 2.0 support | +| 0.1.5 | 2021-09-28 | [5747](https://github.com/airbytehq/airbyte/pull/5747) | Update schema date-time fields | +| 0.1.4 | 2021-06-22 | [5701](https://github.com/airbytehq/airbyte/pull/5071) | Harvest normalization failure: fixing the schemas | +| 0.1.3 | 2021-06-22 | [4274](https://github.com/airbytehq/airbyte/pull/4274) | Fix wrong data type on `statement_key` in `clients` stream | +| 0.1.2 | 2021-06-07 | [4222](https://github.com/airbytehq/airbyte/pull/4222) | Correct specification parameter name | +| 0.1.1 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | +| 0.1.0 | 2021-06-07 | [3709](https://github.com/airbytehq/airbyte/pull/3709) | Release Harvest connector! | diff --git a/docs/integrations/sources/hellobaton.md b/docs/integrations/sources/hellobaton.md index 3b6d38a1ba6..13ba6965fe9 100644 --- a/docs/integrations/sources/hellobaton.md +++ b/docs/integrations/sources/hellobaton.md @@ -51,7 +51,7 @@ The connector is rate limited at 1000 requests per minute per api key. If you fi ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------------------ | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------- | | 0.2.0 | 2023-08-19 | [29490](https://github.com/airbytehq/airbyte/pull/29490) | Migrate CDK from Python to Low Code | -| 0.1.0 | 2022-01-14 | [8461](https://github.com/airbytehq/airbyte/pull/8461) | 🎉 New Source: Hellobaton | +| 0.1.0 | 2022-01-14 | [8461](https://github.com/airbytehq/airbyte/pull/8461) | 🎉 New Source: Hellobaton | diff --git a/docs/integrations/sources/http-request.md b/docs/integrations/sources/http-request.md index 2cb1b0cb60d..48af446c45f 100644 --- a/docs/integrations/sources/http-request.md +++ b/docs/integrations/sources/http-request.md @@ -8,13 +8,13 @@ This connector is graveyarded and will not be receiving any updates from the Air ## Overview -This connector allows you to generally connect to any HTTP API. In order to use this connector, you must manually bring it in as a custom connector. The steps to do this can be found [here](../../connector-development/tutorials/custom-python-connector/0-getting-started.md). +This connector allows you to generally connect to any HTTP API. In order to use this connector, you must manually bring it in as a custom connector. The steps to do this can be found [here](../../connector-development/tutorials/custom-python-connector/0-getting-started.md). ## Where do I find the Docker image? -The Docker image for the HTTP Request connector image can be found at our DockerHub [here](https://hub.docker.com/r/airbyte/source-http-request). +The Docker image for the HTTP Request connector image can be found at our DockerHub [here](https://hub.docker.com/r/airbyte/source-http-request). ## Why was this connector graveyarded? We found that there are lots of cases in which using a general connector leads to poor user experience, as there are countless edge cases for different API structures, different authentication policies, and varied approaches to rate-limiting. We believe that enabling users to more easily -create connectors is a more scalable and resilient approach to maximizing the quality of the user experience. \ No newline at end of file +create connectors is a more scalable and resilient approach to maximizing the quality of the user experience. diff --git a/docs/integrations/sources/hubplanner.md b/docs/integrations/sources/hubplanner.md index 429f7176341..a9435c30d20 100644 --- a/docs/integrations/sources/hubplanner.md +++ b/docs/integrations/sources/hubplanner.md @@ -3,16 +3,19 @@ Hubplanner is a tool to plan, schedule, report and manage your entire team. ## Prerequisites -* Create the API Key to access your data in Hubplanner. + +- Create the API Key to access your data in Hubplanner. ## Airbyte Open Source -* API Key + +- API Key ## Airbyte Cloud -* Comming Soon. +- Comming Soon. ## Setup guide + ### For Airbyte Open Source: 1. Access https://your-domain.hubplanner.com/settings#api or access the panel in left side Integrations/Hub Planner API @@ -21,7 +24,8 @@ Hubplanner is a tool to plan, schedule, report and manage your entire team. ## Supported sync modes The Okta source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh + +- Full Refresh ## Supported Streams @@ -33,11 +37,10 @@ The Okta source connector supports the following [sync modes](https://docs.airby - [Projects](https://github.com/hubplanner/API/blob/master/Sections/project.md) - [Resources](https://github.com/hubplanner/API/blob/master/Sections/resource.md) - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------| +| Version | Date | Pull Request | Subject | +| :------ | :--- | :----------- | :------ | -| 0.2.0 | 2021-09-31 | [29311](https://github.com/airbytehq/airbyte/pull/29311) | Migrated to LowCode CDK | -| 0.1.0 | 2021-08-10 | [12145](https://github.com/airbytehq/airbyte/pull/12145) | Initial Release | +| 0.2.0 | 2021-09-31 | [29311](https://github.com/airbytehq/airbyte/pull/29311) | Migrated to LowCode CDK | +| 0.1.0 | 2021-08-10 | [12145](https://github.com/airbytehq/airbyte/pull/12145) | Initial Release | diff --git a/docs/integrations/sources/hubspot-migrations.md b/docs/integrations/sources/hubspot-migrations.md index 73219f9d927..a3768f6209d 100644 --- a/docs/integrations/sources/hubspot-migrations.md +++ b/docs/integrations/sources/hubspot-migrations.md @@ -9,29 +9,30 @@ This change is only breaking if you are syncing streams `Deals Property History` This update brings extended schema with data type changes for the Marketing Emails stream. Users should: - - Refresh the source schema for the Marketing Emails stream. - - Reset the stream after upgrading to ensure uninterrupted syncs. + +- Refresh the source schema for the Marketing Emails stream. +- Reset the stream after upgrading to ensure uninterrupted syncs. ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection affected by the update. + 1. Select the connection affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. :::note Depending on destination type you may not be prompted to reset your data. ::: -4. Select **Save connection**. +4. Select **Save connection**. :::note This will reset the data in your destination and initiate a fresh sync. @@ -39,7 +40,6 @@ This will reset the data in your destination and initiate a fresh sync. For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset) - ## Upgrading to 3.0.0 :::note @@ -49,29 +49,30 @@ This change is only breaking if you are syncing the Marketing Emails stream. This update brings extended schema with data type changes for the Marketing Emails stream. Users should: - - Refresh the source schema for the Marketing Emails stream. - - Reset the stream after upgrading to ensure uninterrupted syncs. + +- Refresh the source schema for the Marketing Emails stream. +- Reset the stream after upgrading to ensure uninterrupted syncs. ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection affected by the update. + 1. Select the connection affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. :::note Depending on destination type you may not be prompted to reset your data. ::: -4. Select **Save connection**. +4. Select **Save connection**. :::note This will reset the data in your destination and initiate a fresh sync. @@ -79,7 +80,6 @@ This will reset the data in your destination and initiate a fresh sync. For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset) - ## Upgrading to 2.0.0 :::note @@ -91,16 +91,16 @@ With this update, you can now access historical property changes for Deals and C This constitutes a breaking change as the Property History stream has been deprecated and replaced with the Contacts Property History. Please follow the instructions below to migrate to version 2.0.0: 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. + 1. Select **Refresh source schema**. :::note Any detected schema changes will be listed for your review. Select **OK** to proceed. ::: 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. :::note Depending on destination type you may not be prompted to reset your data diff --git a/docs/integrations/sources/hubspot.md b/docs/integrations/sources/hubspot.md index 4b4bf7381d7..bd54ffd342d 100644 --- a/docs/integrations/sources/hubspot.md +++ b/docs/integrations/sources/hubspot.md @@ -11,21 +11,26 @@ This page contains the setup guide and reference information for the [HubSpot](h - HubSpot Account + - **For Airbyte Open Source**: Private App with Access Token ## Setup guide + **For Airbyte Cloud:** We highly recommend you use OAuth rather than Private App authentication, as it significantly simplifies the setup process. + + **For Airbyte Open Source:** We recommend Private App authentication. + More information on HubSpot authentication methods can be found @@ -34,14 +39,17 @@ More information on HubSpot authentication methods can be found ### Step 1: Set up Hubspot + **For Airbyte Cloud:** **- OAuth** (Recommended) **- Private App:** If you are using a Private App, you will need to use your Access Token to set up the connector. Please refer to the [official HubSpot documentation](https://developers.hubspot.com/docs/api/private-apps) for a detailed guide. + + **For Airbyte Open Source:** **- Private App setup** (Recommended): If you are authenticating via a Private App, you will need to use your Access Token to set up the connector. Please refer to the [official HubSpot documentation](https://developers.hubspot.com/docs/api/private-apps) for a detailed guide. @@ -89,32 +97,34 @@ Next, you need to configure the appropriate scopes for the following streams. Pl ### Step 3: Set up the HubSpot source connector in Airbyte + **For Airbyte Cloud:** 1. Log in to your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. 2. From the Airbyte UI, click **Sources**, then click on **+ New Source** and select **HubSpot** from the list of available sources. 3. Enter a **Source name** of your choosing. 4. From the **Authentication** dropdown, select your chosen authentication method: - - **Recommended:** To authenticate using OAuth, select **OAuth** and click **Authenticate your HubSpot account** to sign in with HubSpot and authorize your account. - :::tip HubSpot Authentication issues - You might encounter errors during the connection process in the popup window, such as `An invalid scope name was provided`. - To resolve this, close the window and attempt authentication again. - ::: - - **Not Recommended:**To authenticate using a Private App, select **Private App** and enter the Access Token for your HubSpot account. + - **Recommended:** To authenticate using OAuth, select **OAuth** and click **Authenticate your HubSpot account** to sign in with HubSpot and authorize your account. + :::tip HubSpot Authentication issues + You might encounter errors during the connection process in the popup window, such as `An invalid scope name was provided`. + To resolve this, close the window and attempt authentication again. + ::: + - **Not Recommended:**To authenticate using a Private App, select **Private App** and enter the Access Token for your HubSpot account. 5. For **Start date**, use the provided datepicker or enter the date programmatically in the following format: `yyyy-mm-ddThh:mm:ssZ`. The data added on and after this date will be replicated. If not set, "2006-06-01T00:00:00Z" (Hubspot creation date) will be used as start date. It's recommended to provide relevant to your data start date value to optimize synchronization. 6. Click **Set up source** and wait for the tests to complete. + #### For Airbyte Open Source: 1. Navigate to the Airbyte Open Source dashboard. 2. From the Airbyte UI, click **Sources**, then click on **+ New Source** and select **HubSpot** from the list of available sources. 3. Enter a **Source name** of your choosing. 4. From the **Authentication** dropdown, select your chosen authentication method: - - **Recommended:** To authenticate using a Private App, select **Private App** and enter the Access Token for your HubSpot account. - - **Not Recommended:**To authenticate using OAuth, select **OAuth** and enter your Client ID, Client Secret, and Refresh Token. + - **Recommended:** To authenticate using a Private App, select **Private App** and enter the Access Token for your HubSpot account. + - **Not Recommended:**To authenticate using OAuth, select **OAuth** and enter your Client ID, Client Secret, and Refresh Token. 5. For **Start date**, use the provided datepicker or enter the date programmatically in the following format: `yyyy-mm-ddThh:mm:ssZ`. The data added on and after this date will be replicated. If not set, "2006-06-01T00:00:00Z" (Hubspot creation date) will be used as start date. It's recommended to provide relevant to your data start date value to optimize synchronization. 6. Click **Set up source** and wait for the tests to complete. @@ -136,6 +146,7 @@ If you set up your connections before April 15th, 2023 (on Airbyte Cloud) or bef First you need to give the connector some additional permissions: + - **If you are using OAuth on Airbyte Cloud** go to the Hubspot source settings page in the Airbyte UI and re-authenticate via OAuth to allow Airbyte the permissions to access custom objects. - **If you are using OAuth on OSS or Private App auth** go into the Hubspot UI where you created your Private App or OAuth application and add the `crm.objects.custom.read` scope to your app's scopes. See HubSpot's instructions [here](https://developers.hubspot.com/docs/api/working-with-oauth#scopes). @@ -156,7 +167,7 @@ There are two types of incremental sync: 1. Incremental (standard server-side, where API returns only the data updated or generated since the last sync) 2. Client-Side Incremental (API returns all available data and connector filters out only new records) -::: + ::: ## Supported streams @@ -213,7 +224,6 @@ The HubSpot source connector supports the following streams: Even though the stream is Incremental, there are some record types that are not affected by the last sync timestamp pointer. For example records of type `CALCULATED` will allways have most recent timestamp equal to the requset time, so whenever you sync there will be a bunch of records in return. - ### Notes on the `engagements` stream 1. Objects in the `engagements` stream can have one of the following types: `note`, `email`, `task`, `meeting`, `call`. Depending on the type of engagement, different properties are set for that object in the `engagements_metadata` table in the destination: @@ -250,78 +260,80 @@ Expand to see details about Hubspot connector limitations and troubleshooting. The connector is restricted by normal HubSpot [rate limitations](https://legacydocs.hubspot.com/apps/api_guidelines). -| Product tier | Limits | -|:----------------------------|:-----------------------------------------| -| `Free & Starter` | Burst: 100/10 seconds, Daily: 250,000 | -| `Professional & Enterprise` | Burst: 150/10 seconds, Daily: 500,000 | -| `API add-on (any tier)` | Burst: 200/10 seconds, Daily: 1,000,000 | - +| Product tier | Limits | +| :-------------------------- | :-------------------------------------- | +| `Free & Starter` | Burst: 100/10 seconds, Daily: 250,000 | +| `Professional & Enterprise` | Burst: 150/10 seconds, Daily: 500,000 | +| `API add-on (any tier)` | Burst: 200/10 seconds, Daily: 1,000,000 | ### Troubleshooting -* Consider checking out the following Hubspot tutorial: [Build a single customer view with open-source tools](https://airbyte.com/tutorials/single-customer-view). -* **Enabling streams:** Some streams, such as `workflows`, need to be enabled before they can be read using a connector authenticated using an `API Key`. If reading a stream that is not enabled, a log message returned to the output and the sync operation will only sync the other streams available. +- Consider checking out the following Hubspot tutorial: [Build a single customer view with open-source tools](https://airbyte.com/tutorials/single-customer-view). +- **Enabling streams:** Some streams, such as `workflows`, need to be enabled before they can be read using a connector authenticated using an `API Key`. If reading a stream that is not enabled, a log message returned to the output and the sync operation will only sync the other streams available. - Example of the output message when trying to read `workflows` stream with missing permissions for the `API Key`: + Example of the output message when trying to read `workflows` stream with missing permissions for the `API Key`: - ```json - { - "type": "LOG", - "log": { - "level": "WARN", - "message": "Stream `workflows` cannot be proceed. This API Key (EXAMPLE_API_KEY) does not have proper permissions! (requires any of [automation-access])" - } + ```json + { + "type": "LOG", + "log": { + "level": "WARN", + "message": "Stream `workflows` cannot be proceed. This API Key (EXAMPLE_API_KEY) does not have proper permissions! (requires any of [automation-access])" } - ``` + } + ``` -* **Unnesting top level properties**: Since version 1.5.0, in order to not make the users query their destinations for complicated json fields, we duplicate most of nested data as top level fields. +- **Unnesting top level properties**: Since version 1.5.0, in order to not make the users query their destinations for complicated json fields, we duplicate most of nested data as top level fields. - For instance: + For instance: - ```json - { - "id": 1, - "updatedAt": "2020-01-01", - "properties": { - "hs_note_body": "World's best boss", - "hs_created_by": "Michael Scott" - } + ```json + { + "id": 1, + "updatedAt": "2020-01-01", + "properties": { + "hs_note_body": "World's best boss", + "hs_created_by": "Michael Scott" } - ``` + } + ``` - becomes + becomes - ```json - { - "id": 1, - "updatedAt": "2020-01-01", - "properties": { - "hs_note_body": "World's best boss", - "hs_created_by": "Michael Scott" - }, - "properties_hs_note_body": "World's best boss", - "properties_hs_created_by": "Michael Scott" - } - ``` -* **403 Forbidden Error** - * Hubspot has **scopes** for each API call. - * Each stream is tied to a scope and will need access to that scope to sync data. - * Review the Hubspot OAuth scope documentation [here](https://developers.hubspot.com/docs/api/working-with-oauth#scopes). - * Additional permissions: + ```json + { + "id": 1, + "updatedAt": "2020-01-01", + "properties": { + "hs_note_body": "World's best boss", + "hs_created_by": "Michael Scott" + }, + "properties_hs_note_body": "World's best boss", + "properties_hs_created_by": "Michael Scott" + } + ``` - `feedback_submissions`: Service Hub Professional account +- **403 Forbidden Error** - `marketing_emails`: Market Hub Starter account + - Hubspot has **scopes** for each API call. + - Each stream is tied to a scope and will need access to that scope to sync data. + - Review the Hubspot OAuth scope documentation [here](https://developers.hubspot.com/docs/api/working-with-oauth#scopes). + - Additional permissions: - `workflows`: Sales, Service, and Marketing Hub Professional accounts -* Check out common troubleshooting issues for the Hubspot source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). + `feedback_submissions`: Service Hub Professional account + + `marketing_emails`: Market Hub Starter account + + `workflows`: Sales, Service, and Marketing Hub Professional accounts + +- Check out common troubleshooting issues for the Hubspot source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 4.1.2 | 2024-04-24 | [36642](https://github.com/airbytehq/airbyte/pull/36642) | Schema descriptions and CDK 0.80.0 | | 4.1.1 | 2024-04-11 | [35945](https://github.com/airbytehq/airbyte/pull/35945) | Add integration tests | | 4.1.0 | 2024-03-27 | [36541](https://github.com/airbytehq/airbyte/pull/36541) | Added test configuration features, fixed type hints | @@ -342,7 +354,7 @@ The connector is restricted by normal HubSpot [rate limitations](https://legacyd | 1.6.0 | 2023-10-19 | [31606](https://github.com/airbytehq/airbyte/pull/31606) | Add new field `aifeatures` to the `marketing emails` stream schema | | 1.5.1 | 2023-10-04 | [31050](https://github.com/airbytehq/airbyte/pull/31050) | Add type transformer for `Engagements` stream | | 1.5.0 | 2023-09-11 | [30322](https://github.com/airbytehq/airbyte/pull/30322) | Unnest stream schemas | -| 1.4.1 | 2023-08-22 | [29715](https://github.com/airbytehq/airbyte/pull/29715) | Fix python package configuration stream | +| 1.4.1 | 2023-08-22 | [29715](https://github.com/airbytehq/airbyte/pull/29715) | Fix python package configuration stream | | 1.4.0 | 2023-08-11 | [29249](https://github.com/airbytehq/airbyte/pull/29249) | Add `OwnersArchived` stream | | 1.3.3 | 2023-08-10 | [29248](https://github.com/airbytehq/airbyte/pull/29248) | Specify `threadId` in `engagements` stream to type string | | 1.3.2 | 2023-08-10 | [29326](https://github.com/airbytehq/airbyte/pull/29326) | Add primary keys to streams `ContactLists` and `PropertyHistory` | diff --git a/docs/integrations/sources/insightly.md b/docs/integrations/sources/insightly.md index 71ec935fc96..abb47c15ddc 100644 --- a/docs/integrations/sources/insightly.md +++ b/docs/integrations/sources/insightly.md @@ -16,67 +16,65 @@ This page guides you through the process of setting up the Insightly source conn The Insightly source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh - - Incremental +- Full Refresh +- Incremental ## Supported Streams The Insightly source connector supports the following streams, some of them may need elevated permissions: -* [Activity Sets](https://api.na1.insightly.com/v3.1/#!/ActivitySets/GetActivitySets) \(Full table\) -* [Contacts](https://api.na1.insightly.com/v3.1/#!/Contacts/GetEntities) \(Incremental\) -* [Countries](https://api.na1.insightly.com/v3.1/#!/Countries/GetCountries) \(Full table\) -* [Currencies](https://api.na1.insightly.com/v3.1/#!/Currencies/GetCurrencies) \(Full table\) -* [Emails](https://api.na1.insightly.com/v3.1/#!/Emails/GetEntities) \(Full table\) -* [Events](https://api.na1.insightly.com/v3.1/#!/Events/GetEntities) \(Incremental\) -* [Knowledge Article Categories](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticleCategories/GetEntities) \(Incremental\) -* [Knowledge Article Folders](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticleFolders/GetEntities) \(Incremental\) -* [Knowledge Articles](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticles/GetEntities) \(Incremental\) -* [Leads](https://api.na1.insightly.com/v3.1/#!/Leads/GetEntities) \(Incremental\) -* [Lead Sources](https://api.na1.insightly.com/v3.1/#!/LeadSources/GetLeadSources) \(Full table\) -* [Lead Statuses](https://api.na1.insightly.com/v3.1/#!/LeadStatuses/GetLeadStatuses) \(Full table\) -* [Milestones](https://api.na1.insightly.com/v3.1/#!/Milestones/GetEntities) \(Incremental\) -* [Notes](https://api.na1.insightly.com/v3.1/#!/Notes/GetEntities) \(Incremental\) -* [Opportunities](https://api.na1.insightly.com/v3.1/#!/Opportunities/GetEntities) \(Incremental\) -* [Opportunity Categories](https://api.na1.insightly.com/v3.1/#!/OpportunityCategories/GetOpportunityCategories) \(Full table\) -* [Opportunity Products](https://api.na1.insightly.com/v3.1/#!/OpportunityProducts/GetEntities) \(Incremental\) -* [Opportunity State Reasons](https://api.na1.insightly.com/v3.1/#!/OpportunityStateReasons/GetOpportunityStateReasons) \(Full table\) -* [Organisations](https://api.na1.insightly.com/v3.1/#!/Organisations/GetEntities) \(Incremental\) -* [Pipelines](https://api.na1.insightly.com/v3.1/#!/Pipelines/GetPipelines) \(Full table\) -* [Pipeline Stages](https://api.na1.insightly.com/v3.1/#!/PipelineStages/GetPipelineStages) \(Full table\) -* [Price Book Entries](https://api.na1.insightly.com/v3.1/#!/PriceBookEntries/GetEntities) \(Incremental\) -* [Price Books](https://api.na1.insightly.com/v3.1/#!/PriceBooks/GetEntities) \(Incremental\) -* [Products](https://api.na1.insightly.com/v3.1/#!/Products/GetEntities) \(Incremental\) -* [Project Categories](https://api.na1.insightly.com/v3.1/#!/ProjectCategories/GetProjectCategories) \(Full table\) -* [Projects](https://api.na1.insightly.com/v3.1/#!/Projects/GetEntities) \(Incremental\) -* [Prospects](https://api.na1.insightly.com/v3.1/#!/Prospects/GetEntities) \(Incremental\) -* [Quote Products](https://api.na1.insightly.com/v3.1/#!/QuoteProducts/GetEntities) \(Incremental\) -* [Quotes](https://api.na1.insightly.com/v3.1/#!/Quotes/GetEntities) \(Incremental\) -* [Relationships](https://api.na1.insightly.com/v3.1/#!/Relationships/GetRelationships) \(Full table\) -* [Tags](https://api.na1.insightly.com/v3.1/#!/Tags/GetTags) \(Full table\) -* [Task Categories](https://api.na1.insightly.com/v3.1/#!/TaskCategories/GetTaskCategories) \(Full table\) -* [Tasks](https://api.na1.insightly.com/v3.1/#!/Tasks/GetEntities) \(Incremental\) -* [Team Members](https://api.na1.insightly.com/v3.1/#!/TeamMembers/GetTeamMembers) \(Full table\) -* [Teams](https://api.na1.insightly.com/v3.1/#!/Teams/GetTeams) \(Full table\) -* [Tickets](https://api.na1.insightly.com/v3.1/#!/Tickets/GetEntities) \(Incremental\) -* [Users](https://api.na1.insightly.com/v3.1/#!/Users/GetUsers) \(Incremental\) - +- [Activity Sets](https://api.na1.insightly.com/v3.1/#!/ActivitySets/GetActivitySets) \(Full table\) +- [Contacts](https://api.na1.insightly.com/v3.1/#!/Contacts/GetEntities) \(Incremental\) +- [Countries](https://api.na1.insightly.com/v3.1/#!/Countries/GetCountries) \(Full table\) +- [Currencies](https://api.na1.insightly.com/v3.1/#!/Currencies/GetCurrencies) \(Full table\) +- [Emails](https://api.na1.insightly.com/v3.1/#!/Emails/GetEntities) \(Full table\) +- [Events](https://api.na1.insightly.com/v3.1/#!/Events/GetEntities) \(Incremental\) +- [Knowledge Article Categories](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticleCategories/GetEntities) \(Incremental\) +- [Knowledge Article Folders](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticleFolders/GetEntities) \(Incremental\) +- [Knowledge Articles](https://api.na1.insightly.com/v3.1/#!/KnowledgeArticles/GetEntities) \(Incremental\) +- [Leads](https://api.na1.insightly.com/v3.1/#!/Leads/GetEntities) \(Incremental\) +- [Lead Sources](https://api.na1.insightly.com/v3.1/#!/LeadSources/GetLeadSources) \(Full table\) +- [Lead Statuses](https://api.na1.insightly.com/v3.1/#!/LeadStatuses/GetLeadStatuses) \(Full table\) +- [Milestones](https://api.na1.insightly.com/v3.1/#!/Milestones/GetEntities) \(Incremental\) +- [Notes](https://api.na1.insightly.com/v3.1/#!/Notes/GetEntities) \(Incremental\) +- [Opportunities](https://api.na1.insightly.com/v3.1/#!/Opportunities/GetEntities) \(Incremental\) +- [Opportunity Categories](https://api.na1.insightly.com/v3.1/#!/OpportunityCategories/GetOpportunityCategories) \(Full table\) +- [Opportunity Products](https://api.na1.insightly.com/v3.1/#!/OpportunityProducts/GetEntities) \(Incremental\) +- [Opportunity State Reasons](https://api.na1.insightly.com/v3.1/#!/OpportunityStateReasons/GetOpportunityStateReasons) \(Full table\) +- [Organisations](https://api.na1.insightly.com/v3.1/#!/Organisations/GetEntities) \(Incremental\) +- [Pipelines](https://api.na1.insightly.com/v3.1/#!/Pipelines/GetPipelines) \(Full table\) +- [Pipeline Stages](https://api.na1.insightly.com/v3.1/#!/PipelineStages/GetPipelineStages) \(Full table\) +- [Price Book Entries](https://api.na1.insightly.com/v3.1/#!/PriceBookEntries/GetEntities) \(Incremental\) +- [Price Books](https://api.na1.insightly.com/v3.1/#!/PriceBooks/GetEntities) \(Incremental\) +- [Products](https://api.na1.insightly.com/v3.1/#!/Products/GetEntities) \(Incremental\) +- [Project Categories](https://api.na1.insightly.com/v3.1/#!/ProjectCategories/GetProjectCategories) \(Full table\) +- [Projects](https://api.na1.insightly.com/v3.1/#!/Projects/GetEntities) \(Incremental\) +- [Prospects](https://api.na1.insightly.com/v3.1/#!/Prospects/GetEntities) \(Incremental\) +- [Quote Products](https://api.na1.insightly.com/v3.1/#!/QuoteProducts/GetEntities) \(Incremental\) +- [Quotes](https://api.na1.insightly.com/v3.1/#!/Quotes/GetEntities) \(Incremental\) +- [Relationships](https://api.na1.insightly.com/v3.1/#!/Relationships/GetRelationships) \(Full table\) +- [Tags](https://api.na1.insightly.com/v3.1/#!/Tags/GetTags) \(Full table\) +- [Task Categories](https://api.na1.insightly.com/v3.1/#!/TaskCategories/GetTaskCategories) \(Full table\) +- [Tasks](https://api.na1.insightly.com/v3.1/#!/Tasks/GetEntities) \(Incremental\) +- [Team Members](https://api.na1.insightly.com/v3.1/#!/TeamMembers/GetTeamMembers) \(Full table\) +- [Teams](https://api.na1.insightly.com/v3.1/#!/Teams/GetTeams) \(Full table\) +- [Tickets](https://api.na1.insightly.com/v3.1/#!/Tickets/GetEntities) \(Incremental\) +- [Users](https://api.na1.insightly.com/v3.1/#!/Users/GetUsers) \(Incremental\) ## Performance considerations The connector is restricted by Insightly [requests limitation](https://api.na1.insightly.com/v3.1/#!/Overview/Introduction). - ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------- | -| 0.2.4 | 2024-04-19 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | schema descriptions | -| 0.2.0 | 2023-10-23 | [31162](https://github.com/airbytehq/airbyte/pull/31162) | Migrate to low-code framework | -| 0.1.3 | 2023-05-15 | [26079](https://github.com/airbytehq/airbyte/pull/26079) | Make incremental syncs timestamp inclusive | -| 0.1.2 | 2023-03-23 | [24422](https://github.com/airbytehq/airbyte/pull/24422) | Fix incremental timedelta causing missing records | -| 0.1.1 | 2022-11-11 | [19356](https://github.com/airbytehq/airbyte/pull/19356) | Fix state date parse bug | -| 0.1.0 | 2022-10-19 | [18164](https://github.com/airbytehq/airbyte/pull/18164) | Release Insightly CDK Connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37177](https://github.com/airbytehq/airbyte/pull/37177) | schema descriptions | +| 0.2.0 | 2023-10-23 | [31162](https://github.com/airbytehq/airbyte/pull/31162) | Migrate to low-code framework | +| 0.1.3 | 2023-05-15 | [26079](https://github.com/airbytehq/airbyte/pull/26079) | Make incremental syncs timestamp inclusive | +| 0.1.2 | 2023-03-23 | [24422](https://github.com/airbytehq/airbyte/pull/24422) | Fix incremental timedelta causing missing records | +| 0.1.1 | 2022-11-11 | [19356](https://github.com/airbytehq/airbyte/pull/19356) | Fix state date parse bug | +| 0.1.0 | 2022-10-19 | [18164](https://github.com/airbytehq/airbyte/pull/18164) | Release Insightly CDK Connector | diff --git a/docs/integrations/sources/instagram-migrations.md b/docs/integrations/sources/instagram-migrations.md index 49326bc1e4f..d1844a0c54a 100644 --- a/docs/integrations/sources/instagram-migrations.md +++ b/docs/integrations/sources/instagram-migrations.md @@ -5,15 +5,15 @@ The Instagram connector has been upgrade to API v18 (following the deprecation of v11). Connector will be upgraded to API v18. Affected Streams and their corresponding changes are listed below: - `Media Insights` - + Old metric will be replaced with the new ones, refer to the [IG Media Insights](https://developers.facebook.com/docs/instagram-api/reference/ig-media/insights#metrics) for more info. | Old metric | New metric | - |----------------------------|--------------------| + | -------------------------- | ------------------ | | carousel_album_engagement | total_interactions | | carousel_album_impressions | impressions | | carousel_album_reach | reach | - | carousel_album_saved | saved | + | carousel_album_saved | saved | | carousel_album_video_views | video_views | | engagement | total_interactions | @@ -23,13 +23,13 @@ You may see different results: `engagement` count includes likes, comments, and ::: - New metrics for Reels: `ig_reels_avg_watch_time`, `ig_reels_video_view_total_time` +New metrics for Reels: `ig_reels_avg_watch_time`, `ig_reels_video_view_total_time` - `User Lifetime Insights` - - Metric `audience_locale` will become unavailable. - - Metrics `audience_city`, `audience_country`, and `audience_gender_age` will be consolidated into a single metric named `follower_demographics`, featuring respective breakdowns for `city`, `country`, and `age,gender`. - - Primary key will be changed to `["business_account_id", "breakdown"]`. + - Metric `audience_locale` will become unavailable. + - Metrics `audience_city`, `audience_country`, and `audience_gender_age` will be consolidated into a single metric named `follower_demographics`, featuring respective breakdowns for `city`, `country`, and `age,gender`. + - Primary key will be changed to `["business_account_id", "breakdown"]`. :::note @@ -37,31 +37,29 @@ Due to Instagram limitations, the "Metric Type" will be set to `total_value` for ::: - - `Story Insights` Metrics: `exits`, `taps_back`, `taps_forward` will become unavailable. - Please follow the instructions below to migrate to version 3.0.0: 1. Select **Connections** in the main navbar. -1.1 Select the connection(s) affected by the update. + 1.1 Select the connection(s) affected by the update. 2. Select the **Replication** tab. -2.1 Select **Refresh source schema**. - ```note + 2.1 Select **Refresh source schema**. + `note Any detected schema changes will be listed for your review. - ``` -2.2 Select **OK**. + ` + 2.2 Select **OK**. 3. Select **Save changes** at the bottom of the page. -3.1 Ensure the **Reset affected streams** option is checked. - ```note + 3.1 Ensure the **Reset affected streams** option is checked. + `note Depending on destination type you may not be prompted to reset your data - ``` + ` 4. Select **Save connection**. - ```note - This will reset the data in your destination and initiate a fresh sync. - ``` + `note + This will reset the data in your destination and initiate a fresh sync. + ` For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). @@ -70,5 +68,6 @@ For more information on resetting your data in Airbyte, see [this page](https:// This release adds a default primary key for the streams UserLifetimeInsights and UserInsights, and updates the format of timestamp fields in the UserLifetimeInsights, UserInsights, Media and Stories streams to include timezone information. To ensure uninterrupted syncs, users should: + - Refresh the source schema -- Reset affected streams \ No newline at end of file +- Reset affected streams diff --git a/docs/integrations/sources/instagram.md b/docs/integrations/sources/instagram.md index 3e6a1ffc9d8..1461fb4257a 100644 --- a/docs/integrations/sources/instagram.md +++ b/docs/integrations/sources/instagram.md @@ -84,7 +84,7 @@ The Instagram connector syncs data related to Users, Media, and Stories and thei AirbyteRecords are required to conform to the [Airbyte type](https://docs.airbyte.com/understanding-airbyte/supported-data-types/) system. This means that all sources must produce schemas and records within these types and all destinations must handle records that conform to this type system. | Integration Type | Airbyte Type | -|:-----------------|:-------------| +| :--------------- | :----------- | | `string` | `string` | | `number` | `number` | | `array` | `array` | @@ -105,14 +105,14 @@ Instagram limits the number of requests that can be made at a time. See Facebook ### Troubleshooting -* Check out common troubleshooting issues for the Instagram source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Instagram source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------ | | 3.0.7 | 2024-04-19 | [36643](https://github.com/airbytehq/airbyte/pull/36643) | Updating to 0.80.0 CDK | | 3.0.6 | 2024-04-12 | [36643](https://github.com/airbytehq/airbyte/pull/36643) | Schema descriptions | | 3.0.5 | 2024-03-20 | [36314](https://github.com/airbytehq/airbyte/pull/36314) | Unpin CDK version | diff --git a/docs/integrations/sources/instatus.md b/docs/integrations/sources/instatus.md index c4f2c751ceb..3d54d7ac022 100644 --- a/docs/integrations/sources/instatus.md +++ b/docs/integrations/sources/instatus.md @@ -1,44 +1,51 @@ # Instatus + This page contains the setup guide and reference information for the Instatus source connector. ## Prerequisites -To set up Metabase you need: - * `api_key` - Requests to Instatus API must provide an API token. +To set up Metabase you need: + +- `api_key` - Requests to Instatus API must provide an API token. ## Setup guide + ### Step 1: Set up Instatus account + ### Step 2: Generate an API key + You can get your API key from [User settings](https://dashboard.instatus.com/developer) Make sure that you are an owner of the pages you want to sync because if you are not this data will be skipped. + ### Step 2: Set up the Instatus connector in Airbyte ## Supported sync modes + The Instatus source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) - +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) ## Supported Streams -* [Status pages](https://instatus.com/help/api/status-pages) -* [Components](https://instatus.com/help/api/components) -* [Incidents](https://instatus.com/help/api/incidents) -* [Incident updates](https://instatus.com/help/api/incident-updates) -* [Maintenances](https://instatus.com/help/api/maintenances) -* [Maintenance updates](https://instatus.com/help/api/maintenance-updates) -* [Templates](https://instatus.com/help/api/templates) -* [Team](https://instatus.com/help/api/teammates) -* [Subscribers](https://instatus.com/help/api/subscribers) -* [Metrics](https://instatus.com/help/api/metrics) -* [User](https://instatus.com/help/api/user-profile) -* [Public data](https://instatus.com/help/api/public-data) + +- [Status pages](https://instatus.com/help/api/status-pages) +- [Components](https://instatus.com/help/api/components) +- [Incidents](https://instatus.com/help/api/incidents) +- [Incident updates](https://instatus.com/help/api/incident-updates) +- [Maintenances](https://instatus.com/help/api/maintenances) +- [Maintenance updates](https://instatus.com/help/api/maintenance-updates) +- [Templates](https://instatus.com/help/api/templates) +- [Team](https://instatus.com/help/api/teammates) +- [Subscribers](https://instatus.com/help/api/subscribers) +- [Metrics](https://instatus.com/help/api/metrics) +- [User](https://instatus.com/help/api/user-profile) +- [Public data](https://instatus.com/help/api/public-data) ## Tutorials ### Data type mapping | Integration Type | Airbyte Type | Notes | -|:--------------------|:-------------|:------| +| :------------------ | :----------- | :---- | | `string` | `string` | | | `integer`, `number` | `number` | | | `array` | `array` | | @@ -47,7 +54,7 @@ The Instatus source connector supports the following [sync modes](https://docs.a ### Features | Feature | Supported?\(Yes/No\) | Notes | -|:------------------|:---------------------|:------| +| :---------------- | :------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | SSL connection | Yes | @@ -55,6 +62,6 @@ The Instatus source connector supports the following [sync modes](https://docs.a ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------| -| 0.1.0 | 2023-04-01 | [21008](https://github.com/airbytehq/airbyte/pull/21008) | Initial (alpha) release | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------- | +| 0.1.0 | 2023-04-01 | [21008](https://github.com/airbytehq/airbyte/pull/21008) | Initial (alpha) release | diff --git a/docs/integrations/sources/intercom.md b/docs/integrations/sources/intercom.md index 4f9da48667c..a5931a701bb 100644 --- a/docs/integrations/sources/intercom.md +++ b/docs/integrations/sources/intercom.md @@ -31,9 +31,10 @@ To authenticate the connector in **Airbyte Open Source**, you will need to obtai 5. To authenticate: + - For **Airbyte Cloud**, click **Authenticate your Intercom account**. When the pop-up appears, select the appropriate workspace from the dropdown and click **Authorize access**. - - + + - For **Airbyte Open Source**, enter your access token to authenticate your account. @@ -72,51 +73,51 @@ The Intercom connector should not run into Intercom API limitations under normal ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------------------| -| 0.6.3 | 2024-03-23 | [36414](https://github.com/airbytehq/airbyte/pull/36414) | Fixed `pagination` regression bug for `conversations` stream | -| 0.6.2 | 2024-03-22 | [36277](https://github.com/airbytehq/airbyte/pull/36277) | Fixed the bug for `conversations` stream failed due to `404 - User Not Found`, when the `2.10` API version is used | -| 0.6.1 | 2024-03-18 | [36232](https://github.com/airbytehq/airbyte/pull/36232) | Fixed the bug caused the regression when setting the `Intercom-Version` header, updated the source to use the latest CDK version | -| 0.6.0 | 2024-02-12 | [35176](https://github.com/airbytehq/airbyte/pull/35176) | Update the connector to use `2.10` API version | -| 0.5.1 | 2024-02-12 | [35148](https://github.com/airbytehq/airbyte/pull/35148) | Manage dependencies with Poetry. | -| 0.5.0 | 2024-02-09 | [35063](https://github.com/airbytehq/airbyte/pull/35063) | Add missing fields for mutiple streams | -| 0.4.0 | 2024-01-11 | [33882](https://github.com/airbytehq/airbyte/pull/33882) | Add new stream `Activity Logs` | -| 0.3.2 | 2023-12-07 | [33223](https://github.com/airbytehq/airbyte/pull/33223) | Ignore 404 error for `Conversation Parts` | -| 0.3.1 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.0 | 2023-05-25 | [29598](https://github.com/airbytehq/airbyte/pull/29598) | Update custom components to make them compatible with latest cdk version, simplify logic, update schemas | -| 0.2.1 | 2023-05-25 | [26571](https://github.com/airbytehq/airbyte/pull/26571) | Remove authSpecification from spec.json in favour of advancedAuth | -| 0.2.0 | 2023-04-05 | [23013](https://github.com/airbytehq/airbyte/pull/23013) | Migrated to Low-code (YAML Frramework) | -| 0.1.33 | 2023-03-20 | [22980](https://github.com/airbytehq/airbyte/pull/22980) | Specified date formatting in specification | -| 0.1.32 | 2023-02-27 | [22095](https://github.com/airbytehq/airbyte/pull/22095) | Extended `Contacts` schema adding `opted_out_subscription_types` property | -| 0.1.31 | 2023-02-17 | [23152](https://github.com/airbytehq/airbyte/pull/23152) | Add `TypeTransformer` to stream `companies` | -| 0.1.30 | 2023-01-27 | [22010](https://github.com/airbytehq/airbyte/pull/22010) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.1.29 | 2022-10-31 | [18681](https://github.com/airbytehq/airbyte/pull/18681) | Define correct version for airbyte-cdk~=0.2 | -| 0.1.28 | 2022-10-20 | [18216](https://github.com/airbytehq/airbyte/pull/18216) | Use airbyte-cdk~=0.2.0 with SQLite caching | -| 0.1.27 | 2022-08-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | -| 0.1.26 | 2022-08-18 | [16540](https://github.com/airbytehq/airbyte/pull/16540) | Fix JSON schema | -| 0.1.25 | 2022-08-18 | [15681](https://github.com/airbytehq/airbyte/pull/15681) | Update Intercom API to v 2.5 | -| 0.1.24 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from schemas | -| 0.1.23 | 2022-07-19 | [14830](https://github.com/airbytehq/airbyte/pull/14830) | Added `checkpoint_interval` for Incremental streams | -| 0.1.22 | 2022-07-09 | [14554](https://github.com/airbytehq/airbyte/pull/14554) | Fixed `conversation_parts` stream schema definition | -| 0.1.21 | 2022-07-05 | [14403](https://github.com/airbytehq/airbyte/pull/14403) | Refactored `Conversations`, `Conversation Parts`, `Company Segments` to increase performance | -| 0.1.20 | 2022-06-24 | [14099](https://github.com/airbytehq/airbyte/pull/14099) | Extended `Contacts` stream schema with `sms_consent`,`unsubscribe_from_sms` properties | -| 0.1.19 | 2022-05-25 | [13204](https://github.com/airbytehq/airbyte/pull/13204) | Fixed `conversation_parts` stream schema definition | -| 0.1.18 | 2022-05-04 | [12482](https://github.com/airbytehq/airbyte/pull/12482) | Update input configuration copy | -| 0.1.17 | 2022-04-29 | [12374](https://github.com/airbytehq/airbyte/pull/12374) | Fixed filtering of conversation_parts | -| 0.1.16 | 2022-03-23 | [11206](https://github.com/airbytehq/airbyte/pull/11206) | Added conversation_id field to conversation_part records | -| 0.1.15 | 2022-03-22 | [11176](https://github.com/airbytehq/airbyte/pull/11176) | Correct `check_connection` URL | -| 0.1.14 | 2022-03-16 | [11208](https://github.com/airbytehq/airbyte/pull/11208) | Improve 'conversations' incremental sync speed | -| 0.1.13 | 2022-01-14 | [9513](https://github.com/airbytehq/airbyte/pull/9513) | Added handling of scroll param when it expired | -| 0.1.12 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Updated fields and descriptions | -| 0.1.11 | 2021-12-13 | [8685](https://github.com/airbytehq/airbyte/pull/8685) | Remove time.sleep for rate limit | -| 0.1.10 | 2021-12-10 | [8637](https://github.com/airbytehq/airbyte/pull/8637) | Fix 'conversations' order and sorting. Correction of the companies stream | -| 0.1.9 | 2021-12-03 | [8395](https://github.com/airbytehq/airbyte/pull/8395) | Fix backoff of 'companies' stream | -| 0.1.8 | 2021-11-09 | [7060](https://github.com/airbytehq/airbyte/pull/7060) | Added oauth support | -| 0.1.7 | 2021-11-08 | [7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | -| 0.1.6 | 2021-10-07 | [6879](https://github.com/airbytehq/airbyte/pull/6879) | Corrected pagination for contacts | -| 0.1.5 | 2021-09-28 | [6082](https://github.com/airbytehq/airbyte/pull/6082) | Corrected android\_last\_seen\_at field data type in schemas | -| 0.1.4 | 2021-09-20 | [6087](https://github.com/airbytehq/airbyte/pull/6087) | Corrected updated\_at field data type in schemas | -| 0.1.3 | 2021-09-08 | [5908](https://github.com/airbytehq/airbyte/pull/5908) | Corrected timestamp and arrays in schemas | -| 0.1.2 | 2021-08-19 | [5531](https://github.com/airbytehq/airbyte/pull/5531) | Corrected pagination | -| 0.1.1 | 2021-07-31 | [5123](https://github.com/airbytehq/airbyte/pull/5123) | Corrected rate limit | -| 0.1.0 | 2021-07-19 | [4676](https://github.com/airbytehq/airbyte/pull/4676) | Release Intercom CDK Connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------- | +| 0.6.3 | 2024-03-23 | [36414](https://github.com/airbytehq/airbyte/pull/36414) | Fixed `pagination` regression bug for `conversations` stream | +| 0.6.2 | 2024-03-22 | [36277](https://github.com/airbytehq/airbyte/pull/36277) | Fixed the bug for `conversations` stream failed due to `404 - User Not Found`, when the `2.10` API version is used | +| 0.6.1 | 2024-03-18 | [36232](https://github.com/airbytehq/airbyte/pull/36232) | Fixed the bug caused the regression when setting the `Intercom-Version` header, updated the source to use the latest CDK version | +| 0.6.0 | 2024-02-12 | [35176](https://github.com/airbytehq/airbyte/pull/35176) | Update the connector to use `2.10` API version | +| 0.5.1 | 2024-02-12 | [35148](https://github.com/airbytehq/airbyte/pull/35148) | Manage dependencies with Poetry. | +| 0.5.0 | 2024-02-09 | [35063](https://github.com/airbytehq/airbyte/pull/35063) | Add missing fields for mutiple streams | +| 0.4.0 | 2024-01-11 | [33882](https://github.com/airbytehq/airbyte/pull/33882) | Add new stream `Activity Logs` | +| 0.3.2 | 2023-12-07 | [33223](https://github.com/airbytehq/airbyte/pull/33223) | Ignore 404 error for `Conversation Parts` | +| 0.3.1 | 2023-10-19 | [31599](https://github.com/airbytehq/airbyte/pull/31599) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.0 | 2023-05-25 | [29598](https://github.com/airbytehq/airbyte/pull/29598) | Update custom components to make them compatible with latest cdk version, simplify logic, update schemas | +| 0.2.1 | 2023-05-25 | [26571](https://github.com/airbytehq/airbyte/pull/26571) | Remove authSpecification from spec.json in favour of advancedAuth | +| 0.2.0 | 2023-04-05 | [23013](https://github.com/airbytehq/airbyte/pull/23013) | Migrated to Low-code (YAML Frramework) | +| 0.1.33 | 2023-03-20 | [22980](https://github.com/airbytehq/airbyte/pull/22980) | Specified date formatting in specification | +| 0.1.32 | 2023-02-27 | [22095](https://github.com/airbytehq/airbyte/pull/22095) | Extended `Contacts` schema adding `opted_out_subscription_types` property | +| 0.1.31 | 2023-02-17 | [23152](https://github.com/airbytehq/airbyte/pull/23152) | Add `TypeTransformer` to stream `companies` | +| 0.1.30 | 2023-01-27 | [22010](https://github.com/airbytehq/airbyte/pull/22010) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.1.29 | 2022-10-31 | [18681](https://github.com/airbytehq/airbyte/pull/18681) | Define correct version for airbyte-cdk~=0.2 | +| 0.1.28 | 2022-10-20 | [18216](https://github.com/airbytehq/airbyte/pull/18216) | Use airbyte-cdk~=0.2.0 with SQLite caching | +| 0.1.27 | 2022-08-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | +| 0.1.26 | 2022-08-18 | [16540](https://github.com/airbytehq/airbyte/pull/16540) | Fix JSON schema | +| 0.1.25 | 2022-08-18 | [15681](https://github.com/airbytehq/airbyte/pull/15681) | Update Intercom API to v 2.5 | +| 0.1.24 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from schemas | +| 0.1.23 | 2022-07-19 | [14830](https://github.com/airbytehq/airbyte/pull/14830) | Added `checkpoint_interval` for Incremental streams | +| 0.1.22 | 2022-07-09 | [14554](https://github.com/airbytehq/airbyte/pull/14554) | Fixed `conversation_parts` stream schema definition | +| 0.1.21 | 2022-07-05 | [14403](https://github.com/airbytehq/airbyte/pull/14403) | Refactored `Conversations`, `Conversation Parts`, `Company Segments` to increase performance | +| 0.1.20 | 2022-06-24 | [14099](https://github.com/airbytehq/airbyte/pull/14099) | Extended `Contacts` stream schema with `sms_consent`,`unsubscribe_from_sms` properties | +| 0.1.19 | 2022-05-25 | [13204](https://github.com/airbytehq/airbyte/pull/13204) | Fixed `conversation_parts` stream schema definition | +| 0.1.18 | 2022-05-04 | [12482](https://github.com/airbytehq/airbyte/pull/12482) | Update input configuration copy | +| 0.1.17 | 2022-04-29 | [12374](https://github.com/airbytehq/airbyte/pull/12374) | Fixed filtering of conversation_parts | +| 0.1.16 | 2022-03-23 | [11206](https://github.com/airbytehq/airbyte/pull/11206) | Added conversation_id field to conversation_part records | +| 0.1.15 | 2022-03-22 | [11176](https://github.com/airbytehq/airbyte/pull/11176) | Correct `check_connection` URL | +| 0.1.14 | 2022-03-16 | [11208](https://github.com/airbytehq/airbyte/pull/11208) | Improve 'conversations' incremental sync speed | +| 0.1.13 | 2022-01-14 | [9513](https://github.com/airbytehq/airbyte/pull/9513) | Added handling of scroll param when it expired | +| 0.1.12 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Updated fields and descriptions | +| 0.1.11 | 2021-12-13 | [8685](https://github.com/airbytehq/airbyte/pull/8685) | Remove time.sleep for rate limit | +| 0.1.10 | 2021-12-10 | [8637](https://github.com/airbytehq/airbyte/pull/8637) | Fix 'conversations' order and sorting. Correction of the companies stream | +| 0.1.9 | 2021-12-03 | [8395](https://github.com/airbytehq/airbyte/pull/8395) | Fix backoff of 'companies' stream | +| 0.1.8 | 2021-11-09 | [7060](https://github.com/airbytehq/airbyte/pull/7060) | Added oauth support | +| 0.1.7 | 2021-11-08 | [7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | +| 0.1.6 | 2021-10-07 | [6879](https://github.com/airbytehq/airbyte/pull/6879) | Corrected pagination for contacts | +| 0.1.5 | 2021-09-28 | [6082](https://github.com/airbytehq/airbyte/pull/6082) | Corrected android_last_seen_at field data type in schemas | +| 0.1.4 | 2021-09-20 | [6087](https://github.com/airbytehq/airbyte/pull/6087) | Corrected updated_at field data type in schemas | +| 0.1.3 | 2021-09-08 | [5908](https://github.com/airbytehq/airbyte/pull/5908) | Corrected timestamp and arrays in schemas | +| 0.1.2 | 2021-08-19 | [5531](https://github.com/airbytehq/airbyte/pull/5531) | Corrected pagination | +| 0.1.1 | 2021-07-31 | [5123](https://github.com/airbytehq/airbyte/pull/5123) | Corrected rate limit | +| 0.1.0 | 2021-07-19 | [4676](https://github.com/airbytehq/airbyte/pull/4676) | Release Intercom CDK Connector | diff --git a/docs/integrations/sources/intruder.md b/docs/integrations/sources/intruder.md index bb8bbb7553c..c65cc055b0e 100644 --- a/docs/integrations/sources/intruder.md +++ b/docs/integrations/sources/intruder.md @@ -6,17 +6,17 @@ This source can sync data from the [Intruder.io API](https://dev.Intruder.io.com ## This Source Supports the Following Streams -* Issues -* Occurrences issue -* Targets -* Scans +- Issues +- Occurrences issue +- Targets +- Scans ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -26,10 +26,10 @@ Intruder.io APIs are under rate limits for the number of API calls allowed per A ### Requirements -* Intruder.io Access token +- Intruder.io Access token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-30 | [#18668](https://github.com/airbytehq/airbyte/pull/18668) | 🎉 New Source: Intruder.io API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------------------- | +| 0.1.0 | 2022-10-30 | [#18668](https://github.com/airbytehq/airbyte/pull/18668) | 🎉 New Source: Intruder.io API [low-code CDK] | diff --git a/docs/integrations/sources/ip2whois.md b/docs/integrations/sources/ip2whois.md index 6b972f4e9d0..89f5a0cf543 100644 --- a/docs/integrations/sources/ip2whois.md +++ b/docs/integrations/sources/ip2whois.md @@ -6,15 +6,14 @@ This source can sync data from the [Ip2whois API](https://www.ip2whois.com/devel ## This Source Supports the Following Streams -* [whois](https://www.ip2whois.com/developers-api) - +- [whois](https://www.ip2whois.com/developers-api) ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -24,15 +23,13 @@ Ip2whois APIs allows you to query up to 500 WHOIS domain name per month. ### Requirements -* [API token](https://www.ip2whois.com/register) - +- [API token](https://www.ip2whois.com/register) ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.3 | 2024-04-19 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | schema descriptions | -| 0.1.0 | 2022-10-29 | [#18651](https://github.com/airbytehq/airbyte/pull/18651) | 🎉 New source: Ip2whois [low-code SDK]| - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37180](https://github.com/airbytehq/airbyte/pull/37180) | schema descriptions | +| 0.1.0 | 2022-10-29 | [#18651](https://github.com/airbytehq/airbyte/pull/18651) | 🎉 New source: Ip2whois [low-code SDK] | diff --git a/docs/integrations/sources/iterable.md b/docs/integrations/sources/iterable.md index cbf982a1940..bb7731fbfe2 100644 --- a/docs/integrations/sources/iterable.md +++ b/docs/integrations/sources/iterable.md @@ -79,7 +79,7 @@ The Iterable source connector supports the following [sync modes](https://docs.a ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 0.5.1 | 2024-04-24 | [36645](https://github.com/airbytehq/airbyte/pull/36645) | Schema descriptions and CDK 0.80.0 | | 0.5.0 | 2024-03-18 | [36231](https://github.com/airbytehq/airbyte/pull/36231) | Migrate connector to low-code | | 0.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | diff --git a/docs/integrations/sources/jenkins.md b/docs/integrations/sources/jenkins.md index 43f58dbe113..fb960abb3c4 100644 --- a/docs/integrations/sources/jenkins.md +++ b/docs/integrations/sources/jenkins.md @@ -13,8 +13,8 @@ in the tables and columns you set up for replication, every time a sync is run. Several output streams are available from this source: -* [Builds](https://your.jenkins.url/job/$JOB_NAME/$BUILD_NUMBER/api/json?pretty=true) \(Incremental\) -* [Jobs](https://your.jenkins.url/job/$JOB_NAME/api/json?pretty=true) +- [Builds](https://your.jenkins.url/job/$JOB_NAME/$BUILD_NUMBER/api/json?pretty=true) \(Incremental\) +- [Jobs](https://your.jenkins.url/job/$JOB_NAME/api/json?pretty=true) In the above links, replace `your.jenkins.url` with the url of your Jenkins instance, and replace any environment variables with an existing Jenkins job or @@ -25,12 +25,12 @@ issue.](https://github.com/faros-ai/airbyte-connectors/issues/new) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -43,9 +43,9 @@ rate limit issues that are not automatically retried successfully. ### Requirements -* Jenkins Server -* Jenkins User -* Jenkins API Token +- Jenkins Server +- Jenkins User +- Jenkins API Token ### Setup guide @@ -54,11 +54,10 @@ Login to your Jenkins server in your browser and go to ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.23 | 2021-10-01 | [114](https://github.com/faros-ai/airbyte-connectors/pull/114) | Added projects stream to Phabricator + cleanup | -| 0.1.22 | 2021-10-01 | [113](https://github.com/faros-ai/airbyte-connectors/pull/113) | Added revisions & users streams to Phabricator source + bump version | -| 0.1.21 | 2021-09-27 | [101](https://github.com/faros-ai/airbyte-connectors/pull/101) | Exclude tests from Docker + fix path + bump version | -| 0.1.20 | 2021-09-27 | [100](https://github.com/faros-ai/airbyte-connectors/pull/100) | Update Jenkins spec + refactor + add Phabricator source skeleton | -| 0.1.7 | 2021-09-25 | [64](https://github.com/faros-ai/airbyte-connectors/pull/64) | Add Jenkins source | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------- | :------------------------------------------------------------------- | +| 0.1.23 | 2021-10-01 | [114](https://github.com/faros-ai/airbyte-connectors/pull/114) | Added projects stream to Phabricator + cleanup | +| 0.1.22 | 2021-10-01 | [113](https://github.com/faros-ai/airbyte-connectors/pull/113) | Added revisions & users streams to Phabricator source + bump version | +| 0.1.21 | 2021-09-27 | [101](https://github.com/faros-ai/airbyte-connectors/pull/101) | Exclude tests from Docker + fix path + bump version | +| 0.1.20 | 2021-09-27 | [100](https://github.com/faros-ai/airbyte-connectors/pull/100) | Update Jenkins spec + refactor + add Phabricator source skeleton | +| 0.1.7 | 2021-09-25 | [64](https://github.com/faros-ai/airbyte-connectors/pull/64) | Add Jenkins source | diff --git a/docs/integrations/sources/jira-migrations.md b/docs/integrations/sources/jira-migrations.md index 9dc0955b49d..aba47c32ba5 100644 --- a/docs/integrations/sources/jira-migrations.md +++ b/docs/integrations/sources/jira-migrations.md @@ -7,21 +7,21 @@ Note: this change is only breaking if you are using the `Boards Issues` stream i This is a breaking change because Stream State for `Boards Issues` will be changed, so please follow the instructions below to migrate to version 1.0.0: 1. Select **Connections** in the main navbar. -1.1 Select the connection(s) affected by the update. + 1.1 Select the connection(s) affected by the update. 2. Select the **Replication** tab. -2.1 Select **Refresh source schema**. - ```note + 2.1 Select **Refresh source schema**. + `note Any detected schema changes will be listed for your review. - ``` -2.2 Select **OK**. + ` + 2.2 Select **OK**. 3. Select **Save changes** at the bottom of the page. -3.1 Ensure the **Reset affected streams** option is checked. - ```note + 3.1 Ensure the **Reset affected streams** option is checked. + `note Depending on destination type you may not be prompted to reset your data - ``` + ` 4. Select **Save connection**. - ```note - This will reset the data in your destination and initiate a fresh sync. - ``` + `note + This will reset the data in your destination and initiate a fresh sync. + ` -For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). \ No newline at end of file +For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/jira.md b/docs/integrations/sources/jira.md index 4c22c47823c..e840629bf79 100644 --- a/docs/integrations/sources/jira.md +++ b/docs/integrations/sources/jira.md @@ -123,10 +123,10 @@ The Jira connector should not run into Jira API limitations under normal usage. ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 1.2.2 | 2024-04-19 | [36646](https://github.com/airbytehq/airbyte/pull/36646) | Updating to 0.80.0 CDK | -| 1.2.1 | 2024-04-12 | [36646](https://github.com/airbytehq/airbyte/pull/36646) | schema descriptions | -| 1.2.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | +| :------ | :--------- | :--------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| 1.2.2 | 2024-04-19 | [36646](https://github.com/airbytehq/airbyte/pull/36646) | Updating to 0.80.0 CDK | +| 1.2.1 | 2024-04-12 | [36646](https://github.com/airbytehq/airbyte/pull/36646) | schema descriptions | +| 1.2.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | | 1.1.0 | 2024-02-27 | [35656](https://github.com/airbytehq/airbyte/pull/35656) | Add new fields to streams `board_issues`, `filter_sharing`, `filters`, `issues`, `permission_schemes`, `sprint_issues`, `users_groups_detailed`, and `workflows` | | 1.0.2 | 2024-02-12 | [35160](https://github.com/airbytehq/airbyte/pull/35160) | Manage dependencies with Poetry. | | 1.0.1 | 2024-01-24 | [34470](https://github.com/airbytehq/airbyte/pull/34470) | Add state checkpoint interval for all streams | diff --git a/docs/integrations/sources/k6-cloud.md b/docs/integrations/sources/k6-cloud.md index 6bae33a4039..48a14a238dc 100644 --- a/docs/integrations/sources/k6-cloud.md +++ b/docs/integrations/sources/k6-cloud.md @@ -6,16 +6,16 @@ This source can sync data from the [K6 Cloud API](https://developers.k6.io). At ## This Source Supports the Following Streams -* organizations -* projects -* tests +- organizations +- projects +- tests ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -23,13 +23,13 @@ This source can sync data from the [K6 Cloud API](https://developers.k6.io). At ### Requirements -* API Token +- API Token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.3 | 2024-04-19 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | schema descriptions | -| 0.1.0 | 2022-10-27 | [#18393](https://github.com/airbytehq/airbyte/pull/18393) | 🎉 New Source: K6 Cloud API [low-code CDK] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37181](https://github.com/airbytehq/airbyte/pull/37181) | schema descriptions | +| 0.1.0 | 2022-10-27 | [#18393](https://github.com/airbytehq/airbyte/pull/18393) | 🎉 New Source: K6 Cloud API [low-code CDK] | diff --git a/docs/integrations/sources/kafka.md b/docs/integrations/sources/kafka.md index 7eed0d3c74f..a4a6767feb7 100644 --- a/docs/integrations/sources/kafka.md +++ b/docs/integrations/sources/kafka.md @@ -8,21 +8,21 @@ This page guides you through the process of setting up the Kafka source connecto To use the Kafka source connector, you'll need: -* [A Kafka cluster 1.0 or above](https://kafka.apache.org/quickstart) -* Airbyte user should be allowed to read messages from topics, and these topics should be created before reading from Kafka. +- [A Kafka cluster 1.0 or above](https://kafka.apache.org/quickstart) +- Airbyte user should be allowed to read messages from topics, and these topics should be created before reading from Kafka. ## Step 2: Setup the Kafka source in Airbyte You'll need the following information to configure the Kafka source: -* **Group ID** - The Group ID is how you distinguish different consumer groups. (e.g. group.id) -* **Protocol** - The Protocol used to communicate with brokers. -* **Client ID** - An ID string to pass to the server when making requests. The purpose of this is to be able to track the source of requests beyond just ip/port by allowing a logical application name to be included in server-side request logging. (e.g. airbyte-consumer) -* **Test Topic** - The Topic to test in case the Airbyte can consume messages. (e.g. test.topic) -* **Subscription Method** - You can choose to manually assign a list of partitions, or subscribe to all topics matching specified pattern to get dynamically assigned partitions. -* **List of topic** -* **Bootstrap Servers** - A list of host/port pairs to use for establishing the initial connection to the Kafka cluster. -* **Schema Registry** - Host/port to connect schema registry server. Note: It supports for AVRO format only. +- **Group ID** - The Group ID is how you distinguish different consumer groups. (e.g. group.id) +- **Protocol** - The Protocol used to communicate with brokers. +- **Client ID** - An ID string to pass to the server when making requests. The purpose of this is to be able to track the source of requests beyond just ip/port by allowing a logical application name to be included in server-side request logging. (e.g. airbyte-consumer) +- **Test Topic** - The Topic to test in case the Airbyte can consume messages. (e.g. test.topic) +- **Subscription Method** - You can choose to manually assign a list of partitions, or subscribe to all topics matching specified pattern to get dynamically assigned partitions. +- **List of topic** +- **Bootstrap Servers** - A list of host/port pairs to use for establishing the initial connection to the Kafka cluster. +- **Schema Registry** - Host/port to connect schema registry server. Note: It supports for AVRO format only. ### For Airbyte Open Source: @@ -34,32 +34,32 @@ You'll need the following information to configure the Kafka source: The Kafka source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :------------------------ | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Namespaces | No | | ## Supported Format - JSON - Json value messages. It does not support schema registry now. - - AVRO - deserialize Using confluent API. Please refer (https://docs.confluent.io/platform/current/schema-registry/serdes-develop/serdes-avro.html) - + +JSON - Json value messages. It does not support schema registry now. + +AVRO - deserialize Using confluent API. Please refer (https://docs.confluent.io/platform/current/schema-registry/serdes-develop/serdes-avro.html) ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :-------- | :------------------------------------------------------| :---------------------------------------- | -| 0.2.4 | 2024-02-13 | [35229](https://github.com/airbytehq/airbyte/pull/35229) | Adopt CDK 0.20.4 | -| 0.2.4 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | -| 0.2.3 | 2022-12-06 | [19587](https://github.com/airbytehq/airbyte/pull/19587) | Fix missing data before consumer is closed | -| 0.2.2 | 2022-11-04 | [18648](https://github.com/airbytehq/airbyte/pull/18648) | Add missing record_count increment for JSON| -| 0.2.1 | 2022-11-04 | This version was the same as 0.2.0 and was committed so using 0.2.2 next to keep versions in order| -| 0.2.0 | 2022-08-22 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Added AVRO format support and Support for maximum records to process| -| 0.1.7 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | -| 0.1.6 | 2022-05-29 | [12903](https://github.com/airbytehq/airbyte/pull/12903) | Add Polling Time to Specification (default 100 ms) | -| 0.1.5 | 2022-04-19 | [12134](https://github.com/airbytehq/airbyte/pull/12134) | Add PLAIN Auth | -| 0.1.4 | 2022-02-15 | [10186](https://github.com/airbytehq/airbyte/pull/10186) | Add SCRAM-SHA-512 Auth | -| 0.1.3 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | -| 0.1.2 | 2021-12-21 | [8865](https://github.com/airbytehq/airbyte/pull/8865) | Fix SASL config read issue | -| 0.1.1 | 2021-12-06 | [8524](https://github.com/airbytehq/airbyte/pull/8524) | Update connector fields title/description | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------- | +| 0.2.4 | 2024-02-13 | [35229](https://github.com/airbytehq/airbyte/pull/35229) | Adopt CDK 0.20.4 | +| 0.2.4 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | +| 0.2.3 | 2022-12-06 | [19587](https://github.com/airbytehq/airbyte/pull/19587) | Fix missing data before consumer is closed | +| 0.2.2 | 2022-11-04 | [18648](https://github.com/airbytehq/airbyte/pull/18648) | Add missing record_count increment for JSON | +| 0.2.1 | 2022-11-04 | This version was the same as 0.2.0 and was committed so using 0.2.2 next to keep versions in order | +| 0.2.0 | 2022-08-22 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Added AVRO format support and Support for maximum records to process | +| 0.1.7 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | +| 0.1.6 | 2022-05-29 | [12903](https://github.com/airbytehq/airbyte/pull/12903) | Add Polling Time to Specification (default 100 ms) | +| 0.1.5 | 2022-04-19 | [12134](https://github.com/airbytehq/airbyte/pull/12134) | Add PLAIN Auth | +| 0.1.4 | 2022-02-15 | [10186](https://github.com/airbytehq/airbyte/pull/10186) | Add SCRAM-SHA-512 Auth | +| 0.1.3 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option | +| 0.1.2 | 2021-12-21 | [8865](https://github.com/airbytehq/airbyte/pull/8865) | Fix SASL config read issue | +| 0.1.1 | 2021-12-06 | [8524](https://github.com/airbytehq/airbyte/pull/8524) | Update connector fields title/description | diff --git a/docs/integrations/sources/klarna.md b/docs/integrations/sources/klarna.md index e28fc8b1876..3a89ec32b42 100644 --- a/docs/integrations/sources/klarna.md +++ b/docs/integrations/sources/klarna.md @@ -7,6 +7,7 @@ This page contains the setup guide and reference information for the Klarna sour The [Klarna Settlements API](https://developers.klarna.com/api/#settlements-api) is used to get the payouts and transactions for a Klarna account. ## Setup guide + ### Step 1: Set up Klarna In order to get an `Username (UID)` and `Password` please go to [this](https://docs.klarna.com/) page here you should find **Merchant Portal** button. Using this button you could log in to your production / playground in proper region. After registration / login you may find and create `Username (UID)` and `Password` in settings tab. @@ -20,6 +21,7 @@ Klarna Source Connector does not support OAuth at this time due to limitations o ## Step 2: Set up the Klarna connector in Airbyte ### For Airbyte Open Source: + 1. Navigate to the Airbyte Open Source dashboard 2. Set the name for your source 3. Choose if your account is sandbox @@ -33,17 +35,16 @@ Klarna Source Connector does not support OAuth at this time due to limitations o The Klarna source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -| :------------------------ |:-----------| +| :------------------------ | :--------- | | Full Refresh Sync | Yes | | Incremental - Append Sync | No | - ## Supported Streams This Source is capable of syncing the following Klarna Settlements Streams: -* [Payouts](https://developers.klarna.com/api/#settlements-api-get-all-payouts) -* [Transactions](https://developers.klarna.com/api/#settlements-api-get-transactions) +- [Payouts](https://developers.klarna.com/api/#settlements-api-get-all-payouts) +- [Transactions](https://developers.klarna.com/api/#settlements-api-get-transactions) ## Performance considerations @@ -56,11 +57,11 @@ Connector will handle an issue with rate limiting as Klarna returns 429 status c ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------| -| 0.2.4 | 2024-04-19 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | schema descriptions | -| 0.2.0 | 2023-10-23 | [31003](https://github.com/airbytehq/airbyte/pull/31003) | Migrate to low-code | -| 0.1.0 | 2022-10-24 | [18385](https://github.com/airbytehq/airbyte/pull/18385) | Klarna Settlements Payout and Transactions API | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37182](https://github.com/airbytehq/airbyte/pull/37182) | schema descriptions | +| 0.2.0 | 2023-10-23 | [31003](https://github.com/airbytehq/airbyte/pull/31003) | Migrate to low-code | +| 0.1.0 | 2022-10-24 | [18385](https://github.com/airbytehq/airbyte/pull/18385) | Klarna Settlements Payout and Transactions API | diff --git a/docs/integrations/sources/klaus-api.md b/docs/integrations/sources/klaus-api.md index bf38b64ea60..413c3f9aed0 100644 --- a/docs/integrations/sources/klaus-api.md +++ b/docs/integrations/sources/klaus-api.md @@ -18,7 +18,7 @@ This Source is capable of syncing the following core Streams: ### Features | Feature | Supported?\(Yes/No\) | Notes | -| :------------------------ |:---------------------| :---- | +| :------------------------ | :------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental - Append Sync | Yes | | | Namespaces | No | | @@ -30,5 +30,5 @@ This Source is capable of syncing the following core Streams: ## Changelog | Version | Date | Pull Request | Subject | -| :------ |:-----------| :------------------------------------------------------- |:-------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | | 0.1.0 | 2023-05-04 | [25790](https://github.com/airbytehq/airbyte/pull/25790) | Add Klaus API Source Connector | diff --git a/docs/integrations/sources/klaviyo-migrations.md b/docs/integrations/sources/klaviyo-migrations.md index 9dc27b89ad8..4fe81563d09 100644 --- a/docs/integrations/sources/klaviyo-migrations.md +++ b/docs/integrations/sources/klaviyo-migrations.md @@ -7,7 +7,7 @@ data using latest API which has a different schema. Users will need to refresh t streams after upgrading. See the chart below for the API version change. | Stream | Current API version | New API version | -|-------------------|---------------------|-----------------| +| ----------------- | ------------------- | --------------- | | campaigns | v1 | 2023-06-15 | | email_templates | v1 | 2023-10-15 | | events | v1 | 2023-10-15 | @@ -20,4 +20,4 @@ streams after upgrading. See the chart below for the API version change. ## Upgrading to 1.0.0 `event_properties/items/quantity` for `Events` stream is changed from `integer` to `number`. -For a smooth migration, data reset and schema refresh are needed. \ No newline at end of file +For a smooth migration, data reset and schema refresh are needed. diff --git a/docs/integrations/sources/klaviyo.md b/docs/integrations/sources/klaviyo.md index 86e22272664..54c197f1602 100644 --- a/docs/integrations/sources/klaviyo.md +++ b/docs/integrations/sources/klaviyo.md @@ -59,7 +59,7 @@ Stream `Lists Detailed` contains field `profile_count` in addition to info from ## Data type map | Integration Type | Airbyte Type | Notes | -|:-----------------|:-------------|:------| +| :--------------- | :----------- | :---- | | `string` | `string` | | | `number` | `number` | | | `array` | `array` | | @@ -68,7 +68,7 @@ Stream `Lists Detailed` contains field `profile_count` in addition to info from ## Changelog | Version | Date | Pull Request | Subject | -|:---------|:-----------|:-----------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------| +| :------- | :--------- | :--------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------- | | `2.6.1` | 2024-05-07 | [38010](https://github.com/airbytehq/airbyte/pull/38010) | Add error handler for `5XX` status codes | | `2.6.0` | 2024-04-19 | [37370](https://github.com/airbytehq/airbyte/pull/37370) | Add streams `campaigns_detailed` and `lists_detailed` | | `2.5.0` | 2024-04-15 | [36264](https://github.com/airbytehq/airbyte/pull/36264) | Migrate to low-code | diff --git a/docs/integrations/sources/kustomer-singer.md b/docs/integrations/sources/kustomer-singer.md index 60cf45ce7f9..b7822b50f00 100644 --- a/docs/integrations/sources/kustomer-singer.md +++ b/docs/integrations/sources/kustomer-singer.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The Kustomer source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The Kustomer source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. diff --git a/docs/integrations/sources/kyriba.md b/docs/integrations/sources/kyriba.md index 6587b81cb0a..af600700d78 100644 --- a/docs/integrations/sources/kyriba.md +++ b/docs/integrations/sources/kyriba.md @@ -7,9 +7,11 @@ This page contains the setup guide and reference information for the [Kyriba](ht ## Overview + The Kyriba source retrieves data from [Kyriba](https://kyriba.com/) using their [JSON REST APIs](https://developer.kyriba.com/apiCatalog/). ## Prerequisites + - Kyriba domain - Username - Password @@ -17,6 +19,7 @@ The Kyriba source retrieves data from [Kyriba](https://kyriba.com/) using their ## Setup Guide ### Set up the Kyriba source connector in Airbyte + 1. Log in to your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) account or your Airbyte Open Source account. 2. Navigate to **Sources** in the left sidebar and click **+ New source**. in the top-right corner. 3. Choose **Kyriba** from the list of available sources. @@ -36,6 +39,7 @@ The Kyriba source connector supports the following [sync modes](https://docs.air - Incremental ## Supported Streams + - [Accounts](https://developer.kyriba.com/site/global/apis/accounts/index.gsp) - [Bank Balances](https://developer.kyriba.com/site/global/apis/bank-statement-balances/index.gsp) - End of Day and Intraday - [Cash Balances](https://developer.kyriba.com/site/global/apis/cash-balances/index.gsp) - End of Day and Intraday @@ -56,17 +60,17 @@ The Kyriba connector should not run into API limitations under normal usage. [Cr ### Troubleshooting -* Check out common troubleshooting issues for the Stripe source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Stripe source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------------- | -| 0.1.3 | 2024-04-19 | [37184](https://github.com/airbytehq/airbyte/pull/37184) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-12 | [37184](https://github.com/airbytehq/airbyte/pull/37184) | schema descriptions | -| 0.1.1 | 2024-01-30 | [34545](https://github.com/airbytehq/airbyte/pull/34545) | Updates CDK, Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.0 | 2022-07-13 | [12748](https://github.com/airbytehq/airbyte/pull/12748) | The Kyriba Source is created | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------- | +| 0.1.3 | 2024-04-19 | [37184](https://github.com/airbytehq/airbyte/pull/37184) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-12 | [37184](https://github.com/airbytehq/airbyte/pull/37184) | schema descriptions | +| 0.1.1 | 2024-01-30 | [34545](https://github.com/airbytehq/airbyte/pull/34545) | Updates CDK, Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.0 | 2022-07-13 | [12748](https://github.com/airbytehq/airbyte/pull/12748) | The Kyriba Source is created | diff --git a/docs/integrations/sources/kyve.md b/docs/integrations/sources/kyve.md index ca023fcf656..5074da3757d 100644 --- a/docs/integrations/sources/kyve.md +++ b/docs/integrations/sources/kyve.md @@ -9,21 +9,22 @@ For information about how to setup an end to end pipeline with this connector, s ## Source configuration setup -1. In order to create an ELT pipeline with KYVE source you should specify the **`Pool-ID`** of [KYVE storage pool](https://app.kyve.network/#/pools) from which you want to retrieve data. +1. In order to create an ELT pipeline with KYVE source you should specify the **`Pool-ID`** of [KYVE storage pool](https://app.kyve.network/#/pools) from which you want to retrieve data. 2. You can specify a specific **`Bundle-Start-ID`** in case you want to narrow the records that will be retrieved from the pool. You can find the valid bundles of in the KYVE app (e.g. [Cosmos Hub pool](https://app.kyve.network/#/pools/0/bundles)). 3. In order to extract the validated from KYVE, you can specify the endpoint which will be requested **`KYVE-API URL Base`**. By default, the official KYVE **`mainnet`** endpoint will be used, providing the data of [these pools](https://app.kyve.network/#/pools). - ***Note:*** - KYVE Network consists of three individual networks: *Korellia* is the `devnet` used for development purposes, *Kaon* is the `testnet` used for testing purposes, and **`mainnet`** is the official network. Although through Kaon and Korellia validated data can be used for development purposes, it is recommended to only trust the data validated on Mainnet. + **_Note:_** + KYVE Network consists of three individual networks: _Korellia_ is the `devnet` used for development purposes, _Kaon_ is the `testnet` used for testing purposes, and **`mainnet`** is the official network. Although through Kaon and Korellia validated data can be used for development purposes, it is recommended to only trust the data validated on Mainnet. ## Multiple pools + You can fetch with one source configuration more than one pool simultaneously. You just need to specify the **`Pool-IDs`** and the **`Bundle-Start-ID`** for the KYVE storage pool you want to archive separated with comma. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ |:---------|:-------------|:----------------------------------------------------| -| 0.2.0 | 2023-11-10 | | Update KYVE source to support to Mainnet and Testnet| -| 0.1.0 | 2023-05-25 | | Initial release of KYVE source connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------- | :--------------------------------------------------- | +| 0.2.0 | 2023-11-10 | | Update KYVE source to support to Mainnet and Testnet | +| 0.1.0 | 2023-05-25 | | Initial release of KYVE source connector | diff --git a/docs/integrations/sources/launchdarkly.md b/docs/integrations/sources/launchdarkly.md index ab0fb40dad8..c88927f6e2c 100644 --- a/docs/integrations/sources/launchdarkly.md +++ b/docs/integrations/sources/launchdarkly.md @@ -6,19 +6,19 @@ This source can sync data from the [Launchdarkly API](https://apidocs.launchdark ## This Source Supports the Following Streams -* projects -* environments -* metrics -* members -* audit_log -* flags +- projects +- environments +- metrics +- members +- audit_log +- flags ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -28,10 +28,10 @@ Launchdarkly APIs are under rate limits for the number of API calls allowed per ### Requirements -* Access Token +- Access Token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-30 | [#18660](https://github.com/airbytehq/airbyte/pull/18660) | 🎉 New Source: Launchdarkly API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------------- | +| 0.1.0 | 2022-10-30 | [#18660](https://github.com/airbytehq/airbyte/pull/18660) | 🎉 New Source: Launchdarkly API [low-code CDK] | diff --git a/docs/integrations/sources/lemlist.md b/docs/integrations/sources/lemlist.md index abb0e936c6b..62dac1825e3 100644 --- a/docs/integrations/sources/lemlist.md +++ b/docs/integrations/sources/lemlist.md @@ -35,9 +35,9 @@ The Lemlist connector should not run into Lemlist API limitations under normal u ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :-------------- | -| 0.2.1 | 2024-05-15 | [37100](https://github.com/airbytehq/airbyte/pull/37100) | Add new A/B test columns | -| 0.2.0 | 2023-08-14 | [29406](https://github.com/airbytehq/airbyte/pull/29406) | Migrated to LowCode Cdk | -| 0.1.1 | Unknown | Unknown | Bump Version | -| 0.1.0 | 2021-10-14 | [7062](https://github.com/airbytehq/airbyte/pull/7062) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------- | +| 0.2.1 | 2024-05-15 | [37100](https://github.com/airbytehq/airbyte/pull/37100) | Add new A/B test columns | +| 0.2.0 | 2023-08-14 | [29406](https://github.com/airbytehq/airbyte/pull/29406) | Migrated to LowCode Cdk | +| 0.1.1 | Unknown | Unknown | Bump Version | +| 0.1.0 | 2021-10-14 | [7062](https://github.com/airbytehq/airbyte/pull/7062) | Initial Release | diff --git a/docs/integrations/sources/lever-hiring.md b/docs/integrations/sources/lever-hiring.md index d364368a96a..556260ebb2a 100644 --- a/docs/integrations/sources/lever-hiring.md +++ b/docs/integrations/sources/lever-hiring.md @@ -10,22 +10,22 @@ This source can sync data for the [Lever Hiring API](https://hire.lever.co/devel This Source is capable of syncing the following core Streams: -* [Applications](https://hire.lever.co/developer/documentation#list-all-applications) -* [Interviews](https://hire.lever.co/developer/documentation#list-all-interviews) -* [Notes](https://hire.lever.co/developer/documentation#list-all-notes) -* [Offers](https://hire.lever.co/developer/documentation#list-all-offers) -* [Opportunities](https://hire.lever.co/developer/documentation#list-all-opportunities) -* [Referrals](https://hire.lever.co/developer/documentation#list-all-referrals) -* [Users](https://hire.lever.co/developer/documentation#list-all-users) +- [Applications](https://hire.lever.co/developer/documentation#list-all-applications) +- [Interviews](https://hire.lever.co/developer/documentation#list-all-interviews) +- [Notes](https://hire.lever.co/developer/documentation#list-all-notes) +- [Offers](https://hire.lever.co/developer/documentation#list-all-offers) +- [Opportunities](https://hire.lever.co/developer/documentation#list-all-opportunities) +- [Referrals](https://hire.lever.co/developer/documentation#list-all-referrals) +- [Users](https://hire.lever.co/developer/documentation#list-all-users) ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental - Append Sync | Yes | | -| SSL connection | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :------------------------ | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental - Append Sync | Yes | | +| SSL connection | Yes | | +| Namespaces | No | | ### Performance considerations @@ -35,18 +35,16 @@ The Lever Hiring connector should not run into Lever Hiring API limitations unde ### Requirements -* Lever Hiring Client Id -* Lever Hiring Client Secret -* Lever Hiring Refresh Token +- Lever Hiring Client Id +- Lever Hiring Client Secret +- Lever Hiring Refresh Token ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------- | | 0.2.0 | 2023-05-25 | [26564](https://github.com/airbytehq/airbyte/pull/26564) | Migrate to advancedAuth | | 0.1.3 | 2022-10-14 | [17996](https://github.com/airbytehq/airbyte/pull/17996) | Add Basic Auth management | | 0.1.2 | 2021-12-30 | [9214](https://github.com/airbytehq/airbyte/pull/9214) | Update title and descriptions | | 0.1.1 | 2021-12-16 | [7677](https://github.com/airbytehq/airbyte/pull/7677) | OAuth Automated Authentication | | 0.1.0 | 2021-09-22 | [6141](https://github.com/airbytehq/airbyte/pull/6141) | Add Lever Hiring Source Connector | - - diff --git a/docs/integrations/sources/linkedin-ads-migrations.md b/docs/integrations/sources/linkedin-ads-migrations.md index 20d419340cf..a1caf71a3bb 100644 --- a/docs/integrations/sources/linkedin-ads-migrations.md +++ b/docs/integrations/sources/linkedin-ads-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 2.0.0 -Version 2.0.0 introduces changes in the primary key selected for all *-analytics streams (including custom ones) from pivotValues[array of strings] to string_of_pivot_values[string] so that it is compatible with more destination types. +Version 2.0.0 introduces changes in the primary key selected for all \*-analytics streams (including custom ones) from pivotValues[array of strings] to string_of_pivot_values[string] so that it is compatible with more destination types. - "ad_campaign_analytics" - "ad_creative_analytics" @@ -21,19 +21,18 @@ Version 2.0.0 introduces changes in the primary key selected for all *-analytics Clearing your data is required for the affected streams in order to continue syncing successfully. To clear your data for the affected streams, follow the steps below: 1. Select **Connections** in the main navbar and select the connection(s) affected by the update. -2. Select the **Schema** tab. - 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. - 2. Select **OK** to approve changes. -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. -4. Select **Save connection**. +2. Select the **Schema** tab. + 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. + 2. Select **OK** to approve changes. +3. Select **Save changes** at the bottom of the page. + 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. +4. Select **Save connection**. This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 1.0.0 -Version 1.0.0 introduces changes in the primary key selected for all *-analytics streams (including custom ones). +Version 1.0.0 introduces changes in the primary key selected for all \*-analytics streams (including custom ones). - "ad_campaign_analytics" - "ad_creative_analytics" @@ -52,11 +51,11 @@ Version 1.0.0 introduces changes in the primary key selected for all *-analytics Clearing your data is required for the affected streams in order to continue syncing successfully. To clear your data for the affected streams, follow the steps below: 1. Select **Connections** in the main navbar and select the connection(s) affected by the update. -2. Select the **Schema** tab. - 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. - 2. Select **OK** to approve changes. -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. -4. Select **Save connection**. +2. Select the **Schema** tab. + 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. + 2. Select **OK** to approve changes. +3. Select **Save changes** at the bottom of the page. + 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. +4. Select **Save connection**. -This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). \ No newline at end of file +This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/linkedin-ads.md b/docs/integrations/sources/linkedin-ads.md index 713b1007755..969a72e88a9 100644 --- a/docs/integrations/sources/linkedin-ads.md +++ b/docs/integrations/sources/linkedin-ads.md @@ -34,16 +34,18 @@ You can follow the steps laid out below to create the application and obtain the 1. [Log in to LinkedIn](https://developer.linkedin.com/) with a developer account. 2. Navigate to the [Apps page](https://www.linkedin.com/developers/apps) and click the **Create App** icon. Fill in the fields below: - 1. For **App Name**, enter a name. - 2. For **LinkedIn Page**, enter your company's name or LinkedIn Company Page URL. - 3. For **Privacy policy URL**, enter the link to your company's privacy policy. - 4. For **App logo**, upload your company's logo. - 5. Check **I have read and agree to these terms**, then click **Create App**. LinkedIn redirects you to a page showing the details of your application. + + 1. For **App Name**, enter a name. + 2. For **LinkedIn Page**, enter your company's name or LinkedIn Company Page URL. + 3. For **Privacy policy URL**, enter the link to your company's privacy policy. + 4. For **App logo**, upload your company's logo. + 5. Check **I have read and agree to these terms**, then click **Create App**. LinkedIn redirects you to a page showing the details of your application. 3. You can verify your app using the following steps: - 1. Click the **Settings** tab. On the **App Settings** section, click **Verify** under **Company**. A popup window will be displayed. To generate the verification URL, click on **Generate URL**, then copy and send the URL to the Page Admin (this may be you). Click on **I'm done**. If you are the administrator of your Page, simply run the URL in a new tab (if not, an administrator will have to do the next step). Click on **Verify**. - 2. To display the Products page, click the **Product** tab. For **Marketing Developer Platform**, click **Request access**. A popup window will be displayed. Review and Select **I have read and agree to these terms**. Finally, click **Request access**. + 1. Click the **Settings** tab. On the **App Settings** section, click **Verify** under **Company**. A popup window will be displayed. To generate the verification URL, click on **Generate URL**, then copy and send the URL to the Page Admin (this may be you). Click on **I'm done**. If you are the administrator of your Page, simply run the URL in a new tab (if not, an administrator will have to do the next step). Click on **Verify**. + + 2. To display the Products page, click the **Product** tab. For **Marketing Developer Platform**, click **Request access**. A popup window will be displayed. Review and Select **I have read and agree to these terms**. Finally, click **Request access**. #### Authorize your app @@ -52,11 +54,11 @@ You can follow the steps laid out below to create the application and obtain the 2. Click the **OAuth 2.0 tools** link in the **Understanding authentication and OAuth 2.0** section on the right side of the page. 3. Click **Create token**. 4. Select the scopes you want to use for your app. We recommend using the following scopes: - - `r_emailaddress` - - `r_liteprofile` - - `r_ads` - - `r_ads_reporting` - - `r_organization_social` + - `r_emailaddress` + - `r_liteprofile` + - `r_ads` + - `r_ads_reporting` + - `r_organization_social` 5. Click **Request access token**. You will be redirected to an authorization page. Use your LinkedIn credentials to log in and authorize your app and obtain your **Access Token** and **Refresh Token**. :::caution @@ -78,18 +80,20 @@ If either of your tokens expire, you can generate new ones by returning to Linke 5. To authenticate: + #### For Airbyte Cloud - Select **OAuth2.0** from the Authentication dropdown, then click **Authenticate your LinkedIn Ads account**. Sign in to your account and click **Allow**. + #### For Airbyte Open Source - Select an option from the Authentication dropdown: 1. **OAuth2.0:** Enter your **Client ID**, **Client Secret** and **Refresh Token**. Please note that the refresh token expires after 12 months. 2. **Access Token:** Enter your **Access Token**. Please note that the access token expires after 60 days. - + 6. For **Start Date**, use the provided datepicker or enter a date programmatically in the format YYYY-MM-DD. Any data before this date will not be replicated. 7. (Optional) For **Account IDs**, you may optionally provide a space separated list of Account IDs to pull data from. If you do not specify any account IDs, the connector will replicate data from all accounts accessible using your credentials. @@ -157,20 +161,20 @@ After 5 unsuccessful attempts - the connector will stop the sync operation. In s ## Data type map -| Integration Type | Airbyte Type | Notes | -|:------------------|:--------------|:----------------------------| -| `number` | `number` | float number | -| `integer` | `integer` | whole number | -| `date` | `string` | FORMAT YYYY-MM-DD | -| `datetime` | `string` | FORMAT YYYY-MM-DDThh:mm: ss | -| `array` | `array` | | -| `boolean` | `boolean` | True/False | -| `string` | `string` | | +| Integration Type | Airbyte Type | Notes | +| :--------------- | :----------- | :-------------------------- | +| `number` | `number` | float number | +| `integer` | `integer` | whole number | +| `date` | `string` | FORMAT YYYY-MM-DD | +| `datetime` | `string` | FORMAT YYYY-MM-DDThh:mm: ss | +| `array` | `array` | | +| `boolean` | `boolean` | True/False | +| `string` | `string` | | ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------------- | | 2.1.2 | 2024-05-07 | [36648](https://github.com/airbytehq/airbyte/pull/36648) | Schema descriptions | | 2.1.1 | 2024-05-07 | [38013](https://github.com/airbytehq/airbyte/pull/38013) | Fix an issue where the `Accounts` stream did not correctly handle provided account IDs | | 2.1.0 | 2024-04-30 | [37573](https://github.com/airbytehq/airbyte/pull/37573) | Update API version to `202404`; add cursor-based pagination | diff --git a/docs/integrations/sources/linkedin-pages.md b/docs/integrations/sources/linkedin-pages.md index c333dab7131..61365d460d6 100644 --- a/docs/integrations/sources/linkedin-pages.md +++ b/docs/integrations/sources/linkedin-pages.md @@ -10,13 +10,13 @@ Airbyte uses [LinkedIn Marketing Developer Platform - API](https://docs.microsof This Source is capable of syncing the following data as streams: -* [Organization Lookup](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organizations) -* [Follower Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/follower-statistics?tabs=http#retrieve-lifetime-follower-statistics) -* [Page Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/page-statistics?tabs=http#retrieve-lifetime-organization-page-statistics) -* [Share Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/share-statistics?tabs=http#retrieve-lifetime-share-statistics) -* [Shares (Latest 50)](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/share-api?tabs=http#find-shares-by-owner) -* [Total Follower Count](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organization-follower-count) -* [UGC Posts](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/ugc-post-api?tabs=http#find-ugc-posts-by-authors) +- [Organization Lookup](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organizations) +- [Follower Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/follower-statistics?tabs=http#retrieve-lifetime-follower-statistics) +- [Page Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/page-statistics?tabs=http#retrieve-lifetime-organization-page-statistics) +- [Share Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/share-statistics?tabs=http#retrieve-lifetime-share-statistics) +- [Shares (Latest 50)](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/share-api?tabs=http#find-shares-by-owner) +- [Total Follower Count](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organization-follower-count) +- [UGC Posts](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/ugc-post-api?tabs=http#find-ugc-posts-by-authors) ### NOTE: @@ -24,13 +24,13 @@ All streams only sync all-time statistics at this time. A `start_date` field wil ### Data type mapping -| Integration Type | Airbyte Type | Notes | -| :--------------- | :----------- | :------------------------- | -| `number` | `number` | float number | -| `integer` | `integer` | whole number | -| `array` | `array` | | -| `boolean` | `boolean` | True/False | -| `string` | `string` | | +| Integration Type | Airbyte Type | Notes | +| :--------------- | :----------- | :----------- | +| `number` | `number` | float number | +| `integer` | `integer` | whole number | +| `array` | `array` | | +| `boolean` | `boolean` | True/False | +| `string` | `string` | | ### Features @@ -56,60 +56,68 @@ This is expected when the connector hits the 429 - Rate Limit Exceeded HTTP Erro "Max try rate limit exceded..." ``` -After 5 unsuccessful attempts - the connector will stop the sync operation. In such cases check your Rate Limits [on this page](https://www.linkedin.com/developers/apps) > Choose your app > Analytics. +After 5 unsuccessful attempts - the connector will stop the sync operation. In such cases check your Rate Limits [on this page](https://www.linkedin.com/developers/apps) > Choose your app > Analytics. ## Getting started + The API user account should be assigned the following permissions for the API endpoints: Endpoints such as: `Organization Lookup API`, `Follower Statistics`, `Page Statistics`, `Share Statistics`, `Shares`, `UGC Posts` require these permissions: -* `r_organization_social`: Retrieve your organization's posts, comments, reactions, and other engagement data. -* `rw_organization_admin`: Manage your organization's pages and retrieve reporting data. + +- `r_organization_social`: Retrieve your organization's posts, comments, reactions, and other engagement data. +- `rw_organization_admin`: Manage your organization's pages and retrieve reporting data. The API user account should be assigned the `ADMIN` role. ### Authentication + There are 2 authentication methods: Access Token or OAuth2.0. OAuth2.0 is recommended since it will continue streaming data for 12 months instead of 2 months with an access token. ##### Create the `Refresh_Token` or `Access_Token`: + The source LinkedIn Pages can use either the `client_id`, `client_secret` and `refresh_token` for OAuth2.0 authentication or simply use an `access_token` in the UI connector's settings to make API requests. Access tokens expire after `2 months from creation date (60 days)` and require a user to manually authenticate again. Refresh tokens expire after `12 months from creation date (365 days)`. If you receive a `401 invalid token response`, the error logs will state that your token has expired and to re-authenticate your connection to generate a new token. This is described more [here](https://docs.microsoft.com/en-us/linkedin/shared/authentication/authorization-code-flow?context=linkedin/context). 1. **Log in to LinkedIn as the API user** 2. **Create an App** [here](https://www.linkedin.com/developers/apps): - * `App Name`: airbyte-source - * `Company`: search and find your LinkedIn Company Page - * `Privacy policy URL`: link to company privacy policy - * `Business email`: developer/admin email address - * `App logo`: Airbyte's \(or Company's\) logo - * Review/agree to legal terms and create app - * Review the **Auth** tab: - * **Save your `client_id` and `client_secret`** \(for later steps\) - * Oauth 2.0 settings: Provide a `redirect_uri` \(for later steps\): `https://airbyte.com` + + - `App Name`: airbyte-source + - `Company`: search and find your LinkedIn Company Page + - `Privacy policy URL`: link to company privacy policy + - `Business email`: developer/admin email address + - `App logo`: Airbyte's \(or Company's\) logo + - Review/agree to legal terms and create app + - Review the **Auth** tab: + - **Save your `client_id` and `client_secret`** \(for later steps\) + - Oauth 2.0 settings: Provide a `redirect_uri` \(for later steps\): `https://airbyte.com` 3. **Verify App**: - * In the **Settings** tab of your app dashboard, you'll see a **Verify** button. Click that button! - * Generate and provide the verify URL to your Company's LinkedIn Admin to verify the app. + + - In the **Settings** tab of your app dashboard, you'll see a **Verify** button. Click that button! + - Generate and provide the verify URL to your Company's LinkedIn Admin to verify the app. 4. **Request API Access**: - * Navigate to the **Products** tab - * Select the [Marketing Developer Platform](https://docs.microsoft.com/en-us/linkedin/marketing/) and agree to the legal terms - * After a few minutes, refresh the page to see a link to `View access form` in place of the **Select** button - * Fill out the access form and access should be granted **within 72 hours** (usually quicker) + + - Navigate to the **Products** tab + - Select the [Marketing Developer Platform](https://docs.microsoft.com/en-us/linkedin/marketing/) and agree to the legal terms + - After a few minutes, refresh the page to see a link to `View access form` in place of the **Select** button + - Fill out the access form and access should be granted **within 72 hours** (usually quicker) 5. **Create A Refresh Token** (or Access Token): - * Navigate to the LinkedIn Developers' [OAuth Token Tools](https://www.linkedin.com/developers/tools/oauth) and click **Create token** - * Select your newly created app and check the boxes for the following scopes: - * `r_organization_social` - * `rw_organization_admin` - * Click **Request access token** and once generated, **save your Refresh token** + + - Navigate to the LinkedIn Developers' [OAuth Token Tools](https://www.linkedin.com/developers/tools/oauth) and click **Create token** + - Select your newly created app and check the boxes for the following scopes: + - `r_organization_social` + - `rw_organization_admin` + - Click **Request access token** and once generated, **save your Refresh token** 6. **Use the `client_id`, `client_secret` and `refresh_token`** from Steps 2 and 5 to autorize the LinkedIn Pages connector within the Airbyte UI. - * As mentioned earlier, you can also simply use the Access token auth method for 60-day access. + - As mentioned earlier, you can also simply use the Access token auth method for 60-day access. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------- | | 1.0.2 | 2023-05-30 | [24352](https://github.com/airbytehq/airbyte/pull/24352) | Remove duplicate streams | | 1.0.1 | 2023-03-22 | [24352](https://github.com/airbytehq/airbyte/pull/24352) | Remove `authSpecification` as it's not yet supported | | 1.0.0 | 2023-03-16 | [18967](https://github.com/airbytehq/airbyte/pull/18967) | Fixed failing connection checks | diff --git a/docs/integrations/sources/linnworks.md b/docs/integrations/sources/linnworks.md index b6b53dd2a95..48f6a51656e 100644 --- a/docs/integrations/sources/linnworks.md +++ b/docs/integrations/sources/linnworks.md @@ -32,25 +32,25 @@ The value of your API Token can be viewed at any time from the main dashboard of The Linnworks source connector supports the following streams and [sync modes](https://docs.airbyte.com/cloud/core-concepts/#connection-sync-mode): -| Stream Name | Full Refresh | Incremental | -| :--------------------------------------------------------------------------------------------- | :----------- | :----------- | -| [ProcessedOrders](https://apps.linnworks.net/Api/Method/ProcessedOrders-SearchProcessedOrders) | ✓ | ✓ | -| [ProcessedOrderDetails](https://apps.linnworks.net/Api/Method/Orders-GetOrdersById) | ✓ | ✓ | -| [StockItems](https://apps.linnworks.net//Api/Method/Stock-GetStockItemsFull) | ✓ | X | -| [StockLocations](https://apps.linnworks.net/Api/Method/Inventory-GetStockLocations) | ✓ | X | -| [StockLocationDetails](https://apps.linnworks.net/Api/Method/Locations-GetLocation) | ✓ | X | +| Stream Name | Full Refresh | Incremental | +| :--------------------------------------------------------------------------------------------- | :----------- | :---------- | +| [ProcessedOrders](https://apps.linnworks.net/Api/Method/ProcessedOrders-SearchProcessedOrders) | ✓ | ✓ | +| [ProcessedOrderDetails](https://apps.linnworks.net/Api/Method/Orders-GetOrdersById) | ✓ | ✓ | +| [StockItems](https://apps.linnworks.net//Api/Method/Stock-GetStockItemsFull) | ✓ | X | +| [StockLocations](https://apps.linnworks.net/Api/Method/Inventory-GetStockLocations) | ✓ | X | +| [StockLocationDetails](https://apps.linnworks.net/Api/Method/Locations-GetLocation) | ✓ | X | ### Data type mapping -| Integration Type | Airbyte Type | Example | -| :--------------- | :----------- | :------------------------- | -| `number` | `number` | 50.23 | -| `integer` | `integer` | 50 | -| `date` | `string` | 2020-12-31 | -| `datetime` | `string` | 2020-12-31T07:30:00 | -| `array` | `array` | ["Item 1", "Item 2"] | -| `boolean` | `boolean` | True/False | -| `string` | `string` | Item 3 | +| Integration Type | Airbyte Type | Example | +| :--------------- | :----------- | :------------------- | +| `number` | `number` | 50.23 | +| `integer` | `integer` | 50 | +| `date` | `string` | 2020-12-31 | +| `datetime` | `string` | 2020-12-31T07:30:00 | +| `array` | `array` | ["Item 1", "Item 2"] | +| `boolean` | `boolean` | True/False | +| `string` | `string` | Item 3 | ## Limitations & Troubleshooting @@ -71,8 +71,8 @@ Rate limits for the Linnworks API vary across endpoints. Use the [links in the * | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------- | -| 0.1.9 | 2024-04-19 | [37188](https://github.com/airbytehq/airbyte/pull/37188) | Updating to 0.80.0 CDK | -| 0.1.8 | 2024-04-12 | [37188](https://github.com/airbytehq/airbyte/pull/37188) | schema descriptions | +| 0.1.9 | 2024-04-19 | [37188](https://github.com/airbytehq/airbyte/pull/37188) | Updating to 0.80.0 CDK | +| 0.1.8 | 2024-04-12 | [37188](https://github.com/airbytehq/airbyte/pull/37188) | schema descriptions | | 0.1.7 | 2024-02-22 | [35557](https://github.com/airbytehq/airbyte/pull/35557) | Manage dependencies with Poetry | | 0.1.6 | 2024-01-31 | [34717](https://github.com/airbytehq/airbyte/pull/34717) | Update CDK and migrate to base image | | 0.1.5 | 2022-11-20 | [19865](https://github.com/airbytehq/airbyte/pull/19865) | Bump Version | diff --git a/docs/integrations/sources/lokalise.md b/docs/integrations/sources/lokalise.md index aa4ab5ef7f0..90ba8ca622a 100644 --- a/docs/integrations/sources/lokalise.md +++ b/docs/integrations/sources/lokalise.md @@ -1,4 +1,4 @@ -# Lokalise +# Lokalise This page contains the setup guide and reference information for the [Lokalise](https://lokalise.com/) source connector. @@ -24,7 +24,7 @@ You can find your Project ID and find or create an API key within [Lokalise](htt ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. +2. Set the name for your source. 3. Enter your `project_id` - Lokalise Project ID. 4. Enter your `api_key` - Lokalise API key with read permissions. 5. Click **Set up source**. @@ -42,11 +42,11 @@ The Lokalise source connector supports the following [sync modes](https://docs.a ## Supported Streams -* [Keys](https://developers.lokalise.com/reference/list-all-keys) -* [Languages](https://developers.lokalise.com/reference/list-project-languages) -* [Comments](https://developers.lokalise.com/reference/list-project-comments) -* [Contributors](https://developers.lokalise.com/reference/list-all-contributors) -* [Translations](https://developers.lokalise.com/reference/list-all-translations) +- [Keys](https://developers.lokalise.com/reference/list-all-keys) +- [Languages](https://developers.lokalise.com/reference/list-project-languages) +- [Comments](https://developers.lokalise.com/reference/list-project-comments) +- [Contributors](https://developers.lokalise.com/reference/list-all-contributors) +- [Translations](https://developers.lokalise.com/reference/list-all-translations) ## Data type map @@ -59,6 +59,6 @@ The Lokalise source connector supports the following [sync modes](https://docs.a ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| -| 0.1.0 | 2022-10-27 | [18522](https://github.com/airbytehq/airbyte/pull/18522) | New Source: Lokalise | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------- | +| 0.1.0 | 2022-10-27 | [18522](https://github.com/airbytehq/airbyte/pull/18522) | New Source: Lokalise | diff --git a/docs/integrations/sources/looker.md b/docs/integrations/sources/looker.md index d898fb6c35c..7e8cd8f273a 100644 --- a/docs/integrations/sources/looker.md +++ b/docs/integrations/sources/looker.md @@ -8,55 +8,55 @@ The Looker source supports Full Refresh syncs. That is, every time a sync is run Several output streams are available from this source: -* [Color Collections](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/color-collection#get_all_color_collections) -* [Connections](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/connection#get_all_connections) -* [Content Metadata](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/content#get_all_content_metadatas) -* [Content Metadata Access](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/content#get_all_content_metadata_accesses) -* [Dashboards](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboards) - * [Dashboard Elements](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboardelements) - * [Dashboard Filters](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboard_filters) - * [Dashboard Layouts](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboardlayouts) -* [Datagroups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/datagroup#get_all_datagroups) -* [Folders](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/folder#get_all_folders) -* [Groups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/group#get_all_groups) -* [Homepages](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/homepage#get_all_homepages) -* [Integration Hubs](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/integration#get_all_integration_hubs) -* [Integrations](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/integration#get_all_integrations) -* [Lookml Dashboards](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboards) -* [Lookml Models](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/lookml-model#get_all_lookml_models) -* [Looks](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/look#get_all_looks) - * [Run Look](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/look#run_look) -* [Projects](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_projects) - * [Project Files](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_project_files) - * [Git Branches](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_git_branches) -* [Query History](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/query#run_query) -* [Roles](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_roles) - * [Model Sets](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_model_sets) - * [Permission Sets](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_permission_sets) - * [Permissions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_permissions) - * [Role Groups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_role_groups) -* [Scheduled Plans](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/scheduled-plan#get_all_scheduled_plans) -* [Spaces](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/space#get_all_spaces) -* [User Attributes](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user-attribute#get_all_user_attributes) - * [User Attribute Group Value](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user-attribute#get_user_attribute_group_values) -* [User Login Lockouts](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/auth#get_all_user_login_lockouts) -* [Users](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_all_users) - * [User Attribute Values](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_user_attribute_values) - * [User Sessions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_all_web_login_sessions) -* [Versions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/config#get_apiversion) -* [Workspaces](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/workspace) +- [Color Collections](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/color-collection#get_all_color_collections) +- [Connections](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/connection#get_all_connections) +- [Content Metadata](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/content#get_all_content_metadatas) +- [Content Metadata Access](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/content#get_all_content_metadata_accesses) +- [Dashboards](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboards) + - [Dashboard Elements](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboardelements) + - [Dashboard Filters](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboard_filters) + - [Dashboard Layouts](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboardlayouts) +- [Datagroups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/datagroup#get_all_datagroups) +- [Folders](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/folder#get_all_folders) +- [Groups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/group#get_all_groups) +- [Homepages](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/homepage#get_all_homepages) +- [Integration Hubs](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/integration#get_all_integration_hubs) +- [Integrations](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/integration#get_all_integrations) +- [Lookml Dashboards](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/dashboard#get_all_dashboards) +- [Lookml Models](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/lookml-model#get_all_lookml_models) +- [Looks](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/look#get_all_looks) + - [Run Look](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/look#run_look) +- [Projects](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_projects) + - [Project Files](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_project_files) + - [Git Branches](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/project#get_all_git_branches) +- [Query History](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/query#run_query) +- [Roles](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_roles) + - [Model Sets](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_model_sets) + - [Permission Sets](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_permission_sets) + - [Permissions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_all_permissions) + - [Role Groups](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/role#get_role_groups) +- [Scheduled Plans](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/scheduled-plan#get_all_scheduled_plans) +- [Spaces](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/space#get_all_spaces) +- [User Attributes](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user-attribute#get_all_user_attributes) + - [User Attribute Group Value](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user-attribute#get_user_attribute_group_values) +- [User Login Lockouts](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/auth#get_all_user_login_lockouts) +- [Users](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_all_users) + - [User Attribute Values](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_user_attribute_values) + - [User Sessions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/user#get_all_web_login_sessions) +- [Versions](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/config#get_apiversion) +- [Workspaces](https://docs.looker.com/reference/api-and-integration/api-reference/v3.1/workspace) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Coming soon | +| Feature | Supported? | +| :---------------------------- | :---------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Coming soon | | Replicate Incremental Deletes | Coming soon | -| SSL connection | Yes | -| Namespaces | No | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -66,9 +66,9 @@ The Looker connector should not run into Looker API limitations under normal usa ### Requirements -* Client Id -* Client Secret -* Domain +- Client Id +- Client Secret +- Domain ### Setup guide @@ -76,17 +76,16 @@ Please read the "API3 Key" section in [Looker's information for users docs](http ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.8 | 2022-12-07 | [20182](https://github.com/airbytehq/airbyte/pull/20182) | Fix schema transformation issue | -| 0.2.7 | 2022-01-24 | [9609](https://github.com/airbytehq/airbyte/pull/9609) | Migrate to native CDK and fixing of intergration tests. | -| 0.2.6 | 2021-12-07 | [8578](https://github.com/airbytehq/airbyte/pull/8578) | Update titles and descriptions. | -| 0.2.5 | 2021-10-27 | [7284](https://github.com/airbytehq/airbyte/pull/7284) | Migrate Looker source to CDK structure, add SAT testing. | -| 0.2.4 | 2021-06-25 | [3911](https://github.com/airbytehq/airbyte/pull/3911) | Add `run_look` endpoint. | -| 0.2.3 | 2021-06-22 | [3587](https://github.com/airbytehq/airbyte/pull/3587) | Add support for self-hosted instances. | -| 0.2.2 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for kubernetes support. | -| 0.2.1 | 2021-04-02 | [2726](https://github.com/airbytehq/airbyte/pull/2726) | Fix connector base versioning. | -| 0.2.0 | 2021-03-09 | [2238](https://github.com/airbytehq/airbyte/pull/2238) | Allow future / unknown properties in the protocol. | -| 0.1.1 | 2021-01-27 | [1857](https://github.com/airbytehq/airbyte/pull/1857) | Fix failed CI tests. | -| 0.1.0 | 2020-12-24 | [1441](https://github.com/airbytehq/airbyte/pull/1441) | Add looker connector. | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------- | +| 0.2.8 | 2022-12-07 | [20182](https://github.com/airbytehq/airbyte/pull/20182) | Fix schema transformation issue | +| 0.2.7 | 2022-01-24 | [9609](https://github.com/airbytehq/airbyte/pull/9609) | Migrate to native CDK and fixing of intergration tests. | +| 0.2.6 | 2021-12-07 | [8578](https://github.com/airbytehq/airbyte/pull/8578) | Update titles and descriptions. | +| 0.2.5 | 2021-10-27 | [7284](https://github.com/airbytehq/airbyte/pull/7284) | Migrate Looker source to CDK structure, add SAT testing. | +| 0.2.4 | 2021-06-25 | [3911](https://github.com/airbytehq/airbyte/pull/3911) | Add `run_look` endpoint. | +| 0.2.3 | 2021-06-22 | [3587](https://github.com/airbytehq/airbyte/pull/3587) | Add support for self-hosted instances. | +| 0.2.2 | 2021-06-09 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for kubernetes support. | +| 0.2.1 | 2021-04-02 | [2726](https://github.com/airbytehq/airbyte/pull/2726) | Fix connector base versioning. | +| 0.2.0 | 2021-03-09 | [2238](https://github.com/airbytehq/airbyte/pull/2238) | Allow future / unknown properties in the protocol. | +| 0.1.1 | 2021-01-27 | [1857](https://github.com/airbytehq/airbyte/pull/1857) | Fix failed CI tests. | +| 0.1.0 | 2020-12-24 | [1441](https://github.com/airbytehq/airbyte/pull/1441) | Add looker connector. | diff --git a/docs/integrations/sources/low-code.md b/docs/integrations/sources/low-code.md index 9a5cc291e16..a42060ac57d 100644 --- a/docs/integrations/sources/low-code.md +++ b/docs/integrations/sources/low-code.md @@ -8,23 +8,23 @@ The changelog below is automatically updated by the `bump_version` command as pa ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------| -| 0.86.2 | 2024-05-02 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.3 | -| 0.86.1 | 2024-05-02 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.2 | -| 0.86.0 | 2024-04-30 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.0 | -| 0.85.0 | 2024-04-24 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.85.0 | -| 0.84.0 | 2024-04-23 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.84.0 | -| 0.83.1 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.83.1 | -| 0.83.0 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.83.0 | -| 0.82.0 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.82.0 | -| 0.81.8 | 2024-04-18 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.8 | -| 0.81.7 | 2024-04-18 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.7 | -| 0.81.3 | 2024-04-12 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.4 | -| 0.81.2 | 2024-04-11 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.3 | -| 0.81.1 | 2024-04-11 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.1 | -| 0.81.0 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.0 | -| 0.80.0 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.80.0 | -| 0.79.2 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.2 | -| 0.79.1 | 2024-04-05 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.1 | -| 0.79.0 | 2024-04-05 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.0 | -| 0.78.9 | 2024-04-04 | [36834](https://github.com/airbytehq/airbyte/pull/36834) | Update CDK dependency to version 0.78.9 (before new publishing flow) | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------- | +| 0.86.2 | 2024-05-02 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.3 | +| 0.86.1 | 2024-05-02 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.2 | +| 0.86.0 | 2024-04-30 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.86.0 | +| 0.85.0 | 2024-04-24 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.85.0 | +| 0.84.0 | 2024-04-23 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.84.0 | +| 0.83.1 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.83.1 | +| 0.83.0 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.83.0 | +| 0.82.0 | 2024-04-19 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.82.0 | +| 0.81.8 | 2024-04-18 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.8 | +| 0.81.7 | 2024-04-18 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.7 | +| 0.81.3 | 2024-04-12 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.4 | +| 0.81.2 | 2024-04-11 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.3 | +| 0.81.1 | 2024-04-11 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.1 | +| 0.81.0 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.81.0 | +| 0.80.0 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.80.0 | +| 0.79.2 | 2024-04-09 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.2 | +| 0.79.1 | 2024-04-05 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.1 | +| 0.79.0 | 2024-04-05 | [36501](https://github.com/airbytehq/airbyte/pull/36501) | Bump CDK version to 0.79.0 | +| 0.78.9 | 2024-04-04 | [36834](https://github.com/airbytehq/airbyte/pull/36834) | Update CDK dependency to version 0.78.9 (before new publishing flow) | diff --git a/docs/integrations/sources/magento.md b/docs/integrations/sources/magento.md index d398d0d363f..45a05f2258a 100644 --- a/docs/integrations/sources/magento.md +++ b/docs/integrations/sources/magento.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The output schema is described in the [Magento docs](https://docs.magento.com/mbi/data-analyst/importing-data/integrations/magento-data.html). See the [MySQL connector](mysql.md) for more info on general rules followed by the MySQL connector when moving data. - diff --git a/docs/integrations/sources/mailchimp-migrations.md b/docs/integrations/sources/mailchimp-migrations.md index d09683b8e64..87427665315 100644 --- a/docs/integrations/sources/mailchimp-migrations.md +++ b/docs/integrations/sources/mailchimp-migrations.md @@ -2,40 +2,36 @@ ## Upgrading to 2.0.0 -Version 2.0.0 introduces changes in primary key for streams `Segment Members` and `List Members`. +Version 2.0.0 introduces changes in primary key for streams `Segment Members` and `List Members`. ## Migration Steps ### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. -2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: -4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: + 1. Select the connection(s) affected by the update. +2. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: +4. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ## Upgrading to 1.0.0 Version 1.0.0 of the Source Mailchimp connector introduces a number of breaking changes to the schemas of all incremental streams. A full schema refresh and data reset are required when upgrading to this version. ### Upgrade steps -1. Select **Connections** in the main navbar. +1. Select **Connections** in the main navbar. 2. From the list of your existing connections, select the connection(s) affected by the update. 3. Select the **Replication** tab, then select **Refresh source schema**. @@ -64,10 +60,12 @@ Depending on the destination type, you may not be prompted to reset your data ### Updated datetime fields - Automations: + - `create_time` - `send_time` - Campaigns: + - `create_time` - `send_time` - `rss_opts.last_sent` @@ -76,21 +74,25 @@ Depending on the destination type, you may not be prompted to reset your data - `variate_settings.send_times` (Array of datetime fields) - Email Activity: + - `timestamp` - List Members: + - `timestamp_signup` - `timestamp_opt` - `last_changed` - `created_at` - Lists: + - `date_created` - `stats.campaign_last_sent` - `stats.last_sub_date` - `stats.last_unsub_date` - Reports: + - `send_time` - `rss_last_send` - `opens.last_open` @@ -101,12 +103,14 @@ Depending on the destination type, you may not be prompted to reset your data - `timeseries.timestamp` - Segment Members: + - `timestamp_signup` - `timestamp_opt` - `last_changed` - `last_note.created_at` - Segments: + - `created_at` - `updated_at` diff --git a/docs/integrations/sources/mailchimp.md b/docs/integrations/sources/mailchimp.md index cedead9e032..74afd99f8f1 100644 --- a/docs/integrations/sources/mailchimp.md +++ b/docs/integrations/sources/mailchimp.md @@ -64,7 +64,7 @@ For more information on Mailchimp API Keys, please refer to the [official Mailch The Mailchimp source connector supports the following streams and [sync modes](https://docs.airbyte.com/cloud/core-concepts/#connection-sync-mode): | Stream | Full Refresh | Incremental | -|:-------------------------------------------------------------------------------------------------------------------|:-------------|:------------| +| :----------------------------------------------------------------------------------------------------------------- | :----------- | :---------- | | [Automations](https://mailchimp.com/developer/marketing/api/automation/list-automations/) | ✓ | ✓ | | [Campaigns](https://mailchimp.com/developer/marketing/api/campaigns/get-campaign-info/) | ✓ | ✓ | | [Email Activity](https://mailchimp.com/developer/marketing/api/email-activity-reports/list-email-activity/) | ✓ | ✓ | @@ -89,14 +89,14 @@ All other streams contain an `id` primary key. ## Data type mapping -| Integration Type | Airbyte Type | Notes | -|:---------------------|:---------------------------|:------------------------------------------------------------------------------------| -| `array` | `array` | the type of elements in the array is determined based on the mappings in this table | -| `string` | `string` | | -| `float`, `number` | `number` | | -| `integer` | `integer` | | -| `object` | `object` | properties within objects are mapped based on the mappings in this table | -| `string` (timestamp) | `timestamp_with_timezone` | Mailchimp timestamps are formatted as `YYYY-MM-DDTHH:MM:SS+00:00` | +| Integration Type | Airbyte Type | Notes | +| :------------------- | :------------------------ | :---------------------------------------------------------------------------------- | +| `array` | `array` | the type of elements in the array is determined based on the mappings in this table | +| `string` | `string` | | +| `float`, `number` | `number` | | +| `integer` | `integer` | | +| `object` | `object` | properties within objects are mapped based on the mappings in this table | +| `string` (timestamp) | `timestamp_with_timezone` | Mailchimp timestamps are formatted as `YYYY-MM-DDTHH:MM:SS+00:00` | ## Limitations & Troubleshooting @@ -121,50 +121,50 @@ Now that you have set up the Mailchimp source connector, check out the following ## Changelog -| Version | Date | Pull Request | Subject | -|---------|------------|-----------------------------------------------------------|----------------------------------------------------------------------------| -| 2.0.3 | 2024-05-02 | [36649](https://github.com/airbytehq/airbyte/pull/36649) | Schema descriptions | -| 2.0.2 | 2024-04-25 | [37572](https://github.com/airbytehq/airbyte/pull/37572) | Fixed `start_date` format issue for the `email_activity` stream | -| 2.0.1 | 2024-04-19 | [37434](https://github.com/airbytehq/airbyte/pull/37434) | Fixed cursor format for the `email_activity` stream | -| 2.0.0 | 2024-04-01 | [35281](https://github.com/airbytehq/airbyte/pull/35281) | Migrate to Low-Code | -| 1.2.0 | 2024-03-28 | [36600](https://github.com/airbytehq/airbyte/pull/36600) | Migrate to latest Airbyte-CDK. | -| 1.1.2 | 2024-02-09 | [35092](https://github.com/airbytehq/airbyte/pull/35092) | Manage dependencies with Poetry. | -| 1.1.1 | 2024-01-11 | [34157](https://github.com/airbytehq/airbyte/pull/34157) | Prepare for airbyte-lib | -| 1.1.0 | 2023-12-20 | [32852](https://github.com/airbytehq/airbyte/pull/32852) | Add optional start_date for incremental streams | -| 1.0.0 | 2023-12-19 | [32836](https://github.com/airbytehq/airbyte/pull/32836) | Add airbyte-type to `datetime` columns and remove `._links` column | -| 0.10.0 | 2023-11-23 | [32782](https://github.com/airbytehq/airbyte/pull/32782) | Add SegmentMembers stream | -| 0.9.0 | 2023-11-17 | [32218](https://github.com/airbytehq/airbyte/pull/32218) | Add Interests, InterestCategories, Tags streams | -| 0.8.3 | 2023-11-15 | [32543](https://github.com/airbytehq/airbyte/pull/32543) | Handle empty datetime fields in Reports stream | -| 0.8.2 | 2023-11-13 | [32466](https://github.com/airbytehq/airbyte/pull/32466) | Improve error handling during connection check | -| 0.8.1 | 2023-11-06 | [32226](https://github.com/airbytehq/airbyte/pull/32226) | Unmute expected records test after data anonymisation | -| 0.8.0 | 2023-11-01 | [32032](https://github.com/airbytehq/airbyte/pull/32032) | Add ListMembers stream | -| 0.7.0 | 2023-10-27 | [31940](https://github.com/airbytehq/airbyte/pull/31940) | Implement availability strategy | -| 0.6.0 | 2023-10-27 | [31922](https://github.com/airbytehq/airbyte/pull/31922) | Add Segments stream | -| 0.5.0 | 2023-10-20 | [31675](https://github.com/airbytehq/airbyte/pull/31675) | Add Unsubscribes stream | -| 0.4.1 | 2023-05-02 | [25717](https://github.com/airbytehq/airbyte/pull/25717) | Handle unknown error in EmailActivity | -| 0.4.0 | 2023-04-11 | [23290](https://github.com/airbytehq/airbyte/pull/23290) | Add Automations stream | -| 0.3.5 | 2023-02-28 | [23464](https://github.com/airbytehq/airbyte/pull/23464) | Add Reports stream | -| 0.3.4 | 2023-02-06 | [22405](https://github.com/airbytehq/airbyte/pull/22405) | Revert extra logging | -| 0.3.3 | 2023-02-01 | [22228](https://github.com/airbytehq/airbyte/pull/22228) | Add extra logging | -| 0.3.2 | 2023-01-27 | [22014](https://github.com/airbytehq/airbyte/pull/22014) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.3.1 | 2022-12-20 | [20720](https://github.com/airbytehq/airbyte/pull/20720) | Use stream slices as a source for request params instead of a stream state | -| 0.3.0 | 2022-11-07 | [19023](https://github.com/airbytehq/airbyte/pull/19023) | Set primary key for Email Activity stream. | -| 0.2.15 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | -| 0.2.14 | 2022-04-12 | [11352](https://github.com/airbytehq/airbyte/pull/11352) | Update documentation | -| 0.2.13 | 2022-04-11 | [11632](https://github.com/airbytehq/airbyte/pull/11632) | Add unit tests | -| 0.2.12 | 2022-03-17 | [10975](https://github.com/airbytehq/airbyte/pull/10975) | Fix campaign's stream normalization | -| 0.2.11 | 2021-12-24 | [7159](https://github.com/airbytehq/airbyte/pull/7159) | Add oauth2.0 support | -| 0.2.10 | 2021-12-21 | [9000](https://github.com/airbytehq/airbyte/pull/9000) | Update connector fields title/description | -| 0.2.9 | 2021-12-13 | [7975](https://github.com/airbytehq/airbyte/pull/7975) | Updated JSON schemas | -| 0.2.8 | 2021-08-17 | [5481](https://github.com/airbytehq/airbyte/pull/5481) | Remove date-time type from some fields | -| 0.2.7 | 2021-08-03 | [5137](https://github.com/airbytehq/airbyte/pull/5137) | Source Mailchimp: fix primary key for email activities | -| 0.2.6 | 2021-07-28 | [5024](https://github.com/airbytehq/airbyte/pull/5024) | Source Mailchimp: handle records with no no "activity" field in response | -| 0.2.5 | 2021-07-08 | [4621](https://github.com/airbytehq/airbyte/pull/4621) | Mailchimp fix url-base | -| 0.2.4 | 2021-06-09 | [4285](https://github.com/airbytehq/airbyte/pull/4285) | Use datacenter URL parameter from apikey | -| 0.2.3 | 2021-06-08 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add AIRBYTE\_ENTRYPOINT for Kubernetes support | -| 0.2.2 | 2021-06-08 | [3415](https://github.com/airbytehq/airbyte/pull/3415) | Get Members activities | -| 0.2.1 | 2021-04-03 | [2726](https://github.com/airbytehq/airbyte/pull/2726) | Fix base connector versioning | -| 0.2.0 | 2021-03-09 | [2238](https://github.com/airbytehq/airbyte/pull/2238) | Protocol allows future/unknown properties | -| 0.1.4 | 2020-11-30 | [1046](https://github.com/airbytehq/airbyte/pull/1046) | Add connectors using an index YAML file | +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | -------------------------------------------------------------------------- | +| 2.0.3 | 2024-05-02 | [36649](https://github.com/airbytehq/airbyte/pull/36649) | Schema descriptions | +| 2.0.2 | 2024-04-25 | [37572](https://github.com/airbytehq/airbyte/pull/37572) | Fixed `start_date` format issue for the `email_activity` stream | +| 2.0.1 | 2024-04-19 | [37434](https://github.com/airbytehq/airbyte/pull/37434) | Fixed cursor format for the `email_activity` stream | +| 2.0.0 | 2024-04-01 | [35281](https://github.com/airbytehq/airbyte/pull/35281) | Migrate to Low-Code | +| 1.2.0 | 2024-03-28 | [36600](https://github.com/airbytehq/airbyte/pull/36600) | Migrate to latest Airbyte-CDK. | +| 1.1.2 | 2024-02-09 | [35092](https://github.com/airbytehq/airbyte/pull/35092) | Manage dependencies with Poetry. | +| 1.1.1 | 2024-01-11 | [34157](https://github.com/airbytehq/airbyte/pull/34157) | Prepare for airbyte-lib | +| 1.1.0 | 2023-12-20 | [32852](https://github.com/airbytehq/airbyte/pull/32852) | Add optional start_date for incremental streams | +| 1.0.0 | 2023-12-19 | [32836](https://github.com/airbytehq/airbyte/pull/32836) | Add airbyte-type to `datetime` columns and remove `._links` column | +| 0.10.0 | 2023-11-23 | [32782](https://github.com/airbytehq/airbyte/pull/32782) | Add SegmentMembers stream | +| 0.9.0 | 2023-11-17 | [32218](https://github.com/airbytehq/airbyte/pull/32218) | Add Interests, InterestCategories, Tags streams | +| 0.8.3 | 2023-11-15 | [32543](https://github.com/airbytehq/airbyte/pull/32543) | Handle empty datetime fields in Reports stream | +| 0.8.2 | 2023-11-13 | [32466](https://github.com/airbytehq/airbyte/pull/32466) | Improve error handling during connection check | +| 0.8.1 | 2023-11-06 | [32226](https://github.com/airbytehq/airbyte/pull/32226) | Unmute expected records test after data anonymisation | +| 0.8.0 | 2023-11-01 | [32032](https://github.com/airbytehq/airbyte/pull/32032) | Add ListMembers stream | +| 0.7.0 | 2023-10-27 | [31940](https://github.com/airbytehq/airbyte/pull/31940) | Implement availability strategy | +| 0.6.0 | 2023-10-27 | [31922](https://github.com/airbytehq/airbyte/pull/31922) | Add Segments stream | +| 0.5.0 | 2023-10-20 | [31675](https://github.com/airbytehq/airbyte/pull/31675) | Add Unsubscribes stream | +| 0.4.1 | 2023-05-02 | [25717](https://github.com/airbytehq/airbyte/pull/25717) | Handle unknown error in EmailActivity | +| 0.4.0 | 2023-04-11 | [23290](https://github.com/airbytehq/airbyte/pull/23290) | Add Automations stream | +| 0.3.5 | 2023-02-28 | [23464](https://github.com/airbytehq/airbyte/pull/23464) | Add Reports stream | +| 0.3.4 | 2023-02-06 | [22405](https://github.com/airbytehq/airbyte/pull/22405) | Revert extra logging | +| 0.3.3 | 2023-02-01 | [22228](https://github.com/airbytehq/airbyte/pull/22228) | Add extra logging | +| 0.3.2 | 2023-01-27 | [22014](https://github.com/airbytehq/airbyte/pull/22014) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.3.1 | 2022-12-20 | [20720](https://github.com/airbytehq/airbyte/pull/20720) | Use stream slices as a source for request params instead of a stream state | +| 0.3.0 | 2022-11-07 | [19023](https://github.com/airbytehq/airbyte/pull/19023) | Set primary key for Email Activity stream. | +| 0.2.15 | 2022-09-28 | [17326](https://github.com/airbytehq/airbyte/pull/17326) | Migrate to per-stream states. | +| 0.2.14 | 2022-04-12 | [11352](https://github.com/airbytehq/airbyte/pull/11352) | Update documentation | +| 0.2.13 | 2022-04-11 | [11632](https://github.com/airbytehq/airbyte/pull/11632) | Add unit tests | +| 0.2.12 | 2022-03-17 | [10975](https://github.com/airbytehq/airbyte/pull/10975) | Fix campaign's stream normalization | +| 0.2.11 | 2021-12-24 | [7159](https://github.com/airbytehq/airbyte/pull/7159) | Add oauth2.0 support | +| 0.2.10 | 2021-12-21 | [9000](https://github.com/airbytehq/airbyte/pull/9000) | Update connector fields title/description | +| 0.2.9 | 2021-12-13 | [7975](https://github.com/airbytehq/airbyte/pull/7975) | Updated JSON schemas | +| 0.2.8 | 2021-08-17 | [5481](https://github.com/airbytehq/airbyte/pull/5481) | Remove date-time type from some fields | +| 0.2.7 | 2021-08-03 | [5137](https://github.com/airbytehq/airbyte/pull/5137) | Source Mailchimp: fix primary key for email activities | +| 0.2.6 | 2021-07-28 | [5024](https://github.com/airbytehq/airbyte/pull/5024) | Source Mailchimp: handle records with no no "activity" field in response | +| 0.2.5 | 2021-07-08 | [4621](https://github.com/airbytehq/airbyte/pull/4621) | Mailchimp fix url-base | +| 0.2.4 | 2021-06-09 | [4285](https://github.com/airbytehq/airbyte/pull/4285) | Use datacenter URL parameter from apikey | +| 0.2.3 | 2021-06-08 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add AIRBYTE_ENTRYPOINT for Kubernetes support | +| 0.2.2 | 2021-06-08 | [3415](https://github.com/airbytehq/airbyte/pull/3415) | Get Members activities | +| 0.2.1 | 2021-04-03 | [2726](https://github.com/airbytehq/airbyte/pull/2726) | Fix base connector versioning | +| 0.2.0 | 2021-03-09 | [2238](https://github.com/airbytehq/airbyte/pull/2238) | Protocol allows future/unknown properties | +| 0.1.4 | 2020-11-30 | [1046](https://github.com/airbytehq/airbyte/pull/1046) | Add connectors using an index YAML file | diff --git a/docs/integrations/sources/mailerlite.md b/docs/integrations/sources/mailerlite.md index d5e87337bf3..d74745ad9ec 100644 --- a/docs/integrations/sources/mailerlite.md +++ b/docs/integrations/sources/mailerlite.md @@ -6,21 +6,21 @@ This source can sync data from the [MailerLite API](https://developers.mailerlit ## This Source Supports the Following Streams -* campaigns -* subscribers -* automations -* timezones -* segments -* forms_popup -* forms_embedded -* forms_promotion +- campaigns +- subscribers +- automations +- timezones +- segments +- forms_popup +- forms_embedded +- forms_promotion ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -30,7 +30,7 @@ MailerLite API has a global rate limit of 120 requests per minute. ### Requirements -* MailerLite API Key +- MailerLite API Key ## Changelog diff --git a/docs/integrations/sources/mailgun.md b/docs/integrations/sources/mailgun.md index d085098638c..75c02dea81e 100644 --- a/docs/integrations/sources/mailgun.md +++ b/docs/integrations/sources/mailgun.md @@ -41,7 +41,7 @@ Just pass the generated API key for establishing the connection. The MailGun source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:------------------------------|:-----------| +| :---------------------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Replicate Incremental Deletes | No | @@ -64,7 +64,7 @@ MailGun's [API reference](https://documentation.mailgun.com/en/latest/api_refere ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | | 0.2.6 | 2024-05-02 | [37594](https://github.com/airbytehq/airbyte/pull/37594) | Change `last_recrods` to `last_record` | | 0.2.5 | 2024-04-19 | [37193](https://github.com/airbytehq/airbyte/pull/37193) | Updating to 0.80.0 CDK | | 0.2.4 | 2024-04-18 | [37193](https://github.com/airbytehq/airbyte/pull/37193) | Manage dependencies with Poetry. | diff --git a/docs/integrations/sources/mailjet-mail.md b/docs/integrations/sources/mailjet-mail.md index 85c89b0fda5..e332c0bcb80 100644 --- a/docs/integrations/sources/mailjet-mail.md +++ b/docs/integrations/sources/mailjet-mail.md @@ -6,18 +6,18 @@ This source can sync data from the [Mailjet Mail API](https://dev.mailjet.com/em ## This Source Supports the Following Streams -* contact list -* contacts -* messages -* campaigns -* stats +- contact list +- contacts +- messages +- campaigns +- stats ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -27,13 +27,13 @@ Mailjet APIs are under rate limits for the number of API calls allowed per API k ### Requirements -* Mailjet Mail API_KEY -* Mailjet Mail SECRET_KEY +- Mailjet Mail API_KEY +- Mailjet Mail SECRET_KEY ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.2 | 2022-12-18 | [#30924](https://github.com/airbytehq/airbyte/pull/30924) | Adds Subject field to `message` stream | -| 0.1.1 | 2022-04-19 | [#24689](https://github.com/airbytehq/airbyte/pull/24689) | Add listrecipient stream | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------------- | +| 0.1.2 | 2022-12-18 | [#30924](https://github.com/airbytehq/airbyte/pull/30924) | Adds Subject field to `message` stream | +| 0.1.1 | 2022-04-19 | [#24689](https://github.com/airbytehq/airbyte/pull/24689) | Add listrecipient stream | | 0.1.0 | 2022-10-26 | [#18332](https://github.com/airbytehq/airbyte/pull/18332) | 🎉 New Source: Mailjet Mail API [low-code CDK] | diff --git a/docs/integrations/sources/mailjet-sms.md b/docs/integrations/sources/mailjet-sms.md index e8f726f2589..d5c9e5090dc 100644 --- a/docs/integrations/sources/mailjet-sms.md +++ b/docs/integrations/sources/mailjet-sms.md @@ -6,14 +6,14 @@ This source can sync data from the [Mailjet SMS API](https://dev.mailjet.com/sms ## This Source Supports the Following Streams -* SMS +- SMS ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -23,13 +23,13 @@ Mailjet APIs are under rate limits for the number of API calls allowed per API k ### Requirements -* Mailjet SMS TOKEN +- Mailjet SMS TOKEN ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.3 | 2024-04-19 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | schema descriptions | -| 0.1.0 | 2022-10-26 | [#18345](https://github.com/airbytehq/airbyte/pull/18345) | 🎉 New Source: Mailjet SMS API [low-code CDK] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37195](https://github.com/airbytehq/airbyte/pull/37195) | schema descriptions | +| 0.1.0 | 2022-10-26 | [#18345](https://github.com/airbytehq/airbyte/pull/18345) | 🎉 New Source: Mailjet SMS API [low-code CDK] | diff --git a/docs/integrations/sources/marketo.md b/docs/integrations/sources/marketo.md index 1305fe8f5b9..d16a16085c1 100644 --- a/docs/integrations/sources/marketo.md +++ b/docs/integrations/sources/marketo.md @@ -91,6 +91,7 @@ This connector can be used to sync the following tables from Marketo: Available fields are limited by what is presented in the static schema. ::: + - **[Lists](https://developers.marketo.com/rest-api/endpoint-reference/lead-database-endpoint-reference/#!/Static_Lists/getListByIdUsingGET)**: Contains info about your Marketo static lists. - **[Programs](https://developers.marketo.com/rest-api/endpoint-reference/asset-endpoint-reference/#!/Programs/browseProgramsUsingGET)**: Contains info about your Marketo programs. - **[Segmentations](https://developers.marketo.com/rest-api/endpoint-reference/asset-endpoint-reference/#!/Segments/getSegmentationUsingGET)**: Contains info about your Marketo programs. @@ -116,7 +117,7 @@ If the 50,000 limit is too stringent, contact Marketo support for a quota increa ## Changelog | Version | Date | Pull Request | Subject | -|:---------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------| +| :------- | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------- | | `1.4.0` | 2024-04-15 | [36854](https://github.com/airbytehq/airbyte/pull/36854) | Migrate to low-code | | 1.3.2 | 2024-04-19 | [36650](https://github.com/airbytehq/airbyte/pull/36650) | Updating to 0.80.0 CDK | | 1.3.1 | 2024-04-12 | [36650](https://github.com/airbytehq/airbyte/pull/36650) | schema descriptions | diff --git a/docs/integrations/sources/merge.md b/docs/integrations/sources/merge.md index 75333c1fa27..ce903e6271a 100644 --- a/docs/integrations/sources/merge.md +++ b/docs/integrations/sources/merge.md @@ -4,7 +4,7 @@ This page contains the setup guide and reference information for the [Merge](htt ## Prerequisites -Access Token (which acts as bearer token) and linked accounts tokens are mandate for this connector to work, It could be seen at settings (Bearer ref - https://app.merge.dev/keys) and (Account token ref - https://app.merge.dev/keys). +Access Token (which acts as bearer token) and linked accounts tokens are mandate for this connector to work, It could be seen at settings (Bearer ref - https://app.merge.dev/keys) and (Account token ref - https://app.merge.dev/keys). ## Setup guide @@ -14,9 +14,9 @@ Access Token (which acts as bearer token) and linked accounts tokens are mandate - Get your bearer token on keys section (ref - https://app.merge.dev/keys) - Setup params (All params are required) - Available params - - account_token: Linked account token seen after integration at linked account section - - api_token: Bearer token seen at keys section, try to use production keys - - start_date: Date filter for eligible streams + - account_token: Linked account token seen after integration at linked account section + - api_token: Bearer token seen at keys section, try to use production keys + - start_date: Date filter for eligible streams ## Step 2: Set up the Merge connector in Airbyte @@ -33,7 +33,7 @@ Access Token (which acts as bearer token) and linked accounts tokens are mandate 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `account_token, api_token and start_date`. -5. Click **Set up source**. +4. Click **Set up source**. ## Supported sync modes @@ -74,6 +74,6 @@ Merge [API reference](https://api.merge.dev/api/ats/v1/) has v1 at present. The ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.0 | 2023-04-18 | [Init](https://github.com/airbytehq/airbyte/pull/)| Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------- | :------------- | +| 0.1.0 | 2023-04-18 | [Init](https://github.com/airbytehq/airbyte/pull/) | Initial commit | diff --git a/docs/integrations/sources/metabase-migrations.md b/docs/integrations/sources/metabase-migrations.md index d0744086087..47d4cfade27 100644 --- a/docs/integrations/sources/metabase-migrations.md +++ b/docs/integrations/sources/metabase-migrations.md @@ -11,7 +11,7 @@ Source Metabase has updated the `dashboards` stream's endpoint due to the previo Airbyte Open Source users must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. + 1. Select **Sources**. 2. Find Metabase in the list of connectors. :::note @@ -30,26 +30,23 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### Refresh affected schemas and reset data 1. Select **Connections** in the main navbar. - 1. Select the connection(s) affected by the update. -2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: + 1. Select the connection(s) affected by the update. +2. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: 4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: -For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). \ No newline at end of file +For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/metabase.md b/docs/integrations/sources/metabase.md index 156f753768b..b8135c93594 100644 --- a/docs/integrations/sources/metabase.md +++ b/docs/integrations/sources/metabase.md @@ -1,12 +1,14 @@ # Metabase + This page contains the setup guide and reference information for the Metabase source connector. ## Prerequisites To set up Metabase you need: - * `username` and `password` - Credential pairs to authenticate with Metabase instance. This may be used to generate a new `session_token` if necessary. An email from Metabase may be sent to the owner's account every time this is being used to open a new session. - * `session_token` - Credential token to authenticate requests sent to Metabase API. Usually expires after 14 days. - * `instance_api_url` - URL to interact with Metabase instance API, that uses https. + +- `username` and `password` - Credential pairs to authenticate with Metabase instance. This may be used to generate a new `session_token` if necessary. An email from Metabase may be sent to the owner's account every time this is being used to open a new session. +- `session_token` - Credential token to authenticate requests sent to Metabase API. Usually expires after 14 days. +- `instance_api_url` - URL to interact with Metabase instance API, that uses https. ## Setup guide @@ -37,22 +39,23 @@ authenticated query is running, which might trigger security alerts on the user' The Metabase source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) ## Supported Streams -* [Card](https://www.metabase.com/docs/latest/api/card.html#get-apicard) -* [Collections](https://www.metabase.com/docs/latest/api/collection.html#get-apicollection) -* [Dashboard](https://www.metabase.com/docs/latest/api/dashboard.html#get-apidashboard) -* [User](https://www.metabase.com/docs/latest/api/user.html#get-apiuser) -* [Databases](https://www.metabase.com/docs/latest/api/user.html#get-apiuser) -* [Native Query Snippet](https://www.metabase.com/docs/latest/api/native-query-snippet#get-apinative-query-snippetid) + +- [Card](https://www.metabase.com/docs/latest/api/card.html#get-apicard) +- [Collections](https://www.metabase.com/docs/latest/api/collection.html#get-apicollection) +- [Dashboard](https://www.metabase.com/docs/latest/api/dashboard.html#get-apidashboard) +- [User](https://www.metabase.com/docs/latest/api/user.html#get-apiuser) +- [Databases](https://www.metabase.com/docs/latest/api/user.html#get-apiuser) +- [Native Query Snippet](https://www.metabase.com/docs/latest/api/native-query-snippet#get-apinative-query-snippetid) ## Tutorials ### Data type mapping | Integration Type | Airbyte Type | Notes | -|:--------------------|:-------------|:------| +| :------------------ | :----------- | :---- | | `string` | `string` | | | `integer`, `number` | `number` | | | `array` | `array` | | @@ -61,22 +64,21 @@ The Metabase source connector supports the following [sync modes](https://docs.a ### Features | Feature | Supported?\(Yes/No\) | Notes | -|:------------------|:---------------------|:------| +| :---------------- | :------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | SSL connection | Yes | | | Namespaces | No | | - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------| -| 2.0.0 | 2024-03-01 | [35680](https://github.com/airbytehq/airbyte/pull/35680) | Updates `dashboards` stream, Base image migration: remove Dockerfile and use the python-connector-base image, migrated to poetry | -| 1.1.0 | 2023-10-31 | [31909](https://github.com/airbytehq/airbyte/pull/31909) | Add `databases` and `native_query_snippets` streams | -| 1.0.1 | 2023-07-20 | [28470](https://github.com/airbytehq/airbyte/pull/27777) | Update CDK to 0.47.0 | -| 1.0.0 | 2023-06-27 | [27777](https://github.com/airbytehq/airbyte/pull/27777) | Remove Activity Stream | -| 0.3.1 | 2022-12-15 | [20535](https://github.com/airbytehq/airbyte/pull/20535) | Run on CDK 0.15.0 | -| 0.3.0 | 2022-12-13 | [19236](https://github.com/airbytehq/airbyte/pull/19236) | Migrate to YAML. | -| 0.2.0 | 2022-10-28 | [18607](https://github.com/airbytehq/airbyte/pull/18607) | Disallow using `http` URLs | -| 0.1.0 | 2022-06-15 | [6975](https://github.com/airbytehq/airbyte/pull/13752) | Initial (alpha) release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------- | +| 2.0.0 | 2024-03-01 | [35680](https://github.com/airbytehq/airbyte/pull/35680) | Updates `dashboards` stream, Base image migration: remove Dockerfile and use the python-connector-base image, migrated to poetry | +| 1.1.0 | 2023-10-31 | [31909](https://github.com/airbytehq/airbyte/pull/31909) | Add `databases` and `native_query_snippets` streams | +| 1.0.1 | 2023-07-20 | [28470](https://github.com/airbytehq/airbyte/pull/27777) | Update CDK to 0.47.0 | +| 1.0.0 | 2023-06-27 | [27777](https://github.com/airbytehq/airbyte/pull/27777) | Remove Activity Stream | +| 0.3.1 | 2022-12-15 | [20535](https://github.com/airbytehq/airbyte/pull/20535) | Run on CDK 0.15.0 | +| 0.3.0 | 2022-12-13 | [19236](https://github.com/airbytehq/airbyte/pull/19236) | Migrate to YAML. | +| 0.2.0 | 2022-10-28 | [18607](https://github.com/airbytehq/airbyte/pull/18607) | Disallow using `http` URLs | +| 0.1.0 | 2022-06-15 | [6975](https://github.com/airbytehq/airbyte/pull/13752) | Initial (alpha) release | diff --git a/docs/integrations/sources/microsoft-dataverse.md b/docs/integrations/sources/microsoft-dataverse.md index 4e3138cff79..a3886061862 100644 --- a/docs/integrations/sources/microsoft-dataverse.md +++ b/docs/integrations/sources/microsoft-dataverse.md @@ -59,8 +59,8 @@ https://blog.magnetismsolutions.com/blog/paulnieuwelaar/2021/9/21/setting-up-an- ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------- | | 0.1.2 | 2023-08-24 | [29732](https://github.com/airbytehq/airbyte/pull/29732) | 🐛 Source Microsoft Dataverse: Adjust source_default_cursor when modifiedon not exists | -| 0.1.1 | 2023-03-16 | [22805](https://github.com/airbytehq/airbyte/pull/22805) | Fixed deduped cursor field value update | -| 0.1.0 | 2022-11-14 | [18646](https://github.com/airbytehq/airbyte/pull/18646) | 🎉 New Source: Microsoft Dataverse [python cdk] | +| 0.1.1 | 2023-03-16 | [22805](https://github.com/airbytehq/airbyte/pull/22805) | Fixed deduped cursor field value update | +| 0.1.0 | 2022-11-14 | [18646](https://github.com/airbytehq/airbyte/pull/18646) | 🎉 New Source: Microsoft Dataverse [python cdk] | diff --git a/docs/integrations/sources/microsoft-dynamics-ax.md b/docs/integrations/sources/microsoft-dynamics-ax.md index cff5b16acb2..01a1303abc9 100644 --- a/docs/integrations/sources/microsoft-dynamics-ax.md +++ b/docs/integrations/sources/microsoft-dynamics-ax.md @@ -9,4 +9,3 @@ MS Dynamics AX runs on the MSSQL database. You can use the [MSSQL connector](mss ### Output schema To understand your MS Dynamics AX database schema, see the [Microsoft docs](https://docs.microsoft.com/en-us/dynamicsax-2012/developer/database-erds-on-the-axerd-website). Otherwise, the schema will be loaded according to the rules of MSSQL connector. - diff --git a/docs/integrations/sources/microsoft-dynamics-customer-engagement.md b/docs/integrations/sources/microsoft-dynamics-customer-engagement.md index 7c0467f85c6..f2bec8809f1 100644 --- a/docs/integrations/sources/microsoft-dynamics-customer-engagement.md +++ b/docs/integrations/sources/microsoft-dynamics-customer-engagement.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema To understand your MS Dynamics Customer Engagement database schema, see the [Entity Reference documentation](https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/about-entity-reference?view=op-9-1). Otherwise, the schema will be loaded according to the rules of MSSQL connector. - diff --git a/docs/integrations/sources/microsoft-dynamics-gp.md b/docs/integrations/sources/microsoft-dynamics-gp.md index 00c72bb5e48..e7e9ecabc08 100644 --- a/docs/integrations/sources/microsoft-dynamics-gp.md +++ b/docs/integrations/sources/microsoft-dynamics-gp.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema To understand your MS Dynamics GP database schema, see the [Microsoft docs](https://docs.microsoft.com/en-us/dynamicsax-2012/developer/tables-overview). Otherwise, the schema will be loaded according to the rules of MSSQL connector. - diff --git a/docs/integrations/sources/microsoft-dynamics-nav.md b/docs/integrations/sources/microsoft-dynamics-nav.md index 8b1a6fabe2a..16913528b8d 100644 --- a/docs/integrations/sources/microsoft-dynamics-nav.md +++ b/docs/integrations/sources/microsoft-dynamics-nav.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema To understand your MS Dynamics NAV database schema, see the [Microsoft docs](https://docs.microsoft.com/en-us/dynamics-nav-app/). Otherwise, the schema will be loaded according to the rules of MSSQL connector. - diff --git a/docs/integrations/sources/microsoft-onedrive.md b/docs/integrations/sources/microsoft-onedrive.md index f5ac00ffed8..edb2fe26ba5 100644 --- a/docs/integrations/sources/microsoft-onedrive.md +++ b/docs/integrations/sources/microsoft-onedrive.md @@ -4,11 +4,11 @@ This page contains the setup guide and reference information for the Microsoft O ### Requirements -* Application \(client\) ID -* Directory \(tenant\) ID -* Drive name -* Folder Path -* Client secrets +- Application \(client\) ID +- Directory \(tenant\) ID +- Drive name +- Folder Path +- Client secrets ## Setup guide @@ -23,14 +23,14 @@ This page contains the setup guide and reference information for the Microsoft O 5. Enter **Drive Name**. To find your drive name go to settings and at the top of setting menu you can find the name of your drive. 6. Select **Search Scope**. Specifies the location(s) to search for files. Valid options are 'ACCESSIBLE_DRIVES' to search in the selected OneDrive drive, 'SHARED_ITEMS' for shared items the user has access to, and 'ALL' to search both. Default value is 'ALL'. 7. Enter **Folder Path**. Leave empty to search all folders of the drives. This does not apply to shared items. -8. The **OAuth2.0** authorization method is selected by default. Click **Authenticate your Microsoft OneDrive account**. Log in and authorize your Microsoft account. -9. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. +8. The **OAuth2.0** authorization method is selected by default. Click **Authenticate your Microsoft OneDrive account**. Log in and authorize your Microsoft account. +9. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. 10. Add a stream: 1. Write the **File Type** - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. 3. Give a **Name** to the stream 4. (Optional) - If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). - 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. + 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. 11. Click **Set up source** @@ -44,8 +44,8 @@ The Microsoft Graph API uses OAuth for authentication. Microsoft Graph exposes g Microsoft Graph has two types of permissions: -* **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. -* **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. +- **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. +- **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. This source requires **Application permissions**. Follow these [instructions](https://docs.microsoft.com/en-us/graph/auth-v2-service?context=graph%2Fapi%2F1.0&view=graph-rest-1.0) for creating an app in the Azure portal. This process will produce the `client_id`, `client_secret`, and `tenant_id` needed for the tap configuration file. @@ -54,23 +54,23 @@ This source requires **Application permissions**. Follow these [instructions](ht 3. Select **App Registrations** 4. Click **New registration** 5. Register an application - 1. Name: + 1. Name: 2. Supported account types: Accounts in this organizational directory only 3. Register \(button\) -6. Record the client\_id and tenant\_id which will be used by the tap for authentication and API integration. +6. Record the client_id and tenant_id which will be used by the tap for authentication and API integration. 7. Select **Certificates & secrets** 8. Provide **Description and Expires** 1. Description: tap-microsoft-onedrive client secret 2. Expires: 1-year 3. Add -9. Copy the client secret value, this will be the client\_secret +9. Copy the client secret value, this will be the client_secret 10. Select **API permissions** 1. Click **Add a permission** 11. Select **Microsoft Graph** 12. Select **Application permissions** 13. Select the following permissions: - 1. Files - * Files.Read.All + 1. Files + - Files.Read.All 14. Click **Add permissions** 15. Click **Grant admin consent** @@ -84,15 +84,15 @@ This source requires **Application permissions**. Follow these [instructions](ht 6. Select **Search Scope**. Specifies the location(s) to search for files. Valid options are 'ACCESSIBLE_DRIVES' to search in the selected OneDrive drive, 'SHARED_ITEMS' for shared items the user has access to, and 'ALL' to search both. Default value is 'ALL'. 7. Enter **Folder Path**. Leave empty to search all folders of the drives. This does not apply to shared items. 8. Switch to **Service Key Authentication** -9. For **User Practical Name**, enter the [UPN](https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls) for your user. -10. Enter **Tenant ID**, **Client ID** and **Client secret**. -11. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. +9. For **User Practical Name**, enter the [UPN](https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls) for your user. +10. Enter **Tenant ID**, **Client ID** and **Client secret**. +11. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. 12. Add a stream: 1. Write the **File Type** - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. 3. Give a **Name** to the stream 4. (Optional) - If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). - 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. + 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. 13. Click **Set up source** @@ -101,8 +101,8 @@ This source requires **Application permissions**. Follow these [instructions](ht ### Data type mapping -| Integration Type | Airbyte Type | -|:-----------------|:-------------| +| Integration Type | Airbyte Type | +| :--------------- | :----------- | | `string` | `string` | | `number` | `number` | | `array` | `array` | @@ -110,10 +110,10 @@ This source requires **Application permissions**. Follow these [instructions](ht ### Features -| Feature | Supported?\(Yes/No\) | -|:------------------------------|:---------------------| -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | +| Feature | Supported?\(Yes/No\) | +| :---------------- | :------------------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | ### Performance considerations @@ -121,16 +121,16 @@ The connector is restricted by normal Microsoft Graph [requests limitation](http ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------| -| 0.2.0 | 2024-03-12 | [35849](https://github.com/airbytehq/airbyte/pull/35849) | Add fetching shared items | -| 0.1.9 | 2024-03-11 | [35956](https://github.com/airbytehq/airbyte/pull/35956) | Pin `transformers` transitive dependency | -| 0.1.8 | 2024-03-06 | [35858](https://github.com/airbytehq/airbyte/pull/35858) | Bump poetry.lock to upgrade transitive dependency | -| 0.1.7 | 2024-03-04 | [35584](https://github.com/airbytehq/airbyte/pull/35584) | Enable in Cloud | -| 0.1.6 | 2024-02-06 | [34936](https://github.com/airbytehq/airbyte/pull/34936) | Bump CDK version to avoid missing SyncMode errors | -| 0.1.5 | 2024-01-30 | [34681](https://github.com/airbytehq/airbyte/pull/34681) | Unpin CDK version to make compatible with the Concurrent CDK | -| 0.1.4 | 2024-01-30 | [34661](https://github.com/airbytehq/airbyte/pull/34661) | Pin CDK version until upgrade for compatibility with the Concurrent CDK | -| 0.1.3 | 2024-01-24 | [34478](https://github.com/airbytehq/airbyte/pull/34478) | Fix OAuth | -| 0.1.2 | 2021-12-22 | [33745](https://github.com/airbytehq/airbyte/pull/33745) | Add ql and sl to metadata | -| 0.1.1 | 2021-12-15 | [33758](https://github.com/airbytehq/airbyte/pull/33758) | Fix for docs name | -| 0.1.0 | 2021-12-06 | [32655](https://github.com/airbytehq/airbyte/pull/32655) | New source | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- | +| 0.2.0 | 2024-03-12 | [35849](https://github.com/airbytehq/airbyte/pull/35849) | Add fetching shared items | +| 0.1.9 | 2024-03-11 | [35956](https://github.com/airbytehq/airbyte/pull/35956) | Pin `transformers` transitive dependency | +| 0.1.8 | 2024-03-06 | [35858](https://github.com/airbytehq/airbyte/pull/35858) | Bump poetry.lock to upgrade transitive dependency | +| 0.1.7 | 2024-03-04 | [35584](https://github.com/airbytehq/airbyte/pull/35584) | Enable in Cloud | +| 0.1.6 | 2024-02-06 | [34936](https://github.com/airbytehq/airbyte/pull/34936) | Bump CDK version to avoid missing SyncMode errors | +| 0.1.5 | 2024-01-30 | [34681](https://github.com/airbytehq/airbyte/pull/34681) | Unpin CDK version to make compatible with the Concurrent CDK | +| 0.1.4 | 2024-01-30 | [34661](https://github.com/airbytehq/airbyte/pull/34661) | Pin CDK version until upgrade for compatibility with the Concurrent CDK | +| 0.1.3 | 2024-01-24 | [34478](https://github.com/airbytehq/airbyte/pull/34478) | Fix OAuth | +| 0.1.2 | 2021-12-22 | [33745](https://github.com/airbytehq/airbyte/pull/33745) | Add ql and sl to metadata | +| 0.1.1 | 2021-12-15 | [33758](https://github.com/airbytehq/airbyte/pull/33758) | Fix for docs name | +| 0.1.0 | 2021-12-06 | [32655](https://github.com/airbytehq/airbyte/pull/32655) | New source | diff --git a/docs/integrations/sources/microsoft-sharepoint.md b/docs/integrations/sources/microsoft-sharepoint.md index 298fc6b06fa..c80182ad49c 100644 --- a/docs/integrations/sources/microsoft-sharepoint.md +++ b/docs/integrations/sources/microsoft-sharepoint.md @@ -1,4 +1,5 @@ # Microsoft SharePoint + This page contains the setup guide and reference information for the Microsoft SharePoint source connector. @@ -6,11 +7,11 @@ This page contains the setup guide and reference information for the Microsoft S ### Requirements -* Application \(client\) ID -* Directory \(tenant\) ID -* Drive name -* Folder Path -* Client secrets +- Application \(client\) ID +- Directory \(tenant\) ID +- Drive name +- Folder Path +- Client secrets ## Setup guide @@ -50,8 +51,8 @@ The Microsoft Graph API uses OAuth for authentication. Microsoft Graph exposes g Microsoft Graph has two types of permissions: -* **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. -* **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. +- **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. +- **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. This source requires **Application permissions**. Follow these [instructions](https://docs.microsoft.com/en-us/graph/auth-v2-service?context=graph%2Fapi%2F1.0&view=graph-rest-1.0) for creating an app in the Azure portal. This process will produce the `client_id`, `client_secret`, and `tenant_id` needed for the tap configuration file. @@ -60,23 +61,23 @@ This source requires **Application permissions**. Follow these [instructions](ht 3. Select **App Registrations** 4. Click **New registration** 5. Register an application - 1. Name: + 1. Name: 2. Supported account types: Accounts in this organizational directory only 3. Register \(button\) -6. Record the client\_id and tenant\_id which will be used by the tap for authentication and API integration. +6. Record the client_id and tenant_id which will be used by the tap for authentication and API integration. 7. Select **Certificates & secrets** 8. Provide **Description and Expires** 1. Description: tap-microsoft-teams client secret 2. Expires: 1-year 3. Add -9. Copy the client secret value, this will be the client\_secret +9. Copy the client secret value, this will be the client_secret 10. Select **API permissions** 1. Click **Add a permission** 11. Select **Microsoft Graph** 12. Select **Application permissions** 13. Select the following permissions: - 1. Files - * Files.Read.All + 1. Files + - Files.Read.All 14. Click **Add permissions** 15. Click **Grant admin consent** @@ -90,15 +91,15 @@ This source requires **Application permissions**. Follow these [instructions](ht 6. Select **Search Scope**. Specifies the location(s) to search for files. Valid options are 'ACCESSIBLE_DRIVES' for all SharePoint drives the user can access, 'SHARED_ITEMS' for shared items the user has access to, and 'ALL' to search both. Default value is 'ALL'. 7. Enter **Folder Path**. Leave empty to search all folders of the drives. This does not apply to shared items. 8. Switch to **Service Key Authentication** -9. For **User Practical Name**, enter the [UPN](https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls) for your user. -10. Enter **Tenant ID**, **Client ID** and **Client secret**. -11. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. +9. For **User Practical Name**, enter the [UPN](https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls) for your user. +10. Enter **Tenant ID**, **Client ID** and **Client secret**. +11. For **Start Date**, enter the date in YYYY-MM-DD format. The data added on and after this date will be replicated. 12. Add a stream: 1. Write the **File Type** - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. 3. Give a **Name** to the stream 4. (Optional) - If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). - 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. + 5. Optionally, enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Path Patterns section](#path-patterns) below. 13. Click **Set up source** @@ -109,8 +110,8 @@ This source requires **Application permissions**. Follow these [instructions](ht ### Data type mapping -| Integration Type | Airbyte Type | -|:-----------------|:-------------| +| Integration Type | Airbyte Type | +| :--------------- | :----------- | | `string` | `string` | | `number` | `number` | | `array` | `array` | @@ -118,10 +119,10 @@ This source requires **Application permissions**. Follow these [instructions](ht ### Features -| Feature | Supported?\(Yes/No\) | -|:------------------------------|:---------------------| -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | +| Feature | Supported?\(Yes/No\) | +| :---------------- | :------------------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | ### Performance considerations @@ -130,7 +131,7 @@ The connector is restricted by normal Microsoft Graph [requests limitation](http ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | | 0.2.3 | 2024-04-17 | [37372](https://github.com/airbytehq/airbyte/pull/37372) | Make refresh token optional | | 0.2.2 | 2024-03-28 | [36573](https://github.com/airbytehq/airbyte/pull/36573) | Update QL to 400 | | 0.2.1 | 2024-03-22 | [36381](https://github.com/airbytehq/airbyte/pull/36381) | Unpin CDK | diff --git a/docs/integrations/sources/microsoft-teams.md b/docs/integrations/sources/microsoft-teams.md index 7e306bffaf0..660e92c61f5 100644 --- a/docs/integrations/sources/microsoft-teams.md +++ b/docs/integrations/sources/microsoft-teams.md @@ -10,18 +10,18 @@ There are currently 2 versions of [Microsoft Graph REST APIs](https://docs.micro This Source is capable of syncing the following core Streams: -* [users](https://docs.microsoft.com/en-us/graph/api/user-list?view=graph-rest-beta&tabs=http) -* [groups](https://docs.microsoft.com/en-us/graph/teams-list-all-teams?context=graph%2Fapi%2F1.0&view=graph-rest-1.0) -* [group\_members](https://docs.microsoft.com/en-us/graph/api/group-list-members?view=graph-rest-1.0&tabs=http) -* [group\_owners](https://docs.microsoft.com/en-us/graph/api/group-list-owners?view=graph-rest-1.0&tabs=http) -* [channels](https://docs.microsoft.com/en-us/graph/api/channel-list?view=graph-rest-1.0&tabs=http) -* [channel\_members](https://docs.microsoft.com/en-us/graph/api/channel-list-members?view=graph-rest-1.0&tabs=http) -* [channel\_tabs](https://docs.microsoft.com/en-us/graph/api/channel-list-tabs?view=graph-rest-1.0&tabs=http) -* [conversations](https://docs.microsoft.com/en-us/graph/api/group-list-conversations?view=graph-rest-beta&tabs=http) -* [conversation\_threads](https://docs.microsoft.com/en-us/graph/api/conversation-list-threads?view=graph-rest-beta&tabs=http) -* [conversation\_posts](https://docs.microsoft.com/en-us/graph/api/conversationthread-list-posts?view=graph-rest-beta&tabs=http) -* [team\_drives](https://docs.microsoft.com/en-us/graph/api/drive-get?view=graph-rest-beta&tabs=http#get-the-document-library-associated-with-a-group) -* [team\_device\_usage\_report](https://docs.microsoft.com/en-us/graph/api/reportroot-getteamsdeviceusageuserdetail?view=graph-rest-1.0) +- [users](https://docs.microsoft.com/en-us/graph/api/user-list?view=graph-rest-beta&tabs=http) +- [groups](https://docs.microsoft.com/en-us/graph/teams-list-all-teams?context=graph%2Fapi%2F1.0&view=graph-rest-1.0) +- [group_members](https://docs.microsoft.com/en-us/graph/api/group-list-members?view=graph-rest-1.0&tabs=http) +- [group_owners](https://docs.microsoft.com/en-us/graph/api/group-list-owners?view=graph-rest-1.0&tabs=http) +- [channels](https://docs.microsoft.com/en-us/graph/api/channel-list?view=graph-rest-1.0&tabs=http) +- [channel_members](https://docs.microsoft.com/en-us/graph/api/channel-list-members?view=graph-rest-1.0&tabs=http) +- [channel_tabs](https://docs.microsoft.com/en-us/graph/api/channel-list-tabs?view=graph-rest-1.0&tabs=http) +- [conversations](https://docs.microsoft.com/en-us/graph/api/group-list-conversations?view=graph-rest-beta&tabs=http) +- [conversation_threads](https://docs.microsoft.com/en-us/graph/api/conversation-list-threads?view=graph-rest-beta&tabs=http) +- [conversation_posts](https://docs.microsoft.com/en-us/graph/api/conversationthread-list-posts?view=graph-rest-beta&tabs=http) +- [team_drives](https://docs.microsoft.com/en-us/graph/api/drive-get?view=graph-rest-beta&tabs=http#get-the-document-library-associated-with-a-group) +- [team_device_usage_report](https://docs.microsoft.com/en-us/graph/api/reportroot-getteamsdeviceusageuserdetail?view=graph-rest-1.0) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) @@ -56,9 +56,9 @@ The connector is restricted by normal Microsoft Graph [requests limitation](http ### Requirements -* Application \(client\) ID -* Directory \(tenant\) ID -* Client secrets +- Application \(client\) ID +- Directory \(tenant\) ID +- Client secrets ### Setup guide @@ -66,8 +66,8 @@ The Microsoft Graph API uses OAuth for authentication. Microsoft Graph exposes g Microsoft Graph has two types of permissions: -* **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. -* **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. +- **Delegated permissions** are used by apps that have a signed-in user present. For these apps, either the user or an administrator consents to the permissions that the app requests, and the app can act as the signed-in user when making calls to Microsoft Graph. Some delegated permissions can be consented by non-administrative users, but some higher-privileged permissions require administrator consent. +- **Application permissions** are used by apps that run without a signed-in user present; for example, apps that run as background services or daemons. Application permissions can only be consented by an administrator. This source requires **Application permissions**. Follow these [instructions](https://docs.microsoft.com/en-us/graph/auth-v2-service?context=graph%2Fapi%2F1.0&view=graph-rest-1.0) for creating an app in the Azure portal. This process will produce the `client_id`, `client_secret`, and `tenant_id` needed for the tap configuration file. @@ -76,83 +76,83 @@ This source requires **Application permissions**. Follow these [instructions](ht 3. Select App Registrations 4. Click New registration 5. Register an application - 1. Name: + 1. Name: 2. Supported account types: Accounts in this organizational directory only 3. Register \(button\) -6. Record the client\_id, tenant\_id, and which will be used by the tap for authentication and API integration. +6. Record the client_id, tenant_id, and which will be used by the tap for authentication and API integration. 7. Select Certificates & secrets 8. Provide Description and Expires 1. Description: tap-microsoft-teams client secret 2. Expires: 1-year 3. Add -9. Copy the client secret value, this will be the client\_secret +9. Copy the client secret value, this will be the client_secret 10. Select API permissions 1. Click Add a permission 11. Select Microsoft Graph 12. Select Application permissions 13. Select the following permissions: - 1. Users - * User.Read.All - * User.ReadWrite.All - * Directory.Read.All - * Directory.ReadWrite.All + 1. Users + - User.Read.All + - User.ReadWrite.All + - Directory.Read.All + - Directory.ReadWrite.All 2. Groups - * GroupMember.Read.All - * Group.Read.All - * Directory.Read.All - * Group.ReadWrite.All - * Directory.ReadWrite.All + - GroupMember.Read.All + - Group.Read.All + - Directory.Read.All + - Group.ReadWrite.All + - Directory.ReadWrite.All 3. Group members - * GroupMember.Read.All - * Group.Read.All - * Directory.Read.All + - GroupMember.Read.All + - Group.Read.All + - Directory.Read.All 4. Group owners - * Group.Read.All - * User.Read.All - * Group.Read.All - * User.ReadWrite.All - * Group.Read.All - * User.Read.All - * Application.Read.All + - Group.Read.All + - User.Read.All + - Group.Read.All + - User.ReadWrite.All + - Group.Read.All + - User.Read.All + - Application.Read.All 5. Channels - * ChannelSettings.Read.Group - * ChannelSettings.ReadWrite.Group - * Channel.ReadBasic.All - * ChannelSettings.Read.All - * ChannelSettings.ReadWrite.All - * Group.Read.All - * Group.ReadWrite.All - * Directory.Read.All - * Directory.ReadWrite.All + - ChannelSettings.Read.Group + - ChannelSettings.ReadWrite.Group + - Channel.ReadBasic.All + - ChannelSettings.Read.All + - ChannelSettings.ReadWrite.All + - Group.Read.All + - Group.ReadWrite.All + - Directory.Read.All + - Directory.ReadWrite.All 6. Channel members - * ChannelMember.Read.All - * ChannelMember.ReadWrite.All + - ChannelMember.Read.All + - ChannelMember.ReadWrite.All 7. Channel tabs - * TeamsTab.Read.Group - * TeamsTab.ReadWrite.Group - * TeamsTab.Read.All - * TeamsTab.ReadWriteForTeam.All - * TeamsTab.ReadWrite.All - * Group.Read.All - * Group.ReadWrite.All - * Directory.Read.All - * Directory.ReadWrite.All + - TeamsTab.Read.Group + - TeamsTab.ReadWrite.Group + - TeamsTab.Read.All + - TeamsTab.ReadWriteForTeam.All + - TeamsTab.ReadWrite.All + - Group.Read.All + - Group.ReadWrite.All + - Directory.Read.All + - Directory.ReadWrite.All 8. Conversations - * Group.Read.All - * Group.ReadWrite.All + - Group.Read.All + - Group.ReadWrite.All 9. Conversation threads - * Group.Read.All - * Group.ReadWrite.All + - Group.Read.All + - Group.ReadWrite.All 10. Conversation posts - * Group.Read.All - * Group.ReadWrite.All + - Group.Read.All + - Group.ReadWrite.All 11. Team drives - * Files.Read.All - * Files.ReadWrite.All - * Sites.Read.All - * Sites.ReadWrite.All + - Files.Read.All + - Files.ReadWrite.All + - Sites.Read.All + - Sites.ReadWrite.All 12. Team device usage report - * Reports.Read.All + - Reports.Read.All 14. Click Add permissions Token acquiring implemented by [instantiate](https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-app-configuration?tabs=python#instantiate-the-msal-application) the confidential client application with a client secret and [calling](https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-acquire-token?tabs=python) AcquireTokenForClient from [Microsoft Authentication Library \(MSAL\) for Python](https://github.com/AzureAD/microsoft-authentication-library-for-python) @@ -160,7 +160,7 @@ Token acquiring implemented by [instantiate](https://docs.microsoft.com/en-us/az ## CHANGELOG | Version | Date | Pull Request | Subject | -|:------- |:---------- | :------------------------------------------------------- | :----------------------------- | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | | 1.1.0 | 2024-03-24 | [36223](https://github.com/airbytehq/airbyte/pull/36223) | Migration to low code | | 1.0.0 | 2024-01-04 | [33959](https://github.com/airbytehq/airbyte/pull/33959) | Schema updates | | 0.2.5 | 2021-12-14 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | diff --git a/docs/integrations/sources/mixpanel.md b/docs/integrations/sources/mixpanel.md index d7d041844f0..8e1b0c452d8 100644 --- a/docs/integrations/sources/mixpanel.md +++ b/docs/integrations/sources/mixpanel.md @@ -54,7 +54,7 @@ Syncing huge date windows may take longer due to Mixpanel's low API rate-limits ## CHANGELOG | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- | | 2.2.2 | 2024-04-19 | [36651](https://github.com/airbytehq/airbyte/pull/36651) | Updating to 0.80.0 CDK | | 2.2.1 | 2024-04-12 | [36651](https://github.com/airbytehq/airbyte/pull/36651) | Schema descriptions | | 2.2.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | diff --git a/docs/integrations/sources/monday-migrations.md b/docs/integrations/sources/monday-migrations.md index 9d095b9e127..fceec380dea 100644 --- a/docs/integrations/sources/monday-migrations.md +++ b/docs/integrations/sources/monday-migrations.md @@ -11,18 +11,18 @@ Source Monday has deprecated API version 2023-07. We have upgraded the connector Airbyte Open Source users must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. -2. Find Monday in the list of connectors. + 1. Select **Sources**. +2. Find Monday in the list of connectors. :::note You will see two versions listed, the current in-use version and the latest version available. -::: +::: 3. Select **Change** to update your OSS version to the latest available version. ### Update the connector version -1. Select **Sources** in the main navbar. +1. Select **Sources** in the main navbar. 2. Select the instance of the connector you wish to upgrade. :::note @@ -30,48 +30,40 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. - + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### Refresh schemas and reset data 1. Select **Connections** in the main navbar. 2. Select the connection(s) affected by the update. -3. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -4. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset all streams** option is checked. -5. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: +3. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +4. Select **Save changes** at the bottom of the page. + 1. Ensure the **Reset all streams** option is checked. +5. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - ### Refresh affected schemas and reset data 1. Select **Connections** in the main navb nar. - 1. Select the connection(s) affected by the update. -2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. -:::note -Any detected schema changes will be listed for your review. -::: -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. -:::note -Depending on destination type you may not be prompted to reset your data. -::: -4. Select **Save connection**. -:::note -This will reset the data in your destination and initiate a fresh sync. -::: + 1. Select the connection(s) affected by the update. +2. Select the **Replication** tab. 1. Select **Refresh source schema**. 2. Select **OK**. + :::note + Any detected schema changes will be listed for your review. + ::: +3. Select **Save changes** at the bottom of the page. 1. Ensure the **Reset affected streams** option is checked. + :::note + Depending on destination type you may not be prompted to reset your data. + ::: +4. Select **Save connection**. + :::note + This will reset the data in your destination and initiate a fresh sync. + ::: For more information on resetting your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - diff --git a/docs/integrations/sources/monday.md b/docs/integrations/sources/monday.md index 62d67d92722..0aad5bf8360 100644 --- a/docs/integrations/sources/monday.md +++ b/docs/integrations/sources/monday.md @@ -4,7 +4,7 @@ This page contains the setup guide and reference information for the [Monday](ht ## Prerequisites -* Monday API Token / Monday Access Token +- Monday API Token / Monday Access Token You can find your Oauth application in Monday main page -> Profile picture (bottom left corner) -> Developers -> My Apps -> Select your app. @@ -33,7 +33,7 @@ You can get the API token for Monday by going to Profile picture (bottom left co The Monday source connector supports the following features: | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | SSL connection | No | @@ -43,28 +43,28 @@ The Monday source connector supports the following features: Several output streams are available from this source: -* [Activity logs](https://developer.monday.com/api-reference/docs/activity-logs) -* [Items](https://developer.monday.com/api-reference/docs/items-queries) -* [Boards](https://developer.monday.com/api-reference/docs/groups-queries#groups-queries) -* [Teams](https://developer.monday.com/api-reference/docs/teams-queries) -* [Updates](https://developer.monday.com/api-reference/docs/updates-queries) -* [Users](https://developer.monday.com/api-reference/docs/users-queries-1) -* [Tags](https://developer.monday.com/api-reference/docs/tags-queries) -* [Workspaces](https://developer.monday.com/api-reference/docs/workspaces) +- [Activity logs](https://developer.monday.com/api-reference/docs/activity-logs) +- [Items](https://developer.monday.com/api-reference/docs/items-queries) +- [Boards](https://developer.monday.com/api-reference/docs/groups-queries#groups-queries) +- [Teams](https://developer.monday.com/api-reference/docs/teams-queries) +- [Updates](https://developer.monday.com/api-reference/docs/updates-queries) +- [Users](https://developer.monday.com/api-reference/docs/users-queries-1) +- [Tags](https://developer.monday.com/api-reference/docs/tags-queries) +- [Workspaces](https://developer.monday.com/api-reference/docs/workspaces) Important Notes: -* `Columns` are available from the `Boards` stream. By syncing the `Boards` stream you will get the `Columns` for each `Board` synced in the database -The typical name of the table depends on the `destination` you use like `boards.columns`, for instance. +- `Columns` are available from the `Boards` stream. By syncing the `Boards` stream you will get the `Columns` for each `Board` synced in the database + The typical name of the table depends on the `destination` you use like `boards.columns`, for instance. -* `Column Values` are available from the `Items` stream. By syncing the `Items` stream you will get the `Column Values` for each `Item` (row) of the board. -The typical name of the table depends on the `destination` you use like `items.column_values`, for instance. -If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) +- `Column Values` are available from the `Items` stream. By syncing the `Items` stream you will get the `Column Values` for each `Item` (row) of the board. + The typical name of the table depends on the `destination` you use like `items.column_values`, for instance. + If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) -* Incremental sync for `Items` and `Boards` streams is done using the `Activity logs` stream. -Ids of boards and items are extracted from activity logs events and used to selectively sync boards and items. -Some data may be lost if the time between incremental syncs is longer than the activity logs retention time for your plan. -Check your Monday plan at https://monday.com/pricing. +- Incremental sync for `Items` and `Boards` streams is done using the `Activity logs` stream. + Ids of boards and items are extracted from activity logs events and used to selectively sync boards and items. + Some data may be lost if the time between incremental syncs is longer than the activity logs retention time for your plan. + Check your Monday plan at https://monday.com/pricing. ## Performance considerations @@ -73,7 +73,7 @@ The Monday connector should not run into Monday API limitations under normal usa ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:--------------------------------------------------------------------------------------------------| +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------------------------ | | 2.1.2 | 2024-04-30 | [37722](https://github.com/airbytehq/airbyte/pull/37722) | Fetch `display_value` field for column values of `Mirror`, `Dependency` and `Connect Board` types | | 2.1.1 | 2024-04-05 | [36717](https://github.com/airbytehq/airbyte/pull/36717) | Add handling of complexityBudgetExhausted error. | | 2.1.0 | 2024-04-03 | [36746](https://github.com/airbytehq/airbyte/pull/36746) | Pin airbyte-cdk version to `^0` | @@ -88,8 +88,8 @@ The Monday connector should not run into Monday API limitations under normal usa | 1.1.1 | 2023-08-15 | [29429](https://github.com/airbytehq/airbyte/pull/29429) | Ignore `null` records in response | | 1.1.0 | 2023-07-05 | [27944](https://github.com/airbytehq/airbyte/pull/27944) | Add incremental sync for Items and Boards streams | | 1.0.0 | 2023-06-20 | [27410](https://github.com/airbytehq/airbyte/pull/27410) | Add new streams: Tags, Workspaces. Add new fields for existing streams. | -| 0.2.6 | 2023-06-12 | [27244](https://github.com/airbytehq/airbyte/pull/27244) | Added http error handling for `403` and `500` HTTP errors | -| 0.2.5 | 2023-05-22 | [225881](https://github.com/airbytehq/airbyte/pull/25881) | Fix pagination for the items stream | +| 0.2.6 | 2023-06-12 | [27244](https://github.com/airbytehq/airbyte/pull/27244) | Added http error handling for `403` and `500` HTTP errors | +| 0.2.5 | 2023-05-22 | [225881](https://github.com/airbytehq/airbyte/pull/25881) | Fix pagination for the items stream | | 0.2.4 | 2023-04-26 | [25277](https://github.com/airbytehq/airbyte/pull/25277) | Increase row limit to 100 | | 0.2.3 | 2023-03-06 | [23231](https://github.com/airbytehq/airbyte/pull/23231) | Publish using low-code CDK Beta version | | 0.2.2 | 2023-01-04 | [20996](https://github.com/airbytehq/airbyte/pull/20996) | Fix json schema loader | diff --git a/docs/integrations/sources/mongodb-v2-migrations.md b/docs/integrations/sources/mongodb-v2-migrations.md index 93211e70e93..610c2493773 100644 --- a/docs/integrations/sources/mongodb-v2-migrations.md +++ b/docs/integrations/sources/mongodb-v2-migrations.md @@ -2,16 +2,16 @@ ## Upgrading to 1.0.0 -This version introduces a general availability version of the MongoDB V2 source connector, which leverages -[Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) to improve the performance and -reliability of syncs. This version provides better error handling, incremental delivery of data and improved -reliability of large syncs via frequent checkpointing. +This version introduces a general availability version of the MongoDB V2 source connector, which leverages +[Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) to improve the performance and +reliability of syncs. This version provides better error handling, incremental delivery of data and improved +reliability of large syncs via frequent checkpointing. **THIS VERSION INCLUDES BREAKING CHANGES FROM PREVIOUS VERSIONS OF THE CONNECTOR!** -The changes will require you to reconfigure your existing MongoDB V2 configured source connectors. To review the +The changes will require you to reconfigure your existing MongoDB V2 configured source connectors. To review the breaking changes and to learn how to upgrade the connector, refer to the [MongoDB V2 source connector documentation](mongodb-v2#upgrade-from-previous-version). -Additionally, you can manually update existing connections prior to the next scheduled sync to perform the upgrade or +Additionally, you can manually update existing connections prior to the next scheduled sync to perform the upgrade or re-create the source using the new configuration. Worthy of specific mention, this version includes: @@ -22,4 +22,4 @@ Worthy of specific mention, this version includes: - Sampling of fields for schema discovery - Required SSL/TLS connections -Learn more about what's new in the connection, view the updated documentation [here](mongodb-v2). \ No newline at end of file +Learn more about what's new in the connection, view the updated documentation [here](mongodb-v2). diff --git a/docs/integrations/sources/mongodb-v2.md b/docs/integrations/sources/mongodb-v2.md index a5431a1432d..e3806fbdbea 100644 --- a/docs/integrations/sources/mongodb-v2.md +++ b/docs/integrations/sources/mongodb-v2.md @@ -2,13 +2,13 @@ Airbyte's certified MongoDB connector offers the following features: -* [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) via [MongoDB's change streams](https://www.mongodb.com/docs/manual/changeStreams/)/[Replica Set Oplog](https://www.mongodb.com/docs/manual/core/replica-set-oplog/). -* Reliable replication of any collection size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of data reads. -* ***NEW*** Full refresh syncing of collections. +- [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) via [MongoDB's change streams](https://www.mongodb.com/docs/manual/changeStreams/)/[Replica Set Oplog](https://www.mongodb.com/docs/manual/core/replica-set-oplog/). +- Reliable replication of any collection size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of data reads. +- **_NEW_** Full refresh syncing of collections. ## Quick Start -This section provides information about configuring the MongoDB V2 source connector. If you are upgrading from a +This section provides information about configuring the MongoDB V2 source connector. If you are upgrading from a previous version of the MongoDB V2 source connector, please refer to the [upgrade](#upgrade-from-previous-version) instructions in this document. @@ -24,7 +24,7 @@ Once this is complete, you will be able to select MongoDB as a source for replic #### Step 1: Create a dedicated read-only MongoDB user -These steps create a dedicated, read-only user for replicating data. Alternatively, you can use an existing MongoDB user with +These steps create a dedicated, read-only user for replicating data. Alternatively, you can use an existing MongoDB user with access to the database. ##### MongoDB Atlas @@ -60,19 +60,24 @@ access to the database. ##### Self Hosted -These instructions assume that the [MongoDB shell](https://www.mongodb.com/docs/mongodb-shell/) is installed. To +These instructions assume that the [MongoDB shell](https://www.mongodb.com/docs/mongodb-shell/) is installed. To install the MongoDB shell, please follow [these instructions](https://www.mongodb.com/docs/mongodb-shell/install/#std-label-mdb-shell-install). 1. From a terminal window, launch the MongoDB shell: + ```shell > mongosh --username ; -``` +``` + 2. Switch to the `admin` database: + ```shell test> use admin switched to db admin ``` + 3. Create the `READ_ONLY_USER` user with the `read` role: + ```shell admin> db.createUser({user: "READ_ONLY_USER", pwd: "READ_ONLY_PASSWORD", roles: [{role: "read", db: "TARGET_DATABASE"}]}) ``` @@ -81,7 +86,8 @@ admin> db.createUser({user: "READ_ONLY_USER", pwd: "READ_ONLY_PASSWORD", roles: Replace `READ_ONLY_PASSWORD` with a password of your choice and `TARGET_DATABASE` with the name of the database to be replicated. ::: -4. Next, enable authentication, if not already enabled. Start by editing the `/etc/mongodb.conf` by adding/editing these specific keys: +4. Next, enable authentication, if not already enabled. Start by editing the `/etc/mongodb.conf` by adding/editing these specific keys: + ```yaml net: bindIp: 0.0.0.0 @@ -90,8 +96,8 @@ security: authorization: enabled ``` -:::note -Setting the `bindIp` key to `0.0.0.0` will allow connections to database from any IP address. Setting the `security.authorization` key to `enabled` will enable security and only allow authenticated users to access the database. +:::note +Setting the `bindIp` key to `0.0.0.0` will allow connections to database from any IP address. Setting the `security.authorization` key to `enabled` will enable security and only allow authenticated users to access the database. ::: #### Step 2: Discover the MongoDB cluster connection string @@ -100,7 +106,7 @@ These steps outline how to discover the connection string of your MongoDB instan ##### MongoDB Atlas -Atlas is MongoDB's [cloud-hosted offering](https://www.mongodb.com/atlas/database). Below are the steps to discover +Atlas is MongoDB's [cloud-hosted offering](https://www.mongodb.com/atlas/database). Below are the steps to discover the connection configuration for a MongoDB Atlas-hosted replica set cluster: 1. Log in to the [MongoDB Atlas dashboard](https://cloud.mongodb.com/). @@ -118,11 +124,11 @@ the connection configuration for a MongoDB Atlas-hosted replica set cluster: ##### Self Hosted Cluster -Self-hosted clusters are MongoDB instances that are hosted outside of [MongoDB Atlas](https://www.mongodb.com/atlas/database). Below are the steps to discover +Self-hosted clusters are MongoDB instances that are hosted outside of [MongoDB Atlas](https://www.mongodb.com/atlas/database). Below are the steps to discover the connection string for a MongoDB self-hosted replica set cluster. 1. Refer to the [MongoDB connection string documentation](https://www.mongodb.com/docs/manual/reference/connection-string/#find-your-self-hosted-deployment-s-connection-string) for instructions -on discovering a self-hosted deployment connection string. + on discovering a self-hosted deployment connection string. #### Step 3: Configure the Airbyte MongoDB Source @@ -139,6 +145,7 @@ In addtion MongoDB source now allows for syncing in a full refresh mode. Airbyte utilizes [the change streams feature](https://www.mongodb.com/docs/manual/changeStreams/) of a [MongoDB replica set](https://www.mongodb.com/docs/manual/replication/) to incrementally capture inserts, updates and deletes using a replication plugin. To learn more how Airbyte implements CDC, refer to [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc/). ### Full Refresh + The Full refresh sync mode added in v4.0.0 allows for reading a the entire contents of a collection, repeatedly. The MongoDB source connector is using checkpointing in Full Refresh read so a sync job that failed for netwrok error for example, Rather than starting over it will continue its full refresh read from a last known point. @@ -150,12 +157,14 @@ By default the MongoDB V2 source connector enforces a schema. This means that wh When the schema enforced option is disabled, MongoDB collections are read in schema-less mode which doesn't assume documents share the same structure. This allows for greater flexibility in reading data that is unstructured or vary a lot in between documents in a single collection. When schema is not enforced, each document will generate a record that only contains the following top-level fields: + ```json { "_id": , "data": {} } ``` + The contents of `data` will vary according to the contents of each document read from MongoDB. Unlike in Schema enforced mode, the same field can vary in type between document. For example field `"xyz"` may be a String on one document and a Date on another. As a result no field will be omitted and no document will be rejected. @@ -165,11 +174,12 @@ When Schema is not enforced there is not way to deselect fields as all fields ar ### MongoDB Oplog and Change Streams -[MongoDB's Change Streams](https://www.mongodb.com/docs/manual/changeStreams/) are based on the [Replica Set Oplog](https://www.mongodb.com/docs/manual/core/replica-set-oplog/). This has retention limitations. Syncs that run less frequently than the retention period of the Oplog may encounter issues with missing data. +[MongoDB's Change Streams](https://www.mongodb.com/docs/manual/changeStreams/) are based on the [Replica Set Oplog](https://www.mongodb.com/docs/manual/core/replica-set-oplog/). This has retention limitations. Syncs that run less frequently than the retention period of the Oplog may encounter issues with missing data. We recommend adjusting the Oplog size for your MongoDB cluster to ensure it holds at least 24 hours of changes. For optimal results, we suggest expanding it to maintain a week's worth of data. To adjust your Oplog size, see the corresponding tutorials for [MongoDB Atlas](https://www.mongodb.com/docs/atlas/cluster-additional-settings/#set-oplog-size) (fully-managed) and [MongoDB shell](https://www.mongodb.com/docs/manual/tutorial/change-oplog-size/) (self-hosted). If you are running into an issue similar to "invalid resume token", it may mean you need to: + 1. Increase the Oplog retention period. 2. Increase the Oplog size. 3. Increase the Airbyte sync frequency. @@ -177,51 +187,51 @@ If you are running into an issue similar to "invalid resume token", it may mean You can run the commands outlined [in this tutorial](https://www.mongodb.com/docs/manual/tutorial/troubleshoot-replica-sets/#check-the-size-of-the-oplog) to verify the current of your Oplog. The expect output is: ```yaml -configured oplog size: 10.10546875MB +configured oplog size: 10.10546875MB log length start to end: 94400 (26.22hrs) -oplog first event time: Mon Mar 19 2012 13:50:38 GMT-0400 (EDT) -oplog last event time: Wed Oct 03 2012 14:59:10 GMT-0400 (EDT) -now: Wed Oct 03 2012 15:00:21 GMT-0400 (EDT) +oplog first event time: Mon Mar 19 2012 13:50:38 GMT-0400 (EDT) +oplog last event time: Wed Oct 03 2012 14:59:10 GMT-0400 (EDT) +now: Wed Oct 03 2012 15:00:21 GMT-0400 (EDT) ``` When importing a large MongoDB collection for the first time, the import duration might exceed the Oplog retention period. The Oplog is crucial for incremental updates, and an invalid resume token will require the MongoDB collection to be re-imported to ensure no source updates were missed. ### Supported MongoDB Clusters -* Only supports [replica set](https://www.mongodb.com/docs/manual/replication/) cluster type. -* TLS/SSL is required by this connector. TLS/SSL is enabled by default for MongoDB Atlas clusters. To enable TSL/SSL connection for a self-hosted MongoDB instance, please refer to [MongoDb Documentation](https://docs.mongodb.com/manual/tutorial/configure-ssl/). -* Views, capped collections and clustered collections are not supported. -* Empty collections are excluded from schema discovery. -* Collections with different data types for the values in the `_id` field among the documents in a collection are not supported. All `_id` values within the collection must be the same data type. -* Atlas DB cluster are only supported in a dedicated M10 tier and above. Lower tiers may fail during connection setup. +- Only supports [replica set](https://www.mongodb.com/docs/manual/replication/) cluster type. +- TLS/SSL is required by this connector. TLS/SSL is enabled by default for MongoDB Atlas clusters. To enable TSL/SSL connection for a self-hosted MongoDB instance, please refer to [MongoDb Documentation](https://docs.mongodb.com/manual/tutorial/configure-ssl/). +- Views, capped collections and clustered collections are not supported. +- Empty collections are excluded from schema discovery. +- Collections with different data types for the values in the `_id` field among the documents in a collection are not supported. All `_id` values within the collection must be the same data type. +- Atlas DB cluster are only supported in a dedicated M10 tier and above. Lower tiers may fail during connection setup. ### Schema Discovery & Enforcement -* Schema discovery uses [sampling](https://www.mongodb.com/docs/manual/reference/operator/aggregation/sample/) of the documents to collect all distinct top-level fields. This value is universally applied to all collections discovered in the target database. The approach is modelled after [MongoDB Compass sampling](https://www.mongodb.com/docs/compass/current/sampling/) and is used for efficiency. By default, 10,000 documents are sampled. This value can be increased up to 100,000 documents to increase the likelihood that all fields will be discovered. However, the trade-off is time, as a higher value will take the process longer to sample the collection. -* When Running with Schema Enforced set to `false` there is no attempt to discover any schema. See more in [Schema Enforcement](#Schema-Enforcement). +- Schema discovery uses [sampling](https://www.mongodb.com/docs/manual/reference/operator/aggregation/sample/) of the documents to collect all distinct top-level fields. This value is universally applied to all collections discovered in the target database. The approach is modelled after [MongoDB Compass sampling](https://www.mongodb.com/docs/compass/current/sampling/) and is used for efficiency. By default, 10,000 documents are sampled. This value can be increased up to 100,000 documents to increase the likelihood that all fields will be discovered. However, the trade-off is time, as a higher value will take the process longer to sample the collection. +- When Running with Schema Enforced set to `false` there is no attempt to discover any schema. See more in [Schema Enforcement](#Schema-Enforcement). ## Configuration Parameters -| Parameter Name | Description | -|:-------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| Cluster Type | The type of the MongoDB cluster ([MongoDB Atlas](https://www.mongodb.com/atlas/database) replica set or self-hosted replica set). | -| Connection String | The connection string of the source MongoDB cluster. For Atlas hosted clusters, see [the quick start guide](#step-2-discover-the-mongodb-cluster-connection-string) for steps to find the connection string. For self-hosted clusters, refer to the [MongoDB connection string documentation](https://www.mongodb.com/docs/manual/reference/connection-string/#find-your-self-hosted-deployment-s-connection-string) for more information. | -| Database Name | The name of the database that contains the source collection(s) to sync. | -| Username | The username which is used to access the database. Required for MongoDB Atlas clusters. | -| Password | The password associated with this username. Required for MongoDB Atlas clusters. | -| Authentication Source | (MongoDB Atlas clusters only) Specifies the database that the supplied credentials should be validated against. Defaults to `admin`. See the [MongoDB documentation](https://www.mongodb.com/docs/manual/reference/connection-string/#mongodb-urioption-urioption.authSource) for more details. | -| Schema Enforced | Controls whether schema is discovered and enforced. See discussion in [Schema Enforcement](#Schema-Enforcement). | -| Initial Waiting Time in Seconds (Advanced) | The amount of time the connector will wait when it launches to determine if there is new data to sync or not. Defaults to 300 seconds. Valid range: 120 seconds to 1200 seconds. | -| Size of the queue (Advanced) | The size of the internal queue. This may interfere with memory consumption and efficiency of the connector, please be careful. | -| Discovery Sample Size (Advanced) | The maximum number of documents to sample when attempting to discover the unique fields for a collection. Default is 10,000 with a valid range of 1,000 to 100,000. See the [MongoDB sampling method](https://www.mongodb.com/docs/compass/current/sampling/#sampling-method) for more details. | -| Update Capture Mode (Advanced) | Determines how Airbyte looks up the value of an updated document. Default is "Lookup". **IMPORTANT** : "Post image" is only supported in MongoDB version 6.0+. In addition, the collections of interest must be setup to [return pre and post images](https://www.mongodb.com/docs/manual/changeStreams/#change-streams-with-document-pre-and-post-images). Failure to do so will lead to data loss. | +| Parameter Name | Description | +| :----------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Cluster Type | The type of the MongoDB cluster ([MongoDB Atlas](https://www.mongodb.com/atlas/database) replica set or self-hosted replica set). | +| Connection String | The connection string of the source MongoDB cluster. For Atlas hosted clusters, see [the quick start guide](#step-2-discover-the-mongodb-cluster-connection-string) for steps to find the connection string. For self-hosted clusters, refer to the [MongoDB connection string documentation](https://www.mongodb.com/docs/manual/reference/connection-string/#find-your-self-hosted-deployment-s-connection-string) for more information. | +| Database Name | The name of the database that contains the source collection(s) to sync. | +| Username | The username which is used to access the database. Required for MongoDB Atlas clusters. | +| Password | The password associated with this username. Required for MongoDB Atlas clusters. | +| Authentication Source | (MongoDB Atlas clusters only) Specifies the database that the supplied credentials should be validated against. Defaults to `admin`. See the [MongoDB documentation](https://www.mongodb.com/docs/manual/reference/connection-string/#mongodb-urioption-urioption.authSource) for more details. | +| Schema Enforced | Controls whether schema is discovered and enforced. See discussion in [Schema Enforcement](#Schema-Enforcement). | +| Initial Waiting Time in Seconds (Advanced) | The amount of time the connector will wait when it launches to determine if there is new data to sync or not. Defaults to 300 seconds. Valid range: 120 seconds to 1200 seconds. | +| Size of the queue (Advanced) | The size of the internal queue. This may interfere with memory consumption and efficiency of the connector, please be careful. | +| Discovery Sample Size (Advanced) | The maximum number of documents to sample when attempting to discover the unique fields for a collection. Default is 10,000 with a valid range of 1,000 to 100,000. See the [MongoDB sampling method](https://www.mongodb.com/docs/compass/current/sampling/#sampling-method) for more details. | +| Update Capture Mode (Advanced) | Determines how Airbyte looks up the value of an updated document. Default is "Lookup". **IMPORTANT** : "Post image" is only supported in MongoDB version 6.0+. In addition, the collections of interest must be setup to [return pre and post images](https://www.mongodb.com/docs/manual/changeStreams/#change-streams-with-document-pre-and-post-images). Failure to do so will lead to data loss. | For more information regarding configuration parameters, please see [MongoDb Documentation](https://docs.mongodb.com/drivers/java/sync/v4.10/fundamentals/connection/). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------------------------------- | | 1.3.12 | 2024-05-07 | [36851](https://github.com/airbytehq/airbyte/pull/36851) | Upgrade debezium to version 2.5.1. | | 1.3.11 | 2024-05-02 | [37753](https://github.com/airbytehq/airbyte/pull/37753) | Chunk size(limit) should correspond to ~1GB of data. | | 1.3.10 | 2024-05-02 | [37781](https://github.com/airbytehq/airbyte/pull/37781) | Adopt latest CDK. | @@ -261,7 +271,7 @@ For more information regarding configuration parameters, please see [MongoDb Doc | 1.0.7 | 2023-11-07 | [32250](https://github.com/airbytehq/airbyte/pull/32250) | Add support to read UUIDs. | | 1.0.6 | 2023-11-06 | [32193](https://github.com/airbytehq/airbyte/pull/32193) | Adopt java CDK version 0.4.1. | | 1.0.5 | 2023-10-31 | [32028](https://github.com/airbytehq/airbyte/pull/32028) | url encode username and password.
    Handle a case of document update and delete in a single sync. | -| 1.0.3 | 2023-10-19 | [31629](https://github.com/airbytehq/airbyte/pull/31629) | Allow discover operation use of disk file when an operation goes over max allowed mem | +| 1.0.3 | 2023-10-19 | [31629](https://github.com/airbytehq/airbyte/pull/31629) | Allow discover operation use of disk file when an operation goes over max allowed mem | | 1.0.2 | 2023-10-19 | [31596](https://github.com/airbytehq/airbyte/pull/31596) | Allow use of temp disk file when an operation goes over max allowed mem | | 1.0.1 | 2023-10-03 | [31034](https://github.com/airbytehq/airbyte/pull/31034) | Fix field filtering logic related to nested documents | | 1.0.0 | 2023-10-03 | [29969](https://github.com/airbytehq/airbyte/pull/29969) | General availability release using Change Data Capture (CDC) | diff --git a/docs/integrations/sources/mssql-migrations.md b/docs/integrations/sources/mssql-migrations.md index dc0c892f5d5..087f07fca7a 100644 --- a/docs/integrations/sources/mssql-migrations.md +++ b/docs/integrations/sources/mssql-migrations.md @@ -1,14 +1,16 @@ # Microsoft SQL Server (MSSQL) Migration Guide ## Upgrading to 4.0.0 + Source MSSQL provides incremental sync that can read unlimited sized tables and can resume if the initial read has failed. Upgrading from previous versions will be seamless and does not require any intervention. ## Upgrading to 3.0.0 + This change remapped date, datetime, datetime2, datetimeoffset, smalldatetime, and time data type to their correct Airbyte types. Customers whose streams have columns with the affected datatype must refresh their schema and reset their data. See chart below for the mapping change. | Mssql type | Current Airbyte Type | New Airbyte Type | -|----------------|----------------------|-------------------| +| -------------- | -------------------- | ----------------- | | date | string | date | | datetime | string | timestamp | | datetime2 | string | timestamp | @@ -16,11 +18,13 @@ This change remapped date, datetime, datetime2, datetimeoffset, smalldatetime, a | smalldatetime | string | timestamp | | time | string | time | -For current source-mssql users: +For current source-mssql users: + - If your streams do not contain any column of an affected data type, your connection will be unaffected. No further action is required from you. -- If your streams contain at least one column of an affected data type, you can opt in, refresh your schema, but *do not* reset your stream data. Once the sync starts, the Airbyte platform will trigger a schema change that will propagate to the destination tables. *Note:* In the case that your sync fails, please reset your data and rerun the sync. This will drop, recreate all the necessary tables, and reread the source data from the beginning. +- If your streams contain at least one column of an affected data type, you can opt in, refresh your schema, but _do not_ reset your stream data. Once the sync starts, the Airbyte platform will trigger a schema change that will propagate to the destination tables. _Note:_ In the case that your sync fails, please reset your data and rerun the sync. This will drop, recreate all the necessary tables, and reread the source data from the beginning. If resetting your stream data is an issue, please reach out to Airbyte Cloud support for assistance. ## Upgrading to 2.0.0 + CDC syncs now has default cursor field called `_ab_cdc_cursor`. You will need to force normalization to rebuild your destination tables by manually dropping the SCD tables, refreshing the connection schema (skipping the reset), and running a sync. Alternatively, you can just run a reset. diff --git a/docs/integrations/sources/mssql.md b/docs/integrations/sources/mssql.md index 1a2b3cb8612..0ac5a6713df 100644 --- a/docs/integrations/sources/mssql.md +++ b/docs/integrations/sources/mssql.md @@ -418,7 +418,7 @@ WHERE actor_definition_id ='b5ea17b1-f170-46dc-bc31-cc744ca984c1' AND (configura ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :---------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------- | | 4.0.18 | 2024-04-30 | [37451](https://github.com/airbytehq/airbyte/pull/37451) | Resumable full refresh read of tables. | | 4.0.17 | 2024-05-02 | [37781](https://github.com/airbytehq/airbyte/pull/37781) | Adopt latest CDK. | | 4.0.16 | 2024-05-01 | [37742](https://github.com/airbytehq/airbyte/pull/37742) | Adopt latest CDK. Remove Debezium retries. | diff --git a/docs/integrations/sources/my-hours.md b/docs/integrations/sources/my-hours.md index c4409372533..c3d949546dd 100644 --- a/docs/integrations/sources/my-hours.md +++ b/docs/integrations/sources/my-hours.md @@ -2,10 +2,10 @@ ## Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | This source syncs data from the [My Hours API](https://documenter.getpostman.com/view/8879268/TVmV4YYU). @@ -13,27 +13,27 @@ This source syncs data from the [My Hours API](https://documenter.getpostman.com This source allows you to synchronize the following data tables: -* Time logs -* Clients -* Projects -* Team members -* Tags +- Time logs +- Clients +- Projects +- Team members +- Tags ## Getting started **Requirements** + - In order to use the My Hours API you need to provide the credentials to an admin My Hours account. ### Performance Considerations (Airbyte Open Source) Depending on the amount of team members and time logs the source provides a property to change the pagination size for the time logs query. Typically a pagination of 30 days is a correct balance between reliability and speed. But if you have a big amount of monthly entries you might want to change this value to a lower value. - ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :----------------------------------- | -| 0.2.0 | 2024-03-15 | [36063](https://github.com/airbytehq/airbyte/pull/36063) | Migrate to Low Code | -| 0.1.2 | 2023-11-20 | [32680](https://github.com/airbytehq/airbyte/pull/32680) | Schema and CDK updates | -| 0.1.1 | 2022-06-08 | [12964](https://github.com/airbytehq/airbyte/pull/12964) | Update schema for time_logs stream | -| 0.1.0 | 2021-11-26 | [8270](https://github.com/airbytehq/airbyte/pull/8270) | New Source: My Hours | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------- | +| 0.2.0 | 2024-03-15 | [36063](https://github.com/airbytehq/airbyte/pull/36063) | Migrate to Low Code | +| 0.1.2 | 2023-11-20 | [32680](https://github.com/airbytehq/airbyte/pull/32680) | Schema and CDK updates | +| 0.1.1 | 2022-06-08 | [12964](https://github.com/airbytehq/airbyte/pull/12964) | Update schema for time_logs stream | +| 0.1.0 | 2021-11-26 | [8270](https://github.com/airbytehq/airbyte/pull/8270) | New Source: My Hours | diff --git a/docs/integrations/sources/mysql-migrations.md b/docs/integrations/sources/mysql-migrations.md index b593d3af36d..3648438f2de 100644 --- a/docs/integrations/sources/mysql-migrations.md +++ b/docs/integrations/sources/mysql-migrations.md @@ -1,4 +1,5 @@ # MySQL Migration Guide ## Upgrading to 3.0.0 -CDC syncs now has default cursor field called `_ab_cdc_cursor`. You will need to force normalization to rebuild your destination tables by manually dropping the SCD tables, refreshing the connection schema (skipping the reset), and running a sync. Alternatively, you can just run a reset. \ No newline at end of file + +CDC syncs now has default cursor field called `_ab_cdc_cursor`. You will need to force normalization to rebuild your destination tables by manually dropping the SCD tables, refreshing the connection schema (skipping the reset), and running a sync. Alternatively, you can just run a reset. diff --git a/docs/integrations/sources/mysql.md b/docs/integrations/sources/mysql.md index 1756ebeca45..ea11b9ae2d9 100644 --- a/docs/integrations/sources/mysql.md +++ b/docs/integrations/sources/mysql.md @@ -1,20 +1,21 @@ # MySQL Airbyte's certified MySQL connector offers the following features: -* Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) using the [binlog](https://dev.mysql.com/doc/refman/8.0/en/binary-log.html). -* All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. -* Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. + +- Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) using the [binlog](https://dev.mysql.com/doc/refman/8.0/en/binary-log.html). +- All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. +- Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. The contents below include a 'Quick Start' guide, advanced setup steps, and reference information (data type mapping and changelogs). **Please note the minimum required platform version is v0.58.0 to run source-mysql 3.4.0.** - ![Airbyte MySQL Connection](https://raw.githubusercontent.com/airbytehq/airbyte/3a9264666b7b9b9d10ef8d174b8454a6c7e57560/docs/integrations/sources/mysql/assets/airbyte_mysql_source.png) ## Quick Start Here is an outline of the minimum required steps to configure a MySQL connector: + 1. Create a dedicated read-only MySQL user with permissions for replicating data 2. Create a new MySQL source in the Airbyte UI using CDC logical replication 3. (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs @@ -85,18 +86,21 @@ From your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open To fill out the required information: + 1. Enter the hostname, port number, and name for your MySQL database. 2. Enter the username and password you created in [Step 1](#step-1-create-a-dedicated-read-only-mysql-user). 3. Select an SSL mode. You will most frequently choose `require` or `verify-ca`. Both of these always require encryption. `verify-ca` also requires certificates from your MySQL database. See [here](#ssl-modes) to learn about other SSL modes and SSH tunneling. 4. Select `Read Changes using Binary Log (CDC)` from available replication methods. + #### Step 4: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs. If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in our [Airbyte Security docs](../../operating-airbyte/security#network-security-1). Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte MySQL source! + @@ -106,6 +110,7 @@ Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting t ### Change Data Capture \(CDC\) Airbyte uses logical replication of the [MySQL binlog](https://dev.mysql.com/doc/refman/8.0/en/binary-log.html) to incrementally capture deletes. To learn more how Airbyte implements CDC, refer to [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc/). We generally recommend configure your MySQL source with CDC whenever possible, as it provides: + - A record of deletions, if needed. - Scalable replication to large tables (1 TB and more). - A reliable cursor not reliant on the nature of your data. For example, if your table has a primary key but doesn't have a reasonable cursor field for incremental syncing \(i.e. `updated_at`\), CDC allows you to sync your table incrementally. @@ -115,6 +120,7 @@ Airbyte uses logical replication of the [MySQL binlog](https://dev.mysql.com/doc ### Standard Airbyte offers incremental replication using a custom cursor available in your source tables (e.g. `updated_at`). We generally recommend against this replication method, but it is well suited for the following cases: + - Your MySQL server does not expose the binlog. - Your data set is small, and you just want snapshot of your table in the destination. @@ -129,6 +135,7 @@ Airbyte offers incremental replication using a custom cursor available in your s Airbyte Cloud uses SSL by default. You are not permitted to `disable` SSL while using Airbyte Cloud. Here is a breakdown of available SSL connection modes: + - `disable` to disable encrypted communication between Airbyte and the source - `allow` to enable encrypted communication only when required by the source - `prefer` to allow unencrypted communication only when the source doesn't support encryption @@ -147,14 +154,14 @@ When using an SSH tunnel, you are configuring Airbyte to connect to an intermedi To connect to a MySQL server via an SSH tunnel: 1. While setting up the MySQL source connector, from the SSH tunnel dropdown, select: - - SSH Key Authentication to use a private as your secret for establishing the SSH tunnel - - Password Authentication to use a password as your secret for establishing the SSH Tunnel + - SSH Key Authentication to use a private as your secret for establishing the SSH tunnel + - Password Authentication to use a password as your secret for establishing the SSH Tunnel 2. For **SSH Tunnel Jump Server Host**, enter the hostname or IP address for the intermediate (bastion) server that Airbyte will connect to. 3. For **SSH Connection Port**, enter the port on the bastion server. The default port for SSH connections is 22. 4. For **SSH Login Username**, enter the username to use when connecting to the bastion server. **Note:** This is the operating system username and not the MySQL username. 5. For authentication: - - If you selected **SSH Key Authentication**, set the **SSH Private Key** to the [private Key](#generating-a-private-key-for-ssh-tunneling) that you are using to create the SSH connection. - - If you selected **Password Authentication**, enter the password for the operating system user to connect to the bastion server. **Note:** This is the operating system password and not the MySQL password. + - If you selected **SSH Key Authentication**, set the **SSH Private Key** to the [private Key](#generating-a-private-key-for-ssh-tunneling) that you are using to create the SSH connection. + - If you selected **Password Authentication**, enter the password for the operating system user to connect to the bastion server. **Note:** This is the operating system password and not the MySQL password. #### Generating a private key for SSH Tunneling @@ -180,7 +187,6 @@ Any database or table encoding combination of charset and collation is supported
    MySQL Data Type Mapping - | MySQL Type | Resulting Type | Notes | | :---------------------------------------- | :--------------------- | :------------------------------------------------------------------------------------------------------------- | | `bit(1)` | boolean | | @@ -218,14 +224,12 @@ Any database or table encoding combination of charset and collation is supported | `set` | string | E.g. `blue,green,yellow` | | `geometry` | base64 binary string | | -
    ## Changelog - | Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :--------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------- | | 3.4.1 | 2024-05-03 | [37824](https://github.com/airbytehq/airbyte/pull/37824) | Fixed a bug on Resumeable full refresh where cursor based source throw NPE. | | 3.4.0 | 2024-05-02 | [36932](https://github.com/airbytehq/airbyte/pull/36932) | Resumeable full refresh. Note please upgrade your platform - minimum platform version is 0.58.0. | | 3.3.25 | 2024-05-02 | [37781](https://github.com/airbytehq/airbyte/pull/37781) | Adopt latest CDK. | diff --git a/docs/integrations/sources/mysql/mysql-troubleshooting.md b/docs/integrations/sources/mysql/mysql-troubleshooting.md index 50d78109c02..690ee5d11c1 100644 --- a/docs/integrations/sources/mysql/mysql-troubleshooting.md +++ b/docs/integrations/sources/mysql/mysql-troubleshooting.md @@ -15,9 +15,9 @@ ### Common Config Errors -* Mapping MySQL's DateTime field: There may be problems with mapping values in MySQL's datetime field to other relational data stores. MySQL permits zero values for date/time instead of NULL which may not be accepted by other data stores. To work around this problem, you can pass the following key value pair in the JDBC connector of the source setting `zerodatetimebehavior=Converttonull`. -* Amazon RDS MySQL or MariaDB connection issues: If you see the following `Cannot create a PoolableConnectionFactory` error, please add `enabledTLSProtocols=TLSv1.2` in the JDBC parameters. -* Amazon RDS MySQL connection issues: If you see `Error: HikariPool-1 - Connection is not available, request timed out after 30001ms.`, many times this due to your VPC not allowing public traffic. We recommend going through [this AWS troubleshooting checklist](https://aws.amazon.com/premiumsupport/knowledge-center/rds-cannot-connect/) to ensure the correct permissions/settings have been granted to allow Airbyte to connect to your database. +- Mapping MySQL's DateTime field: There may be problems with mapping values in MySQL's datetime field to other relational data stores. MySQL permits zero values for date/time instead of NULL which may not be accepted by other data stores. To work around this problem, you can pass the following key value pair in the JDBC connector of the source setting `zerodatetimebehavior=Converttonull`. +- Amazon RDS MySQL or MariaDB connection issues: If you see the following `Cannot create a PoolableConnectionFactory` error, please add `enabledTLSProtocols=TLSv1.2` in the JDBC parameters. +- Amazon RDS MySQL connection issues: If you see `Error: HikariPool-1 - Connection is not available, request timed out after 30001ms.`, many times this due to your VPC not allowing public traffic. We recommend going through [this AWS troubleshooting checklist](https://aws.amazon.com/premiumsupport/knowledge-center/rds-cannot-connect/) to ensure the correct permissions/settings have been granted to allow Airbyte to connect to your database. ### Under CDC incremental mode, there are still full refresh syncs @@ -28,8 +28,8 @@ Normally under the CDC mode, the MySQL source will first run a full refresh sync The root causes is that the binglogs needed for the incremental sync have been removed by MySQL. This can occur under the following scenarios: - When there are lots of database updates resulting in more WAL files than allowed in the `pg_wal` directory, Postgres will purge or archive the WAL files. This scenario is preventable. Possible solutions include: - - Sync the data source more frequently. - - Set a higher `binlog_expire_logs_seconds`. It's recommended to set this value to a time period of 7 days. See detailed documentation [here](https://dev.mysql.com/doc/refman/8.0/en/replication-options-binary-log.html#sysvar_binlog_expire_logs_seconds). The downside of this approach is that more disk space will be needed. + - Sync the data source more frequently. + - Set a higher `binlog_expire_logs_seconds`. It's recommended to set this value to a time period of 7 days. See detailed documentation [here](https://dev.mysql.com/doc/refman/8.0/en/replication-options-binary-log.html#sysvar_binlog_expire_logs_seconds). The downside of this approach is that more disk space will be needed. ### EventDataDeserializationException errors during initial snapshot diff --git a/docs/integrations/sources/nasa.md b/docs/integrations/sources/nasa.md index 5c3cde2a886..bfd3b5390c4 100644 --- a/docs/integrations/sources/nasa.md +++ b/docs/integrations/sources/nasa.md @@ -8,14 +8,14 @@ The NASA source supports full refresh syncs Asingle output stream is available (at the moment) from this source: -*[APOD](https://github.com/nasa/apod-api#docs-). +\*[APOD](https://github.com/nasa/apod-api#docs-). If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | SSL connection | No | @@ -29,16 +29,17 @@ The NASA connector should not run into NASA API limitations under normal usage. ### Requirements -* NASA API Key. You can use `DEMO_KEY` (see rate limits [here](https://api.nasa.gov/)). +- NASA API Key. You can use `DEMO_KEY` (see rate limits [here](https://api.nasa.gov/)). ### Connect using `API Key`: + 1. Generate an API Key as described [here](https://api.nasa.gov/). 2. Use the generated `API Key` in the Airbyte connection. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------| -| 0.2.0 | 2023-10-10 | [31051](https://github.com/airbytehq/airbyte/pull/31051) | Migrate to lowcode | -| 0.1.1 | 2023-02-13 | [22934](https://github.com/airbytehq/airbyte/pull/22934) | Specified date formatting in specification | -| 0.1.0 | 2022-10-24 | [18394](https://github.com/airbytehq/airbyte/pull/18394) | 🎉 New Source: NASA APOD | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------- | +| 0.2.0 | 2023-10-10 | [31051](https://github.com/airbytehq/airbyte/pull/31051) | Migrate to lowcode | +| 0.1.1 | 2023-02-13 | [22934](https://github.com/airbytehq/airbyte/pull/22934) | Specified date formatting in specification | +| 0.1.0 | 2022-10-24 | [18394](https://github.com/airbytehq/airbyte/pull/18394) | 🎉 New Source: NASA APOD | diff --git a/docs/integrations/sources/netsuite.md b/docs/integrations/sources/netsuite.md index 4183b4722c8..c80ef17385c 100644 --- a/docs/integrations/sources/netsuite.md +++ b/docs/integrations/sources/netsuite.md @@ -5,28 +5,35 @@ One unified business management suite, encompassing ERP/Financials, CRM and ecom This connector implements the [SuiteTalk REST Web Services](https://docs.oracle.com/en/cloud/saas/netsuite/ns-online-help/chapter_1540391670.html) and uses REST API to fetch the customers data. ## Prerequisites -* Oracle NetSuite [account](https://system.netsuite.com/pages/customerlogin.jsp?country=US) -* Allowed access to all Account permissions options + +- Oracle NetSuite [account](https://system.netsuite.com/pages/customerlogin.jsp?country=US) +- Allowed access to all Account permissions options ## Airbyte OSS and Airbyte Cloud -* Realm (Account ID) -* Consumer Key -* Consumer Secret -* Token ID -* Token Secret + +- Realm (Account ID) +- Consumer Key +- Consumer Secret +- Token ID +- Token Secret ## Setup guide + ### Step 1: Create NetSuite account 1. Create [account](https://system.netsuite.com/pages/customerlogin.jsp?country=US) on Oracle NetSuite 2. Confirm your Email ### Step 2: Setup NetSuite account + #### Step 2.1: Obtain Realm info + 1. Login into your NetSuite [account](https://system.netsuite.com/pages/customerlogin.jsp?country=US) 2. Go to **Setup** » **Company** » **Company Information** 3. Copy your Account ID (Realm). It should look like **1234567** for the `Production` env. or **1234567_SB2** - for a `Sandbox` + #### Step 2.2: Enable features + 1. Go to **Setup** » **Company** » **Enable Features** 2. Click on **SuiteCloud** tab 3. Scroll down to **SuiteScript** section @@ -36,14 +43,18 @@ This connector implements the [SuiteTalk REST Web Services](https://docs.oracle. 7. Scroll down to **SuiteTalk (Web Services)** 8. Enable checkbox `REST WEB SERVISES` 9. Save the changes + #### Step 2.3: Create Integration (obtain Consumer Key and Consumer Secret) + 1. Go to **Setup** » **Integration** » **Manage Integrations** » **New** 2. Fill the **Name** field (we recommend to put `airbyte-rest-integration` for a name) 3. Make sure the **State** is `enabled` 4. Enable checkbox `Token-Based Authentication` in **Authentication** section 5. Save changes 6. After that, **Consumer Key** and **Consumer Secret** will be showed once (copy them to the safe place) + #### Step 2.4: Setup Role + 1. Go to **Setup** » **Users/Roles** » **Manage Roles** » **New** 2. Fill the **Name** field (we recommend to put `airbyte-integration-role` for a name) 3. Scroll down to **Permissions** tab @@ -51,10 +62,12 @@ This connector implements the [SuiteTalk REST Web Services](https://docs.oracle. 5. (REQUIRED) Click on `Reports` and manually `add` all the dropdown entities with either `full` or `view` access level. 6. (REQUIRED) Click on `Lists` and manually `add` all the dropdown entities with either `full` or `view` access level. 7. (REQUIRED) Click on `Setup` and manually `add` all the dropdown entities with either `full` or `view` access level. -* Make sure you've done all `REQUIRED` steps correctly, to avoid sync issues in the future. -* Please edit these params again when you `rename` or `customise` any `Object` in Netsuite for `airbyte-integration-role` to reflect such changes. + +- Make sure you've done all `REQUIRED` steps correctly, to avoid sync issues in the future. +- Please edit these params again when you `rename` or `customise` any `Object` in Netsuite for `airbyte-integration-role` to reflect such changes. #### Step 2.5: Setup User + 1. Go to **Setup** » **Users/Roles** » **Manage Users** 2. In column `Name` click on the user’s name you want to give access to the `airbyte-integration-role` 3. Then click on **Edit** button under the user’s name @@ -63,6 +76,7 @@ This connector implements the [SuiteTalk REST Web Services](https://docs.oracle. 6. Save changes #### Step 2.6: Create Access Token for role + 1. Go to **Setup** » **Users/Roles** » **Access Tokens** » **New** 2. Select an **Application Name** 3. Under **User** select the user you assigned the `airbyte-integration-role` in the step **2.4** @@ -72,15 +86,18 @@ This connector implements the [SuiteTalk REST Web Services](https://docs.oracle. 7. After that, **Token ID** and **Token Secret** will be showed once (copy them to the safe place) #### Step 2.7: Summary + You have copied next parameters -* Realm (Account ID) -* Consumer Key -* Consumer Secret -* Token ID -* Token Secret -Also you have properly **Configured Account** with **Correct Permissions** and **Access Token** for User and Role you've created early. + +- Realm (Account ID) +- Consumer Key +- Consumer Secret +- Token ID +- Token Secret + Also you have properly **Configured Account** with **Correct Permissions** and **Access Token** for User and Role you've created early. ### Step 3: Set up the source connector in Airbyte + ### For Airbyte Cloud: 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -105,26 +122,25 @@ Also you have properly **Configured Account** with **Correct Permissions** and * 8. Add **Token Secret** 9. Click `Set up source` - ## Supported sync modes The NetSuite source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh - - Incremental + +- Full Refresh +- Incremental ## Supported Streams - Streams are generated based on `ROLE` and `USER` access to them as well as `Account` settings, make sure you're using the correct role assigned in our case `airbyte-integration-role` or any other custom `ROLE` granted to the Access Token, having the access to the NetSuite objects for data sync, please refer to the **Setup guide** > **Step 2.4** and **Setup guide** > **Step 2.5** - ## Performance considerations The connector is restricted by Netsuite [Concurrency Limit per Integration](https://docs.oracle.com/en/cloud/saas/netsuite/ns-online-help/bridgehead_156224824287.html). ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------- | | 0.1.3 | 2023-01-20 | [21645](https://github.com/airbytehq/airbyte/pull/21645) | Minor issues fix, Setup Guide corrections for public docs | -| 0.1.1 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state | -| 0.1.0 | 2022-09-15 | [16093](https://github.com/airbytehq/airbyte/pull/16093) | Initial Alpha release | +| 0.1.1 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state | +| 0.1.0 | 2022-09-15 | [16093](https://github.com/airbytehq/airbyte/pull/16093) | Initial Alpha release | diff --git a/docs/integrations/sources/news-api.md b/docs/integrations/sources/news-api.md index 6e6645d14a6..f0f05f609bc 100644 --- a/docs/integrations/sources/news-api.md +++ b/docs/integrations/sources/news-api.md @@ -10,13 +10,13 @@ chosen, or just top headlines. This source is capable of syncing the following streams: -* `everything` -* `top_headlines` +- `everything` +- `top_headlines` ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:------| +| :---------------- | :-------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | @@ -56,7 +56,7 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------- | | 0.1.1 | 2023-04-30 | [25554](https://github.com/airbytehq/airbyte/pull/25554) | Make manifest connector builder friendly | -| 0.1.0 | 2022-10-21 | [18301](https://github.com/airbytehq/airbyte/pull/18301) | New source | +| 0.1.0 | 2022-10-21 | [18301](https://github.com/airbytehq/airbyte/pull/18301) | New source | diff --git a/docs/integrations/sources/newsdata.md b/docs/integrations/sources/newsdata.md index 9446b593f6f..aa826b689fc 100644 --- a/docs/integrations/sources/newsdata.md +++ b/docs/integrations/sources/newsdata.md @@ -8,17 +8,17 @@ This source retrieves the latests news from the [Newsdata API](https://newsdata. This source is capable of syncing the following streams: -* `latest` -* `sources` - - __NOTE__: `category`, `language` and `country` input parameters only accept a single value, not multiple like `latest` stream. - Thus, if several values are supplied, the first one will be the one to be used. +- `latest` +- `sources` + - **NOTE**: `category`, `language` and `country` input parameters only accept a single value, not multiple like `latest` stream. + Thus, if several values are supplied, the first one will be the one to be used. If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features | Feature | Supported? | Notes | -|:------------------|------------|:------| +| :---------------- | ---------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | @@ -43,9 +43,9 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------| -| 0.1.3 | 2024-04-19 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | schema descriptions | -| 0.1.0 | 2022-10-21 | [18576](https://github.com/airbytehq/airbyte/pull/18576) | 🎉 New Source: Newsdata | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37203](https://github.com/airbytehq/airbyte/pull/37203) | schema descriptions | +| 0.1.0 | 2022-10-21 | [18576](https://github.com/airbytehq/airbyte/pull/18576) | 🎉 New Source: Newsdata | diff --git a/docs/integrations/sources/notion-migrations.md b/docs/integrations/sources/notion-migrations.md index 0f77fa22d13..bb52c859917 100644 --- a/docs/integrations/sources/notion-migrations.md +++ b/docs/integrations/sources/notion-migrations.md @@ -13,9 +13,9 @@ If you are not syncing data from the `Comments` stream, this change is non-break Data for the `Comments` stream will need to cleared to ensure your syncs continue successfully. To clear your data for the `Comments` stream, follow the steps below: 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Status** tab. - 1. In the **Enabled streams** list, click the three dots on the right side of the **Comments** stream and select **Clear data**. + 1. In the **Enabled streams** list, click the three dots on the right side of the **Comments** stream and select **Clear data**. After the clear succeeds, trigger a sync for the `Comments` stream by clicking "Sync Now". For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/notion.md b/docs/integrations/sources/notion.md index 553118163d3..b58bd7a0032 100644 --- a/docs/integrations/sources/notion.md +++ b/docs/integrations/sources/notion.md @@ -14,9 +14,11 @@ To authenticate the Notion source connector, you need to use **one** of the foll - Access Token + :::note **For Airbyte Cloud users:** We highly recommend using OAuth2.0 authorization to connect to Notion, as this method significantly simplifies the setup process. If you use OAuth2.0 authorization in Airbyte Cloud, you do **not** need to create and configure a new integration in Notion. Instead, you can proceed straight to [setting up the connector in Airbyte](#step-3-set-up-the-notion-connector-in-airbyte). ::: + We have provided a quick setup guide for creating an integration in Notion below. If you would like more detailed information and context on Notion integrations, or experience any difficulties with the integration setup process, please refer to the [official Notion documentation](https://developers.notion.com/docs). @@ -66,6 +68,7 @@ If you are authenticating via OAuth2.0 for **Airbyte Open Source**, you will nee 5. Choose the method of authentication from the dropdown menu: + #### Authentication for Airbyte Cloud - **OAuth2.0** (Recommended): Click **Authenticate your Notion account**. When the popup appears, click **Select pages**. Check the pages you want to give Airbyte access to, and click **Allow access**. @@ -73,6 +76,7 @@ If you are authenticating via OAuth2.0 for **Airbyte Open Source**, you will nee + #### Authentication for Airbyte Open Source - **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your private integration's page. @@ -87,12 +91,12 @@ If you are authenticating via OAuth2.0 for **Airbyte Open Source**, you will nee The Notion source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Stream | Full Refresh (Overwrite/Append) | Incremental (Append/Append + Deduped) | -|-----------|:------------:|:-----------:| -| Blocks | ✓ | ✓ | -| Comments | ✓ | ✓ | -| Databases | ✓ | ✓ | -| Pages | ✓ | ✓ | -| Users | ✓ | | +| --------- | :-----------------------------: | :-----------------------------------: | +| Blocks | ✓ | ✓ | +| Comments | ✓ | ✓ | +| Databases | ✓ | ✓ | +| Pages | ✓ | ✓ | +| Users | ✓ | | ## Supported Streams @@ -111,7 +115,7 @@ The connector is restricted by Notion [request limits](https://developers.notion ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------- | | 3.0.1 | 2024-04-24 | [36653](https://github.com/airbytehq/airbyte/pull/36653) | Schema descriptions and CDK 0.80.0 | | 3.0.0 | 2024-04-12 | [35794](https://github.com/airbytehq/airbyte/pull/35974) | Migrate to low-code CDK (python CDK for Blocks stream) | | 2.2.0 | 2024-04-08 | [36890](https://github.com/airbytehq/airbyte/pull/36890) | Unpin CDK version | diff --git a/docs/integrations/sources/nytimes.md b/docs/integrations/sources/nytimes.md index a7569d16e79..ee842f954a2 100644 --- a/docs/integrations/sources/nytimes.md +++ b/docs/integrations/sources/nytimes.md @@ -8,17 +8,17 @@ The New York Times source supports full refresh syncs Several output streams are available from this source: -*[Archive](https://developer.nytimes.com/docs/archive-product/1/overview). -*[Most Popular Emailed Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/emailed/%7Bperiod%7D.json/get). -*[Most Popular Shared Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/shared/%7Bperiod%7D.json/get). -*[Most Popular Viewed Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/viewed/%7Bperiod%7D.json/get). +_[Archive](https://developer.nytimes.com/docs/archive-product/1/overview). +_[Most Popular Emailed Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/emailed/%7Bperiod%7D.json/get). +_[Most Popular Shared Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/shared/%7Bperiod%7D.json/get). +_[Most Popular Viewed Articles](https://developer.nytimes.com/docs/most-popular-product/1/routes/viewed/%7Bperiod%7D.json/get). If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | @@ -30,7 +30,7 @@ The New York Times connector should not run into limitations under normal usage. ### Requirements -* New York Times API Key. +- New York Times API Key. ### Connect using `API Key`: @@ -40,11 +40,11 @@ The New York Times connector should not run into limitations under normal usage. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------| -| 0.1.5 | 2024-04-19 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Updating to 0.80.0 CDK | -| 0.1.4 | 2024-04-18 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Manage dependencies with Poetry. | -| 0.1.3 | 2024-04-15 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.2 | 2024-04-12 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | schema descriptions | -| 0.1.1 | 2023-02-13 | [22925](https://github.com/airbytehq/airbyte/pull/22925) | Specified date formatting in specification | -| 0.1.0 | 2022-11-01 | [18746](https://github.com/airbytehq/airbyte/pull/18746) | 🎉 New Source: New York Times | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.5 | 2024-04-19 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Updating to 0.80.0 CDK | +| 0.1.4 | 2024-04-18 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Manage dependencies with Poetry. | +| 0.1.3 | 2024-04-15 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.2 | 2024-04-12 | [37204](https://github.com/airbytehq/airbyte/pull/37204) | schema descriptions | +| 0.1.1 | 2023-02-13 | [22925](https://github.com/airbytehq/airbyte/pull/22925) | Specified date formatting in specification | +| 0.1.0 | 2022-11-01 | [18746](https://github.com/airbytehq/airbyte/pull/18746) | 🎉 New Source: New York Times | diff --git a/docs/integrations/sources/okta.md b/docs/integrations/sources/okta.md index 2cb60f58492..ae0de902d20 100644 --- a/docs/integrations/sources/okta.md +++ b/docs/integrations/sources/okta.md @@ -3,22 +3,26 @@ Okta is the complete identity solution for all your apps and people that’s universal, reliable, and easy ## Prerequisites -* Created Okta account with added application on [Add Application Page](https://okta-domain.okta.com/enduser/catalog) page. (change okta-domain to you'r domain received after complete registration) + +- Created Okta account with added application on [Add Application Page](https://okta-domain.okta.com/enduser/catalog) page. (change okta-domain to you'r domain received after complete registration) ## Airbyte Open Source -* Name -* Okta-Domain -* Start Date -* Personal Api Token (look [here](https://developer.okta.com/docs/guides/find-your-domain/-/main/) to find it) + +- Name +- Okta-Domain +- Start Date +- Personal Api Token (look [here](https://developer.okta.com/docs/guides/find-your-domain/-/main/) to find it) ## Airbyte Cloud -* Name -* Start Date -* Client ID (received when application was added). -* Client Secret (received when application was added). -* Refresh Token (received when application was added) + +- Name +- Start Date +- Client ID (received when application was added). +- Client Secret (received when application was added). +- Refresh Token (received when application was added) ## Setup guide + ### Step 1: Set up Okta 1. Create account on Okta by following link [signup](https://www.okta.com/free-trial/) @@ -35,29 +39,29 @@ Okta is the complete identity solution for all your apps and people that’s uni 5. Add **Okta Domain** (If your Okta URL is `https://MY_DOMAIN.okta.com/`, then `MY_DOMAIN` is your Okta domain.) 6. Add **Start date** (defaults to 7 days if no date is included) 7. Choose the method of authentication -8. If you select Token authentication - fill the field **Personal Api Token** +8. If you select Token authentication - fill the field **Personal Api Token** 9. If you select OAuth2.0 authorization - fill the fields **Client ID**, **Client Secret**, **Refresh Token** 10. Click `Set up source`. ### For Airbyte Open Source: 1. Go to local Airbyte page. -2. Use API token from requirements and Okta [domain](https://developer.okta.com/docs/guides/find-your-domain/-/main/). +2. Use API token from requirements and Okta [domain](https://developer.okta.com/docs/guides/find-your-domain/-/main/). 3. Go to local Airbyte page. -4. In the left navigation bar, click **Sources**. In the top-right corner, click **+ new source**. -5. On the Set up the source page select **Okta** from the Source type dropdown. +4. In the left navigation bar, click **Sources**. In the top-right corner, click **+ new source**. +5. On the Set up the source page select **Okta** from the Source type dropdown. 6. Add **Name** 7. Add **Okta-Domain** 8. Add **Start date** 9. Paste all data to required fields fill the fields **Client ID**, **Client Secret**, **Refresh Token** 10. Click `Set up source`. - ## Supported sync modes The Okta source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh - - Incremental + +- Full Refresh +- Incremental ## Supported Streams @@ -78,14 +82,14 @@ The connector is restricted by normal Okta [requests limitation](https://develop ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------- | | 0.1.16 | 2023-07-07 | [20833](https://github.com/airbytehq/airbyte/pull/20833) | Fix infinite loop for GroupMembers stream | -| 0.1.15 | 2023-06-20 | [27533](https://github.com/airbytehq/airbyte/pull/27533) | Fixed group member stream and resource sets stream pagination | +| 0.1.15 | 2023-06-20 | [27533](https://github.com/airbytehq/airbyte/pull/27533) | Fixed group member stream and resource sets stream pagination | | 0.1.14 | 2022-12-24 | [20877](https://github.com/airbytehq/airbyte/pull/20877) | Disabled OAuth2.0 authorization method | | 0.1.13 | 2022-08-12 | [14700](https://github.com/airbytehq/airbyte/pull/14700) | Add resource sets | | 0.1.12 | 2022-08-05 | [15050](https://github.com/airbytehq/airbyte/pull/15050) | Add parameter `start_date` for Logs stream | | 0.1.11 | 2022-08-03 | [14739](https://github.com/airbytehq/airbyte/pull/14739) | Add permissions for custom roles | -| 0.1.10 | 2022-08-01 | [15179](https://github.com/airbytehq/airbyte/pull/15179) | Fix broken schemas for all streams | +| 0.1.10 | 2022-08-01 | [15179](https://github.com/airbytehq/airbyte/pull/15179) | Fix broken schemas for all streams | | 0.1.9 | 2022-07-25 | [15001](https://github.com/airbytehq/airbyte/pull/15001) | Return deprovisioned users | | 0.1.8 | 2022-07-19 | [14710](https://github.com/airbytehq/airbyte/pull/14710) | Implement OAuth2.0 authorization method | | 0.1.7 | 2022-07-13 | [14556](https://github.com/airbytehq/airbyte/pull/14556) | Add User_Role_Assignments and Group_Role_Assignments streams (full fetch only) | diff --git a/docs/integrations/sources/omnisend.md b/docs/integrations/sources/omnisend.md index 613b9c54b8a..c1e4b3cd6e6 100644 --- a/docs/integrations/sources/omnisend.md +++ b/docs/integrations/sources/omnisend.md @@ -6,18 +6,18 @@ This source can sync data from the [Omnisend API](https://api-docs.omnisend.com/ ## This Source Supports the Following Streams -* contacts -* campaigns -* carts -* orders -* products +- contacts +- campaigns +- carts +- orders +- products ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -27,10 +27,10 @@ The connector has a rate limit of 400 requests per 1 minute. ### Requirements -* Omnisend API Key +- Omnisend API Key ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-25 | [18577](https://github.com/airbytehq/airbyte/pull/18577) | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-25 | [18577](https://github.com/airbytehq/airbyte/pull/18577) | Initial commit | diff --git a/docs/integrations/sources/onesignal.md b/docs/integrations/sources/onesignal.md index 2f9df7587ac..c8978a3c941 100644 --- a/docs/integrations/sources/onesignal.md +++ b/docs/integrations/sources/onesignal.md @@ -74,7 +74,7 @@ The connector is restricted by normal OneSignal [rate limits](https://documentat ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------- | | 1.1.0 | 2023-08-31 | [28941](https://github.com/airbytehq/airbyte/pull/28941) | Migrate connector to low-code | | 1.0.1 | 2023-03-14 | [24076](https://github.com/airbytehq/airbyte/pull/24076) | Fix schema and add additionalProperties true | | 1.0.0 | 2023-03-14 | [24076](https://github.com/airbytehq/airbyte/pull/24076) | Update connectors spec; fix incremental sync | diff --git a/docs/integrations/sources/open-exchange-rates.md b/docs/integrations/sources/open-exchange-rates.md index d5f602908a3..6b3a33ef2e4 100644 --- a/docs/integrations/sources/open-exchange-rates.md +++ b/docs/integrations/sources/open-exchange-rates.md @@ -10,9 +10,9 @@ It contains one stream: `open_exchange_rates` Each record in the stream contains many fields: -* The timestamp of the record -* Base currency -* The conversion rates from the base currency to the target currency +- The timestamp of the record +- Base currency +- The conversion rates from the base currency to the target currency #### Data type mapping @@ -20,17 +20,17 @@ Currencies are `number` and the date is a `string`. #### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| Namespaces | No | ### Getting started ### Requirements -* App ID +- App ID ### Setup guide @@ -43,11 +43,11 @@ If you have `free` subscription plan \(you may check it [here](https://openexcha ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------- | -| 0.2.4 | 2024-04-19 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | schema descriptions | -| 0.2.0 | 2023-10-03 | [30983](https://github.com/airbytehq/airbyte/pull/30983) | Migrate to low code | -| 0.1.0 | 2022-11-15 | [19436](https://github.com/airbytehq/airbyte/issues/19436) | Created CDK native Open Exchange Rates connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37208](https://github.com/airbytehq/airbyte/pull/37208) | schema descriptions | +| 0.2.0 | 2023-10-03 | [30983](https://github.com/airbytehq/airbyte/pull/30983) | Migrate to low code | +| 0.1.0 | 2022-11-15 | [19436](https://github.com/airbytehq/airbyte/issues/19436) | Created CDK native Open Exchange Rates connector | diff --git a/docs/integrations/sources/openweather.md b/docs/integrations/sources/openweather.md index af164bfae79..86f532e7442 100644 --- a/docs/integrations/sources/openweather.md +++ b/docs/integrations/sources/openweather.md @@ -10,36 +10,36 @@ This source currently has a single stream, `openweather_one_call`. An example of ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync - (append only) | Yes | -| Incremental - Append Sync | Yes | -| Namespaces | No | +| Feature | Supported? | +| :-------------------------------- | :--------- | +| Full Refresh Sync - (append only) | Yes | +| Incremental - Append Sync | Yes | +| Namespaces | No | ## Getting started ### Requirements -* An OpenWeather API key -* Latitude and longitude of the location for which you want to get weather data +- An OpenWeather API key +- Latitude and longitude of the location for which you want to get weather data ### Setup guide -Visit the [OpenWeather](https://openweathermap.org) to create a user account and obtain an API key. The *One Call API* is available with the free plan. +Visit the [OpenWeather](https://openweathermap.org) to create a user account and obtain an API key. The _One Call API_ is available with the free plan. ## Rate limiting + The free plan allows 60 calls per minute and 1,000,000 calls per month, you won't get beyond these limits with existing Airbyte's sync frequencies. ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.3 | 2024-04-19 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | schema descriptions | -| 0.2.0 | 2023-08-31 | [29983](https://github.com/airbytehq/airbyte/pull/29983) | Migrate to Low Code Framework | -| 0.1.6 | 2022-06-21 | [16136](https://github.com/airbytehq/airbyte/pull/16136) | Update openweather onecall api to 3.0. | -| 0.1.5 | 2022-06-21 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | No changes. Used connector to test publish workflow changes. | -| 0.1.4 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. | -| 0.1.0 | 2021-10-27 | [7434](https://github.com/airbytehq/airbyte/pull/7434) | Initial release | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.3 | 2024-04-19 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37209](https://github.com/airbytehq/airbyte/pull/37209) | schema descriptions | +| 0.2.0 | 2023-08-31 | [29983](https://github.com/airbytehq/airbyte/pull/29983) | Migrate to Low Code Framework | +| 0.1.6 | 2022-06-21 | [16136](https://github.com/airbytehq/airbyte/pull/16136) | Update openweather onecall api to 3.0. | +| 0.1.5 | 2022-06-21 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | No changes. Used connector to test publish workflow changes. | +| 0.1.4 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. | +| 0.1.0 | 2021-10-27 | [7434](https://github.com/airbytehq/airbyte/pull/7434) | Initial release | diff --git a/docs/integrations/sources/opsgenie.md b/docs/integrations/sources/opsgenie.md index 96f43541d46..ab18c01861d 100644 --- a/docs/integrations/sources/opsgenie.md +++ b/docs/integrations/sources/opsgenie.md @@ -8,23 +8,23 @@ This page contains the setup guide and reference information for the Opsgenie so This connector outputs the following streams: -* [Alerts](https://docs.opsgenie.com/docs/alert-api) \(Incremental\) -* [Alert Logs](https://docs.opsgenie.com/docs/alert-api-continued#list-alert-logs) \(Incremental\) -* [Alert Recipients](https://docs.opsgenie.com/docs/alert-api-continued#list-alert-recipients) \(Incremental\) -* [Services](https://docs.opsgenie.com/docs/service-api) -* [Incidents](https://docs.opsgenie.com/docs/incident-api) \(Incremental\) -* [Integrations](https://docs.opsgenie.com/docs/integration-api) -* [Users](https://docs.opsgenie.com/docs/user-api) -* [Teams](https://docs.opsgenie.com/docs/team-api) -* [Team Members](https://docs.opsgenie.com/docs/team-member-api) +- [Alerts](https://docs.opsgenie.com/docs/alert-api) \(Incremental\) +- [Alert Logs](https://docs.opsgenie.com/docs/alert-api-continued#list-alert-logs) \(Incremental\) +- [Alert Recipients](https://docs.opsgenie.com/docs/alert-api-continued#list-alert-recipients) \(Incremental\) +- [Services](https://docs.opsgenie.com/docs/service-api) +- [Incidents](https://docs.opsgenie.com/docs/incident-api) \(Incremental\) +- [Integrations](https://docs.opsgenie.com/docs/integration-api) +- [Users](https://docs.opsgenie.com/docs/user-api) +- [Teams](https://docs.opsgenie.com/docs/team-api) +- [Team Members](https://docs.opsgenie.com/docs/team-member-api) ### Features -| Feature | Supported? | -|:--------------------------| :--- | -| Full Refresh Sync | Yes | +| Feature | Supported? | +| :------------------------ | :---------------------------- | +| Full Refresh Sync | Yes | | Incremental - Append Sync | Partially \(not all streams\) | -| EU Instance | Yes | +| EU Instance | Yes | ### Performance Considerations @@ -34,8 +34,8 @@ Opsgenie has [rate limits](https://docs.opsgenie.com/docs/api-rate-limiting), bu ### Requirements -* Opsgenie Account -* Opsgenie API Key wih the necessary permissions \(described below\) +- Opsgenie Account +- Opsgenie API Key wih the necessary permissions \(described below\) ### Setup Guide @@ -49,13 +49,13 @@ The Opsgenie connector uses the most recent API version for each source of data. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------| :--- | -| 0.3.5 | 2024-04-19 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Updating to 0.80.0 CDK | -| 0.3.4 | 2024-04-18 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Manage dependencies with Poetry. | -| 0.3.3 | 2024-04-15 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.2 | 2024-04-12 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | schema descriptions | -| 0.3.1 | 2024-02-14 | [35269](https://github.com/airbytehq/airbyte/pull/35269) | Fix parsing of updated_at timestamps in alerts | -| 0.3.0 | 2023-10-19 | [31552](https://github.com/airbytehq/airbyte/pull/31552) | Migrated to Low Code | -| 0.2.0 | 2023-10-24 | [31777](https://github.com/airbytehq/airbyte/pull/31777) | Fix schema | -| 0.1.0 | 2022-09-14 | [16768](https://github.com/airbytehq/airbyte/pull/16768) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.3.5 | 2024-04-19 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Updating to 0.80.0 CDK | +| 0.3.4 | 2024-04-18 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Manage dependencies with Poetry. | +| 0.3.3 | 2024-04-15 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.2 | 2024-04-12 | [37210](https://github.com/airbytehq/airbyte/pull/37210) | schema descriptions | +| 0.3.1 | 2024-02-14 | [35269](https://github.com/airbytehq/airbyte/pull/35269) | Fix parsing of updated_at timestamps in alerts | +| 0.3.0 | 2023-10-19 | [31552](https://github.com/airbytehq/airbyte/pull/31552) | Migrated to Low Code | +| 0.2.0 | 2023-10-24 | [31777](https://github.com/airbytehq/airbyte/pull/31777) | Fix schema | +| 0.1.0 | 2022-09-14 | [16768](https://github.com/airbytehq/airbyte/pull/16768) | Initial Release | diff --git a/docs/integrations/sources/oracle-peoplesoft.md b/docs/integrations/sources/oracle-peoplesoft.md index 3e721147b3b..c428f59372b 100644 --- a/docs/integrations/sources/oracle-peoplesoft.md +++ b/docs/integrations/sources/oracle-peoplesoft.md @@ -6,9 +6,9 @@ Oracle PeopleSoft can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle.com/en/applications/peoplesoft/peopletools/index.html) databases. You can use Airbyte to sync your Oracle PeopleSoft instance by connecting to the underlying database using the appropriate Airbyte connector: -* [DB2](db2.md) -* [MSSQL](mssql.md) -* [Oracle](oracle.md) +- [DB2](db2.md) +- [MSSQL](mssql.md) +- [Oracle](oracle.md) :::info @@ -19,4 +19,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The schema will be loaded according to the rules of the underlying database's connector. Oracle provides ERD diagrams but they are behind a paywall. Contact your Oracle rep to gain access. - diff --git a/docs/integrations/sources/oracle-siebel-crm.md b/docs/integrations/sources/oracle-siebel-crm.md index ee73bd13216..b5806179de2 100644 --- a/docs/integrations/sources/oracle-siebel-crm.md +++ b/docs/integrations/sources/oracle-siebel-crm.md @@ -6,9 +6,9 @@ Oracle Siebel CRM can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle.com/cd/E88140_01/books/DevDep/installing-and-configuring-siebel-crm.html#PrerequisiteSoftware) databases. You can use Airbyte to sync your Oracle Siebel CRM instance by connecting to the underlying database using the appropriate Airbyte connector: -* [DB2](db2.md) -* [MSSQL](mssql.md) -* [Oracle](oracle.md) +- [DB2](db2.md) +- [MSSQL](mssql.md) +- [Oracle](oracle.md) :::info @@ -19,4 +19,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema To understand your Oracle Siebel CRM database schema, see the [Organization Setup Overview docs](https://docs.oracle.com/cd/E88140_01/books/DevDep/basic-organization-setup-overview.html#basic-organization-setup-overview) documentation. Otherwise, the schema will be loaded according to the rules of the underlying database's connector. - diff --git a/docs/integrations/sources/oracle.md b/docs/integrations/sources/oracle.md index 61b9b69b0f4..645903cc2f8 100644 --- a/docs/integrations/sources/oracle.md +++ b/docs/integrations/sources/oracle.md @@ -131,8 +131,8 @@ Airbyte has the ability to connect to the Oracle source with 3 network connectiv ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.5.2 | 2024-02-13 | [35225](https://github.com/airbytehq/airbyte/pull/35225) | Adopt CDK 0.20.4 | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | +| 0.5.2 | 2024-02-13 | [35225](https://github.com/airbytehq/airbyte/pull/35225) | Adopt CDK 0.20.4 | | 0.5.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.5.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | | 0.4.0 | 2023-06-26 | [27737](https://github.com/airbytehq/airbyte/pull/27737) | License Update: Elv2 | diff --git a/docs/integrations/sources/orb.md b/docs/integrations/sources/orb.md index 8d2d9d4e38d..6c21ac3c28b 100644 --- a/docs/integrations/sources/orb.md +++ b/docs/integrations/sources/orb.md @@ -9,11 +9,11 @@ will only read and output new records based on their `created_at` timestamp. This Source is capable of syncing the following core resources, each of which has a separate Stream. Note that all of the streams are incremental: -* [Subscriptions](https://docs.withorb.com/reference/list-subscriptions) -* [Plans](https://docs.withorb.com/reference/list-plans) -* [Customers](https://docs.withorb.com/reference/list-customers) -* [Credits Ledger Entries](https://docs.withorb.com/reference/fetch-customer-credits-ledger) -* [Subscription Usage](https://docs.withorb.com/reference/fetch-subscription-usage) +- [Subscriptions](https://docs.withorb.com/reference/list-subscriptions) +- [Plans](https://docs.withorb.com/reference/list-plans) +- [Customers](https://docs.withorb.com/reference/list-customers) +- [Credits Ledger Entries](https://docs.withorb.com/reference/fetch-customer-credits-ledger) +- [Subscription Usage](https://docs.withorb.com/reference/fetch-subscription-usage) As a caveat, the Credits Ledger Entries must read all Customers for an incremental sync, but will only incrementally return new ledger entries for each customers. @@ -27,12 +27,12 @@ In order to capture data that has been updated after creation, please run a peri ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Incremental - Dedupe Sync | Yes | -| SSL connection | Yes | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| Incremental - Dedupe Sync | Yes | +| SSL connection | Yes | ### Performance considerations @@ -43,15 +43,15 @@ The `credit_ledger_entries` stream will now include `events` data. This upgrade ::: :::info -If you are using the `start_date` and `end_date` parameter with the `credit_ledger_entries` stream it will sync all customers created during the that time window. It isn't possible to query data directly to `credit_ledger_entries`. The connector need to retrieve data from customers first to ingest the credit data. +If you are using the `start_date` and `end_date` parameter with the `credit_ledger_entries` stream it will sync all customers created during the that time window. It isn't possible to query data directly to `credit_ledger_entries`. The connector need to retrieve data from customers first to ingest the credit data. ::: ## Getting started ### Requirements -* Orb Account -* Orb API Key +- Orb Account +- Orb API Key ### Setup guide @@ -60,19 +60,18 @@ an Orb Account and API Key. ## Changelog -| Version | Date | Pull Request | Subject | -| --- |------------|----------------------------------------------------------| --- | -| 1.2.2 | 2024-04-19 | [37211](https://github.com/airbytehq/airbyte/pull/37211) | Updating to 0.80.0 CDK | -| 1.2.1 | 2024-04-12 | [37211](https://github.com/airbytehq/airbyte/pull/37211) | schema descriptions | -| 1.2.0 | 2024-03-19 | [x](https://github.com/airbytehq/airbyte/pull/x) | Expose `end_date`parameter | -| 1.1.2 | 2024-03-13 | [x](https://github.com/airbytehq/airbyte/pull/x) | Fix window to 30 days for events query timesframe start and query | -| 1.1.1 | 2024-02-07 | [35005](https://github.com/airbytehq/airbyte/pull/35005) | Pass timeframe_start, timeframe_end to events query | -| 1.1.0 | 2023-03-03 | [24567](https://github.com/airbytehq/airbyte/pull/24567) | Add Invoices incremental stream merged from [#24737](https://github.com/airbytehq/airbyte/pull/24737) | -| 1.0.0 | 2023-02-02 | [21951](https://github.com/airbytehq/airbyte/pull/21951) | Add SubscriptionUsage stream, and made `start_date` a required field | -| 0.1.4 | 2022-10-07 | [17761](https://github.com/airbytehq/airbyte/pull/17761) | Fix bug with enriching ledger entries with multiple credit blocks | -| 0.1.3 | 2022-08-26 | [16017](https://github.com/airbytehq/airbyte/pull/16017) | Add credit block id to ledger entries | -| 0.1.2 | 2022-04-20 | [11528](https://github.com/airbytehq/airbyte/pull/11528) | Add cost basis to ledger entries, update expiration date, sync only committed entries | -| 0.1.1 | 2022-03-03 | [10839](https://github.com/airbytehq/airbyte/pull/10839) | Support ledger entries with numeric properties + schema fixes | -| 0.1.0 | 2022-02-01 | | New Source: Orb | -| :--- | :--- | :--- | :--- | - +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| 1.2.2 | 2024-04-19 | [37211](https://github.com/airbytehq/airbyte/pull/37211) | Updating to 0.80.0 CDK | +| 1.2.1 | 2024-04-12 | [37211](https://github.com/airbytehq/airbyte/pull/37211) | schema descriptions | +| 1.2.0 | 2024-03-19 | [x](https://github.com/airbytehq/airbyte/pull/x) | Expose `end_date`parameter | +| 1.1.2 | 2024-03-13 | [x](https://github.com/airbytehq/airbyte/pull/x) | Fix window to 30 days for events query timesframe start and query | +| 1.1.1 | 2024-02-07 | [35005](https://github.com/airbytehq/airbyte/pull/35005) | Pass timeframe_start, timeframe_end to events query | +| 1.1.0 | 2023-03-03 | [24567](https://github.com/airbytehq/airbyte/pull/24567) | Add Invoices incremental stream merged from [#24737](https://github.com/airbytehq/airbyte/pull/24737) | +| 1.0.0 | 2023-02-02 | [21951](https://github.com/airbytehq/airbyte/pull/21951) | Add SubscriptionUsage stream, and made `start_date` a required field | +| 0.1.4 | 2022-10-07 | [17761](https://github.com/airbytehq/airbyte/pull/17761) | Fix bug with enriching ledger entries with multiple credit blocks | +| 0.1.3 | 2022-08-26 | [16017](https://github.com/airbytehq/airbyte/pull/16017) | Add credit block id to ledger entries | +| 0.1.2 | 2022-04-20 | [11528](https://github.com/airbytehq/airbyte/pull/11528) | Add cost basis to ledger entries, update expiration date, sync only committed entries | +| 0.1.1 | 2022-03-03 | [10839](https://github.com/airbytehq/airbyte/pull/10839) | Support ledger entries with numeric properties + schema fixes | +| 0.1.0 | 2022-02-01 | | New Source: Orb | +| :--- | :--- | :--- | :--- | diff --git a/docs/integrations/sources/orbit.md b/docs/integrations/sources/orbit.md index 5293e032378..9f3f28cf928 100644 --- a/docs/integrations/sources/orbit.md +++ b/docs/integrations/sources/orbit.md @@ -2,23 +2,23 @@ ## Sync overview -This source can sync data for the [Orbit API](https://docs.orbit.love/reference/about-the-orbit-api). It currently only supports Full Refresh syncs. +This source can sync data for the [Orbit API](https://docs.orbit.love/reference/about-the-orbit-api). It currently only supports Full Refresh syncs. ### Output schema This Source is capable of syncing the following core Streams: -* [Members](https://api.orbit.love/reference/get_workspace-slug-members) -* [Workspaces](https://docs.orbit.love/reference/get_workspaces-workspace-slug) +- [Members](https://api.orbit.love/reference/get_workspace-slug-members) +- [Workspaces](https://docs.orbit.love/reference/get_workspaces-workspace-slug) ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| Namespaces | No | | -| Pagination | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| Namespaces | No | | +| Pagination | Yes | | ### Performance considerations / Rate Limiting @@ -30,7 +30,7 @@ Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see ### Requirements -* Orbit API key - This can either be a workspace-tied key or a general personal key. +- Orbit API key - This can either be a workspace-tied key or a general personal key. ### Setup guide @@ -43,13 +43,13 @@ The Orbit API Key should be available to you immediately as an Orbit user. ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.3.4 | 2024-04-19 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Updating to 0.80.0 CDK | -| 0.3.3 | 2024-04-18 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Manage dependencies with Poetry. | -| 0.3.2 | 2024-04-15 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.1 | 2024-04-12 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | schema descriptions | -| 0.3.0 | 2023-10-25 | [30976](https://github.com/airbytehq/airbyte/pull/30976) | Migrate to low-code framework | -| 0.2.0 | 2023-10-23 | [31747](https://github.com/airbytehq/airbyte/pull/31747) | Update schema | -| 0.1.1 | 2022-06-28 | [14208](https://github.com/airbytehq/airbyte/pull/14208) | Remove unused schema | -| 0.1.0 | 2022-06-27 | [13390](https://github.com/airbytehq/airbyte/pull/13390) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.3.4 | 2024-04-19 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Updating to 0.80.0 CDK | +| 0.3.3 | 2024-04-18 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Manage dependencies with Poetry. | +| 0.3.2 | 2024-04-15 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.1 | 2024-04-12 | [37212](https://github.com/airbytehq/airbyte/pull/37212) | schema descriptions | +| 0.3.0 | 2023-10-25 | [30976](https://github.com/airbytehq/airbyte/pull/30976) | Migrate to low-code framework | +| 0.2.0 | 2023-10-23 | [31747](https://github.com/airbytehq/airbyte/pull/31747) | Update schema | +| 0.1.1 | 2022-06-28 | [14208](https://github.com/airbytehq/airbyte/pull/14208) | Remove unused schema | +| 0.1.0 | 2022-06-27 | [13390](https://github.com/airbytehq/airbyte/pull/13390) | Initial Release | diff --git a/docs/integrations/sources/oura.md b/docs/integrations/sources/oura.md index c302c69237f..c8b000a8886 100644 --- a/docs/integrations/sources/oura.md +++ b/docs/integrations/sources/oura.md @@ -22,7 +22,7 @@ This source is capable of syncing the following streams: ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:----------------------------------| +| :---------------- | :-------------------- | :-------------------------------- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | | Multiple rings | No | May be implemented in the future. | @@ -52,5 +52,5 @@ The following fields are required fields for the connector to work: ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| +| :------ | :--------- | :------------------------------------------------------- | :--------- | | 0.1.0 | 2022-10-20 | [18224](https://github.com/airbytehq/airbyte/pull/18224) | New source | diff --git a/docs/integrations/sources/outbrain-amplify.md b/docs/integrations/sources/outbrain-amplify.md index 3ebe25b2b8f..1c6af212881 100644 --- a/docs/integrations/sources/outbrain-amplify.md +++ b/docs/integrations/sources/outbrain-amplify.md @@ -8,41 +8,41 @@ This source can sync data for the [Outbrain Amplify API](https://amplifyv01.docs This Source is capable of syncing the following core Streams: -* marketers stream. -* campaigns by marketers stream.-Incremental -* campaigns geo-location stream. -* promoted links for campaigns stream. -* promoted links sequence for campaigns stream. -* budgets for marketers stream. -* performance report campaigns by marketers stream. -* performance report periodic by marketers stream. -* performance report periodic by marketers campaign stream. -* performance report periodic content by promoted links campaign stream. -* performance report marketers by publisher stream. -* performance report publishers by campaigns stream. -* performance report marketers by platforms stream. -* performance report marketers campaigns by platforms stream. -* performance report marketers by geo performance stream. -* performance report marketers campaigns by geo stream. -* performance report marketers by Interest stream. +- marketers stream. +- campaigns by marketers stream.-Incremental +- campaigns geo-location stream. +- promoted links for campaigns stream. +- promoted links sequence for campaigns stream. +- budgets for marketers stream. +- performance report campaigns by marketers stream. +- performance report periodic by marketers stream. +- performance report periodic by marketers campaign stream. +- performance report periodic content by promoted links campaign stream. +- performance report marketers by publisher stream. +- performance report publishers by campaigns stream. +- performance report marketers by platforms stream. +- performance report marketers campaigns by platforms stream. +- performance report marketers by geo performance stream. +- performance report marketers campaigns by geo stream. +- performance report marketers by Interest stream. ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `integer` | `integer` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `integer` | `integer` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Namespaces | No | | ### Performance considerations @@ -52,7 +52,7 @@ The Outbrain Amplify connector should not run into Outbrain Amplify API limitati ### Requirements -* Credentials and start-date. +- Credentials and start-date. ### Setup guide @@ -60,8 +60,8 @@ Specify credentials and a start date. ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.2 | 2022-08-25 | [15667](https://github.com/airbytehq/airbyte/pull/15667) | Add message when no data available | -| 0.1.1 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Fix docs | -| 0.1.0 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------- | +| 0.1.2 | 2022-08-25 | [15667](https://github.com/airbytehq/airbyte/pull/15667) | Add message when no data available | +| 0.1.1 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Fix docs | +| 0.1.0 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Initial Release | diff --git a/docs/integrations/sources/outreach.md b/docs/integrations/sources/outreach.md index 56bb608ffcb..a19c745d5a6 100644 --- a/docs/integrations/sources/outreach.md +++ b/docs/integrations/sources/outreach.md @@ -51,16 +51,16 @@ List of available streams: ## Changelog -| Version | Date | Pull Request | Subject | -| :------ |:-----------| :----- | :------ | -| 0.5.4 | 2024-04-19 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Updating to 0.80.0 CDK | -| 0.5.3 | 2024-04-18 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Manage dependencies with Poetry. | -| 0.5.2 | 2024-04-15 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.5.1 | 2024-04-12 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | schema descriptions | -| 0.5.0 | 2023-09-20 | [30639](https://github.com/airbytehq/airbyte/pull/30639) | Add Call Purposes and Call Dispositions streams -| 0.4.0 | 2023-06-14 | [27343](https://github.com/airbytehq/airbyte/pull/27343) | Add Users, Tasks, Templates, Snippets streams -| 0.3.0 | 2023-05-17 | [26211](https://github.com/airbytehq/airbyte/pull/26211) | Add SequenceStates Stream -| 0.2.0 | 2022-10-27 | [17385](https://github.com/airbytehq/airbyte/pull/17385) | Add new streams + page size variable + relationship data | -| 0.1.2 | 2022-07-04 | [14386](https://github.com/airbytehq/airbyte/pull/14386) | Fix stream schema and cursor field | -| 0.1.1 | 2021-12-07 | [8582](https://github.com/airbytehq/airbyte/pull/8582) | Update connector fields title/description | -| 0.1.0 | 2021-11-03 | [7507](https://github.com/airbytehq/airbyte/pull/7507) | Outreach Connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.5.4 | 2024-04-19 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Updating to 0.80.0 CDK | +| 0.5.3 | 2024-04-18 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Manage dependencies with Poetry. | +| 0.5.2 | 2024-04-15 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.5.1 | 2024-04-12 | [37215](https://github.com/airbytehq/airbyte/pull/37215) | schema descriptions | +| 0.5.0 | 2023-09-20 | [30639](https://github.com/airbytehq/airbyte/pull/30639) | Add Call Purposes and Call Dispositions streams | +| 0.4.0 | 2023-06-14 | [27343](https://github.com/airbytehq/airbyte/pull/27343) | Add Users, Tasks, Templates, Snippets streams | +| 0.3.0 | 2023-05-17 | [26211](https://github.com/airbytehq/airbyte/pull/26211) | Add SequenceStates Stream | +| 0.2.0 | 2022-10-27 | [17385](https://github.com/airbytehq/airbyte/pull/17385) | Add new streams + page size variable + relationship data | +| 0.1.2 | 2022-07-04 | [14386](https://github.com/airbytehq/airbyte/pull/14386) | Fix stream schema and cursor field | +| 0.1.1 | 2021-12-07 | [8582](https://github.com/airbytehq/airbyte/pull/8582) | Update connector fields title/description | +| 0.1.0 | 2021-11-03 | [7507](https://github.com/airbytehq/airbyte/pull/7507) | Outreach Connector | diff --git a/docs/integrations/sources/pagerduty.md b/docs/integrations/sources/pagerduty.md index e517fecf87c..3530ae2d10f 100644 --- a/docs/integrations/sources/pagerduty.md +++ b/docs/integrations/sources/pagerduty.md @@ -13,27 +13,27 @@ the tables and columns you set up for replication, every time a sync is run. Several output streams are available from this source: -* [Incidents](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODEzOA-list-incidents) \(Incremental\) -* [Incident Log Entries](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODE1NA-list-log-entries) \(Incremental\) -* [Priorities](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODE2NA-list-priorities) -* [Users](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODIzMw-list-users) +- [Incidents](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODEzOA-list-incidents) \(Incremental\) +- [Incident Log Entries](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODE1NA-list-log-entries) \(Incremental\) +- [Priorities](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODE2NA-list-priorities) +- [Users](https://developer.pagerduty.com/api-reference/b3A6Mjc0ODIzMw-list-users) If there are more endpoints you'd like Faros AI to support, please [create an issue.](https://github.com/faros-ai/airbyte-connectors/issues/new) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations The PagerDuty source should not run into PagerDuty API limitations under normal -usage. Please [create an +usage. Please [create an issue](https://github.com/faros-ai/airbyte-connectors/issues/new) if you see any rate limit issues that are not automatically retried successfully. @@ -41,14 +41,14 @@ rate limit issues that are not automatically retried successfully. ### Requirements -* PagerDuty API Key +- PagerDuty API Key Please follow the [their documentation for generating a PagerDuty API Key](https://support.pagerduty.com/docs/generating-api-keys#section-generating-a-general-access-rest-api-key). ## Changelog -| Version | Date | Pull Request | Subject | -| :------- | :--------- | :----------------------------------------------------------------- | :----------------------------------- | -| 0.2.0 | 2023-10-20 | [31160](https://github.com/airbytehq/airbyte/pull/31160) | Migrate to low code | -| 0.1.23 | 2021-11-12 | [125](https://github.com/faros-ai/airbyte-connectors/pull/125) | Add Pagerduty source and destination | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------- | :----------------------------------- | +| 0.2.0 | 2023-10-20 | [31160](https://github.com/airbytehq/airbyte/pull/31160) | Migrate to low code | +| 0.1.23 | 2021-11-12 | [125](https://github.com/faros-ai/airbyte-connectors/pull/125) | Add Pagerduty source and destination | diff --git a/docs/integrations/sources/pardot.md b/docs/integrations/sources/pardot.md index 5de66dd1732..9bdc99f105f 100644 --- a/docs/integrations/sources/pardot.md +++ b/docs/integrations/sources/pardot.md @@ -10,27 +10,27 @@ The Pardot supports full refresh syncs Several output streams are available from this source: -* [Campaigns](https://developer.salesforce.com/docs/marketing/pardot/guide/campaigns-v4.html) -* [EmailClicks](https://developer.salesforce.com/docs/marketing/pardot/guide/batch-email-clicks-v4.html) -* [ListMembership](https://developer.salesforce.com/docs/marketing/pardot/guide/list-memberships-v4.html) -* [Lists](https://developer.salesforce.com/docs/marketing/pardot/guide/lists-v4.html) -* [ProspectAccounts](https://developer.salesforce.com/docs/marketing/pardot/guide/prospect-accounts-v4.html) -* [Prospects](https://developer.salesforce.com/docs/marketing/pardot/guide/prospects-v4.html) -* [Users](https://developer.salesforce.com/docs/marketing/pardot/guide/users-v4.html) -* [VisitorActivities](https://developer.salesforce.com/docs/marketing/pardot/guide/visitor-activities-v4.html) -* [Visitors](https://developer.salesforce.com/docs/marketing/pardot/guide/visitors-v4.html) -* [Visits](https://developer.salesforce.com/docs/marketing/pardot/guide/visits-v4.html) +- [Campaigns](https://developer.salesforce.com/docs/marketing/pardot/guide/campaigns-v4.html) +- [EmailClicks](https://developer.salesforce.com/docs/marketing/pardot/guide/batch-email-clicks-v4.html) +- [ListMembership](https://developer.salesforce.com/docs/marketing/pardot/guide/list-memberships-v4.html) +- [Lists](https://developer.salesforce.com/docs/marketing/pardot/guide/lists-v4.html) +- [ProspectAccounts](https://developer.salesforce.com/docs/marketing/pardot/guide/prospect-accounts-v4.html) +- [Prospects](https://developer.salesforce.com/docs/marketing/pardot/guide/prospects-v4.html) +- [Users](https://developer.salesforce.com/docs/marketing/pardot/guide/users-v4.html) +- [VisitorActivities](https://developer.salesforce.com/docs/marketing/pardot/guide/visitor-activities-v4.html) +- [Visitors](https://developer.salesforce.com/docs/marketing/pardot/guide/visitors-v4.html) +- [Visits](https://developer.salesforce.com/docs/marketing/pardot/guide/visits-v4.html) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | No | -| SSL connection | No | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | No | +| SSL connection | No | +| Namespaces | No | ### Performance considerations @@ -40,22 +40,22 @@ The Pardot connector should not run into Pardot API limitations under normal usa ### Requirements -* Pardot Account -* Pardot Business Unit ID -* Client ID -* Client Secret -* Refresh Token -* Start Date -* Is Sandbox environment? +- Pardot Account +- Pardot Business Unit ID +- Client ID +- Client Secret +- Refresh Token +- Start Date +- Is Sandbox environment? ### Setup guide -* `pardot_business_unit_id`: Pardot Business ID, can be found at Setup > Pardot > Pardot Account Setup -* `client_id`: The Consumer Key that can be found when viewing your app in Salesforce -* `client_secret`: The Consumer Secret that can be found when viewing your app in Salesforce -* `refresh_token`: Salesforce Refresh Token used for Airbyte to access your Salesforce account. If you don't know what this is, follow [this guide](https://medium.com/@bpmmendis94/obtain-access-refresh-tokens-from-salesforce-rest-api-a324fe4ccd9b) to retrieve it. -* `start_date`: UTC date and time in the format 2017-01-25T00:00:00Z. Any data before this date will not be replicated. Leave blank to skip this filter -* `is_sandbox`: Whether or not the app is in a Salesforce sandbox. If you do not know what this is, assume it is false. +- `pardot_business_unit_id`: Pardot Business ID, can be found at Setup > Pardot > Pardot Account Setup +- `client_id`: The Consumer Key that can be found when viewing your app in Salesforce +- `client_secret`: The Consumer Secret that can be found when viewing your app in Salesforce +- `refresh_token`: Salesforce Refresh Token used for Airbyte to access your Salesforce account. If you don't know what this is, follow [this guide](https://medium.com/@bpmmendis94/obtain-access-refresh-tokens-from-salesforce-rest-api-a324fe4ccd9b) to retrieve it. +- `start_date`: UTC date and time in the format 2017-01-25T00:00:00Z. Any data before this date will not be replicated. Leave blank to skip this filter +- `is_sandbox`: Whether or not the app is in a Salesforce sandbox. If you do not know what this is, assume it is false. ## Changelog diff --git a/docs/integrations/sources/paypal-transaction-migrations.md b/docs/integrations/sources/paypal-transaction-migrations.md index abe8b5f5590..ad81a2a79ed 100644 --- a/docs/integrations/sources/paypal-transaction-migrations.md +++ b/docs/integrations/sources/paypal-transaction-migrations.md @@ -5,7 +5,8 @@ Version 2.1.0 changes the format of the state object. Upgrading to 2.1.0 is safe, but downgrading to 2.0.0 is not. To downgrade to 2.0.0: + - Edit your connection state: - Change the keys for the transactions and balances streams to "date" - Change the format of the cursor to "yyyy-MM-dd'T'HH:mm:ss+00:00" -Alternatively, you can also reset your connection. + Alternatively, you can also reset your connection. diff --git a/docs/integrations/sources/paypal-transaction.md b/docs/integrations/sources/paypal-transaction.md index 87296616893..6fbad66bb57 100644 --- a/docs/integrations/sources/paypal-transaction.md +++ b/docs/integrations/sources/paypal-transaction.md @@ -2,7 +2,7 @@ This page contains the setup guide and reference information for the Paypal source connector. -This connector uses [PayPal APIs](https://developer.paypal.com/api/rest/authentication/) OAuth 2.0 access token to authenticate requests. +This connector uses [PayPal APIs](https://developer.paypal.com/api/rest/authentication/) OAuth 2.0 access token to authenticate requests. ## Prerequisites @@ -11,42 +11,35 @@ You will need a Paypal account, which you can get following [these steps](https: In the same page, you will also find how to setup a Sandbox so you can test the connector before using it in production. ## Setup guide + ### Step 1: Get your Paypal secrets After creating your account you will be able to get your `Client ID` and `Secret`. You can find them in your [Apps & Credentials page](https://developer.paypal.com/dashboard/applications/live). - ### Step 2: Set up the Paypal Transaction connector in Airbyte - 1. Log into your Airbyte account - - For Cloud [Log in here](https://cloud.airbyte.com/workspaces). + + - For Cloud [Log in here](https://cloud.airbyte.com/workspaces). 2. In the left navigation bar, click **Sources**. - + a. If this is your first time creating a source, use the search bar and enter **Paypal Transaction** and select it. b. If you already have sources configured, go to the top-right corner and click **+new source**. Then enter **Paypal Transaction** in the searech bar and select the connector. - -3. Set the name for your source -4. Enter your `Client ID` + +3. Set the name for your source +4. Enter your `Client ID` 5. Enter your `Client secret` -6. `Start Date`: Use the provided datepicker or enter manually a UTC date and time in the format `YYYY-MM-DDTHH:MM:SSZ`. +6. `Start Date`: Use the provided datepicker or enter manually a UTC date and time in the format `YYYY-MM-DDTHH:MM:SSZ`. 7. Switch ON/Off the Sandbox toggle. By defaukt the toggle is OFF, meaning it work only in a produciton environment. -8. _(Optional) `Dispute Start Date Range`: Use the provided datepicker or enter manually a UTC date and time in the format `YYYY-MM-DDTHH:MM:SS.sssZ`. - - If you don't add a date and you sync the `lists_disputes stream`, it will use the default value of 180 days in the past to retrieve data - - It is mandatory to add the milliseconds is you enter a datetime. - - This option only works for `lists_disputes stream` +8. \_(Optional) `Dispute Start Date Range`: Use the provided datepicker or enter manually a UTC date and time in the format `YYYY-MM-DDTHH:MM:SS.sssZ`. - If you don't add a date and you sync the `lists_disputes stream`, it will use the default value of 180 days in the past to retrieve data - It is mandatory to add the milliseconds is you enter a datetime. - This option only works for `lists_disputes stream` 9. _(Optional)`Refresh Token`:_ You can enter manually a refresh token. Right now the stream does this automatically. -10. _(Optional)`Number of days per request`:_ You can specify the days used by the connector when requesting data from the Paypal API. This helps in cases when you have a rate limit and you want to lower the window of retrieving data. - - Paypal has a 10K record limit per request. This option is useful if your sync is every week and you have more than 10K per week - - The default value is 7 - - This Max value you can enter is 31 days - +10. _(Optional)`Number of days per request`:_ You can specify the days used by the connector when requesting data from the Paypal API. This helps in cases when you have a rate limit and you want to lower the window of retrieving data. - Paypal has a 10K record limit per request. This option is useful if your sync is every week and you have more than 10K per week - The default value is 7 - This Max value you can enter is 31 days 11. Click **Set up source** -:::info +:::info By default, syncs are run with a slice period of 7 days. If you see errors with the message `Result set size is greater than the maximum limit` or an error code like `RESULTSET_TOO_LARGE`: @@ -55,7 +48,6 @@ By default, syncs are run with a slice period of 7 days. If you see errors with ::: - ## Supported sync modes The PayPal Transaction source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): @@ -66,156 +58,151 @@ The PayPal Transaction source connector supports the following [sync modes](http | Incremental - Append Sync | Yes | | Namespaces | No | - ## Supported Streams This Source is capable of syncing the following core Streams: -* [Transactions](https://developer.paypal.com/docs/api/transaction-search/v1/#transactions) -* [Balances](https://developer.paypal.com/docs/api/transaction-search/v1/#balances) -* [List Products](https://developer.paypal.com/docs/api/catalog-products/v1/#products_list) -* [Show Product Details](https://developer.paypal.com/docs/api/catalog-products/v1/#products_get) -* [List Disputes](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_list) -* [Search Invoices](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_search-invoices) -* [List Payments](https://developer.paypal.com/docs/api/payments/v1/#payment_list) +- [Transactions](https://developer.paypal.com/docs/api/transaction-search/v1/#transactions) +- [Balances](https://developer.paypal.com/docs/api/transaction-search/v1/#balances) +- [List Products](https://developer.paypal.com/docs/api/catalog-products/v1/#products_list) +- [Show Product Details](https://developer.paypal.com/docs/api/catalog-products/v1/#products_get) +- [List Disputes](https://developer.paypal.com/docs/api/customer-disputes/v1/#disputes_list) +- [Search Invoices](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_search-invoices) +- [List Payments](https://developer.paypal.com/docs/api/payments/v1/#payment_list) +### Transactions Stream -### Transactions Stream +The below table contains the configuraiton parameters available for this connector and the default values and available features -The below table contains the configuraiton parameters available for this connector and the default values and available features - -| **Param/Feature** | `Transactions` | -| :-------------------------- | :------------------------ | -| `Start Date` | Timestamp with TZ (no ms) | -| `Dispute Start Date Range` | NA | -| `Refresh token` | Auto | -| `Number of days per request`| Max 31 , 7(D) | -| `Pagination Strategy` | Page Increment | -| `Page size ` | Max 500 (F) | -| `Full Refresh` | :white_check_mark: | -| `Incremental` | :white_check_mark: (D) | +| **Param/Feature** | `Transactions` | +| :--------------------------- | :------------------------ | +| `Start Date` | Timestamp with TZ (no ms) | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | Max 31 , 7(D) | +| `Pagination Strategy` | Page Increment | +| `Page size ` | Max 500 (F) | +| `Full Refresh` | :white_check_mark: | +| `Incremental` | :white_check_mark: (D) | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. -___ +--- -### Balances Stream +### Balances Stream -The below table contains the configuraiton parameters available for this connector and the default values and available features +The below table contains the configuraiton parameters available for this connector and the default values and available features -| **Param/Feature** |`Balances` | -| :-------------------------- |:------------------------ | -| `Start Date` |Timestamp with TZ (no ms) | -| `Dispute Start Date Range` |NA | -| `Refresh token` |Auto | -| `Number of days per request`|NA | -| `Pagination Strategy` |NA | -| `Page size ` |NA | -| `Full Refresh` |:white_check_mark: | -| `Incremental` |:white_check_mark: (D) | +| **Param/Feature** | `Balances` | +| :--------------------------- | :------------------------ | +| `Start Date` | Timestamp with TZ (no ms) | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | NA | +| `Pagination Strategy` | NA | +| `Page size ` | NA | +| `Full Refresh` | :white_check_mark: | +| `Incremental` | :white_check_mark: (D) | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. -___ +--- +### List Products Stream -### List Products Stream +The below table contains the configuraiton parameters available for this connector and the default values and available features -The below table contains the configuraiton parameters available for this connector and the default values and available features - - -| **Param/Feature** |`List Products` | -| :-------------------------- |:------------------------ | -| `Start Date` |NA | -| `Dispute Start Date Range` |NA | -| `Refresh token` |Auto | -| `Number of days per request`|NA | -| `Pagination Strategy` |Page Increment | -| `Page size ` |Max 20 (F) | -| `Full Refresh` |:white_check_mark: (D) | -| `Incremental` |:x: | +| **Param/Feature** | `List Products` | +| :--------------------------- | :--------------------- | +| `Start Date` | NA | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | NA | +| `Pagination Strategy` | Page Increment | +| `Page size ` | Max 20 (F) | +| `Full Refresh` | :white_check_mark: (D) | +| `Incremental` | :x: | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. -:::caution +:::caution -When configuring your stream take in consideration that the way the API works limits the speed on retreiving data. In some cases a +30K catalog retrieval could take between 10-15 minutes. +When configuring your stream take in consideration that the way the API works limits the speed on retreiving data. In some cases a +30K catalog retrieval could take between 10-15 minutes. ::: -___ +--- -### Show Products Stream +### Show Products Stream -The below table contains the configuraiton parameters available for this connector and the default values and available features +The below table contains the configuraiton parameters available for this connector and the default values and available features -| **Param/Feature** |`Show Prod. Details` | -| :-------------------------- |:------------------------ | -| `Start Date` |NA | -| `Dispute Start Date Range` |NA | -| `Refresh token` |Auto | -| `Number of days per request`|NA | -| `Pagination Strategy` |NA | -| `Page size ` |NA | -| `Full Refresh` |:white_check_mark: (D) | -| `Incremental` |:x: | +| **Param/Feature** | `Show Prod. Details` | +| :--------------------------- | :--------------------- | +| `Start Date` | NA | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | NA | +| `Pagination Strategy` | NA | +| `Page size ` | NA | +| `Full Refresh` | :white_check_mark: (D) | +| `Incremental` | :x: | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. +:::caution -:::caution - -When configuring this stream consider that the parent stream paginates with 20 number of items (Max alowed page size). The Paypal API calls are not concurrent, so the time it takes depends entirely on the server side. -This stream could take a considerable time syncing, so you should consider running the sync of this and the parent stream (`list_products`) at the end of the day. -Depending on the size of the catalog it could take several hours to sync. +When configuring this stream consider that the parent stream paginates with 20 number of items (Max alowed page size). The Paypal API calls are not concurrent, so the time it takes depends entirely on the server side. +This stream could take a considerable time syncing, so you should consider running the sync of this and the parent stream (`list_products`) at the end of the day. +Depending on the size of the catalog it could take several hours to sync. ::: -___ +--- -### List Disputes Stream +### List Disputes Stream -The below table contains the configuraiton parameters available for this connector and the default values and available features +The below table contains the configuraiton parameters available for this connector and the default values and available features -| **Param/Feature** |`List Disputes` | -| :-------------------------- |:------------------------ | -| `Start Date` |NA | -| `Dispute Start Date Range` |Timestamp with TZ (w/ms) | -| `Refresh token` |Auto | -| `Number of days per request`|Max 180 , 7(D) | -| `Pagination Strategy` |Page Token | -| `Page size ` |Max 50 (F) | -| `Full Refresh` |:white_check_mark: | -| `Incremental` |:white_check_mark: (D) | +| **Param/Feature** | `List Disputes` | +| :--------------------------- | :----------------------- | +| `Start Date` | NA | +| `Dispute Start Date Range` | Timestamp with TZ (w/ms) | +| `Refresh token` | Auto | +| `Number of days per request` | Max 180 , 7(D) | +| `Pagination Strategy` | Page Token | +| `Page size ` | Max 50 (F) | +| `Full Refresh` | :white_check_mark: | +| `Incremental` | :white_check_mark: (D) | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. -___ +--- -### Search Invoices Stream +### Search Invoices Stream -The below table contains the configuraiton parameters available for this connector and the default values and available features +The below table contains the configuraiton parameters available for this connector and the default values and available features -| **Param/Feature** |`Search Invoices` | -| :-------------------------- |:------------------------ | -| `Start Date` |Timestamp with TZ (no ms) | -| `Dispute Start Date Range` |NA | -| `Refresh token` |Auto | -| `Number of days per request`|ND | -| `Pagination Strategy` |Page Number | -| `Page size ` |Max 100 (F) | -| `Full Refresh` |:white_check_mark: (D) | -| `Incremental` |:x: | +| **Param/Feature** | `Search Invoices` | +| :--------------------------- | :------------------------ | +| `Start Date` | Timestamp with TZ (no ms) | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | ND | +| `Pagination Strategy` | Page Number | +| `Page size ` | Max 100 (F) | +| `Full Refresh` | :white_check_mark: (D) | +| `Incremental` | :x: | **D:** Default configured Value @@ -223,47 +210,43 @@ The below table contains the configuraiton parameters available for this connec **ND:** Not Defined in the source. +:::info -:::info - -The `start_end` from the configuration, is passed to the body of the request and uses the `creation_date_range.start` and `creation_date_range.end`. More information in the [Paypal Developer API documentation](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_search-invoices). +The `start_end` from the configuration, is passed to the body of the request and uses the `creation_date_range.start` and `creation_date_range.end`. More information in the [Paypal Developer API documentation](https://developer.paypal.com/docs/api/invoicing/v2/#invoices_search-invoices). ::: +--- -___ +### List Payments Stream -### List Payments Stream +The below table contains the configuraiton parameters available for this connector and the default values and available features. -The below table contains the configuraiton parameters available for this connector and the default values and available features. - -| **Param/Feature** |`List Payments` | -| :-------------------------- |:------------------------ | -| `Start Date` |Timestamp with TZ (no ms) | -| `Dispute Start Date Range` |NA | -| `Refresh token` |Auto | -| `Number of days per request`|NA , 7(D) | -| `Pagination Strategy` |Page Cursor | -| `Page size ` |Max 20 (F) | -| `Full Refresh` |:white_check_mark: | -| `Incremental` |:white_check_mark: (D) | +| **Param/Feature** | `List Payments` | +| :--------------------------- | :------------------------ | +| `Start Date` | Timestamp with TZ (no ms) | +| `Dispute Start Date Range` | NA | +| `Refresh token` | Auto | +| `Number of days per request` | NA , 7(D) | +| `Pagination Strategy` | Page Cursor | +| `Page size ` | Max 20 (F) | +| `Full Refresh` | :white_check_mark: | +| `Incremental` | :white_check_mark: (D) | **D:** Default configured Value **F:** Fixed Value. This means it is not configurable. -___ +--- ## Performance Considerations -* **Data Availability:** It takes a maximum of 3 hours for executed transactions to appear in the list transactions call. -* **Number of days per request:** The maximum supported date range is 31 days. -* **Historical Data:** You can't retrieve more than 3yrs of data for the `transactions` stream. For `dispute_start_date` you can only retrieve 180 days of data (see specifications per stream) -* `records_per_request`: The maximum number of records in a single request are 10K (API Server restriction) -* `page_size`: The number of records per page is differs per stream. `source-paypal-transaction` sets maximum allowed page size for each stream by default. -* `requests_per_minute`: The maximum limit is 50 requests per minute from IP address to all endpoint (API Server restriction). - - +- **Data Availability:** It takes a maximum of 3 hours for executed transactions to appear in the list transactions call. +- **Number of days per request:** The maximum supported date range is 31 days. +- **Historical Data:** You can't retrieve more than 3yrs of data for the `transactions` stream. For `dispute_start_date` you can only retrieve 180 days of data (see specifications per stream) +- `records_per_request`: The maximum number of records in a single request are 10K (API Server restriction) +- `page_size`: The number of records per page is differs per stream. `source-paypal-transaction` sets maximum allowed page size for each stream by default. +- `requests_per_minute`: The maximum limit is 50 requests per minute from IP address to all endpoint (API Server restriction). ## Data type map @@ -274,11 +257,10 @@ ___ | `array` | `array` | | `object` | `object` | - ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------- | | 2.5.3 | 2024-04-24 | [36654](https://github.com/airbytehq/airbyte/pull/36654) | Schema descriptions | | 2.5.2 | 2024-04-19 | [37435](https://github.com/airbytehq/airbyte/pull/37435) | Updated `manifest.yaml` to use the latest CDK Manifest version to fix the Incremental STATE values | | 2.5.1 | 2024-03-15 | [36165](https://github.com/airbytehq/airbyte/pull/36165) | Unpin CDK Version | diff --git a/docs/integrations/sources/paystack.md b/docs/integrations/sources/paystack.md index f769e0194d9..3156a87aff0 100644 --- a/docs/integrations/sources/paystack.md +++ b/docs/integrations/sources/paystack.md @@ -1,36 +1,42 @@ # Paystack + This page contains the setup guide and reference information for the Paystack source connector. ## Prerequisites -* Secret Key -* Start Day -* Lookback Window + +- Secret Key +- Start Day +- Lookback Window ## Setup guide + ### Step 1: Set up Paystack connector + 1. Log into your [Airbyte Cloud](https://cloud.airbyte.io/workspaces) or Airbyte Open Source account. -2. Click **Sources** and then click **+ New source**. +2. Click **Sources** and then click **+ New source**. 3. On the Set up the source page, select **Paystack** from the Source type dropdown. 4. Enter a name for your source. -5. For **Secret Key** enter your secret key. The Paystack API key usually starts with **'sk_live_'**. You can find yours secret key [here](https://dashboard.paystack.com/#/settings/developer). +5. For **Secret Key** enter your secret key. The Paystack API key usually starts with **'sk*live*'**. You can find yours secret key [here](https://dashboard.paystack.com/#/settings/developer). 6. For **Start Date** enter UTC date and time in the format **2017-01-25T00:00:00Z**. Any data before this date will not be replicated. 7. For **Lookback Window (in days)** enter the number of days. When set, the connector will always reload data from the past N days, where N is the value set here. This is useful if your data is updated after creation. ## Supported sync modes + The Paystack source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* Full Refresh -* Incremental + +- Full Refresh +- Incremental ## Supported Streams -* [Customers](https://paystack.com/docs/api/customer#list) \(Incremental\) -* [Disputes](https://paystack.com/docs/api/dispute) \(Incremental\) -* [Invoices](https://paystack.com/docs/api/payment-request) \(Incremental\) -* [Refunds](https://paystack.com/docs/api/refund) \(Incremental\) -* [Settlements](https://paystack.com/docs/api/settlement) \(Incremental\) -* [Subscriptions](https://paystack.com/docs/api/subscription) \(Incremental\) -* [Transactions](https://paystack.com/docs/api/transaction) \(Incremental\) -* [Transfers](https://paystack.com/docs/api/transfer) \(Incremental\) +- [Customers](https://paystack.com/docs/api/customer#list) \(Incremental\) +- [Disputes](https://paystack.com/docs/api/dispute) \(Incremental\) +- [Invoices](https://paystack.com/docs/api/payment-request) \(Incremental\) +- [Refunds](https://paystack.com/docs/api/refund) \(Incremental\) +- [Settlements](https://paystack.com/docs/api/settlement) \(Incremental\) +- [Subscriptions](https://paystack.com/docs/api/subscription) \(Incremental\) +- [Transactions](https://paystack.com/docs/api/transaction) \(Incremental\) +- [Transfers](https://paystack.com/docs/api/transfer) \(Incremental\) ### Note on Incremental Syncs @@ -45,7 +51,7 @@ The [Paystack API](https://paystack.com/docs/api) is compatible with the [JSONSc ### Features | Feature | Supported? | -|:--------------------------|:-----------| +| :------------------------ | :--------- | | Full Refresh Sync | Yes | | Incremental - Append Sync | Yes | | Incremental - Dedupe Sync | Yes | @@ -55,12 +61,11 @@ The [Paystack API](https://paystack.com/docs/api) is compatible with the [JSONSc The Paystack connector should not run into Paystack API limitations under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. - ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------- | | 0.1.3 | 2023-03-21 | [24247](https://github.com/airbytehq/airbyte/pull/24247) | Specified date formatting in specification | | 0.1.2 | 2023-03-15 | [24085](https://github.com/airbytehq/airbyte/pull/24085) | Set additionalProperties: true, add TypeTransformer to Refunds | | 0.1.1 | 2021-12-07 | [8582](https://github.com/airbytehq/airbyte/pull/8582) | Update connector fields title/description | -| 0.1.0 | 2021-10-20 | [7214](https://github.com/airbytehq/airbyte/pull/7214) | Add Paystack source connector | \ No newline at end of file +| 0.1.0 | 2021-10-20 | [7214](https://github.com/airbytehq/airbyte/pull/7214) | Add Paystack source connector | diff --git a/docs/integrations/sources/pendo.md b/docs/integrations/sources/pendo.md index b07240a6fb5..e01f16e3d08 100644 --- a/docs/integrations/sources/pendo.md +++ b/docs/integrations/sources/pendo.md @@ -1,18 +1,21 @@ # Pendo -Pendo is powerful product analytics tool. The source connector here allows you to fetch data from the v1 API. +Pendo is powerful product analytics tool. The source connector here allows you to fetch data from the v1 API. Currently, the aggregation endpoint is not supported. Please [create an issue](https://github.com/airbytehq/airbyte/issues/new/choose) if you want more streams supported here. ## Prerequisites -* Created Pendo and enable the integration feature -* Generate [an API token](https://app.pendo.io/admin/integrationkeys) + +- Created Pendo and enable the integration feature +- Generate [an API token](https://app.pendo.io/admin/integrationkeys) ## Airbyte Open Source -* Api Token + +- Api Token ## Airbyte Cloud -* Api Token + +- Api Token ## Setup guide @@ -41,12 +44,11 @@ Currently, the aggregation endpoint is not supported. Please [create an issue](h 4. Add **API Key** from the last step 5. Click `Set up source`. - ## Supported sync modes The Pendo source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh +- Full Refresh ## Supported Streams @@ -55,13 +57,12 @@ The Pendo source connector supports the following [sync modes](https://docs.airb - [Report](https://engageapi.pendo.io/#2ac0699a-b653-4082-be11-563e5c0c9410) - [Guide](https://engageapi.pendo.io/#4f1e3ca1-fc41-4469-bf4b-da90ee8caf3d) - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------| -| 0.1.4 | 2024-04-19 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Updating to 0.80.0 CDK | -| 0.1.3 | 2024-04-18 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | schema descriptions | -| 0.1.0 | 2023-03-14 | [3563](https://github.com/airbytehq/airbyte/pull/3563) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.4 | 2024-04-19 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Updating to 0.80.0 CDK | +| 0.1.3 | 2024-04-18 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37220](https://github.com/airbytehq/airbyte/pull/37220) | schema descriptions | +| 0.1.0 | 2023-03-14 | [3563](https://github.com/airbytehq/airbyte/pull/3563) | Initial Release | diff --git a/docs/integrations/sources/persistiq.md b/docs/integrations/sources/persistiq.md index 10beb248508..3a481b2d6a6 100644 --- a/docs/integrations/sources/persistiq.md +++ b/docs/integrations/sources/persistiq.md @@ -40,5 +40,5 @@ Please read [How to find your API key](https://apidocs.persistiq.com/#introducti | Version | Date | Pull Request | Subject | | :------ | :--------- | :----------------------------------------------------- | :----------------------- | -| 0.2.0 | 2023-10-10 | [TBD](https://github.com/airbytehq/airbyte/pull/TBD) | Migrate to low code | +| 0.2.0 | 2023-10-10 | [TBD](https://github.com/airbytehq/airbyte/pull/TBD) | Migrate to low code | | 0.1.0 | 2022-01-21 | [9515](https://github.com/airbytehq/airbyte/pull/9515) | 🎉 New Source: PersistIq | diff --git a/docs/integrations/sources/pinterest-migrations.md b/docs/integrations/sources/pinterest-migrations.md index 42e0e32efb3..cd722c1d711 100644 --- a/docs/integrations/sources/pinterest-migrations.md +++ b/docs/integrations/sources/pinterest-migrations.md @@ -5,5 +5,6 @@ This release updates date-time fields with airbyte_type: timestamp_without_timezone for streams BoardPins, BoardSectionPins, Boards, Catalogs, CatalogFeeds. Additionally, the stream names AdvertizerReport and AdvertizerTargetingReport have been renamed to AdvertiserReport and AdvertiserTargetingReport, respectively. To ensure uninterrupted syncs, users should: + - Refresh the source schema - Reset affected streams diff --git a/docs/integrations/sources/pipedrive-migrations.md b/docs/integrations/sources/pipedrive-migrations.md index d4cef6e3242..491d3d94646 100644 --- a/docs/integrations/sources/pipedrive-migrations.md +++ b/docs/integrations/sources/pipedrive-migrations.md @@ -1,6 +1,7 @@ # Pipedrive Migration Guide ## Upgrading to 2.0.0 + Please update your config and reset your data (to match the new format). This version has changed the config to only require an API key. -This version also removes the `pipeline_ids` field from the `deal_fields` stream. +This version also removes the `pipeline_ids` field from the `deal_fields` stream. diff --git a/docs/integrations/sources/pipedrive.md b/docs/integrations/sources/pipedrive.md index cb87f1d27fb..6d71735a501 100644 --- a/docs/integrations/sources/pipedrive.md +++ b/docs/integrations/sources/pipedrive.md @@ -4,9 +4,9 @@ This page contains the setup guide and reference information for the Pipedrive c ## Prerequisites -* A Pipedrive account; -* An `API token`; -* A `client_id`, `client_secret`, and `refresh_token`. +- A Pipedrive account; +- An `API token`; +- A `client_id`, `client_secret`, and `refresh_token`. ## Setup guide @@ -20,7 +20,6 @@ If you don't see API next to the `Your companies` section, it's due to the permi For more information, access [enabling API for company users](https://pipedrive.readme.io/docs/enabling-api-for-company-users). - Step 2 - Find the API Token: You can get the API Token manually from the Pipedrive web app by going to account name (on the top right) > Company settings > Personal preferences > API. @@ -53,58 +52,57 @@ The Pipedrive connector supports the following sync modes: | SSL connection | Yes | | Namespaces | No | - ## Supported Streams Apart from `Fields` streams, all other streams support incremental. -* [Activities](https://developers.pipedrive.com/docs/api/v1/Activities#getActivities) +- [Activities](https://developers.pipedrive.com/docs/api/v1/Activities#getActivities) -* [ActivityFields](https://developers.pipedrive.com/docs/api/v1/ActivityFields#getActivityFields) +- [ActivityFields](https://developers.pipedrive.com/docs/api/v1/ActivityFields#getActivityFields) -* [ActivityTypes](https://developers.pipedrive.com/docs/api/v1/ActivityTypes#getActivityTypes) +- [ActivityTypes](https://developers.pipedrive.com/docs/api/v1/ActivityTypes#getActivityTypes) -* [Currencies](https://developers.pipedrive.com/docs/api/v1/Currencies#getCurrencies) +- [Currencies](https://developers.pipedrive.com/docs/api/v1/Currencies#getCurrencies) -* [DealFields](https://developers.pipedrive.com/docs/api/v1/DealFields#getDealFields) +- [DealFields](https://developers.pipedrive.com/docs/api/v1/DealFields#getDealFields) -* [DealProducts](https://developers.pipedrive.com/docs/api/v1/Deals#getDealProducts) +- [DealProducts](https://developers.pipedrive.com/docs/api/v1/Deals#getDealProducts) -* [Deals](https://developers.pipedrive.com/docs/api/v1/Deals#getDeals) +- [Deals](https://developers.pipedrive.com/docs/api/v1/Deals#getDeals) -* [Files](https://developers.pipedrive.com/docs/api/v1/Files#getFiles) +- [Files](https://developers.pipedrive.com/docs/api/v1/Files#getFiles) -* [Filters](https://developers.pipedrive.com/docs/api/v1/Filters#getFilters) +- [Filters](https://developers.pipedrive.com/docs/api/v1/Filters#getFilters) -* [Goals](https://developers.pipedrive.com/docs/api/v1/Goals#getGoals) +- [Goals](https://developers.pipedrive.com/docs/api/v1/Goals#getGoals) -* [LeadLabels](https://developers.pipedrive.com/docs/api/v1/LeadLabels#getLeadLabels) +- [LeadLabels](https://developers.pipedrive.com/docs/api/v1/LeadLabels#getLeadLabels) -* [Leads](https://developers.pipedrive.com/docs/api/v1/Leads#getLeads) +- [Leads](https://developers.pipedrive.com/docs/api/v1/Leads#getLeads) -* [Notes](https://developers.pipedrive.com/docs/api/v1/Notes#getNotes) +- [Notes](https://developers.pipedrive.com/docs/api/v1/Notes#getNotes) -* [OrganizationFields](https://developers.pipedrive.com/docs/api/v1/OrganizationFields#getOrganizationFields) +- [OrganizationFields](https://developers.pipedrive.com/docs/api/v1/OrganizationFields#getOrganizationFields) -* [Organizations](https://developers.pipedrive.com/docs/api/v1/Organizations#getOrganizations) +- [Organizations](https://developers.pipedrive.com/docs/api/v1/Organizations#getOrganizations) -* [PermissionSets](https://developers.pipedrive.com/docs/api/v1/PermissionSets#getPermissionSets) +- [PermissionSets](https://developers.pipedrive.com/docs/api/v1/PermissionSets#getPermissionSets) -* [PersonFields](https://developers.pipedrive.com/docs/api/v1/PersonFields#getPersonFields) +- [PersonFields](https://developers.pipedrive.com/docs/api/v1/PersonFields#getPersonFields) -* [Persons](https://developers.pipedrive.com/docs/api/v1/Persons#getPersons) +- [Persons](https://developers.pipedrive.com/docs/api/v1/Persons#getPersons) -* [Pipelines](https://developers.pipedrive.com/docs/api/v1/Pipelines#getPipelines) +- [Pipelines](https://developers.pipedrive.com/docs/api/v1/Pipelines#getPipelines) -* [ProductFields](https://developers.pipedrive.com/docs/api/v1/ProductFields#getProductFields) +- [ProductFields](https://developers.pipedrive.com/docs/api/v1/ProductFields#getProductFields) -* [Products](https://developers.pipedrive.com/docs/api/v1/Products#getProducts) +- [Products](https://developers.pipedrive.com/docs/api/v1/Products#getProducts) -* [Roles](https://developers.pipedrive.com/docs/api/v1/Roles#getRoles) +- [Roles](https://developers.pipedrive.com/docs/api/v1/Roles#getRoles) -* [Stages](https://developers.pipedrive.com/docs/api/v1/Stages#getStages) +- [Stages](https://developers.pipedrive.com/docs/api/v1/Stages#getStages) -* [Users](https://developers.pipedrive.com/docs/api/v1/Users#getUsers) +- [Users](https://developers.pipedrive.com/docs/api/v1/Users#getUsers) ## Performance considerations @@ -113,8 +111,8 @@ The Pipedrive connector will gracefully handle rate limits. For more information ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------| -| 2.2.2 | 2024-01-11 | [34153](https://github.com/airbytehq/airbyte/pull/34153) | prepare for airbyte-lib | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------- | +| 2.2.2 | 2024-01-11 | [34153](https://github.com/airbytehq/airbyte/pull/34153) | prepare for airbyte-lib | | 2.2.1 | 2023-11-06 | [31147](https://github.com/airbytehq/airbyte/pull/31147) | Bugfix: handle records with a null data field | | 2.2.0 | 2023-10-25 | [31707](https://github.com/airbytehq/airbyte/pull/31707) | Add new stream mail | | 2.1.0 | 2023-10-10 | [31184](https://github.com/airbytehq/airbyte/pull/31184) | Add new stream goals | diff --git a/docs/integrations/sources/pivotal-tracker.md b/docs/integrations/sources/pivotal-tracker.md index 214eaf95538..f1dc4cb9072 100644 --- a/docs/integrations/sources/pivotal-tracker.md +++ b/docs/integrations/sources/pivotal-tracker.md @@ -51,7 +51,7 @@ Use this to pull data from Pivotal Tracker. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------ | | 0.1.1 | 2023-10-25 | [11060](https://github.com/airbytehq/airbyte/pull/11060) | Fix schema and check connection | -| 0.1.0 | 2022-04-04 | [11060](https://github.com/airbytehq/airbyte/pull/11060) | Initial Release | +| 0.1.0 | 2022-04-04 | [11060](https://github.com/airbytehq/airbyte/pull/11060) | Initial Release | diff --git a/docs/integrations/sources/plausible.md b/docs/integrations/sources/plausible.md index 02cf5238043..833f5a5a90a 100644 --- a/docs/integrations/sources/plausible.md +++ b/docs/integrations/sources/plausible.md @@ -1,19 +1,23 @@ # Plausible ## Requirements -* [Plausible account](https://plausible.io/) -* Plausible [API key](https://plausible.io/docs/stats-api) + +- [Plausible account](https://plausible.io/) +- Plausible [API key](https://plausible.io/docs/stats-api) ## Supported sync modes -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | [Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) | -| Incremental Sync | No | | + +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :--------------------------------------------------------------------------------------------- | +| Full Refresh Sync | Yes | [Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) | +| Incremental Sync | No | | ## Supported Streams -* [Stats - Time Series](https://plausible.io/docs/stats-api#get-apiv1statstimeseries) + +- [Stats - Time Series](https://plausible.io/docs/stats-api#get-apiv1statstimeseries) ### Notes + Plausible is a privacy-first analytics service, and the data available from its API is intentionally 1) less granular and 2) less comprehensive than those available from Google Analytics. As such: 1. when retrieving multi-day data, [metrics](https://plausible.io/docs/stats-api#metrics) are aggregated to a daily grain; and @@ -22,10 +26,11 @@ Plausible is a privacy-first analytics service, and the data available from its Thus, this source connector retrieves [all possible metrics](https://plausible.io/docs/stats-api#metrics) on a daily grain, for all days with nonzero website activity. ## Performance Considerations + The [stated rate limit](https://plausible.io/docs/stats-api) is 600 requests per hour per API key, with higher capacities potentially available [upon request](https://plausible.io/contact). ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-30 | [18657](https://github.com/airbytehq/airbyte/pull/18657) | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-30 | [18657](https://github.com/airbytehq/airbyte/pull/18657) | Initial commit | diff --git a/docs/integrations/sources/pocket.md b/docs/integrations/sources/pocket.md index f3e8c71f4aa..d7eff408ee3 100644 --- a/docs/integrations/sources/pocket.md +++ b/docs/integrations/sources/pocket.md @@ -8,12 +8,12 @@ The Pocket source connector only supports full refresh syncs A single output stream is available from this source: -* [Retrieve](https://getpocket.com/developer/docs/v3/retrieve) +- [Retrieve](https://getpocket.com/developer/docs/v3/retrieve) ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | @@ -25,8 +25,8 @@ For more info on rate limiting, please refer to [Pocket Docs > Rate Limits](http ### Requirements -* Consumer Key -* Access Token +- Consumer Key +- Access Token ### Setup Guide @@ -36,12 +36,15 @@ It's nevertheless, very recommended to follow [this guide](https://www.jamesfmac 1. Create an App in the [Pocket Developer Portal](https://getpocket.com/developer/apps/new), give it Retrieve permissions and get your Consumer Key. 2. Obtain a Request Token. To do so, you need to issue a POST request to get a temporary Request Token. You can execute the command below: + ```sh curl --insecure -X POST -H 'Content-Type: application/json' -H 'X-Accept: application/json' \ https://getpocket.com/v3/oauth/request -d '{"consumer_key":"REPLACE-ME","redirect_uri":"http://www.google.com"}' ``` + 3. Visit the following website from your browser, and authorize the app: `https://getpocket.com/auth/authorize?request_token=REPLACE-ME&redirect_uri=http://www.google.com` 4. Convert your Request Token Into a Pocket Access Token. To do so, you can execute the following command: + ```sh curl --insecure -X POST -H 'Content-Type: application/json' -H 'X-Accept: application/json' \ https://getpocket.com/v3/oauth/authorize -d '{"consumer_key":"REPLACE-ME","code":"REQUEST-TOKEN"}' @@ -49,9 +52,9 @@ curl --insecure -X POST -H 'Content-Type: application/json' -H 'X-Accept: applic ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:------------------------------------------------| -| 0.1.3 | 2024-04-19 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | schema descriptions | -| 0.1.0 | 2022-10-30 | [18655](https://github.com/airbytehq/airbyte/pull/18655) | 🎉 New Source: Pocket | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37228](https://github.com/airbytehq/airbyte/pull/37228) | schema descriptions | +| 0.1.0 | 2022-10-30 | [18655](https://github.com/airbytehq/airbyte/pull/18655) | 🎉 New Source: Pocket | diff --git a/docs/integrations/sources/pokeapi.md b/docs/integrations/sources/pokeapi.md index ee543b33e02..f23208bcd03 100644 --- a/docs/integrations/sources/pokeapi.md +++ b/docs/integrations/sources/pokeapi.md @@ -36,7 +36,7 @@ The PokéAPI uses the same [JSONSchema](https://json-schema.org/understanding-js | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------- | -| 0.2.0 | 2023-10-02 | [30969](https://github.com/airbytehq/airbyte/pull/30969) | Migrated to Low code +| 0.2.0 | 2023-10-02 | [30969](https://github.com/airbytehq/airbyte/pull/30969) | Migrated to Low code | | 0.1.5 | 2022-05-18 | [12942](https://github.com/airbytehq/airbyte/pull/12942) | Fix example inputs | | 0.1.4 | 2021-12-07 | [8582](https://github.com/airbytehq/airbyte/pull/8582) | Update connector fields title/description | | 0.1.3 | 2021-12-03 | [8432](https://github.com/airbytehq/airbyte/pull/8432) | Migrate from base_python to CDK, add SAT tests. | diff --git a/docs/integrations/sources/polygon-stock-api.md b/docs/integrations/sources/polygon-stock-api.md index b9589b91f32..631f8e65ac2 100644 --- a/docs/integrations/sources/polygon-stock-api.md +++ b/docs/integrations/sources/polygon-stock-api.md @@ -2,7 +2,7 @@ ## Sync overview -This source can give information about stocks data available on +This source can give information about stocks data available on [PolygonStocksApi](https://polygon.io). It currently only supports Full Refresh syncs. @@ -10,14 +10,14 @@ syncs. This source is capable of syncing the following streams: -* `stock_api` +- `stock_api` ### Features -| Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:--------------------------------------------------------| -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported? \(Yes/No\) | Notes | +| :---------------- | :-------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -35,27 +35,24 @@ may require a paid plan based upon your requirements. ### Setup guide The following fields are required fields for the connector to work: + - `apiKey`: Your Polygon Stocks API key. - `stocksTicker`: The ticker symbol of the `stock/equity`. - `multiplier`: The size of the timespan multiplier. -- `timespan`: The +- `timespan`: The - `from`: The start of the aggregate time window. Either a date with the format YYYY-MM-DD or a millisecond timestamp. - `to`: The end of the aggregate time window. Either a date with the format YYYY-MM-DD or a millisecond timestamp. - (optional) `adjusted`: determines whether or not the results are adjusted for splits. By default, results are adjusted and set to true. Set this to false to get results that are NOT adjusted for splits. - (optional) `sort`: Sort the results by timestamp. asc will return results in ascending order (oldest at the top), desc will return results in descending order (newest at the top). - (optional) `limit`: Limits the number of base aggregates queried to create the aggregate results. Max 50000 and Default 5000. Read more about how limit is used to calculate aggregate results in our article on Aggregate Data API Improvements [Find-more](https://polygon.io/blog/aggs-api-updates/). - - - - ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| -| 0.1.5 | 2024-04-19 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Updating to 0.80.0 CDK | -| 0.1.4 | 2024-04-18 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Manage dependencies with Poetry. | -| 0.1.3 | 2024-04-15 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.2 | 2024-04-12 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | schema descriptions | -| 0.1.1 | 2023-02-13 | [22908](https://github.com/airbytehq/airbyte/pull/22908) | Specified date formatting in specificatition | -| 0.1.0 | 2022-11-02 | [18842](https://github.com/airbytehq/airbyte/pull/18842) | New source | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.5 | 2024-04-19 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Updating to 0.80.0 CDK | +| 0.1.4 | 2024-04-18 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Manage dependencies with Poetry. | +| 0.1.3 | 2024-04-15 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.2 | 2024-04-12 | [37230](https://github.com/airbytehq/airbyte/pull/37230) | schema descriptions | +| 0.1.1 | 2023-02-13 | [22908](https://github.com/airbytehq/airbyte/pull/22908) | Specified date formatting in specificatition | +| 0.1.0 | 2022-11-02 | [18842](https://github.com/airbytehq/airbyte/pull/18842) | New source | diff --git a/docs/integrations/sources/postgres.md b/docs/integrations/sources/postgres.md index 238e1266ff0..6d8c3e84985 100644 --- a/docs/integrations/sources/postgres.md +++ b/docs/integrations/sources/postgres.md @@ -1,10 +1,11 @@ # Postgres Airbyte's certified Postgres connector offers the following features: -* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. -* Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) and replication using the [xmin system column](#xmin). -* All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. -* Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. + +- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions. +- Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) and replication using the [xmin system column](#xmin). +- All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. +- Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. The contents below include a 'Quick Start' guide, advanced setup steps, and reference information (data type mapping, and changelogs). See [here](https://docs.airbyte.com/integrations/sources/postgres/postgres-troubleshooting) to troubleshooting issues with the Postgres connector. @@ -13,6 +14,7 @@ The contents below include a 'Quick Start' guide, advanced setup steps, and refe ## Quick Start Here is an outline of the minimum required steps to configure a Postgres connector: + 1. Create a dedicated read-only Postgres user with permissions for replicating data 2. Create a new Postgres source in the Airbyte UI using `xmin` system column 3. (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs @@ -44,6 +46,7 @@ From your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open ![Create an Airbyte source](https://github.com/airbytehq/airbyte/blob/c078e8ed6703020a584d9362efa5665fbe8db77f/docs/integrations/sources/postgres/assets/airbyte_source_selection.png?raw=true) To fill out the required information: + 1. Enter the hostname, port number, and name for your Postgres database. 2. You may optionally opt to list each of the schemas you want to sync. These are case-sensitive, and multiple schemas may be entered. By default, `public` is the only selected schema. 3. Enter the username and password you created in [Step 1](#step-1-create-a-dedicated-read-only-postgres-user). @@ -52,12 +55,14 @@ To fill out the required information: 1. If your database is particularly large (> 500 GB), you will benefit from [configuring your Postgres source using logical replication (CDC)](#cdc). + #### Step 3: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs. If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in our [Airbyte Security docs](../../operating-airbyte/security#network-security-1). Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte Postgres source! + ## Advanced Configuration @@ -65,15 +70,18 @@ Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting t ### Setup using CDC Airbyte uses [logical replication](https://www.postgresql.org/docs/10/logical-replication.html) of the Postgres write-ahead log (WAL) to incrementally capture deletes using a replication plugin: -* See [here](https://docs.airbyte.com/understanding-airbyte/cdc) to learn more on how Airbyte implements CDC. -* See [here](https://docs.airbyte.com/integrations/sources/postgres/postgres-troubleshooting#cdc-requirements) to learn more about Postgres CDC requirements and limitations. + +- See [here](https://docs.airbyte.com/understanding-airbyte/cdc) to learn more on how Airbyte implements CDC. +- See [here](https://docs.airbyte.com/integrations/sources/postgres/postgres-troubleshooting#cdc-requirements) to learn more about Postgres CDC requirements and limitations. We recommend configuring your Postgres source with CDC when: + - You need a record of deletions. - You have a very large database (500 GB or more). - Your table has a primary key but doesn't have a reasonable cursor field for incremental syncing (`updated_at`). These are the additional steps required (after following the [quick start](#quick-start)) to configure your Postgres source using CDC: + 1. Provide additional `REPLICATION` permissions to read-only user 2. Enable logical replication on your Postgres database 3. Create a replication slot on your Postgres database @@ -89,6 +97,7 @@ For CDC, you must connect to primary/master databases. Pointing the connector co #### Step 2: Provide additional permissions to read-only user To configure CDC for the Postgres source connector, grant `REPLICATION` permissions to the user created in [step 1 of the quick start](#step-1-create-a-dedicated-read-only-postgres-user): + ``` ALTER USER REPLICATION; ``` @@ -98,16 +107,17 @@ ALTER USER REPLICATION; To enable logical replication on bare metal, VMs (EC2/GCE/etc), or Docker, configure the following parameters in the
    postgresql.conf file for your Postgres database: | Parameter | Description | Set value to | -|-----------------------|--------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------| +| --------------------- | ------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------ | | wal_level | Type of coding used within the Postgres write-ahead log | `logical ` | | max_wal_senders | The maximum number of processes used for handling WAL changes | `min: 1` | | max_replication_slots | The maximum number of replication slots that are allowed to stream WAL changes | `1` (if Airbyte is the only service reading subscribing to WAL changes. More than 1 if other services are also reading from the WAL) | To enable logical replication on AWS Postgres RDS or Aurora: -* Go to the Configuration tab for your DB cluster. -* Find your cluster parameter group. Either edit the parameters for this group or create a copy of this parameter group to edit. If you create a copy, change your cluster's parameter group before restarting. -* Within the parameter group page, search for `rds.logical_replication`. Select this row and click Edit parameters. Set this value to 1. -* Wait for a maintenance window to automatically restart the instance or restart it manually. + +- Go to the Configuration tab for your DB cluster. +- Find your cluster parameter group. Either edit the parameters for this group or create a copy of this parameter group to edit. If you create a copy, change your cluster's parameter group before restarting. +- Within the parameter group page, search for `rds.logical_replication`. Select this row and click Edit parameters. Set this value to 1. +- Wait for a maintenance window to automatically restart the instance or restart it manually. To enable logical replication on Azure Database for Postgres, change the replication mode of your Postgres DB on Azure to `logical` using the replication menu of your PostgreSQL instance in the Azure Portal. Alternatively, use the Azure CLI to run the following command: @@ -164,6 +174,7 @@ The Postgres source currently offers 3 methods of replicating updates to your de #### CDC Airbyte uses [logical replication](https://www.postgresql.org/docs/10/logical-replication.html) of the Postgres write-ahead log (WAL) to incrementally capture deletes using a replication plugin. To learn more how Airbyte implements CDC, refer to [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc/). We recommend configuring your Postgres source with CDC when: + - You need a record of deletions. - You have a very large database (500 GB or more). - Your table has a primary key but doesn't have a reasonable cursor field for incremental syncing (`updated_at`). @@ -175,18 +186,20 @@ If your goal is to maintain a snapshot of your table in the destination but the Xmin replication is the new cursor-less replication method for Postgres. Cursorless syncs enable syncing new or updated rows without explicitly choosing a cursor field. The xmin system column which (available in all Postgres databases) is used to track inserts and updates to your source data. This is a good solution if: + - There is not a well-defined cursor candidate to use for Standard incremental mode. - You want to replace a previously configured full-refresh sync. - You are replicating Postgres tables less than 500GB. -- You are not replicating non-materialized views. Non-materialized views are not supported by xmin replication. +- You are not replicating non-materialized views. Non-materialized views are not supported by xmin replication. ## Connecting with SSL or SSH Tunneling ### SSL Modes -Airbyte Cloud uses SSL by default. You are not permitted to `disable` SSL while using Airbyte Cloud. +Airbyte Cloud uses SSL by default. You are not permitted to `disable` SSL while using Airbyte Cloud. Here is a breakdown of available SSL connection modes: + - `disable` to disable encrypted communication between Airbyte and the source - `allow` to enable encrypted communication only when required by the source - `prefer` to allow unencrypted communication only when the source doesn't support encryption @@ -199,6 +212,7 @@ Here is a breakdown of available SSL connection modes: If you are using SSH tunneling, as Airbyte Cloud requires encrypted communication, select `SSH Key Authentication` or `Password Authentication` if you selected `disable`, `allow`, or `prefer` as the SSL Mode; otherwise, the connection will fail. For SSH Tunnel Method, select: + - `No Tunnel` for a direct connection to the database - `SSH Key Authentication` to use an RSA Private as your secret for establishing the SSH tunnel - `Password Authentication` to use a password as your secret for establishing the SSH tunnel @@ -212,14 +226,14 @@ When using an SSH tunnel, you are configuring Airbyte to connect to an intermedi To connect to a Postgres instance via an SSH tunnel: 1. While [setting up](#step-2-create-a-new-postgres-source-in-airbyte-ui) the Postgres source connector, from the SSH tunnel dropdown, select: - - SSH Key Authentication to use a private as your secret for establishing the SSH tunnel - - Password Authentication to use a password as your secret for establishing the SSH Tunnel + - SSH Key Authentication to use a private as your secret for establishing the SSH tunnel + - Password Authentication to use a password as your secret for establishing the SSH Tunnel 2. For **SSH Tunnel Jump Server Host**, enter the hostname or IP address for the intermediate (bastion) server that Airbyte will connect to. 3. For **SSH Connection Port**, enter the port on the bastion server. The default port for SSH connections is 22. 4. For **SSH Login Username**, enter the username to use when connecting to the bastion server. **Note:** This is the operating system username and not the Postgres username. 5. For authentication: - - If you selected **SSH Key Authentication**, set the **SSH Private Key** to the [private Key](#generating-a-private-key-for-ssh-tunneling) that you are using to create the SSH connection. - - If you selected **Password Authentication**, enter the password for the operating system user to connect to the bastion server. **Note:** This is the operating system password and not the Postgres password. + - If you selected **SSH Key Authentication**, set the **SSH Private Key** to the [private Key](#generating-a-private-key-for-ssh-tunneling) that you are using to create the SSH connection. + - If you selected **Password Authentication**, enter the password for the operating system user to connect to the bastion server. **Note:** This is the operating system password and not the Postgres password. #### Generating a private key for SSH Tunneling @@ -240,7 +254,7 @@ To see connector limitations, or troubleshoot your Postgres connector, see more According to Postgres [documentation](https://www.postgresql.org/docs/14/datatype.html), Postgres data types are mapped to the following data types when synchronizing data. You can check the test values examples [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-postgres/src/test-integration/java/io/airbyte/integrations/io/airbyte/integration_tests/sources/PostgresSourceDatatypeTest.java). If you can't find the data type you are looking for or have any problems feel free to add a new test! | Postgres Type | Resulting Type | Notes | -|---------------------------------------|----------------|------------------------------------------------------------------------------------------------------------------------------------------------------| +| ------------------------------------- | -------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------- | | `bigint` | number | | | `bigserial`, `serial8` | number | | | `bit` | string | Fixed-length bit string (e.g. "0100"). | @@ -291,7 +305,7 @@ According to Postgres [documentation](https://www.postgresql.org/docs/14/datatyp ## Changelog | Version | Date | Pull Request | Subject | -|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| ------- | ---------- | -------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --- | | 3.3.32 | 2024-04-30 | [37758](https://github.com/airbytehq/airbyte/pull/37758) | Correct previous release to disable debezium retries | | 3.3.31 | 2024-04-30 | [37754](https://github.com/airbytehq/airbyte/pull/37754) | Add CDC logs | | 3.3.30 | 2024-04-30 | [37726](https://github.com/airbytehq/airbyte/pull/37726) | Remove debezium retries | @@ -453,13 +467,13 @@ According to Postgres [documentation](https://www.postgresql.org/docs/14/datatyp | 0.4.43 | 2022-08-03 | [15226](https://github.com/airbytehq/airbyte/pull/15226) | Make connectionTimeoutMs configurable through JDBC url parameters | | 0.4.42 | 2022-08-03 | [15273](https://github.com/airbytehq/airbyte/pull/15273) | Fix a bug in `0.4.36` and correctly parse the CDC initial record waiting time | | 0.4.41 | 2022-08-03 | [15077](https://github.com/airbytehq/airbyte/pull/15077) | Sync data from beginning if the LSN is no longer valid in CDC | -| | 2022-08-03 | [14903](https://github.com/airbytehq/airbyte/pull/14903) | Emit state messages more frequently (⛔ this version has a bug; use `1.0.1` instead | +| | 2022-08-03 | [14903](https://github.com/airbytehq/airbyte/pull/14903) | Emit state messages more frequently (⛔ this version has a bug; use `1.0.1` instead | | 0.4.40 | 2022-08-03 | [15187](https://github.com/airbytehq/airbyte/pull/15187) | Add support for BCE dates/timestamps | | | 2022-08-03 | [14534](https://github.com/airbytehq/airbyte/pull/14534) | Align regular and CDC integration tests and data mappers | | 0.4.39 | 2022-08-02 | [14801](https://github.com/airbytehq/airbyte/pull/14801) | Fix multiple log bindings | | 0.4.38 | 2022-07-26 | [14362](https://github.com/airbytehq/airbyte/pull/14362) | Integral columns are now discovered as int64 fields. | | 0.4.37 | 2022-07-22 | [14714](https://github.com/airbytehq/airbyte/pull/14714) | Clarified error message when invalid cursor column selected | -| 0.4.36 | 2022-07-21 | [14451](https://github.com/airbytehq/airbyte/pull/14451) | Make initial CDC waiting time configurable (⛔ this version has a bug and will not work; use `0.4.42` instead) | | +| 0.4.36 | 2022-07-21 | [14451](https://github.com/airbytehq/airbyte/pull/14451) | Make initial CDC waiting time configurable (⛔ this version has a bug and will not work; use `0.4.42` instead) | | | 0.4.35 | 2022-07-14 | [14574](https://github.com/airbytehq/airbyte/pull/14574) | Removed additionalProperties:false from JDBC source connectors | | 0.4.34 | 2022-07-17 | [13840](https://github.com/airbytehq/airbyte/pull/13840) | Added the ability to connect using different SSL modes and SSL certificates. | | 0.4.33 | 2022-07-14 | [14586](https://github.com/airbytehq/airbyte/pull/14586) | Validate source JDBC url parameters | diff --git a/docs/integrations/sources/postgres/cloud-sql-postgres.md b/docs/integrations/sources/postgres/cloud-sql-postgres.md index 670d268f82d..ea1079b99f7 100644 --- a/docs/integrations/sources/postgres/cloud-sql-postgres.md +++ b/docs/integrations/sources/postgres/cloud-sql-postgres.md @@ -1,9 +1,10 @@ # Cloud SQL for PostgreSQL Airbyte's certified Postgres connector offers the following features: -* Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) and replication using the [xmin system column](https://docs.airbyte.com/integrations/sources/postgres#xmin). -* All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. -* Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. + +- Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) and replication using the [xmin system column](https://docs.airbyte.com/integrations/sources/postgres#xmin). +- All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination. +- Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads. ![Airbyte Postgres Connection](https://raw.githubusercontent.com/airbytehq/airbyte/c078e8ed6703020a584d9362efa5665fbe8db77f/docs/integrations/sources/postgres/assets/airbyte_postgres_source.png?raw=true) @@ -12,6 +13,7 @@ Airbyte's certified Postgres connector offers the following features: ![Cloud SQL for PostgreSQL](./assets/airbyte_cloud_sql_postgres_db.png) Here is an outline of the minimum required steps to configure a connection to Postgres on Google Cloud SQL: + 1. Create a dedicated read-only Postgres user with permissions for replicating data 2. Create a new Postgres source in the Airbyte UI using `xmin` system column 3. (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs @@ -43,17 +45,20 @@ From your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open ![Create an Airbyte source](https://github.com/airbytehq/airbyte/blob/c078e8ed6703020a584d9362efa5665fbe8db77f/docs/integrations/sources/postgres/assets/airbyte_source_selection.png?raw=true) To fill out the required information: + 1. Enter the hostname, port number, and name for your Postgres database. 2. You may optionally opt to list each of the schemas you want to sync. These are case-sensitive, and multiple schemas may be entered. By default, `public` is the only selected schema. 3. Enter the username and password you created in [Step 1](#step-1-create-a-dedicated-read-only-postgres-user). 4. Select an SSL mode. You will most frequently choose `require` or `verify-ca`. Both of these always require encryption. `verify-ca` also requires certificates from your Postgres database. See here to learn about other SSL modes and SSH tunneling. 5. Select `Standard (xmin)` from available replication methods. This uses the [xmin system column](https://docs.airbyte.com/integrations/sources/postgres#xmin) to reliably replicate data from your database. - 1. If your database is particularly large (> 500 GB), you will benefit from [configuring your Postgres source using logical replication (CDC)](https://docs.airbyte.com/integrations/sources/postgres#cdc). + 1. If your database is particularly large (> 500 GB), you will benefit from [configuring your Postgres source using logical replication (CDC)](https://docs.airbyte.com/integrations/sources/postgres#cdc). + #### Step 3: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs. If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. To allowlist IPs in Cloud SQL: + 1. In your Google Cloud SQL database dashboard, select `Connections` from the left menu. Then, select `Add Network` under the `Connectivity` section. ![Add a Network](./assets/airbyte_cloud_sql_postgres_add_network.png) @@ -69,15 +74,18 @@ Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting t ### Setup using CDC Airbyte uses [logical replication](https://www.postgresql.org/docs/10/logical-replication.html) of the Postgres write-ahead log (WAL) to incrementally capture deletes using a replication plugin: -* See [here](https://docs.airbyte.com/understanding-airbyte/cdc) to learn more on how Airbyte implements CDC. -* See [here](https://docs.airbyte.com/integrations/sources/postgres/postgres-troubleshooting#cdc-requirements) to learn more about Postgres CDC requirements and limitations. + +- See [here](https://docs.airbyte.com/understanding-airbyte/cdc) to learn more on how Airbyte implements CDC. +- See [here](https://docs.airbyte.com/integrations/sources/postgres/postgres-troubleshooting#cdc-requirements) to learn more about Postgres CDC requirements and limitations. We recommend configuring your Postgres source with CDC when: + - You need a record of deletions. - You have a very large database (500 GB or more). - Your table has a primary key but doesn't have a reasonable cursor field for incremental syncing (`updated_at`). These are the additional steps required (after following the [quick start](#quick-start)) to configure your Postgres source using CDC: + 1. Provide additional `REPLICATION` permissions to read-only user 2. Enable logical replication on your Postgres database 3. Create a replication slot on your Postgres database @@ -93,6 +101,7 @@ For CDC, you must connect to primary/master databases. Pointing the connector co #### Step 2: Provide additional permissions to read-only user To configure CDC for the Postgres source connector, grant `REPLICATION` permissions to the user created in [step 1 of the quick start](#step-1-create-a-dedicated-read-only-postgres-user): + ``` ALTER USER REPLICATION; ``` diff --git a/docs/integrations/sources/postgres/postgres-troubleshooting.md b/docs/integrations/sources/postgres/postgres-troubleshooting.md index 329cc2af727..b28770c5d5a 100644 --- a/docs/integrations/sources/postgres/postgres-troubleshooting.md +++ b/docs/integrations/sources/postgres/postgres-troubleshooting.md @@ -7,9 +7,9 @@ - The Postgres source connector currently does not handle schemas larger than 4MB. - The Postgres source connector does not alter the schema present in your database. Depending on the destination connected to this source, however, the schema may be altered. See the destination's documentation for more details. - The following two schema evolution actions are currently supported: - - Adding/removing tables without resetting the entire connection at the destination - Caveat: In the CDC mode, adding a new table to a connection may become a temporary bottleneck. When a new table is added, the next sync job takes a full snapshot of the new table before it proceeds to handle any changes. - - Resetting a single table within the connection without resetting the rest of the destination tables in that connection + - Adding/removing tables without resetting the entire connection at the destination + Caveat: In the CDC mode, adding a new table to a connection may become a temporary bottleneck. When a new table is added, the next sync job takes a full snapshot of the new table before it proceeds to handle any changes. + - Resetting a single table within the connection without resetting the rest of the destination tables in that connection - Changing a column data type or removing a column might break connections. ### Version Requirements @@ -28,8 +28,8 @@ - Log-based replication only works for master instances of Postgres. CDC cannot be run from a read-replica of your primary database. - An Airbyte database source using CDC replication can only be used with a single Airbyte destination. This is due to how Postgres CDC is implemented - each destination would recieve only part of the data available in the replication slot. - Using logical replication increases disk space used on the database server. The additional data is stored until it is consumed. - - Set frequent syncs for CDC to ensure that the data doesn't fill up your disk space. - - If you stop syncing a CDC-configured Postgres instance with Airbyte, delete the replication slot. Otherwise, it may fill up your disk space. + - Set frequent syncs for CDC to ensure that the data doesn't fill up your disk space. + - If you stop syncing a CDC-configured Postgres instance with Airbyte, delete the replication slot. Otherwise, it may fill up your disk space. ### Supported cursors @@ -78,15 +78,14 @@ Normally under the CDC mode, the Postgres source will first run a full refresh s The root causes is that the WALs needed for the incremental sync has been removed by Postgres. This can occur under the following scenarios: - When there are lots of database updates resulting in more WAL files than allowed in the `pg_wal` directory, Postgres will purge or archive the WAL files. This scenario is preventable. Possible solutions include: - - Sync the data source more frequently. - - Set a higher `wal_keep_size`. If no unit is provided, it is in megabytes, and the default is `0`. See detailed documentation [here](https://www.postgresql.org/docs/current/runtime-config-replication.html#GUC-WAL-KEEP-SIZE). The downside of this approach is that more disk space will be needed. + - Sync the data source more frequently. + - Set a higher `wal_keep_size`. If no unit is provided, it is in megabytes, and the default is `0`. See detailed documentation [here](https://www.postgresql.org/docs/current/runtime-config-replication.html#GUC-WAL-KEEP-SIZE). The downside of this approach is that more disk space will be needed. - When the Postgres connector successfully reads the WAL and acknowledges it to Postgres, but the destination connector fails to consume the data, the Postgres connector will try to read the same WAL again, which may have been removed by Postgres, since the WAL record is already acknowledged. This scenario is rare, because it can happen, and currently there is no way to prevent it. The correct behavior is to perform a full refresh. ### Temporary File Size Limit Some larger tables may encounter an error related to the temporary file size limit such as `temporary file size exceeds temp_file_limit`. To correct this error increase the [temp_file_limit](https://postgresqlco.nf/doc/en/param/temp_file_limit/). - ### (Advanced) Custom JDBC Connection Strings To customize the JDBC connection beyond common options, specify additional supported [JDBC URL parameters](https://jdbc.postgresql.org/documentation/head/connect.html) as key-value pairs separated by the symbol & in the **JDBC URL Parameters (Advanced)** field. @@ -103,6 +102,7 @@ The connector now supports `connectTimeout` and defaults to 60 seconds. Setting ### (Advanced) Setting up initial CDC waiting time The Postgres connector may need some time to start processing the data in the CDC mode in the following scenarios: + - When the connection is set up for the first time and a snapshot is needed - When the connector has a lot of change logs to process @@ -114,10 +114,10 @@ If you know there are database changes to be synced, but the connector cannot re In certain situations, WAL disk consumption increases. This can occur when there are a large volume of changes, but only a small percentage of them are being made to the databases, schemas and tables configured for capture. -A workaround for this situation is to artificially add events to a heartbeat table that the Airbyte use has write access to. This will ensure that Airbyte can process the WAL and prevent disk space to spike. To configure this: -1. Create a table (e.g. `airbyte_heartbeat`) in the database and schema being tracked. -2. Add this table to the airbyte publication. +A workaround for this situation is to artificially add events to a heartbeat table that the Airbyte use has write access to. This will ensure that Airbyte can process the WAL and prevent disk space to spike. To configure this: + +1. Create a table (e.g. `airbyte_heartbeat`) in the database and schema being tracked. +2. Add this table to the airbyte publication. 3. Configure the `heartbeat_action_query` property while setting up the source-postgres connector. This query will be periodically executed by Airbyte on the `airbyte_heartbeat` table. For example, this param can be set to a query like `INSERT INTO airbyte_heartbeat (text) VALUES ('heartbeat')`. - See detailed documentation [here](https://debezium.io/documentation/reference/stable/connectors/postgresql.html#postgresql-wal-disk-space). diff --git a/docs/integrations/sources/posthog.md b/docs/integrations/sources/posthog.md index 2c40b751526..192e1229b16 100644 --- a/docs/integrations/sources/posthog.md +++ b/docs/integrations/sources/posthog.md @@ -46,7 +46,7 @@ This page contains the setup guide and reference information for the PostHog sou ### Rate limiting -Private `GET`, `POST`, `PATCH`, `DELETE` endpoints are rate limited. Public POST-only endpoints are **not** rate limited. A rule of thumb for whether rate limits apply is if the personal API key is used for authentication. +Private `GET`, `POST`, `PATCH`, `DELETE` endpoints are rate limited. Public POST-only endpoints are **not** rate limited. A rule of thumb for whether rate limits apply is if the personal API key is used for authentication. There are separate limits for different kinds of resources. @@ -67,9 +67,9 @@ Want to use the PostHog API beyond these limits? Email Posthog at `customers@pos ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------- | | 1.0.0 | 2023-12-04 | [28593](https://github.com/airbytehq/airbyte/pull/28593) | Fix events.event type | -| 0.1.15 | 2023-10-28 | [31265](https://github.com/airbytehq/airbyte/pull/31265) | Fix Events stream datetime format | +| 0.1.15 | 2023-10-28 | [31265](https://github.com/airbytehq/airbyte/pull/31265) | Fix Events stream datetime format | | 0.1.14 | 2023-08-29 | [29947](https://github.com/airbytehq/airbyte/pull/29947) | Add optional field to spec: `events_time_step` | | 0.1.13 | 2023-07-19 | [28461](https://github.com/airbytehq/airbyte/pull/28461) | Fixed EventsSimpleRetriever declaration | | 0.1.12 | 2023-06-28 | [27764](https://github.com/airbytehq/airbyte/pull/27764) | Update following state breaking changes | diff --git a/docs/integrations/sources/postmarkapp.md b/docs/integrations/sources/postmarkapp.md index 8e7edc01c5a..b8adf316a41 100644 --- a/docs/integrations/sources/postmarkapp.md +++ b/docs/integrations/sources/postmarkapp.md @@ -9,21 +9,21 @@ The Postmarkapp source can sync data from the [Postmarkapp API](https://postmark Postmarkapp requires an API key to make request and retrieve data. You can find your API key in the [Postmarkapp dashboard](https://account.postmarkapp.com/servers/9708911/credentials). ## Streams -Current supported streams: + +Current supported streams: Server-API + - [Bounces: Deliverystats](https://postmarkapp.com/developer/api/bounce-api#delivery-stats): Lets you access all reports regarding your bounces for a specific server. Bounces are available for 45 days after a bounce. - [Message-Streams](https://postmarkapp.com/developer/api/message-streams-api#list-message-streams): Lets you manage message streams for a specific server. Please note: A Server may have up to 10 Streams, including the default ones. Default Streams cannot be deleted, and Servers can only have 1 Inbound Stream. - [Outbound stats](https://account.postmarkapp.com/servers/9708911/credentials): Lets you get all of the statistics of your outbound emails for a specific server. These statistics are stored permantently and do not expire. All stats use EST timezone Account-API + - [Servers](https://postmarkapp.com/developer/api/servers-api): Lets you manage servers for a specific account. - [Domains](https://postmarkapp.com/developer/api/domains-api): Gets a list of domains containing an overview of the domain and authentication status. - [Sender signatures](https://postmarkapp.com/developer/api/signatures-api): Gets a list of sender signatures containing brief details associated with your account. - - - ## Setup guide ## Step 1: Set up the Postmarkapp connector in Airbyte @@ -51,12 +51,11 @@ The Postmarkapp source connector supports the following [sync modes](https://doc | Incremental Sync | No | | Namespaces | No | - ## Changelog -| Version | Date | Pull Request | Subject | -| :------ |:-----|:-------------| :----------------------------------------- | -| 0.1.3 | 2024-04-19 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | schema descriptions | -| 0.1.0 | 2022-11-09 | 18220 | 🎉 New Source: Postmarkapp API [low-code CDK] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37232](https://github.com/airbytehq/airbyte/pull/37232) | schema descriptions | +| 0.1.0 | 2022-11-09 | 18220 | 🎉 New Source: Postmarkapp API [low-code CDK] | diff --git a/docs/integrations/sources/prestashop.md b/docs/integrations/sources/prestashop.md index 2760d1f2fbf..12f90dd8f97 100644 --- a/docs/integrations/sources/prestashop.md +++ b/docs/integrations/sources/prestashop.md @@ -102,14 +102,14 @@ If there are more endpoints you'd like Airbyte to support, please [create an iss ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------------------- | -| 1.0.4 | 2024-04-19 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Updating to 0.80.0 CDK | -| 1.0.3 | 2024-04-18 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Manage dependencies with Poetry. | -| 1.0.2 | 2024-04-15 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 1.0.1 | 2024-04-12 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | schema descriptions | -| 1.0.0 | 2023-06-26 | [27716](https://github.com/airbytehq/airbyte/pull/27716) | update schema; remove empty datetime fields | -| 0.3.1 | 2023-02-13 | [22905](https://github.com/airbytehq/airbyte/pull/22905) | Specified date formatting in specification | -| 0.3.0 | 2022-11-08 | [#18927](https://github.com/airbytehq/airbyte/pull/18927) | Migrate connector from Alpha (Python) to Beta (YAML) | -| 0.2.0 | 2022-10-31 | [#18599](https://github.com/airbytehq/airbyte/pull/18599) | Only https scheme is allowed | -| 0.1.0 | 2021-07-02 | [#4465](https://github.com/airbytehq/airbyte/pull/4465) | Initial implementation | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 1.0.4 | 2024-04-19 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Updating to 0.80.0 CDK | +| 1.0.3 | 2024-04-18 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Manage dependencies with Poetry. | +| 1.0.2 | 2024-04-15 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 1.0.1 | 2024-04-12 | [37233](https://github.com/airbytehq/airbyte/pull/37233) | schema descriptions | +| 1.0.0 | 2023-06-26 | [27716](https://github.com/airbytehq/airbyte/pull/27716) | update schema; remove empty datetime fields | +| 0.3.1 | 2023-02-13 | [22905](https://github.com/airbytehq/airbyte/pull/22905) | Specified date formatting in specification | +| 0.3.0 | 2022-11-08 | [#18927](https://github.com/airbytehq/airbyte/pull/18927) | Migrate connector from Alpha (Python) to Beta (YAML) | +| 0.2.0 | 2022-10-31 | [#18599](https://github.com/airbytehq/airbyte/pull/18599) | Only https scheme is allowed | +| 0.1.0 | 2021-07-02 | [#4465](https://github.com/airbytehq/airbyte/pull/4465) | Initial implementation | diff --git a/docs/integrations/sources/primetric.md b/docs/integrations/sources/primetric.md index f3d48aac8f5..35cb97b618e 100644 --- a/docs/integrations/sources/primetric.md +++ b/docs/integrations/sources/primetric.md @@ -55,7 +55,7 @@ your employees to the right projects. ## CHANGELOG -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :--------------------------------------------------------- | :--------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------------- | | 1.0.0 | 2024-04-01 | [36508](https://github.com/airbytehq/airbyte/pull/36508) | Migrate to low code cdk | -| 0.1.0 | 2022-09-05 | [15880](https://github.com/airbytehq/airbyte/pull/15880) | Initial implementation | +| 0.1.0 | 2022-09-05 | [15880](https://github.com/airbytehq/airbyte/pull/15880) | Initial implementation | diff --git a/docs/integrations/sources/public-apis.md b/docs/integrations/sources/public-apis.md index 220e2040546..9b4b7b8d84a 100644 --- a/docs/integrations/sources/public-apis.md +++ b/docs/integrations/sources/public-apis.md @@ -8,26 +8,26 @@ This source can sync data for the [Public APIs](https://api.publicapis.org/) RES This Source is capable of syncing the following Streams: -* [Services](https://api.publicapis.org#get-entries) -* [Categories](https://api.publicapis.org#get-categories) +- [Services](https://api.publicapis.org#get-entries) +- [Categories](https://api.publicapis.org#get-categories) ### Data type mapping -| Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `integer`, `number` | `number` | | -| `boolean` | `boolean` | | +| Integration Type | Airbyte Type | Notes | +| :------------------ | :----------- | :---- | +| `string` | `string` | | +| `integer`, `number` | `number` | | +| `boolean` | `boolean` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | -| SSL connection | Yes | -| Namespaces | No | | -| Pagination | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | +| SSL connection | Yes | +| Namespaces | No | | +| Pagination | No | | ## Getting started @@ -41,7 +41,7 @@ This source requires no setup. ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.0 | 2023-06-15 | [29391](https://github.com/airbytehq/airbyte/pull/29391) | Migrated to Low Code | -| 0.1.0 | 2022-10-28 | [18471](https://github.com/airbytehq/airbyte/pull/18471) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------- | +| 0.2.0 | 2023-06-15 | [29391](https://github.com/airbytehq/airbyte/pull/29391) | Migrated to Low Code | +| 0.1.0 | 2022-10-28 | [18471](https://github.com/airbytehq/airbyte/pull/18471) | Initial Release | diff --git a/docs/integrations/sources/punk-api.md b/docs/integrations/sources/punk-api.md index 27d4871ada0..1662b8d563f 100644 --- a/docs/integrations/sources/punk-api.md +++ b/docs/integrations/sources/punk-api.md @@ -28,9 +28,9 @@ Api key is not required for this connector to work,But a dummy key need to be pa 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. -4. Enter your dummy `api_key`. -5. Enter the params configuration if needed: ID (Optional) -6. Click **Set up source**. +3. Enter your dummy `api_key`. +4. Enter the params configuration if needed: ID (Optional) +5. Click **Set up source**. ## Supported sync modes @@ -59,6 +59,6 @@ Punk API's [API reference](https://punkapi.com/documentation/v2) has v2 at prese ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.0 | 2022-10-31 | [Init](https://github.com/airbytehq/airbyte/pull/)| Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------ | :------------- | +| 0.1.0 | 2022-10-31 | [Init](https://github.com/airbytehq/airbyte/pull/) | Initial commit | diff --git a/docs/integrations/sources/pypi.md b/docs/integrations/sources/pypi.md index fa6c2432f60..b9ef5aff8fd 100644 --- a/docs/integrations/sources/pypi.md +++ b/docs/integrations/sources/pypi.md @@ -3,16 +3,18 @@ This page guides you through the process of setting up the PyPI source connector. ## Setup guide + ### Get package name from PyPI + This is the name given in `pip install package_name` box. For example, `airbyte-cdk` is the package name for [airbyte-cdk](https://pypi.org/project/airbyte-cdk/). Optianlly, provide a version name. If not provided, the release stream, containing data for particular version, cannot be used. The project stream is as same as release stream but contains data for all versions. ## Supported streams and sync modes -* [Project](https://warehouse.pypa.io/api-reference/json.html#project) -* [Release](https://warehouse.pypa.io/api-reference/json.html#release) -* [Stats](https://warehouse.pypa.io/api-reference/stats.html) +- [Project](https://warehouse.pypa.io/api-reference/json.html#project) +- [Release](https://warehouse.pypa.io/api-reference/json.html#release) +- [Stats](https://warehouse.pypa.io/api-reference/stats.html) ### Performance considerations @@ -24,11 +26,9 @@ Try not to make a lot of requests (thousands) in a short amount of time (minutes ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------| -| 0.1.3 | 2024-04-19 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | schema descriptions | -| 0.1.0 | 2022-10-29 | [18632](https://github.com/airbytehq/airbyte/pull/18632) | Initial Release | - - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37237](https://github.com/airbytehq/airbyte/pull/37237) | schema descriptions | +| 0.1.0 | 2022-10-29 | [18632](https://github.com/airbytehq/airbyte/pull/18632) | Initial Release | diff --git a/docs/integrations/sources/qonto.md b/docs/integrations/sources/qonto.md index 20feb35aab7..dd90f4a4136 100644 --- a/docs/integrations/sources/qonto.md +++ b/docs/integrations/sources/qonto.md @@ -6,5 +6,5 @@ The Airbyte Source for [Qonto](https://qonto.com) | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :-------------------------------- | -| 0.2.0 | 2023-10-25 | [31603](https://github.com/airbytehq/airbyte/pull/31603) | Migrate to low-code framework | +| 0.2.0 | 2023-10-25 | [31603](https://github.com/airbytehq/airbyte/pull/31603) | Migrate to low-code framework | | 0.1.0 | 2022-11-14 | [17452](https://github.com/airbytehq/airbyte/pull/17452) | 🎉 New Source: Qonto [python cdk] | diff --git a/docs/integrations/sources/qualaroo.md b/docs/integrations/sources/qualaroo.md index b6ae702205d..863bb7109d1 100644 --- a/docs/integrations/sources/qualaroo.md +++ b/docs/integrations/sources/qualaroo.md @@ -8,19 +8,19 @@ The Qualaroo source supports Full Refresh syncs. You can choose if this connecto Several output streams are available from this source: -* [Surveys](https://help.qualaroo.com/hc/en-us/articles/201969438-The-REST-Reporting-API) \(Full table\) - * [Responses](https://help.qualaroo.com/hc/en-us/articles/201969438-The-REST-Reporting-API) \(Full table\) +- [Surveys](https://help.qualaroo.com/hc/en-us/articles/201969438-The-REST-Reporting-API) \(Full table\) + - [Responses](https://help.qualaroo.com/hc/en-us/articles/201969438-The-REST-Reporting-API) \(Full table\) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | NO | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | NO | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -30,20 +30,21 @@ The connector is **not** yet restricted by normal requests limitation. As a resu ### Requirements -* Qualaroo API Key -* Qualaroo API Token +- Qualaroo API Key +- Qualaroo API Token ### Setup guide + + Please read [How to get your APIs Token and Key](https://help.qualaroo.com/hc/en-us/articles/201969438-The-REST-Reporting-API) or you can log in to Qualaroo and visit [Reporting API](https://app.qualaroo.com/account). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------------------------| -| 0.3.0 | 2023-10-25 | [31070](https://github.com/airbytehq/airbyte/pull/31070) | Migrate to low-code framework | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------- | +| 0.3.0 | 2023-10-25 | [31070](https://github.com/airbytehq/airbyte/pull/31070) | Migrate to low-code framework | | 0.2.0 | 2023-05-24 | [26491](https://github.com/airbytehq/airbyte/pull/26491) | Remove authSpecification from spec.json as OAuth is not supported by Qualaroo + update stream schema | | 0.1.2 | 2022-05-24 | [13121](https://github.com/airbytehq/airbyte/pull/13121) | Fix `start_date` and `survey_ids` schema formatting. Separate source and stream files. Add stream_slices | | 0.1.1 | 2022-05-20 | [13042](https://github.com/airbytehq/airbyte/pull/13042) | Update stream specs | | 0.1.0 | 2021-08-18 | [8623](https://github.com/airbytehq/airbyte/pull/8623) | New source: Qualaroo | - diff --git a/docs/integrations/sources/quickbooks-migrations.md b/docs/integrations/sources/quickbooks-migrations.md index aeee6abf297..48735fefaca 100644 --- a/docs/integrations/sources/quickbooks-migrations.md +++ b/docs/integrations/sources/quickbooks-migrations.md @@ -1,4 +1,5 @@ # QuickBooks Migration Guide ## Upgrading to 3.0.0 + Some fields in `bills`, `credit_memos`, `items`, `refund_receipts`, and `sales_receipts` streams have been changed from `integer` to `number` to fix normalization. You may need to refresh the connection schema for those streams (skipping the reset), and running a sync. Alternatively, you can just run a reset. diff --git a/docs/integrations/sources/quickbooks.md b/docs/integrations/sources/quickbooks.md index fa2075cff34..b0bd930ce82 100644 --- a/docs/integrations/sources/quickbooks.md +++ b/docs/integrations/sources/quickbooks.md @@ -103,20 +103,20 @@ This Source is capable of syncing the following [Streams](https://developer.intu ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------- | -| `3.0.3` | 2024-03-22 | [36389](https://github.com/airbytehq/airbyte/pull/36389) | Add refresh token updater and add missing properties to streams | -| `3.0.2` | 2024-02-20 | [32236](https://github.com/airbytehq/airbyte/pull/32236) | Small typo in spec correction | -| `3.0.1` | 2023-11-06 | [32236](https://github.com/airbytehq/airbyte/pull/32236) | Upgrade to `airbyte-cdk>=0.52.10` to resolve refresh token issues | -| `3.0.0` | 2023-09-26 | [30770](https://github.com/airbytehq/airbyte/pull/30770) | Update schema to use `number` instead of `integer` | -| `2.0.5` | 2023-09-26 | [30766](https://github.com/airbytehq/airbyte/pull/30766) | Fix improperly named keyword argument | -| `2.0.4` | 2023-06-28 | [27803](https://github.com/airbytehq/airbyte/pull/27803) | Update following state breaking changes | -| `2.0.3` | 2023-06-08 | [27148](https://github.com/airbytehq/airbyte/pull/27148) | Update description and example values of a Start Date in spec.json | -| `2.0.2` | 2023-06-07 | [26722](https://github.com/airbytehq/airbyte/pull/27053) | Update CDK version and adjust authenticator configuration | -| `2.0.1` | 2023-05-28 | [26722](https://github.com/airbytehq/airbyte/pull/26722) | Change datatype for undisclosed amount field in payments | -| `2.0.0` | 2023-04-11 | [25045](https://github.com/airbytehq/airbyte/pull/25045) | Fix datetime format, disable OAuth button in cloud | -| `1.0.0` | 2023-03-20 | [24324](https://github.com/airbytehq/airbyte/pull/24324) | Migrate to Low-Code | -| `0.1.5` | 2022-02-17 | [10346](https://github.com/airbytehq/airbyte/pull/10346) | Update label `Quickbooks` -> `QuickBooks` | -| `0.1.4` | 2021-12-20 | [8960](https://github.com/airbytehq/airbyte/pull/8960) | Update connector fields title/description | -| `0.1.3` | 2021-08-10 | [4986](https://github.com/airbytehq/airbyte/pull/4986) | Using number data type for decimal fields instead string | -| `0.1.2` | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------- | +| `3.0.3` | 2024-03-22 | [36389](https://github.com/airbytehq/airbyte/pull/36389) | Add refresh token updater and add missing properties to streams | +| `3.0.2` | 2024-02-20 | [32236](https://github.com/airbytehq/airbyte/pull/32236) | Small typo in spec correction | +| `3.0.1` | 2023-11-06 | [32236](https://github.com/airbytehq/airbyte/pull/32236) | Upgrade to `airbyte-cdk>=0.52.10` to resolve refresh token issues | +| `3.0.0` | 2023-09-26 | [30770](https://github.com/airbytehq/airbyte/pull/30770) | Update schema to use `number` instead of `integer` | +| `2.0.5` | 2023-09-26 | [30766](https://github.com/airbytehq/airbyte/pull/30766) | Fix improperly named keyword argument | +| `2.0.4` | 2023-06-28 | [27803](https://github.com/airbytehq/airbyte/pull/27803) | Update following state breaking changes | +| `2.0.3` | 2023-06-08 | [27148](https://github.com/airbytehq/airbyte/pull/27148) | Update description and example values of a Start Date in spec.json | +| `2.0.2` | 2023-06-07 | [26722](https://github.com/airbytehq/airbyte/pull/27053) | Update CDK version and adjust authenticator configuration | +| `2.0.1` | 2023-05-28 | [26722](https://github.com/airbytehq/airbyte/pull/26722) | Change datatype for undisclosed amount field in payments | +| `2.0.0` | 2023-04-11 | [25045](https://github.com/airbytehq/airbyte/pull/25045) | Fix datetime format, disable OAuth button in cloud | +| `1.0.0` | 2023-03-20 | [24324](https://github.com/airbytehq/airbyte/pull/24324) | Migrate to Low-Code | +| `0.1.5` | 2022-02-17 | [10346](https://github.com/airbytehq/airbyte/pull/10346) | Update label `Quickbooks` -> `QuickBooks` | +| `0.1.4` | 2021-12-20 | [8960](https://github.com/airbytehq/airbyte/pull/8960) | Update connector fields title/description | +| `0.1.3` | 2021-08-10 | [4986](https://github.com/airbytehq/airbyte/pull/4986) | Using number data type for decimal fields instead string | +| `0.1.2` | 2021-07-06 | [4539](https://github.com/airbytehq/airbyte/pull/4539) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | diff --git a/docs/integrations/sources/railz.md b/docs/integrations/sources/railz.md index b3ca0f27c4f..04640a3df4d 100644 --- a/docs/integrations/sources/railz.md +++ b/docs/integrations/sources/railz.md @@ -90,6 +90,6 @@ The Railz connector should gracefully handle Railz API limitations under normal ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------| -| 0.1.1 | 2023-02-16 | [20960](https://github.com/airbytehq/airbyte/pull/20960) | New Source: Railz | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :---------------- | +| 0.1.1 | 2023-02-16 | [20960](https://github.com/airbytehq/airbyte/pull/20960) | New Source: Railz | diff --git a/docs/integrations/sources/rd-station-marketing.md b/docs/integrations/sources/rd-station-marketing.md index bf649567fca..427224bb471 100644 --- a/docs/integrations/sources/rd-station-marketing.md +++ b/docs/integrations/sources/rd-station-marketing.md @@ -3,35 +3,38 @@ RD Station Marketing is the leading Marketing Automation tool in Latin America. It is a software application that helps your company carry out better campaigns, nurture Leads, generate qualified business opportunities and achieve more results. From social media to email, Landing Pages, Pop-ups, even Automations and Analytics. ## Prerequisites -* An RD Station account -* A callback URL to receive the first account credential (can be done using localhost) -* `client_id` and `client_secret` credentials. Access [this link](https://appstore.rdstation.com/en/publisher) to register a new application and start the authentication flow. + +- An RD Station account +- A callback URL to receive the first account credential (can be done using localhost) +- `client_id` and `client_secret` credentials. Access [this link](https://appstore.rdstation.com/en/publisher) to register a new application and start the authentication flow. ## Airbyte Open Source -* Start Date -* Client Id -* Client Secret -* Refresh token + +- Start Date +- Client Id +- Client Secret +- Refresh token ## Supported sync modes The RD Station Marketing source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh - - Incremental (for analytics endpoints) + +- Full Refresh +- Incremental (for analytics endpoints) ## Supported Streams -* conversions (analytics endpoint) -* emails (analytics endpoint) -* funnel (analytics endpoint) -* workflow_emails_statistics (analytics endpoint) -* emails -* embeddables -* fields -* landing_pages -* popups -* segmentations -* workflows +- conversions (analytics endpoint) +- emails (analytics endpoint) +- funnel (analytics endpoint) +- workflow_emails_statistics (analytics endpoint) +- emails +- embeddables +- fields +- landing_pages +- popups +- segmentations +- workflows ## Performance considerations @@ -40,7 +43,7 @@ Each endpoint has its own performance limitations, which also consider the accou ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------|:---------------------------------| +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------- | | 0.1.2 | 2022-07-06 | [28009](https://github.com/airbytehq/airbyte/pull/28009/) | Migrated to advancedOAuth | | 0.1.1 | 2022-11-01 | [18826](https://github.com/airbytehq/airbyte/pull/18826) | Fix stream analytics_conversions | | 0.1.0 | 2022-10-23 | [18348](https://github.com/airbytehq/airbyte/pull/18348) | Initial Release | diff --git a/docs/integrations/sources/recharge.md b/docs/integrations/sources/recharge.md index 307c4256dbf..3e97084e9e3 100644 --- a/docs/integrations/sources/recharge.md +++ b/docs/integrations/sources/recharge.md @@ -74,34 +74,34 @@ The Recharge connector should gracefully handle Recharge API limitations under n ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------| -| 1.2.0 | 2024-03-13 | [35450](https://github.com/airbytehq/airbyte/pull/35450) | Migrated to low-code | -| 1.1.6 | 2024-03-12 | [35982](https://github.com/airbytehq/airbyte/pull/35982) | Added additional `query param` to guarantee the records are in `asc` order | -| 1.1.5 | 2024-02-12 | [35182](https://github.com/airbytehq/airbyte/pull/35182) | Manage dependencies with Poetry. | -| 1.1.4 | 2024-02-02 | [34772](https://github.com/airbytehq/airbyte/pull/34772) | Fix airbyte-lib distribution | -| 1.1.3 | 2024-01-31 | [34707](https://github.com/airbytehq/airbyte/pull/34707) | Added the UI toggle `Use 'Orders' Deprecated API` to switch between `deprecated` and `modern` api versions for `Orders` stream | -| 1.1.2 | 2023-11-03 | [32132](https://github.com/airbytehq/airbyte/pull/32132) | Reduced `period in days` value for `Subscriptions` stream, to avoid `504 - Gateway TimeOut` error | -| 1.1.1 | 2023-09-26 | [30782](https://github.com/airbytehq/airbyte/pull/30782) | For the new style pagination, pass only limit along with cursor | -| 1.1.0 | 2023-09-26 | [30756](https://github.com/airbytehq/airbyte/pull/30756) | Fix pagination and slicing | -| 1.0.1 | 2023-08-30 | [29992](https://github.com/airbytehq/airbyte/pull/29992) | Revert for orders stream to use old API version 2021-01 | -| 1.0.0 | 2023-06-22 | [27612](https://github.com/airbytehq/airbyte/pull/27612) | Change data type of the `shopify_variant_id_not_found` field of the `Charges` stream | -| 0.2.10 | 2023-06-20 | [27503](https://github.com/airbytehq/airbyte/pull/27503) | Update API version to 2021-11 | -| 0.2.9 | 2023-04-10 | [25009](https://github.com/airbytehq/airbyte/pull/25009) | Fix owner slicing for `Metafields` stream | -| 0.2.8 | 2023-04-07 | [24990](https://github.com/airbytehq/airbyte/pull/24990) | Add slicing to connector | -| 0.2.7 | 2023-02-13 | [22901](https://github.com/airbytehq/airbyte/pull/22901) | Specified date formatting in specification | -| 0.2.6 | 2023-02-21 | [22473](https://github.com/airbytehq/airbyte/pull/22473) | Use default availability strategy | -| 0.2.5 | 2023-01-27 | [22021](https://github.com/airbytehq/airbyte/pull/22021) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.2.4 | 2022-10-11 | [17822](https://github.com/airbytehq/airbyte/pull/17822) | Do not parse JSON in `should_retry` | -| 0.2.3 | 2022-10-11 | [17822](https://github.com/airbytehq/airbyte/pull/17822) | Do not parse JSON in `should_retry` | -| 0.2.2 | 2022-10-05 | [17608](https://github.com/airbytehq/airbyte/pull/17608) | Skip stream if we receive 403 error | -| 0.2.2 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | -| 0.2.1 | 2022-09-23 | [17080](https://github.com/airbytehq/airbyte/pull/17080) | Fix `total_weight` value to be `int` instead of `float` | -| 0.2.0 | 2022-09-21 | [16959](https://github.com/airbytehq/airbyte/pull/16959) | Use TypeTransformer to reliably convert to schema declared data types | -| 0.1.8 | 2022-08-27 | [16045](https://github.com/airbytehq/airbyte/pull/16045) | Force total_weight to be an integer | -| 0.1.7 | 2022-07-24 | [14978](https://github.com/airbytehq/airbyte/pull/14978) | Set `additionalProperties` to True, to guarantee backward cababilities | -| 0.1.6 | 2022-07-21 | [14902](https://github.com/airbytehq/airbyte/pull/14902) | Increased test coverage, fixed broken `charges`, `orders` schemas, added state checkpoint | -| 0.1.5 | 2022-01-26 | [9808](https://github.com/airbytehq/airbyte/pull/9808) | Update connector fields title/description | -| 0.1.4 | 2021-11-05 | [7626](https://github.com/airbytehq/airbyte/pull/7626) | Improve 'backoff' for HTTP requests | -| 0.1.3 | 2021-09-17 | [6149](https://github.com/airbytehq/airbyte/pull/6149) | Update `discount` and `order` schema | -| 0.1.2 | 2021-09-17 | [6149](https://github.com/airbytehq/airbyte/pull/6149) | Change `cursor_field` for Incremental streams | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------- | +| 1.2.0 | 2024-03-13 | [35450](https://github.com/airbytehq/airbyte/pull/35450) | Migrated to low-code | +| 1.1.6 | 2024-03-12 | [35982](https://github.com/airbytehq/airbyte/pull/35982) | Added additional `query param` to guarantee the records are in `asc` order | +| 1.1.5 | 2024-02-12 | [35182](https://github.com/airbytehq/airbyte/pull/35182) | Manage dependencies with Poetry. | +| 1.1.4 | 2024-02-02 | [34772](https://github.com/airbytehq/airbyte/pull/34772) | Fix airbyte-lib distribution | +| 1.1.3 | 2024-01-31 | [34707](https://github.com/airbytehq/airbyte/pull/34707) | Added the UI toggle `Use 'Orders' Deprecated API` to switch between `deprecated` and `modern` api versions for `Orders` stream | +| 1.1.2 | 2023-11-03 | [32132](https://github.com/airbytehq/airbyte/pull/32132) | Reduced `period in days` value for `Subscriptions` stream, to avoid `504 - Gateway TimeOut` error | +| 1.1.1 | 2023-09-26 | [30782](https://github.com/airbytehq/airbyte/pull/30782) | For the new style pagination, pass only limit along with cursor | +| 1.1.0 | 2023-09-26 | [30756](https://github.com/airbytehq/airbyte/pull/30756) | Fix pagination and slicing | +| 1.0.1 | 2023-08-30 | [29992](https://github.com/airbytehq/airbyte/pull/29992) | Revert for orders stream to use old API version 2021-01 | +| 1.0.0 | 2023-06-22 | [27612](https://github.com/airbytehq/airbyte/pull/27612) | Change data type of the `shopify_variant_id_not_found` field of the `Charges` stream | +| 0.2.10 | 2023-06-20 | [27503](https://github.com/airbytehq/airbyte/pull/27503) | Update API version to 2021-11 | +| 0.2.9 | 2023-04-10 | [25009](https://github.com/airbytehq/airbyte/pull/25009) | Fix owner slicing for `Metafields` stream | +| 0.2.8 | 2023-04-07 | [24990](https://github.com/airbytehq/airbyte/pull/24990) | Add slicing to connector | +| 0.2.7 | 2023-02-13 | [22901](https://github.com/airbytehq/airbyte/pull/22901) | Specified date formatting in specification | +| 0.2.6 | 2023-02-21 | [22473](https://github.com/airbytehq/airbyte/pull/22473) | Use default availability strategy | +| 0.2.5 | 2023-01-27 | [22021](https://github.com/airbytehq/airbyte/pull/22021) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.2.4 | 2022-10-11 | [17822](https://github.com/airbytehq/airbyte/pull/17822) | Do not parse JSON in `should_retry` | +| 0.2.3 | 2022-10-11 | [17822](https://github.com/airbytehq/airbyte/pull/17822) | Do not parse JSON in `should_retry` | +| 0.2.2 | 2022-10-05 | [17608](https://github.com/airbytehq/airbyte/pull/17608) | Skip stream if we receive 403 error | +| 0.2.2 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream state. | +| 0.2.1 | 2022-09-23 | [17080](https://github.com/airbytehq/airbyte/pull/17080) | Fix `total_weight` value to be `int` instead of `float` | +| 0.2.0 | 2022-09-21 | [16959](https://github.com/airbytehq/airbyte/pull/16959) | Use TypeTransformer to reliably convert to schema declared data types | +| 0.1.8 | 2022-08-27 | [16045](https://github.com/airbytehq/airbyte/pull/16045) | Force total_weight to be an integer | +| 0.1.7 | 2022-07-24 | [14978](https://github.com/airbytehq/airbyte/pull/14978) | Set `additionalProperties` to True, to guarantee backward cababilities | +| 0.1.6 | 2022-07-21 | [14902](https://github.com/airbytehq/airbyte/pull/14902) | Increased test coverage, fixed broken `charges`, `orders` schemas, added state checkpoint | +| 0.1.5 | 2022-01-26 | [9808](https://github.com/airbytehq/airbyte/pull/9808) | Update connector fields title/description | +| 0.1.4 | 2021-11-05 | [7626](https://github.com/airbytehq/airbyte/pull/7626) | Improve 'backoff' for HTTP requests | +| 0.1.3 | 2021-09-17 | [6149](https://github.com/airbytehq/airbyte/pull/6149) | Update `discount` and `order` schema | +| 0.1.2 | 2021-09-17 | [6149](https://github.com/airbytehq/airbyte/pull/6149) | Change `cursor_field` for Incremental streams | diff --git a/docs/integrations/sources/recreation.md b/docs/integrations/sources/recreation.md index e249812ece9..daff9e4e811 100644 --- a/docs/integrations/sources/recreation.md +++ b/docs/integrations/sources/recreation.md @@ -3,35 +3,36 @@ ## Sync overview **Recreation Information Database - RIDB** -RIDB is a part of the Recreation One Stop (R1S) program, -which oversees the operation of Recreation.gov -- a user-friendly, web-based -resource to citizens, offering a single point of access to information about -recreational opportunities nationwide. The website represents an authoritative -source of information and services for millions of visitors to federal lands, +RIDB is a part of the Recreation One Stop (R1S) program, +which oversees the operation of Recreation.gov -- a user-friendly, web-based +resource to citizens, offering a single point of access to information about +recreational opportunities nationwide. The website represents an authoritative +source of information and services for millions of visitors to federal lands, historic sites, museums, waterways and other activities and destinations. This source retrieves data from the [Recreation API](https://ridb.recreation.gov/landing). + ### Output schema This source is capable of syncing the following streams: -* Activities -* Campsites -* Events -* Facilities -* Facility Addresses -* Links -* Media -* Organizations -* Permit Entrances -* Recreation Areas -* Recreation Area Addresses -* Tours +- Activities +- Campsites +- Events +- Facilities +- Facility Addresses +- Links +- Media +- Organizations +- Permit Entrances +- Recreation Areas +- Recreation Area Addresses +- Tours ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:------| +| :---------------- | :-------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | @@ -54,9 +55,9 @@ The following fields are required fields for the connector to work: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-------------|:-------------| -| 0.1.3 | 2024-04-19 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | schema descriptions | -| 0.1.0 | 2022-11-02 | TBA | First Commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37244](https://github.com/airbytehq/airbyte/pull/37244) | schema descriptions | +| 0.1.0 | 2022-11-02 | TBA | First Commit | diff --git a/docs/integrations/sources/recruitee.md b/docs/integrations/sources/recruitee.md index 310ba261d00..25e4802857d 100644 --- a/docs/integrations/sources/recruitee.md +++ b/docs/integrations/sources/recruitee.md @@ -24,9 +24,9 @@ You can find your Company ID and find or create an API key within [Recruitee](ht ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. -4. Enter your `company_id` - Recruitee Company ID. -5. Enter your `api_key` - Recruitee API key. +2. Set the name for your source. +3. Enter your `company_id` - Recruitee Company ID. +4. Enter your `api_key` - Recruitee API key. 5. Click **Set up source**. ## Supported sync modes @@ -42,12 +42,12 @@ The Recruitee source connector supports the following [sync modes](https://docs. ## Supported Streams -* [Candidates](https://docs.recruitee.com/reference/candidates-get) -* [Offers](https://docs.recruitee.com/reference/offers-get) -* [Departments](https://docs.recruitee.com/reference/departments-get) +- [Candidates](https://docs.recruitee.com/reference/candidates-get) +- [Offers](https://docs.recruitee.com/reference/offers-get) +- [Departments](https://docs.recruitee.com/reference/departments-get) ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| -| 0.1.0 | 2022-10-30 | [18671](https://github.com/airbytehq/airbyte/pull/18671) | New Source: Recruitee | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------- | +| 0.1.0 | 2022-10-30 | [18671](https://github.com/airbytehq/airbyte/pull/18671) | New Source: Recruitee | diff --git a/docs/integrations/sources/recurly-migrations.md b/docs/integrations/sources/recurly-migrations.md index 251b70ae1d9..d17e1607aad 100644 --- a/docs/integrations/sources/recurly-migrations.md +++ b/docs/integrations/sources/recurly-migrations.md @@ -13,7 +13,7 @@ Once you have migrated to the new version, we highly recommend all users refresh Airbyte Open Source users with existing connections must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. + 1. Select **Sources**. 2. Find Recurly in the list of connectors. :::note @@ -24,7 +24,7 @@ You will see two versions listed, the current in-use version and the latest vers ### Update the connector version -1. Select **Sources** in the main navbar. +1. Select **Sources** in the main navbar. 2. Select the instance of the connector you wish to upgrade. :::note @@ -32,23 +32,23 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### Refresh schemas and reset data 1. Select **Connections** in the main navbar. 2. Select the connection(s) affected by the update. -3. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. +3. Select the **Replication** tab. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: 4. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset all streams** option is checked. -5. Select **Save connection**. + 1. Ensure the **Reset all streams** option is checked. +5. Select **Save connection**. :::note This will reset the data in your destination and initiate a fresh sync. diff --git a/docs/integrations/sources/recurly.md b/docs/integrations/sources/recurly.md index 26d9af6bc57..27101b2c8db 100644 --- a/docs/integrations/sources/recurly.md +++ b/docs/integrations/sources/recurly.md @@ -2,7 +2,7 @@ ## Overview -The Recurly source supports _Full Refresh_ as well as _Incremental_ syncs. +The Recurly source supports _Full Refresh_ as well as _Incremental_ syncs. _Full Refresh_ sync means every time a sync is run, Airbyte will copy all rows in the tables and columns you set up for replication into the destination in a new table. _Incremental_ syn means only changed resources are copied from Recurly. For the first run, it will be a Full Refresh sync. @@ -11,37 +11,36 @@ _Incremental_ syn means only changed resources are copied from Recurly. For the Several output streams are available from this source: -* [Accounts](https://docs.recurly.com/docs/accounts) -* [Account Notes](https://docs.recurly.com/docs/accounts#account-notes) -* [Account Coupon Redemptions](https://docs.recurly.com/docs/coupons#redemptions) -* [Add Ons](https://docs.recurly.com/docs/plans#add-ons-1) -* [Billing Infos](https://docs.recurly.com/docs/accounts#billing-info) -* [Coupons](https://docs.recurly.com/docs/coupons) -* [Unique Coupons](https://docs.recurly.com/docs/bulk-unique-coupons) -* [Credit Payments](https://docs.recurly.com/docs/invoices) -* [Automated Exports](https://docs.recurly.com/docs/export-overview) -* [Invoices](https://docs.recurly.com/docs/invoices) -* [Measured Units](https://developers.recurly.com/api/v2021-02-25/index.html#tag/measured_unit) -* [Line Items](https://docs.recurly.com/docs/invoices#line-items) -* [Plans](https://docs.recurly.com/docs/plans) -* [Shipping Addresses](https://docs.recurly.com/docs/shipping-addresses) -* [Shipping Methods](https://docs.recurly.com/docs/shipping#shipping-methods) -* [Subscriptions](https://docs.recurly.com/docs/subscriptions) -* [Subscription Changes](https://docs.recurly.com/docs/change-subscription#subscription-changes) -* [Transactions](https://docs.recurly.com/docs/transactions) - +- [Accounts](https://docs.recurly.com/docs/accounts) +- [Account Notes](https://docs.recurly.com/docs/accounts#account-notes) +- [Account Coupon Redemptions](https://docs.recurly.com/docs/coupons#redemptions) +- [Add Ons](https://docs.recurly.com/docs/plans#add-ons-1) +- [Billing Infos](https://docs.recurly.com/docs/accounts#billing-info) +- [Coupons](https://docs.recurly.com/docs/coupons) +- [Unique Coupons](https://docs.recurly.com/docs/bulk-unique-coupons) +- [Credit Payments](https://docs.recurly.com/docs/invoices) +- [Automated Exports](https://docs.recurly.com/docs/export-overview) +- [Invoices](https://docs.recurly.com/docs/invoices) +- [Measured Units](https://developers.recurly.com/api/v2021-02-25/index.html#tag/measured_unit) +- [Line Items](https://docs.recurly.com/docs/invoices#line-items) +- [Plans](https://docs.recurly.com/docs/plans) +- [Shipping Addresses](https://docs.recurly.com/docs/shipping-addresses) +- [Shipping Methods](https://docs.recurly.com/docs/shipping#shipping-methods) +- [Subscriptions](https://docs.recurly.com/docs/subscriptions) +- [Subscription Changes](https://docs.recurly.com/docs/change-subscription#subscription-changes) +- [Transactions](https://docs.recurly.com/docs/transactions) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | +| Feature | Supported? | +| :---------------------------- | :---------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | | Replicate Incremental Deletes | Coming soon | -| SSL connection | Yes | -| Namespaces | No | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -51,8 +50,8 @@ The Recurly connector should not run into Recurly API limitations under normal u ### Requirements -* Recurly Account -* Recurly API Key +- Recurly Account +- Recurly API Key ### Setup guide @@ -62,15 +61,15 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :--------------------------------------------------------| :--------------------------------------------------------------------------------------- | -| 1.0.3 | 2024-04-19 | [37246](https://github.com/airbytehq/airbyte/pull/37246) | Updating to 0.80.0 CDK | -| 1.0.2 | 2024-04-12 | [37246](https://github.com/airbytehq/airbyte/pull/37246) | schema descriptions | -| 1.0.1 | 2024-03-05 | [35828](https://github.com/airbytehq/airbyte/pull/35828) | Bump version to unarchive supportLevel in Cloud productionDB | -| 1.0.0 | 2024-03-01 | [35763](https://github.com/airbytehq/airbyte/pull/35763) | Re-introduce updated connector to catalog from archival repo | -| 0.5.0 | 2024-02-22 | [34622](https://github.com/airbytehq/airbyte/pull/34622) | Republish connector using base image/Poetry, update schemas | -| 0.4.1 | 2022-06-10 | [13685](https://github.com/airbytehq/airbyte/pull/13685) | Add state_checkpoint_interval to Recurly stream | -| 0.4.0 | 2022-01-28 | [9866](https://github.com/airbytehq/airbyte/pull/9866) | Revamp Recurly Schema and add more resources | -| 0.3.2 | 2022-01-20 | [8617](https://github.com/airbytehq/airbyte/pull/8617) | Update connector fields title/description | -| 0.3.1 | 2022-01-10 | [9382](https://github.com/airbytehq/airbyte/pull/9382) | Source Recurly: avoid loading all accounts when importing account coupon redemptions | -| 0.3.0 | 2021-12-08 | [8468](https://github.com/airbytehq/airbyte/pull/8468) | Support Incremental Sync Mode | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------- | +| 1.0.3 | 2024-04-19 | [37246](https://github.com/airbytehq/airbyte/pull/37246) | Updating to 0.80.0 CDK | +| 1.0.2 | 2024-04-12 | [37246](https://github.com/airbytehq/airbyte/pull/37246) | schema descriptions | +| 1.0.1 | 2024-03-05 | [35828](https://github.com/airbytehq/airbyte/pull/35828) | Bump version to unarchive supportLevel in Cloud productionDB | +| 1.0.0 | 2024-03-01 | [35763](https://github.com/airbytehq/airbyte/pull/35763) | Re-introduce updated connector to catalog from archival repo | +| 0.5.0 | 2024-02-22 | [34622](https://github.com/airbytehq/airbyte/pull/34622) | Republish connector using base image/Poetry, update schemas | +| 0.4.1 | 2022-06-10 | [13685](https://github.com/airbytehq/airbyte/pull/13685) | Add state_checkpoint_interval to Recurly stream | +| 0.4.0 | 2022-01-28 | [9866](https://github.com/airbytehq/airbyte/pull/9866) | Revamp Recurly Schema and add more resources | +| 0.3.2 | 2022-01-20 | [8617](https://github.com/airbytehq/airbyte/pull/8617) | Update connector fields title/description | +| 0.3.1 | 2022-01-10 | [9382](https://github.com/airbytehq/airbyte/pull/9382) | Source Recurly: avoid loading all accounts when importing account coupon redemptions | +| 0.3.0 | 2021-12-08 | [8468](https://github.com/airbytehq/airbyte/pull/8468) | Support Incremental Sync Mode | diff --git a/docs/integrations/sources/redshift.md b/docs/integrations/sources/redshift.md index 4594602feb2..625b9bbb6ec 100644 --- a/docs/integrations/sources/redshift.md +++ b/docs/integrations/sources/redshift.md @@ -55,8 +55,8 @@ All Redshift connections are encrypted using SSL ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.5.2 | 2024-02-13 | [35223](https://github.com/airbytehq/airbyte/pull/35223) | Adopt CDK 0.20.4 | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | +| 0.5.2 | 2024-02-13 | [35223](https://github.com/airbytehq/airbyte/pull/35223) | Adopt CDK 0.20.4 | | 0.5.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.5.0 | 2023-12-18 | [33484](https://github.com/airbytehq/airbyte/pull/33484) | Remove LEGACY state | | (none) | 2023-11-17 | [32616](https://github.com/airbytehq/airbyte/pull/32616) | Improve timestamptz handling | diff --git a/docs/integrations/sources/retently.md b/docs/integrations/sources/retently.md index a90ef5a2b2f..b06fbcc5252 100644 --- a/docs/integrations/sources/retently.md +++ b/docs/integrations/sources/retently.md @@ -44,17 +44,17 @@ OAuth application is [here](https://app.retently.com/settings/oauth). ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------- | -| 0.2.4 | 2024-04-19 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | schema descriptions | -| 0.2.0 | 2023-08-03 | [29040](https://github.com/airbytehq/airbyte/pull/29040) | Migrate to Low-Code CDK | -| 0.1.6 | 2023-05-10 | [25714](https://github.com/airbytehq/airbyte/pull/25714) | Fix invalid json schema for nps stream | -| 0.1.5 | 2023-05-08 | [25900](https://github.com/airbytehq/airbyte/pull/25900) | Fix integration tests | -| 0.1.4 | 2023-05-08 | [25900](https://github.com/airbytehq/airbyte/pull/25900) | Fix integration tests | -| 0.1.3 | 2022-11-15 | [19456](https://github.com/airbytehq/airbyte/pull/19456) | Add campaign, feedback, outbox and templates streams | -| 0.1.2 | 2021-12-28 | [9045](https://github.com/airbytehq/airbyte/pull/9045) | Update titles and descriptions | -| 0.1.1 | 2021-12-06 | [8043](https://github.com/airbytehq/airbyte/pull/8043) | 🎉 Source Retently: add OAuth 2.0 | -| 0.1.0 | 2021-11-02 | [6966](https://github.com/airbytehq/airbyte/pull/6966) | 🎉 New Source: Retently | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37248](https://github.com/airbytehq/airbyte/pull/37248) | schema descriptions | +| 0.2.0 | 2023-08-03 | [29040](https://github.com/airbytehq/airbyte/pull/29040) | Migrate to Low-Code CDK | +| 0.1.6 | 2023-05-10 | [25714](https://github.com/airbytehq/airbyte/pull/25714) | Fix invalid json schema for nps stream | +| 0.1.5 | 2023-05-08 | [25900](https://github.com/airbytehq/airbyte/pull/25900) | Fix integration tests | +| 0.1.4 | 2023-05-08 | [25900](https://github.com/airbytehq/airbyte/pull/25900) | Fix integration tests | +| 0.1.3 | 2022-11-15 | [19456](https://github.com/airbytehq/airbyte/pull/19456) | Add campaign, feedback, outbox and templates streams | +| 0.1.2 | 2021-12-28 | [9045](https://github.com/airbytehq/airbyte/pull/9045) | Update titles and descriptions | +| 0.1.1 | 2021-12-06 | [8043](https://github.com/airbytehq/airbyte/pull/8043) | 🎉 Source Retently: add OAuth 2.0 | +| 0.1.0 | 2021-11-02 | [6966](https://github.com/airbytehq/airbyte/pull/6966) | 🎉 New Source: Retently | diff --git a/docs/integrations/sources/ringcentral.md b/docs/integrations/sources/ringcentral.md index 59087df0818..dbdcaea966d 100644 --- a/docs/integrations/sources/ringcentral.md +++ b/docs/integrations/sources/ringcentral.md @@ -1,7 +1,6 @@ # RingCentral -This page contains the setup guide and reference information for the [RingCentral](https://developers.ringcentral.com/api-reference/ -) source +This page contains the setup guide and reference information for the [RingCentral](https://developers.ringcentral.com/api-reference/) source ## Prerequisites @@ -14,11 +13,11 @@ Auth Token (which acts as bearer token), account id and extension id are mandate - Get your bearer token by following auth section (ref - https://developers.ringcentral.com/api-reference/authentication) - Setup params (All params are required) - Available params - - auth_token: Recieved by following https://developers.ringcentral.com/api-reference/authentication - - account_id: Could be seen at response to basic api call to an endpoint with ~ operator. \ - \ Example- (https://platform.devtest.ringcentral.com/restapi/v1.0/account/~/extension/~/business-hours) - - extension_id: Could be seen at response to basic api call to an endpoint with ~ operator. \ - \ Example- (https://platform.devtest.ringcentral.com/restapi/v1.0/account/~/extension/~/business-hours) + - auth_token: Recieved by following https://developers.ringcentral.com/api-reference/authentication + - account_id: Could be seen at response to basic api call to an endpoint with ~ operator. \ + \ Example- (https://platform.devtest.ringcentral.com/restapi/v1.0/account/~/extension/~/business-hours) + - extension_id: Could be seen at response to basic api call to an endpoint with ~ operator. \ + \ Example- (https://platform.devtest.ringcentral.com/restapi/v1.0/account/~/extension/~/business-hours) ## Step 2: Set up the RingCentral connector in Airbyte @@ -35,7 +34,7 @@ Auth Token (which acts as bearer token), account id and extension id are mandate 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `auth_token, account_id, extension_id`. -5. Click **Set up source**. +4. Click **Set up source**. ## Supported sync modes @@ -66,7 +65,6 @@ The RingCentral source connector supports the following [sync modes](https://doc - ivr_prompts - fax_cover - ## API method example GET https://platform.devtest.ringcentral.com/restapi/v1.0/account/~/extension/~/business-hours @@ -77,6 +75,6 @@ RingCentral [API reference](https://platform.devtest.ringcentral.com/restapi/v1. ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.0 | 2023-05-10 | [Init](https://github.com/airbytehq/airbyte/pull/)| Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------- | :------------- | +| 0.1.0 | 2023-05-10 | [Init](https://github.com/airbytehq/airbyte/pull/) | Initial commit | diff --git a/docs/integrations/sources/rki-covid.md b/docs/integrations/sources/rki-covid.md index f9e695aab44..833793b7cc9 100644 --- a/docs/integrations/sources/rki-covid.md +++ b/docs/integrations/sources/rki-covid.md @@ -8,32 +8,32 @@ This source can sync data for the [Robert Koch-Institut Covid API](https://api.c This Source is capable of syncing the following core Streams (only for Germany cases): -* Germany -* Germany by age and groups -* Germany cases by days -* Germany incidences by days -* Germany deaths by days -* Germany recovered by days -* Germany frozen-incidence by days -* Germany hospitalization by days +- Germany +- Germany by age and groups +- Germany cases by days +- Germany incidences by days +- Germany deaths by days +- Germany recovered by days +- Germany frozen-incidence by days +- Germany hospitalization by days ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `integer` | `integer` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `integer` | `integer` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | +| Namespaces | No | | ### Performance considerations @@ -43,7 +43,7 @@ The RKI Covid connector should not run into RKI Covid API limitations under norm ### Requirements -* Start Date +- Start Date ### Setup guide @@ -51,8 +51,8 @@ Select start date ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.2 | 2022-08-25 | [15667](https://github.com/airbytehq/airbyte/pull/15667) | Add message when no data available | -| 0.1.1 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Fix docs | -| 0.1.0 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------- | +| 0.1.2 | 2022-08-25 | [15667](https://github.com/airbytehq/airbyte/pull/15667) | Add message when no data available | +| 0.1.1 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Fix docs | +| 0.1.0 | 2022-05-30 | [11732](https://github.com/airbytehq/airbyte/pull/11732) | Initial Release | diff --git a/docs/integrations/sources/rocket-chat.md b/docs/integrations/sources/rocket-chat.md index 030cc6aad43..a8d131bd0e2 100644 --- a/docs/integrations/sources/rocket-chat.md +++ b/docs/integrations/sources/rocket-chat.md @@ -6,19 +6,19 @@ This source can sync data from the [Rocket.chat API](https://developer.rocket.ch ## This Source Supports the Following Streams -* teams -* rooms -* channels -* roles -* subscriptions -* users +- teams +- rooms +- channels +- roles +- subscriptions +- users ### Features | Feature | Supported?\(Yes/No\) | Notes | -| :--* | :--* | :--* | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| :--_ | :--_ | :--\* | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -37,5 +37,5 @@ You need to setup a personal access token within the Rocket.chat workspace, see ## Changelog | Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | +| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------------------- | | 0.1.0 | 2022-10-29 | [#18635](https://github.com/airbytehq/airbyte/pull/18635) | 🎉 New Source: Rocket.chat API [low-code CDK] | diff --git a/docs/integrations/sources/rss-migrations.md b/docs/integrations/sources/rss-migrations.md index 329ad553a71..9759a020645 100644 --- a/docs/integrations/sources/rss-migrations.md +++ b/docs/integrations/sources/rss-migrations.md @@ -1,6 +1,7 @@ # Rss Migration Guide ## Upgrading to 1.0.0 + We're continuously striving to enhance the quality and reliability of our connectors at Airbyte. As part of our commitment to delivering exceptional service, we are transitioning our RSS source from the Python Connector Development Kit (CDK) @@ -11,11 +12,11 @@ to our new low-code framework improving maintainability and reliability of the c Clearing your data is required for the affected streams in order to continue syncing successfully. To clear your data for the affected streams, follow the steps below: 1. Select **Connections** in the main navbar and select the connection(s) affected by the update. -2. Select the **Schema** tab. - 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. - 2. Select **OK** to approve changes. -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. -4. Select **Save connection**. +2. Select the **Schema** tab. + 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. + 2. Select **OK** to approve changes. +3. Select **Save changes** at the bottom of the page. + 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. +4. Select **Save connection**. This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/rss.md b/docs/integrations/sources/rss.md index e68a7c97b99..2b67980fe31 100644 --- a/docs/integrations/sources/rss.md +++ b/docs/integrations/sources/rss.md @@ -7,20 +7,21 @@ The RSS source allows you to read data from any individual RSS feed. #### Output schema This source is capable of syncing the following streams: -* `items` - * Provides stats about specific RSS items. - * Most fields are simply kept from RSS items as strings if present (`title`, `link`, `description`, `author`, `category`, `comments`, `enclosure`, `guid`). - * The date field is handled differently. It's transformed into a UTC datetime in a `published` field for easier use in data warehouses and other destinations. - * The RSS feed you're subscribing to must have a valid `pubDate` field for each item for incremental syncs to work properly. - * Since `guid` is not a required field, there is no primary key for the feed, only a cursor on the published date. + +- `items` + - Provides stats about specific RSS items. + - Most fields are simply kept from RSS items as strings if present (`title`, `link`, `description`, `author`, `category`, `comments`, `enclosure`, `guid`). + - The date field is handled differently. It's transformed into a UTC datetime in a `published` field for easier use in data warehouses and other destinations. + - The RSS feed you're subscribing to must have a valid `pubDate` field for each item for incremental syncs to work properly. + - Since `guid` is not a required field, there is no primary key for the feed, only a cursor on the published date. #### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| Namespaces | No | ### Requirements / Setup Guide @@ -32,8 +33,8 @@ None ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :---------- | :------------------------------------------------------- | :----------------------------- | -| 1.0.1 | 2024-04-30 | [37535](https://github.com/airbytehq/airbyte/pull/37535) | Fix incremental sync | -| 1.0.0 | 2024-04-20 | [36418](https://github.com/airbytehq/airbyte/pull/36418) | Migrate python cdk to low code | -| 0.1.0 | 2022-10-12 | [18838](https://github.com/airbytehq/airbyte/pull/18838) | Initial release supporting RSS | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------- | +| 1.0.1 | 2024-04-30 | [37535](https://github.com/airbytehq/airbyte/pull/37535) | Fix incremental sync | +| 1.0.0 | 2024-04-20 | [36418](https://github.com/airbytehq/airbyte/pull/36418) | Migrate python cdk to low code | +| 0.1.0 | 2022-10-12 | [18838](https://github.com/airbytehq/airbyte/pull/18838) | Initial release supporting RSS | diff --git a/docs/integrations/sources/s3-migrations.md b/docs/integrations/sources/s3-migrations.md index 18e6bdb1194..5c80e2b3730 100644 --- a/docs/integrations/sources/s3-migrations.md +++ b/docs/integrations/sources/s3-migrations.md @@ -6,23 +6,24 @@ Note: This change is only breaking if you created S3 sources using the API and d Following 4.0.0 config change, we are removing `streams.*.file_type` field which was redundant with `streams.*.format`. This is a breaking change as `format` now needs to be required. Given that the UI would always populate `format`, only users creating actors using the API and not providing `format` are be affected. In order to fix that, simply set `streams.*.format` to `{"filetype": }`. - ## Upgrading to 4.0.0 We have revamped the implementation to use the File-Based CDK. The goal is to increase resiliency and reduce development time. Here are the breaking changes: -* [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively. -* [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string. -* [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger. + +- [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively. +- [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string. +- [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger. Given that you are not affected by the above, your migration should proceed automatically once you run a sync with the new connector. To leverage this: -* Upgrade source-s3 to use v4.0.0 -* Run at least one sync for all your source-s3 connectors - * Migration will be performed and an AirbyteControlMessage will be emitted to the platform so that the migrated config is persisted + +- Upgrade source-s3 to use v4.0.0 +- Run at least one sync for all your source-s3 connectors + - Migration will be performed and an AirbyteControlMessage will be emitted to the platform so that the migrated config is persisted If a user tries to modify the config after source-s3 is upgraded to v4.0.0 and before there was a sync or a periodic discover check, they will have to update the already provided fields manually. To avoid this, a sync can be executed on any of the connections for this source. Other than breaking changes, we have changed the UI from which the user configures the source: -* You can now configure multiple streams by clicking on `Add` under `Streams`. -* `Output Stream Name` has been renamed to `Name` when configuring a specific stream. -* `Pattern of files to replicate` field has been renamed `Globs` under the stream configuration. +- You can now configure multiple streams by clicking on `Add` under `Streams`. +- `Output Stream Name` has been renamed to `Name` when configuring a specific stream. +- `Pattern of files to replicate` field has been renamed `Globs` under the stream configuration. diff --git a/docs/integrations/sources/s3.md b/docs/integrations/sources/s3.md index e65a80df0d4..594a3baf55e 100644 --- a/docs/integrations/sources/s3.md +++ b/docs/integrations/sources/s3.md @@ -51,14 +51,17 @@ At this time, object-level permissions alone are not sufficient to successfully #### Option 1: Using an IAM Role (Most secure) + :::note Currently this feature is available only for the users in a Sales Assist workflow. Please contact your Solutions Engineer if you are interested in using this. ::: + 1. In the IAM dashboard, click **Roles**, then **Create role**. -2. Choose the appropriate trust entity and attach the policy you created. +2. Choose the appropriate trust entity and attach the policy you created. 3. Set up a trust relationship for the role. For example for **AWS account** trusted entity use default AWS account on your instance (it will be used to assume role). To use **External ID** set it to environment variables as `export AWS_ASSUME_ROLE_EXTERNAL_ID="{your-external-id}"`. Edit the trust relationship policy to reflect this: + ``` { "Version": "2012-10-17", @@ -77,11 +80,14 @@ Currently this feature is available only for the users in a Sales Assist workflo } ] } -``` +``` + -2. Choose the **AWS account** trusted entity type. + +2. Choose the **AWS account** trusted entity type. 3. Set up a trust relationship for the role. This allows the Airbyte instance's AWS account to assume this role. You will also need to specify an external ID, which is a secret key that the trusting service (Airbyte) and the trusted role (the role you're creating) both know. This ID is used to prevent the "confused deputy" problem. The External ID should be your Airbyte workspace ID, which can be found in the URL of your workspace page. Edit the trust relationship policy to include the external ID: + ``` { "Version": "2012-10-17", @@ -101,7 +107,9 @@ Currently this feature is available only for the users in a Sales Assist workflo ] } ``` + + 4. Complete the role creation and note the Role ARN. #### Option 2: Using an IAM User @@ -115,7 +123,7 @@ Currently this feature is available only for the users in a Sales Assist workflo Your `Secret Access Key` will only be visible once upon creation. Be sure to copy and store it securely for future use. ::: -For more information on managing your access keys, please refer to the +For more information on managing your access keys, please refer to the [official AWS documentation](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html). ### Step 2: Set up the Amazon S3 connector in Airbyte @@ -126,14 +134,14 @@ For more information on managing your access keys, please refer to the 4. Enter the name of the **Bucket** containing your files to replicate. 5. Add a stream 1. Choose the **File Format** - 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. + 2. In the **Format** box, use the dropdown menu to select the format of the files you'd like to replicate. The supported formats are **CSV**, **Parquet**, **Avro** and **JSONL**. Toggling the **Optional fields** button within the **Format** box will allow you to enter additional configurations based on the selected format. For a detailed breakdown of these settings, refer to the [File Format section](#file-format-settings) below. 3. Give a **Name** to the stream 4. (Optional) Enter the **Globs** which dictates which files to be synced. This is a regular expression that allows Airbyte to pattern match the specific files to replicate. If you are replicating all the files within your bucket, use `**` as the pattern. For more precise pattern matching options, refer to the [Globs section](#globs) below. 5. (Optional) Modify the **Days To Sync If History Is Full** value. This gives you control of the lookback window that we will use to determine which files to sync if the state history is full. Details are in the [State section](#state) below. 6. (Optional) If you want to enforce a specific schema, you can enter a **Input schema**. By default, this value is set to `{}` and will automatically infer the schema from the file\(s\) you are replicating. For details on providing a custom schema, refer to the [User Schema section](#user-schema). 7. (Optional) Select the **Schemaless** option, to skip all validation of the records against a schema. If this option is selected the schema will be `{"data": "object"}` and all downstream data will be nested in a "data" field. This is a good option if the schema of your records changes frequently. 8. (Optional) Select a **Validation Policy** to tell Airbyte how to handle records that do not match the schema. You may choose to emit the record anyway (fields that aren't present in the schema may not arrive at the destination), skip the record altogether, or wait until the next discovery (which will happen in the next 24 hours). -6. **To authenticate your private bucket**: +6. **To authenticate your private bucket**: - If using an IAM role, enter the **AWS Role ARN**. - If using IAM user credentials, fill the **AWS Access Key ID** and **AWS Secret Access Key** fields with the appropriate credentials. @@ -221,7 +229,7 @@ As you can probably tell, there are many ways to achieve the same goal with path ## State -To perform incremental syncs, Airbyte syncs files from oldest to newest. Each file that's synced (up to 10,000 files) will be added as an entry in a "history" section of the connection's state message. +To perform incremental syncs, Airbyte syncs files from oldest to newest. Each file that's synced (up to 10,000 files) will be added as an entry in a "history" section of the connection's state message. Once history is full, we drop the older messages out of the file, and only read files that were last modified between the date of the newest file in history and `Days to Sync if History is Full` days prior. ## User Schema @@ -278,7 +286,7 @@ Product,Description,Price Jeans,"Navy Blue, Bootcut, 34\"",49.99 ``` -The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). +The backslash (`\`) is used directly before the second double quote (`"`) to indicate that it is _not_ the closing quote for the field, but rather a literal double quote character that should be included in the value (in this example, denoting the size of the jeans in inches: `34"` ). Leaving this field blank (default option) will disallow escaping. @@ -290,7 +298,6 @@ Leaving this field blank (default option) will disallow escaping. - **Strings Can Be Null**: Whether strings can be interpreted as null values. If true, strings that match the null_values set will be interpreted as null. If false, strings that match the null_values set will be interpreted as the string itself. - **True Values**: A set of case-sensitive strings that should be interpreted as true values. - ### Parquet Apache Parquet is a column-oriented data storage format of the Apache Hadoop ecosystem. It provides efficient data compression and encoding schemes with enhanced performance to handle complex data in bulk. At the moment, partitioned parquet datasets are unsupported. The following settings are available: @@ -300,6 +307,7 @@ Apache Parquet is a column-oriented data storage format of the Apache Hadoop eco ### Avro The Avro parser uses the [Fastavro library](https://fastavro.readthedocs.io/en/latest/). The following settings are available: + - **Convert Double Fields to Strings**: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers. ### JSONL @@ -327,7 +335,7 @@ This connector utilizes the open source [Unstructured](https://unstructured-io.g ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:----------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :-------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------- | | 4.5.13 | 2024-05-03 | [37776](https://github.com/airbytehq/airbyte/pull/37776) | Update `airbyte-cdk` to fix the `discovery` command issue | | 4.5.12 | 2024-04-11 | [37001](https://github.com/airbytehq/airbyte/pull/37001) | Update airbyte-cdk to flush print buffer for every message | | 4.5.11 | 2024-03-14 | [36160](https://github.com/airbytehq/airbyte/pull/36160) | Bump python-cdk version to include CSV tab delimiter fix | @@ -357,7 +365,7 @@ This connector utilizes the open source [Unstructured](https://unstructured-io.g | 4.1.1 | 2023-10-19 | [31601](https://github.com/airbytehq/airbyte/pull/31601) | Base image migration: remove Dockerfile and use the python-connector-base image | | 4.1.0 | 2023-10-17 | [31340](https://github.com/airbytehq/airbyte/pull/31340) | Add reading files inside zip archive | | 4.0.5 | 2023-10-16 | [31209](https://github.com/airbytehq/airbyte/pull/31209) | Add experimental Markdown/PDF/Docx file format | -| 4.0.4 | 2023-09-18 | [30476](https://github.com/airbytehq/airbyte/pull/30476) | Remove streams.*.file_type from source-s3 configuration | +| 4.0.4 | 2023-09-18 | [30476](https://github.com/airbytehq/airbyte/pull/30476) | Remove streams.\*.file_type from source-s3 configuration | | 4.0.3 | 2023-09-13 | [30387](https://github.com/airbytehq/airbyte/pull/30387) | Bump Airbyte-CDK version to improve messages for record parse errors | | 4.0.2 | 2023-09-07 | [28639](https://github.com/airbytehq/airbyte/pull/28639) | Always show S3 Key fields | | 4.0.1 | 2023-09-06 | [30217](https://github.com/airbytehq/airbyte/pull/30217) | Migrate inference error to config errors and avoir sentry alerts | diff --git a/docs/integrations/sources/salesforce.md b/docs/integrations/sources/salesforce.md index c3347c420f6..b33509a7764 100644 --- a/docs/integrations/sources/salesforce.md +++ b/docs/integrations/sources/salesforce.md @@ -14,7 +14,6 @@ This page contains the setup guide and reference information for the [Salesforce - (For Airbyte Open Source) Salesforce [OAuth](https://help.salesforce.com/s/articleView?id=sf.remoteaccess_oauth_tokens_scopes.htm&type=5) credentials - :::tip To use this connector, you'll need at least the Enterprise edition of Salesforce or the Professional Edition with API access purchased as an add-on. Reference the [Salesforce docs about API access](https://help.salesforce.com/s/articleView?id=000385436&type=1) for more information. @@ -120,7 +119,6 @@ Airbyte allows exporting all available Salesforce objects dynamically based on: - If the authenticated Salesforce user has the Role and Permissions to read and fetch objects - If the salesforce object has the queryable property set to true. Airbyte can only fetch objects which are queryable. If you don’t see an object available via Airbyte, and it is queryable, check if it is API-accessible to the Salesforce user you authenticated with. - ## Limitations & Troubleshooting
    @@ -135,6 +133,7 @@ Expand to see details about Salesforce connector limitations and troubleshooting The Salesforce connector is restricted by Salesforce’s [Daily Rate Limits](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_api.htm). The connector syncs data until it hits the daily rate limit, then ends the sync early with success status, and starts the next sync from where it left off. Note that picking up from where it ends will work only for incremental sync, which is why we recommend using the [Incremental Sync - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped) sync mode. #### A note on the BULK API vs REST API and their limitations + ## Syncing Formula Fields The Salesforce connector syncs formula field outputs from Salesforce. If the formula of a field changes in Salesforce and no other field on the record is updated, you will need to reset the stream and sync a historical backfill to pull in all the updated values of the field. @@ -149,26 +148,26 @@ Salesforce allows extracting data using either the [BULK API](https://developer. - The Salesforce object has columns which are unsupported by the BULK API, like columns with a `base64` or `complexvalue` type - The Salesforce object is not supported by BULK API. In this case we sync the objects via the REST API which will occasionally cost more of your API quota. This includes the following objects: - - AcceptedEventRelation - - Attachment - - CaseStatus - - ContractStatus - - DeclinedEventRelation - - FieldSecurityClassification - - KnowledgeArticle - - KnowledgeArticleVersion - - KnowledgeArticleVersionHistory - - KnowledgeArticleViewStat - - KnowledgeArticleVoteStat - - OrderStatus - - PartnerRole - - RecentlyViewed - - ServiceAppointmentStatus - - ShiftStatus - - SolutionStatus - - TaskPriority - - TaskStatus - - UndecidedEventRelation + - AcceptedEventRelation + - Attachment + - CaseStatus + - ContractStatus + - DeclinedEventRelation + - FieldSecurityClassification + - KnowledgeArticle + - KnowledgeArticleVersion + - KnowledgeArticleVersionHistory + - KnowledgeArticleViewStat + - KnowledgeArticleVoteStat + - OrderStatus + - PartnerRole + - RecentlyViewed + - ServiceAppointmentStatus + - ShiftStatus + - SolutionStatus + - TaskPriority + - TaskStatus + - UndecidedEventRelation More information on the differences between various Salesforce APIs can be found [here](https://help.salesforce.com/s/articleView?id=sf.integrate_what_is_api.htm&type=5). @@ -192,7 +191,7 @@ Now that you have set up the Salesforce source connector, check out the followin ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------- | | 2.5.9 | 2024-05-02 | [37749](https://github.com/airbytehq/airbyte/pull/37749) | Adding mock server tests for bulk streams | | 2.5.8 | 2024-04-30 | [37340](https://github.com/airbytehq/airbyte/pull/37340) | Source Salesforce: reduce info logs | | 2.5.7 | 2024-04-24 | [36657](https://github.com/airbytehq/airbyte/pull/36657) | Schema descriptions | diff --git a/docs/integrations/sources/sap-business-one.md b/docs/integrations/sources/sap-business-one.md index 9acbfb75fd2..cf7ea7ce3db 100644 --- a/docs/integrations/sources/sap-business-one.md +++ b/docs/integrations/sources/sap-business-one.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The schema will be loaded according to the rules of the underlying database's connector and the data available in your B1 instance. - diff --git a/docs/integrations/sources/sap-fieldglass.md b/docs/integrations/sources/sap-fieldglass.md index a1e94cc498f..49b8e42b45b 100644 --- a/docs/integrations/sources/sap-fieldglass.md +++ b/docs/integrations/sources/sap-fieldglass.md @@ -9,17 +9,17 @@ This page contains the setup guide and reference information for the SAP Fieldgl ## Supported sync modes -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Supported Streams -* [Active Worker Download](https://api.sap.com/api/activeWorkerDownload/resource) +- [Active Worker Download](https://api.sap.com/api/activeWorkerDownload/resource) ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-22 | https://github.com/airbytehq/airbyte/pull/18656 | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :---------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-22 | https://github.com/airbytehq/airbyte/pull/18656 | Initial commit | diff --git a/docs/integrations/sources/search-metrics.md b/docs/integrations/sources/search-metrics.md index 19ae17af87b..afdb831ad70 100644 --- a/docs/integrations/sources/search-metrics.md +++ b/docs/integrations/sources/search-metrics.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The SearchMetrics source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The SearchMetrics source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. @@ -14,7 +14,6 @@ Users who still wish to sync data from this connector are advised to explore cre ::: - ## Overview The SearchMetrics source supports both Full Refresh and Incremental syncs. You can choose if this connector will copy only the new or updated data, or all rows in the tables and columns you set up for replication, every time a sync is run. @@ -23,39 +22,38 @@ The SearchMetrics source supports both Full Refresh and Incremental syncs. You c Several output streams are available from this source: -* [Projects](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5ODE-get-list-projects) \(Full table\) -* [BenchmarkRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDY-get-list-benchmark-rankings-s7) \(Full table\) -* [CompetitorRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDc-get-list-competitor-rankings-s7) \(Full table\) -* [DistributionKeywordsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDg-get-list-distribution-keywords-s7) \(Full table\) -* [KeywordPotentialsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTA-get-list-keyword-potentials-s7) \(Full table\) -* [ListCompetitors](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5OTI-get-list-competitors) \(Full table\) -* [ListCompetitorsRelevancy](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQxODQxNjU-get-list-competitors-relevancy) \(Full table\) -* [ListLosersS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTE-get-list-losers-s7) \(Full table\) -* [ListMarketShareS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTI-get-list-market-share-s7) \(Incremental\) -* [ListPositionSpreadHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTM-get-list-position-spread-historic-s7) \(Incremental\) -* [ListRankingsDomain](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5OTg-get-list-rankings-domain) \(Full table\) -* [ListRankingsHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTY-get-list-rankings-historic-s7) \(Full table\) -* [ListSeoVisibilityCountry](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQyMjg4NDk-get-list-seo-visibility-country) \(Full table\) -* [ListSeoVisibilityHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTc-get-list-seo-visibility-historic-s7) \(Incremental\) -* [ListSerpSpreadS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTg-get-list-serp-spread-s7) \(Full table\) -* [ListWinnersS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NjQ-get-list-winners-s7) \(Full table\) -* [SeoVisibilityValueS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQyMzQzMjk-get-value-seo-visibility) \(Full table\) -* [SerpSpreadValueS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0Njc-get-value-serp-spread-s7) \(Full table\) -* [TagPotentialsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTk-get-list-tag-potentials-s7) \(Full table\) -* [Tags](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjE4NzQ0ODMz-get-list-project-tags) \(Full table\) -* [UrlRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NjM-get-list-url-rankings-s7) \(Full table\) +- [Projects](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5ODE-get-list-projects) \(Full table\) +- [BenchmarkRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDY-get-list-benchmark-rankings-s7) \(Full table\) +- [CompetitorRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDc-get-list-competitor-rankings-s7) \(Full table\) +- [DistributionKeywordsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NDg-get-list-distribution-keywords-s7) \(Full table\) +- [KeywordPotentialsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTA-get-list-keyword-potentials-s7) \(Full table\) +- [ListCompetitors](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5OTI-get-list-competitors) \(Full table\) +- [ListCompetitorsRelevancy](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQxODQxNjU-get-list-competitors-relevancy) \(Full table\) +- [ListLosersS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTE-get-list-losers-s7) \(Full table\) +- [ListMarketShareS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTI-get-list-market-share-s7) \(Incremental\) +- [ListPositionSpreadHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTM-get-list-position-spread-historic-s7) \(Incremental\) +- [ListRankingsDomain](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQwODQ5OTg-get-list-rankings-domain) \(Full table\) +- [ListRankingsHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTY-get-list-rankings-historic-s7) \(Full table\) +- [ListSeoVisibilityCountry](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQyMjg4NDk-get-list-seo-visibility-country) \(Full table\) +- [ListSeoVisibilityHistoricS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTc-get-list-seo-visibility-historic-s7) \(Incremental\) +- [ListSerpSpreadS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTg-get-list-serp-spread-s7) \(Full table\) +- [ListWinnersS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NjQ-get-list-winners-s7) \(Full table\) +- [SeoVisibilityValueS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQyMzQzMjk-get-value-seo-visibility) \(Full table\) +- [SerpSpreadValueS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0Njc-get-value-serp-spread-s7) \(Full table\) +- [TagPotentialsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NTk-get-list-tag-potentials-s7) \(Full table\) +- [Tags](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjE4NzQ0ODMz-get-list-project-tags) \(Full table\) +- [UrlRankingsS7](https://developer.searchmetrics.com/docs/apiv4-documentation/ZG9jOjQzNjc0NjM-get-list-url-rankings-s7) \(Full table\) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | - +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | The SearchMetrics connector should not run into SearchMetrics API limitations under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. @@ -63,8 +61,8 @@ The SearchMetrics connector should not run into SearchMetrics API limitations un ### Requirements -* SearchMetrics Client Secret -* SearchMetrics API Key +- SearchMetrics Client Secret +- SearchMetrics API Key ### Setup guide @@ -72,7 +70,7 @@ Please read [How to get your API Key and Client Secret](https://developer.search ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :-------- | :----- | :------ | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------- | :---------------------------------- | | 0.1.1 | 2021-12-22 | [6992](https://github.com/airbytehq/airbyte/pull/6992) | Deleted windows in days from config | | 0.1.0 | 2021-10-13 | [6992](https://github.com/airbytehq/airbyte/pull/6992) | Release SearchMetrics CDK Connector | diff --git a/docs/integrations/sources/secoda.md b/docs/integrations/sources/secoda.md index dda0935be9a..dcb1c47bbac 100644 --- a/docs/integrations/sources/secoda.md +++ b/docs/integrations/sources/secoda.md @@ -6,16 +6,16 @@ This source can sync data from the [Secoda API](https://docs.secoda.co/secoda-ap ## This Source Supports the Following Streams -* collections -* tables -* terms +- collections +- tables +- terms ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -23,10 +23,10 @@ This source can sync data from the [Secoda API](https://docs.secoda.co/secoda-ap ### Requirements -* API Access +- API Access ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-27 | [#18378](https://github.com/airbytehq/airbyte/pull/18378) | 🎉 New Source: Secoda API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :--------------------------------------- | +| 0.1.0 | 2022-10-27 | [#18378](https://github.com/airbytehq/airbyte/pull/18378) | 🎉 New Source: Secoda API [low-code CDK] | diff --git a/docs/integrations/sources/sendgrid-migrations.md b/docs/integrations/sources/sendgrid-migrations.md index e06eac15698..0179d10eb9f 100644 --- a/docs/integrations/sources/sendgrid-migrations.md +++ b/docs/integrations/sources/sendgrid-migrations.md @@ -6,13 +6,13 @@ We're continuously striving to enhance the quality and reliability of our connec As part of our commitment to delivering exceptional service, we are transitioning Source Sendgrid from the Python Connector Development Kit (CDK) to our new low-code framework improving maintainability and reliability of the connector. Due to differences between the Python and low-code CDKs, this migration constitutes a breaking change. - -* The configuration options have been renamed to `api_key` and `start_date`. -* The `unsubscribe_groups` stream has been removed as it was a duplicate of `suppression_groups`. You can use `suppression_groups` and get the same data you were previously receiving in `unsubscribe_groups`. -* The `single_sends` stream has been renamed to `singlesend_stats`. This was done to more closely match the data from the Sendgrid API. -* The `segments` stream has been upgraded to use the Sendgrid 2.0 API as the previous version of the API has been deprecated. As a result, fields within the stream have changed to reflect the new API. -To ensure a smooth upgrade, please clear your streams and trigger a sync to bring in historical data. +- The configuration options have been renamed to `api_key` and `start_date`. +- The `unsubscribe_groups` stream has been removed as it was a duplicate of `suppression_groups`. You can use `suppression_groups` and get the same data you were previously receiving in `unsubscribe_groups`. +- The `single_sends` stream has been renamed to `singlesend_stats`. This was done to more closely match the data from the Sendgrid API. +- The `segments` stream has been upgraded to use the Sendgrid 2.0 API as the previous version of the API has been deprecated. As a result, fields within the stream have changed to reflect the new API. + +To ensure a smooth upgrade, please clear your streams and trigger a sync to bring in historical data. ## Migration Steps @@ -21,18 +21,18 @@ To ensure a smooth upgrade, please clear your streams and trigger a sync to brin Airbyte Open Source users must manually update the connector image in their local registry before proceeding with the migration. To do so: 1. Select **Settings** in the main navbar. - 1. Select **Sources**. -2. Find Sendgrid in the list of connectors. + 1. Select **Sources**. +2. Find Sendgrid in the list of connectors. :::note You will see two versions listed, the current in-use version and the latest version available. -::: +::: 3. Select **Change** to update your OSS version to the latest available version. ### Update the connector version -1. Select **Sources** in the main navbar. +1. Select **Sources** in the main navbar. 2. Select the instance of the connector you wish to upgrade. :::note @@ -40,18 +40,18 @@ Each instance of the connector must be updated separately. If you have created m ::: 3. Select **Upgrade** - 1. Follow the prompt to confirm you are ready to upgrade to the new version. + 1. Follow the prompt to confirm you are ready to upgrade to the new version. ### For Airbyte Cloud and Open Source: Steps to Update Schema and Clear Streams To clear your data for the affected streams, follow the steps below: 1. Select **Connections** in the main navbar and select the connection(s) affected by the update. -2. Select the **Schema** tab. - 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. - 2. Select **OK** to approve changes. -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. -4. Select **Save connection**. +2. Select the **Schema** tab. + 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. + 2. Select **OK** to approve changes. +3. Select **Save changes** at the bottom of the page. + 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. +4. Select **Save connection**. -This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). \ No newline at end of file +This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/sendgrid.md b/docs/integrations/sources/sendgrid.md index 179f30ba5e0..539d623cd8e 100644 --- a/docs/integrations/sources/sendgrid.md +++ b/docs/integrations/sources/sendgrid.md @@ -8,15 +8,16 @@ This page contains the setup guide and reference information for the [Sendgrid]( ## Prerequisites -* [Sendgrid API Key](https://docs.sendgrid.com/ui/account-and-settings/api-keys#creating-an-api-key) +- [Sendgrid API Key](https://docs.sendgrid.com/ui/account-and-settings/api-keys#creating-an-api-key) ## Setup guide + ### Step 1: Set up Sendgrid -* Sendgrid Account -* [Create Sendgrid API Key](https://docs.sendgrid.com/ui/account-and-settings/api-keys#creating-an-api-key) with the following permissions: -* Read-only access to all resources -* Full access to marketing resources +- Sendgrid Account +- [Create Sendgrid API Key](https://docs.sendgrid.com/ui/account-and-settings/api-keys#creating-an-api-key) with the following permissions: +- Read-only access to all resources +- Full access to marketing resources ### Step 2: Set up the Sendgrid connector in Airbyte @@ -33,28 +34,27 @@ This page contains the setup guide and reference information for the [Sendgrid]( The Sendgrid source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) -* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) ## Supported Streams -* [Campaigns](https://docs.sendgrid.com/api-reference/campaigns-api/retrieve-all-campaigns) -* [Lists](https://docs.sendgrid.com/api-reference/lists/get-all-lists) -* [Contacts](https://docs.sendgrid.com/api-reference/contacts/export-contacts) -* [Stats automations](https://docs.sendgrid.com/api-reference/marketing-campaign-stats/get-all-automation-stats) -* [Segments](https://docs.sendgrid.com/api-reference/segmenting-contacts/get-list-of-segments) -* [Single Sends](https://docs.sendgrid.com/api-reference/marketing-campaign-stats/get-all-single-sends-stats) -* [Templates](https://docs.sendgrid.com/api-reference/transactional-templates/retrieve-paged-transactional-templates) -* [Global suppression](https://docs.sendgrid.com/api-reference/suppressions-global-suppressions/retrieve-all-global-suppressions) \(Incremental\) -* [Suppression groups](https://docs.sendgrid.com/api-reference/suppressions-unsubscribe-groups/retrieve-all-suppression-groups-associated-with-the-user) -* [Suppression group members](https://docs.sendgrid.com/api-reference/suppressions-suppressions/retrieve-all-suppressions) \(Incremental\) -* [Blocks](https://docs.sendgrid.com/api-reference/blocks-api/retrieve-all-blocks) \(Incremental\) -* [Bounces](https://docs.sendgrid.com/api-reference/bounces-api/retrieve-all-bounces) \(Incremental\) -* [Invalid emails](https://docs.sendgrid.com/api-reference/invalid-e-mails-api/retrieve-all-invalid-emails) \(Incremental\) -* [Spam reports](https://docs.sendgrid.com/api-reference/spam-reports-api/retrieve-all-spam-reports) -* [Unsubscribe Groups](https://docs.sendgrid.com/api-reference/suppressions-unsubscribe-groups/retrieve-all-suppression-groups-associated-with-the-user) - +- [Campaigns](https://docs.sendgrid.com/api-reference/campaigns-api/retrieve-all-campaigns) +- [Lists](https://docs.sendgrid.com/api-reference/lists/get-all-lists) +- [Contacts](https://docs.sendgrid.com/api-reference/contacts/export-contacts) +- [Stats automations](https://docs.sendgrid.com/api-reference/marketing-campaign-stats/get-all-automation-stats) +- [Segments](https://docs.sendgrid.com/api-reference/segmenting-contacts/get-list-of-segments) +- [Single Sends](https://docs.sendgrid.com/api-reference/marketing-campaign-stats/get-all-single-sends-stats) +- [Templates](https://docs.sendgrid.com/api-reference/transactional-templates/retrieve-paged-transactional-templates) +- [Global suppression](https://docs.sendgrid.com/api-reference/suppressions-global-suppressions/retrieve-all-global-suppressions) \(Incremental\) +- [Suppression groups](https://docs.sendgrid.com/api-reference/suppressions-unsubscribe-groups/retrieve-all-suppression-groups-associated-with-the-user) +- [Suppression group members](https://docs.sendgrid.com/api-reference/suppressions-suppressions/retrieve-all-suppressions) \(Incremental\) +- [Blocks](https://docs.sendgrid.com/api-reference/blocks-api/retrieve-all-blocks) \(Incremental\) +- [Bounces](https://docs.sendgrid.com/api-reference/bounces-api/retrieve-all-bounces) \(Incremental\) +- [Invalid emails](https://docs.sendgrid.com/api-reference/invalid-e-mails-api/retrieve-all-invalid-emails) \(Incremental\) +- [Spam reports](https://docs.sendgrid.com/api-reference/spam-reports-api/retrieve-all-spam-reports) +- [Unsubscribe Groups](https://docs.sendgrid.com/api-reference/suppressions-unsubscribe-groups/retrieve-all-suppression-groups-associated-with-the-user) ## Create a read-only API key (Optional) @@ -76,15 +76,16 @@ Expand to see details about Sendgrid connector limitations and troubleshooting. The connector is restricted by normal Sendgrid [requests limitation](https://docs.sendgrid.com/api-reference/how-to-use-the-sendgrid-v3-api/rate-limits). ### Troubleshooting -* **Legacy marketing campaigns are not supported by this source connector**. Sendgrid provides two different kinds of marketing campaigns, "legacy marketing campaigns" and "new marketing campaigns". If you are seeing a `403 FORBIDDEN error message for https://api.sendgrid.com/v3/marketing/campaigns`, it might be because your SendGrid account uses legacy marketing campaigns. -* Check out common troubleshooting issues for the Sendgrid source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). + +- **Legacy marketing campaigns are not supported by this source connector**. Sendgrid provides two different kinds of marketing campaigns, "legacy marketing campaigns" and "new marketing campaigns". If you are seeing a `403 FORBIDDEN error message for https://api.sendgrid.com/v3/marketing/campaigns`, it might be because your SendGrid account uses legacy marketing campaigns. +- Check out common troubleshooting issues for the Sendgrid source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions).
    ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | 1.0.0 | 2024-04-15 | [35776](https://github.com/airbytehq/airbyte/pull/35776) | Migration to low-code CDK. Breaking change that updates configuration keys, removes unsubscribe_groups stream, renames a stream to singlesend_stats, and adds the singlesends stream. | | 0.5.0 | 2024-03-26 | [36455](https://github.com/airbytehq/airbyte/pull/36455) | Unpin CDK version, add record counts to state messages | | 0.4.3 | 2024-02-21 | [35181](https://github.com/airbytehq/airbyte/pull/35343) | Handle uncompressed contacts downloads. | diff --git a/docs/integrations/sources/sendinblue.md b/docs/integrations/sources/sendinblue.md index 0e56d9bbc2c..485533e0163 100644 --- a/docs/integrations/sources/sendinblue.md +++ b/docs/integrations/sources/sendinblue.md @@ -6,16 +6,16 @@ This source can sync data from the [Sendinblue API](https://developers.sendinblu ## This Source Supports the Following Streams -* [contacts](https://developers.brevo.com/reference/getcontacts-1) *(Incremental Sync)* -* [campaigns](https://developers.brevo.com/reference/getemailcampaigns-1) -* [templates](https://developers.brevo.com/reference/getsmtptemplates) +- [contacts](https://developers.brevo.com/reference/getcontacts-1) _(Incremental Sync)_ +- [campaigns](https://developers.brevo.com/reference/getemailcampaigns-1) +- [templates](https://developers.brevo.com/reference/getsmtptemplates) ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | ### Performance considerations @@ -25,11 +25,11 @@ Sendinblue APIs are under rate limits for the number of API calls allowed per AP ### Requirements -* Sendinblue API KEY +- Sendinblue API KEY ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------------------------------ | | 0.1.1 | 2022-08-31 | [#30022](https://github.com/airbytehq/airbyte/pull/30022) | ✨ Source SendInBlue: Add incremental sync to contacts stream | -| 0.1.0 | 2022-11-01 | [#18771](https://github.com/airbytehq/airbyte/pull/18771) | 🎉 New Source: Sendinblue API [low-code CDK] | +| 0.1.0 | 2022-11-01 | [#18771](https://github.com/airbytehq/airbyte/pull/18771) | 🎉 New Source: Sendinblue API [low-code CDK] | diff --git a/docs/integrations/sources/senseforce.md b/docs/integrations/sources/senseforce.md index ba04db94493..79859b60e91 100644 --- a/docs/integrations/sources/senseforce.md +++ b/docs/integrations/sources/senseforce.md @@ -5,15 +5,18 @@ This page guides you through the process of setting up the Senseforce source con ## Sync overview ## Prerequisites + - A [Senseforce Dataset](https://manual.senseforce.io/manual/sf-platform/dataset-builder) to export - Your [Senseforce `API Access Token`](https://manual.senseforce.io/manual/sf-platform/public-api/get-your-access-token) - Your [Senseforce `Backend URL`](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints#prerequisites) - Your [Senseforce `Dataset ID`](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints#prerequisites) ## Creating a Senseforce Dataset to Export -The Senseforce Airbyte connector allows to export custom datasets built bei Senseforce users. Follow these steps to configure a dataset which can be exported with the Airbyte connector: + +The Senseforce Airbyte connector allows to export custom datasets built bei Senseforce users. Follow these steps to configure a dataset which can be exported with the Airbyte connector: + 1. Create a new, empty dataset as documented [here](https://manual.senseforce.io/manual/sf-platform/dataset-builder) -2. Add at least the following columns (these columns are Senseforce system columns and available for all of your custom data models/event schemas): +2. Add at least the following columns (these columns are Senseforce system columns and available for all of your custom data models/event schemas): 1. Metadata -> Timestamp 2. Metadata -> Thing 3. Metadata -> Id @@ -25,20 +28,19 @@ The Senseforce Airbyte connector allows to export custom datasets built bei Sens > **IMPORTANT:** The Timestamp, Thing and Id column are mandatory for the Connector to work as intended. While it still works without eg. the "Id", functionality might be impaired if one of these 3 columns is missing. Make sure to not rename these columns - keep them at their default names. - ## Set up the Senseforce source connector 1. Log into your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open Source account. -2. Click **Sources** and then click **+ New source**. +2. Click **Sources** and then click **+ New source**. 3. On the Set up the source page, select **Senseforce** from the Source type dropdown. 4. Enter a name for your source. 5. For **API Access Token**, enter your [Senseforce `API Access Token`](https://manual.senseforce.io/manual/sf-platform/public-api/get-your-access-token). 6. For **Senseforce backend URL**, enter your [Senseforce `Backend URL`](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints#prerequisites). -6. For **Dataset ID**, enter your [Senseforce `Dataset ID`](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints#prerequisites). +7. For **Dataset ID**, enter your [Senseforce `Dataset ID`](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints#prerequisites). - We recommend creating an api access token specifically for Airbyte to control which resources Airbyte can access. For good operations, we recommend to create a separate Airbyte User as well as a separate Senseforce [Airbyte Group](https://manual.senseforce.io/manual/sf-platform/user-and-group-management). Share the dataset with this group and grant Dataset Read, Event Schema Read and Machine Master Data Read permissions. + We recommend creating an api access token specifically for Airbyte to control which resources Airbyte can access. For good operations, we recommend to create a separate Airbyte User as well as a separate Senseforce [Airbyte Group](https://manual.senseforce.io/manual/sf-platform/user-and-group-management). Share the dataset with this group and grant Dataset Read, Event Schema Read and Machine Master Data Read permissions. -7. For **The first day (in UTC) when to read data from**, enter the day in YYYY-MM-DD format. The data added on and after this day will be replicated. +8. For **The first day (in UTC) when to read data from**, enter the day in YYYY-MM-DD format. The data added on and after this day will be replicated. 9. Click **Set up source**. ## Supported sync modes @@ -49,8 +51,10 @@ The Senseforce source connector supports the following [sync modes](https://docs - Incremental > **NOTE:** The Senseforce Airbyte connector uses the Timestamp column to determine, which data were already read. Data inserted AFTER a finished sync, with timestamps less than already synced ones, are not considered for the next sync anymore. -If this behavior does not fit your use case, follow the next section +> If this behavior does not fit your use case, follow the next section + ### Using Inserted Timestamp instead of Data Timestamp for incremental modes + 1. Rename your "Timestamp" column to "Timestamp_data" 2. Add the Metadata -> Inserted column to your dataset. 3. Move the newly added "Inserted" column to position 1. @@ -61,15 +65,15 @@ Now the inserted timestamp will be used for creating the Airbyte cursor. Note th ## Supported Streams The Senseforce source connector supports the following streams: -- [Senseforce Datasets](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints) +- [Senseforce Datasets](https://manual.senseforce.io/manual/sf-platform/public-api/endpoints) ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | ### Performance considerations @@ -78,7 +82,7 @@ Senseforce utilizes an undocumented rate limit which - under normal use - should ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.1 | 2023-02-13 | [22892](https://github.com/airbytehq/airbyte/pull/22892) | Specified date formatting in specification | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------------------- | +| 0.1.1 | 2023-02-13 | [22892](https://github.com/airbytehq/airbyte/pull/22892) | Specified date formatting in specification | | 0.1.0 | 2022-10-26 | [#18775](https://github.com/airbytehq/airbyte/pull/18775) | 🎉 New Source: Mailjet SMS API [low-code CDK] | diff --git a/docs/integrations/sources/sentry.md b/docs/integrations/sources/sentry.md index dab0ff0f04f..6c3127f631c 100644 --- a/docs/integrations/sources/sentry.md +++ b/docs/integrations/sources/sentry.md @@ -46,7 +46,7 @@ The Sentry source connector supports the following [sync modes](https://docs.air ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- | | 0.5.1 | 2024-04-01 | [36731](https://github.com/airbytehq/airbyte/pull/36731) | Add `%Y-%m-%dT%H:%M:%S%z` to date time formats. | | 0.5.0 | 2024-03-27 | [35755](https://github.com/airbytehq/airbyte/pull/35755) | Migrate to low-code. | | 0.4.2 | 2024-03-25 | [36448](https://github.com/airbytehq/airbyte/pull/36448) | Unpin CDK version | diff --git a/docs/integrations/sources/serpstat.md b/docs/integrations/sources/serpstat.md index 55ea29d539d..aef8004f89e 100644 --- a/docs/integrations/sources/serpstat.md +++ b/docs/integrations/sources/serpstat.md @@ -3,7 +3,8 @@ This page contains the setup guide and reference information for the Serpstat source connector. ## Setup guide -### Step 1: Get Serpstat API key + +### Step 1: Get Serpstat API key #### For new Serpstat users @@ -30,23 +31,23 @@ Go to [My account](https://serpstat.com/users/profile/) page and click **Copy** The Serpstat source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* Full refresh +- Full refresh ## Supported Streams -* [Domains summary](https://serpstat.com/api/412-summarnij-otchet-po-domenu-v4-serpstatdomainproceduregetdomainsinfo/) -* [Domain history](https://serpstat.com/api/420-istoriya-po-domenu-v4-serpstatdomainproceduregetdomainshistory/) -* [Domain keywords](https://serpstat.com/api/584-top-search-engine-keywords-by-v4-domain-serpstatdomainproceduregetdomainkeywords/) -* [Domain keywords by region](https://serpstat.com/api/sorting-the-domain-by-keywords/) -* [Domain competitors](https://serpstat.com/api/590-domain-competitors-in-v4-search-result-serpstatdomainproceduregetcompetitors/) -* [Domain top pages](https://serpstat.com/api/588-domain-top-urls-v4-serpstatdomainproceduregettopurls/) +- [Domains summary](https://serpstat.com/api/412-summarnij-otchet-po-domenu-v4-serpstatdomainproceduregetdomainsinfo/) +- [Domain history](https://serpstat.com/api/420-istoriya-po-domenu-v4-serpstatdomainproceduregetdomainshistory/) +- [Domain keywords](https://serpstat.com/api/584-top-search-engine-keywords-by-v4-domain-serpstatdomainproceduregetdomainkeywords/) +- [Domain keywords by region](https://serpstat.com/api/sorting-the-domain-by-keywords/) +- [Domain competitors](https://serpstat.com/api/590-domain-competitors-in-v4-search-result-serpstatdomainproceduregetcompetitors/) +- [Domain top pages](https://serpstat.com/api/588-domain-top-urls-v4-serpstatdomainproceduregettopurls/) + +## Performance considerations -## Performance considerations - The maximum sync speed is limited by the number of requests per second per API key. See this limit in your [Serpstat account](https://serpstat.com/users/profile/). ## Changelog -| Version | Date | Pull Request | Subject | -|:--------| :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------------------------- | -| 0.1.0 | 2023-08-21 | [28147](https://github.com/airbytehq/airbyte/pull/28147) | Release Serpstat Connector | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------- | +| 0.1.0 | 2023-08-21 | [28147](https://github.com/airbytehq/airbyte/pull/28147) | Release Serpstat Connector | diff --git a/docs/integrations/sources/sftp-bulk-migrations.md b/docs/integrations/sources/sftp-bulk-migrations.md index 7178311368e..e40936c08e0 100644 --- a/docs/integrations/sources/sftp-bulk-migrations.md +++ b/docs/integrations/sources/sftp-bulk-migrations.md @@ -1,6 +1,7 @@ # SFTP Bulk Migration Guide ## Upgrading to 1.0.0 + We're continuously striving to enhance the quality and reliability of our connectors at Airbyte. As part of our commitment to delivering exceptional service, we are transitioning our SFTP Bulk source from the Python Connector Development Kit (CDK) to our new low-code framework improving maintainability and reliability of the connector. Due to differences between the Python and low-code CDKs, this migration constitutes a breaking change for the following: @@ -10,16 +11,16 @@ As part of our commitment to delivering exceptional service, we are transitionin ## Migration Steps -This version change requires you to re-verify the configuration of your source. To do this, click on "Source" in the left-hand sidebar and navigate to your SFTP Bulk source. Test the source and ensure the configuration is valid before moving forward. +This version change requires you to re-verify the configuration of your source. To do this, click on "Source" in the left-hand sidebar and navigate to your SFTP Bulk source. Test the source and ensure the configuration is valid before moving forward. Clearing your data is required for the affected streams in order to continue syncing successfully. To clear your data for the affected streams, follow the steps below: 1. Select **Connections** in the main navbar and select the connection(s) affected by the update. -2. Select the **Schema** tab. - 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. - 2. Select **OK** to approve changes. -3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. -4. Select **Save connection**. +2. Select the **Schema** tab. + 1. Select **Refresh source schema** to bring in any schema changes. Any detected schema changes will be listed for your review. + 2. Select **OK** to approve changes. +3. Select **Save changes** at the bottom of the page. + 1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema. +4. Select **Save connection**. -This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). \ No newline at end of file +This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). diff --git a/docs/integrations/sources/sftp-bulk.md b/docs/integrations/sources/sftp-bulk.md index 56bb62a7fa0..e9c0a35394b 100644 --- a/docs/integrations/sources/sftp-bulk.md +++ b/docs/integrations/sources/sftp-bulk.md @@ -119,7 +119,7 @@ More formats \(e.g. Apache Avro\) will be supported in the future. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------- | | 1.0.0 | 2024-03-22 | [36256](https://github.com/airbytehq/airbyte/pull/36256) | Migrate to File-Based CDK. Manage dependencies with Poetry. | | 0.1.2 | 2023-04-19 | [19224](https://github.com/airbytehq/airbyte/pull/19224) | Support custom CSV separators | | 0.1.1 | 2023-03-17 | [24180](https://github.com/airbytehq/airbyte/pull/24180) | Fix field order | diff --git a/docs/integrations/sources/sftp.md b/docs/integrations/sources/sftp.md index 13cd4a0979f..a16741ade69 100644 --- a/docs/integrations/sources/sftp.md +++ b/docs/integrations/sources/sftp.md @@ -107,8 +107,8 @@ More formats \(e.g. Apache Avro\) will be supported in the future. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------| -| 0.2.2 | 2024-02-13 | [35221](https://github.com/airbytehq/airbyte/pull/35221) | Adopt CDK 0.20.4 | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------- | +| 0.2.2 | 2024-02-13 | [35221](https://github.com/airbytehq/airbyte/pull/35221) | Adopt CDK 0.20.4 | | 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.2.0 | 2024-01-15 | [34265](https://github.com/airbytehq/airbyte/pull/34265) | Remove LEGACY state flag | | 0.1.2 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors | diff --git a/docs/integrations/sources/shopify-migrations.md b/docs/integrations/sources/shopify-migrations.md index 0ecf880c31a..699d4be8fb4 100644 --- a/docs/integrations/sources/shopify-migrations.md +++ b/docs/integrations/sources/shopify-migrations.md @@ -1,50 +1,56 @@ # Shopify Migration Guide ## Upgrading to 2.0.0 + This version implements `Shopify GraphQL BULK Operations` to speed up the following streams: - - `Collections` - - `Customer Address` - - `Discount Codes` - - `Fulfillment Orders` - - `Inventory Items` - - `Inventory Levels` - - `Metafield Collections` - - `Metafield Customers` - - `Metafield Draft_orders` - - `Metafield Locations` - - `Metafield Orders` - - `Metafield Product Images` - - `Metafield Product Variants` - - `Transactions Graphql` (duplicated `Transactions` stream to provide faster fetch) + +- `Collections` +- `Customer Address` +- `Discount Codes` +- `Fulfillment Orders` +- `Inventory Items` +- `Inventory Levels` +- `Metafield Collections` +- `Metafield Customers` +- `Metafield Draft_orders` +- `Metafield Locations` +- `Metafield Orders` +- `Metafield Product Images` +- `Metafield Product Variants` +- `Transactions Graphql` (duplicated `Transactions` stream to provide faster fetch) Increased the performance for the following streams: + - `Fulfillments` - `Order Refunds` - `Product Images` - `Product Variants` - + Other bug fixes and improvements, more info: `https://github.com/airbytehq/airbyte/pull/32345` ### Action items required for 2.0.0 -* The `Fulfillments` stream now has the cursor field `updated_at`, instead of the `id`. -* The `Order Refunds` stream, now has the schema `refund_line_items.line_item.properties` to array of `strings`, instead of `object` with properties. -* The `Fulfillment Orders` stream now has the `supported_actions` schema as `array of objects` instead of `array of strings`. -* The `Collections` stream now requires additional api scope `read_publications` to fetch the `published_at` field with `GraphQL BULK Operations`. - - if `API_PASSWORD` is used for authentication: - - BEFORE UPDATING to the `2.0.0`: update your `Private Developer Application` scopes with `read_publications` and save the changes, in your Shopify Account. - - if `OAuth2.0` is used for authentication: - - `re-auth` in order to obtain new scope automatically, after the upgrade. - - `Refresh Schema` + `Reset` is required for these streams after the upgrade from previous version. +- The `Fulfillments` stream now has the cursor field `updated_at`, instead of the `id`. +- The `Order Refunds` stream, now has the schema `refund_line_items.line_item.properties` to array of `strings`, instead of `object` with properties. +- The `Fulfillment Orders` stream now has the `supported_actions` schema as `array of objects` instead of `array of strings`. +- The `Collections` stream now requires additional api scope `read_publications` to fetch the `published_at` field with `GraphQL BULK Operations`. + - if `API_PASSWORD` is used for authentication: + - BEFORE UPDATING to the `2.0.0`: update your `Private Developer Application` scopes with `read_publications` and save the changes, in your Shopify Account. + - if `OAuth2.0` is used for authentication: + - `re-auth` in order to obtain new scope automatically, after the upgrade. + - `Refresh Schema` + `Reset` is required for these streams after the upgrade from previous version. ## Upgrading to 1.0.0 + This version uses Shopify API version `2023-07` which brings changes to the following streams: - - removed `gateway, payment_details, processing_method` properties from `Order` stream, they are no longer supplied. - - added `company, confirmation_number, current_total_additional_fees_set, original_total_additional_fees_set, tax_exempt, po_number` properties to `Orders` stream - - added `total_unsettled_set, payment_id` to `Transactions` stream - - added `return` property to `Order Refund` stream - - added `created_at, updated_at` to `Fulfillment Order` stream + +- removed `gateway, payment_details, processing_method` properties from `Order` stream, they are no longer supplied. +- added `company, confirmation_number, current_total_additional_fees_set, original_total_additional_fees_set, tax_exempt, po_number` properties to `Orders` stream +- added `total_unsettled_set, payment_id` to `Transactions` stream +- added `return` property to `Order Refund` stream +- added `created_at, updated_at` to `Fulfillment Order` stream ### Action items required for 1.0.0 - * The `reset` and `full-refresh` for `Orders` stream is required after upgrading to this version. + +- The `reset` and `full-refresh` for `Orders` stream is required after upgrading to this version. diff --git a/docs/integrations/sources/shopify.md b/docs/integrations/sources/shopify.md index 7872eb84cb5..92dfca573f4 100644 --- a/docs/integrations/sources/shopify.md +++ b/docs/integrations/sources/shopify.md @@ -8,10 +8,10 @@ This page contains the setup guide and reference information for the [Shopify](h ## Prerequisites -* An active [Shopify store](https://www.shopify.com). -* If you are syncing data from a store that you do not own, you will need to [request access to your client's store](https://help.shopify.com/en/partners/dashboard/managing-stores/request-access#request-access) (not required for account owners). +- An active [Shopify store](https://www.shopify.com). +- If you are syncing data from a store that you do not own, you will need to [request access to your client's store](https://help.shopify.com/en/partners/dashboard/managing-stores/request-access#request-access) (not required for account owners). -* For **Airbyte Open Source** users: A custom Shopify application with [`read_` scopes enabled](#scopes-required-for-custom-app). +- For **Airbyte Open Source** users: A custom Shopify application with [`read_` scopes enabled](#scopes-required-for-custom-app). ## Setup guide @@ -19,6 +19,7 @@ This page contains the setup guide and reference information for the [Shopify](h This connector supports **OAuth2.0** and **API Password** (for private applications) authentication methods. + :::note For existing **Airbyte Cloud** customers, if you are currently using the **API Password** authentication method, please switch to **OAuth2.0**, as the API Password will be deprecated shortly. This change will not affect **Airbyte Open Source** connections. ::: @@ -38,6 +39,7 @@ For existing **Airbyte Cloud** customers, if you are currently using the **API P + ### Airbyte Open Source #### Create a custom app @@ -64,39 +66,39 @@ Authentication to the Shopify API requires a [custom application](https://help.s Add the following scopes to your custom app to ensure Airbyte can sync all available data. For more information on access scopes, see the [Shopify docs](https://shopify.dev/docs/api/usage/access-scopes). -* `read_analytics` -* `read_assigned_fulfillment_orders` -* `read_content` -* `read_customers` -* `read_discounts` -* `read_draft_orders` -* `read_fulfillments` -* `read_gdpr_data_request` -* `read_gift_cards` -* `read_inventory` -* `read_legal_policies` -* `read_locations` -* `read_locales` -* `read_marketing_events` -* `read_merchant_managed_fulfillment_orders` -* `read_online_store_pages` -* `read_order_edits` -* `read_orders` -* `read_price_rules` -* `read_product_listings` -* `read_products` -* `read_publications` -* `read_reports` -* `read_resource_feedbacks` -* `read_script_tags` -* `read_shipping` -* `read_shopify_payments_accounts` -* `read_shopify_payments_bank_accounts` -* `read_shopify_payments_disputes` -* `read_shopify_payments_payouts` -* `read_themes` -* `read_third_party_fulfillment_orders` -* `read_translations` +- `read_analytics` +- `read_assigned_fulfillment_orders` +- `read_content` +- `read_customers` +- `read_discounts` +- `read_draft_orders` +- `read_fulfillments` +- `read_gdpr_data_request` +- `read_gift_cards` +- `read_inventory` +- `read_legal_policies` +- `read_locations` +- `read_locales` +- `read_marketing_events` +- `read_merchant_managed_fulfillment_orders` +- `read_online_store_pages` +- `read_order_edits` +- `read_orders` +- `read_price_rules` +- `read_product_listings` +- `read_products` +- `read_publications` +- `read_reports` +- `read_resource_feedbacks` +- `read_script_tags` +- `read_shipping` +- `read_shopify_payments_accounts` +- `read_shopify_payments_bank_accounts` +- `read_shopify_payments_disputes` +- `read_shopify_payments_payouts` +- `read_themes` +- `read_third_party_fulfillment_orders` +- `read_translations` @@ -187,27 +189,26 @@ Expand to see details about Shopify connector limitations and troubleshooting Shopify has some [rate limit restrictions](https://shopify.dev/concepts/about-apis/rate-limits). Typically, there should not be issues with throttling or exceeding the rate limits but, in some edge cases, you may encounter the following warning message: ```text -"Caught retryable error ' or null' after tries. +"Caught retryable error ' or null' after tries. Waiting seconds then retrying..." ``` This is expected when the connector hits a `429 - Rate Limit Exceeded` HTTP Error. The sync operation will continue successfully after a short backoff period. -For all `Shopify GraphQL BULK` api requests these limitations are applied: https://shopify.dev/docs/api/usage/bulk-operations/queries#operation-restrictions - +For all `Shopify GraphQL BULK` api requests these limitations are applied: https://shopify.dev/docs/api/usage/bulk-operations/queries#operation-restrictions ### Troubleshooting -* If you encounter access errors while using **OAuth2.0** authentication, please make sure you've followed this [Shopify Article](https://help.shopify.com/en/partners/dashboard/managing-stores/request-access#request-access) to request the access to the client's store first. Once the access is granted, you should be able to proceed with **OAuth2.0** authentication. -* Check out common troubleshooting issues for the Shopify source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). +- If you encounter access errors while using **OAuth2.0** authentication, please make sure you've followed this [Shopify Article](https://help.shopify.com/en/partners/dashboard/managing-stores/request-access#request-access) to request the access to the client's store first. Once the access is granted, you should be able to proceed with **OAuth2.0** authentication. +- Check out common troubleshooting issues for the Shopify source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 2.0.8 | 2024-05-02 | [37589](https://github.com/airbytehq/airbyte/pull/37589) | Added retry for known HTTP Errors for BULK streams | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| 2.0.8 | 2024-05-02 | [37589](https://github.com/airbytehq/airbyte/pull/37589) | Added retry for known HTTP Errors for BULK streams | | 2.0.7 | 2024-04-24 | [36660](https://github.com/airbytehq/airbyte/pull/36660) | Schema descriptions | | 2.0.6 | 2024-04-22 | [37468](https://github.com/airbytehq/airbyte/pull/37468) | Fixed one time retry for `Internal Server Error` for BULK streams | | 2.0.5 | 2024-04-03 | [36788](https://github.com/airbytehq/airbyte/pull/36788) | Added ability to dynamically adjust the size of the `slice` | diff --git a/docs/integrations/sources/shortio.md b/docs/integrations/sources/shortio.md index 35c1935d16e..2bebe991510 100644 --- a/docs/integrations/sources/shortio.md +++ b/docs/integrations/sources/shortio.md @@ -10,13 +10,13 @@ This source can sync data for the [Shortio API](https://developers.short.io/refe This Source is capable of syncing the following Streams: -* [Clicks](https://developers.short.io/reference#getdomaindomainidlink_clicks) -* [Links](https://developers.short.io/reference#apilinksget) +- [Clicks](https://developers.short.io/reference#getdomaindomainidlink_clicks) +- [Links](https://developers.short.io/reference#apilinksget) ### Data type mapping | Integration Type | Airbyte Type | Notes | -|:-----------------|:-------------|:------| +| :--------------- | :----------- | :---- | | `string` | `string` | | | `number` | `number` | | | `array` | `array` | | @@ -25,7 +25,7 @@ This Source is capable of syncing the following Streams: ### Features | Feature | Supported?\(Yes/No\) | Notes | -|:--------------------------|:---------------------|:------| +| :------------------------ | :------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental - Append Sync | Yes | | | Namespaces | No | | @@ -40,11 +40,10 @@ This Source is capable of syncing the following Streams: ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | | 0.2.1 | 2024-05-02 | [37597](https://github.com/airbytehq/airbyte/pull/37597) | Change `last_records` to `last_record` | | 0.2.0 | 2023-08-02 | [28950](https://github.com/airbytehq/airbyte/pull/28950) | Migrate to Low-Code CDK | | 0.1.3 | 2022-08-01 | [15066](https://github.com/airbytehq/airbyte/pull/15066) | Update primary key to `idString` | | 0.1.2 | 2021-12-28 | [8628](https://github.com/airbytehq/airbyte/pull/8628) | Update fields in source-connectors specifications | | 0.1.1 | 2021-11-08 | [7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | | 0.1.0 | 2021-08-16 | [3787](https://github.com/airbytehq/airbyte/pull/5418) | Add Native Shortio Source Connector | - diff --git a/docs/integrations/sources/slack-migrations.md b/docs/integrations/sources/slack-migrations.md index e2d0fb2129d..d5a1b101957 100644 --- a/docs/integrations/sources/slack-migrations.md +++ b/docs/integrations/sources/slack-migrations.md @@ -2,7 +2,7 @@ ## Upgrading to 1.0.0 -We're continuously striving to enhance the quality and reliability of our connectors at Airbyte. As part of our commitment to delivering exceptional service, we are transitioning source Slack from the Python Connector Development Kit (CDK) to our innovative low-code framework. This is part of a strategic move to streamline many processes across connectors, bolstering maintainability and freeing us to focus more of our efforts on improving the performance and features of our evolving platform and growing catalog. However, due to differences between the Python and low-code CDKs, this migration constitutes a breaking change. +We're continuously striving to enhance the quality and reliability of our connectors at Airbyte. As part of our commitment to delivering exceptional service, we are transitioning source Slack from the Python Connector Development Kit (CDK) to our innovative low-code framework. This is part of a strategic move to streamline many processes across connectors, bolstering maintainability and freeing us to focus more of our efforts on improving the performance and features of our evolving platform and growing catalog. However, due to differences between the Python and low-code CDKs, this migration constitutes a breaking change. We’ve evolved and standardized how state is managed for incremental streams that are nested within a parent stream. This change impacts how individual states are tracked and stored for each partition, using a more structured approach to ensure the most granular and flexible state management. This change will affect the `Channel Messages` stream. @@ -11,9 +11,8 @@ We’ve evolved and standardized how state is managed for incremental streams th Clearing your data is required in order to continue syncing `Channel Messages` successfully. To clear your data for the `Channel Messages` stream, follow the steps below: 1. Select **Connections** in the main nav bar. - 1. Select the connection(s) affected by the update. + 1. Select the connection(s) affected by the update. 2. Select the **Status** tab. - 1. In the **Enabled streams** list, click the three dots on the right side of the `Channel Messages` and select **Clear Data**. + 1. In the **Enabled streams** list, click the three dots on the right side of the `Channel Messages` and select **Clear Data**. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](https://docs.airbyte.com/operator-guides/reset). - diff --git a/docs/integrations/sources/slack.md b/docs/integrations/sources/slack.md index 2db7c1f1b2d..f3e88b2aada 100644 --- a/docs/integrations/sources/slack.md +++ b/docs/integrations/sources/slack.md @@ -65,19 +65,20 @@ This tutorial assumes that you are an administrator on your slack instance. If y 8. In Airbyte, create a Slack source. The "Bot User OAuth Access Token" from the earlier should be used as the token. 9. You can now pull data from your slack instance! - + **Airbyte Open Source additional setup steps** You can no longer create "Legacy" API Keys, but if you already have one, you can use it with this source. Fill it into the API key section. We recommend creating a restricted, read-only key specifically for Airbyte access. This will allow you to control which resources Airbyte should be able to access. - + ### Step 2: Set up the Slack connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -93,6 +94,7 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces + **For Airbyte Open Source:** 1. Navigate to the Airbyte Open Source dashboard. @@ -113,7 +115,7 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces The Slack source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Namespaces | No | @@ -122,11 +124,11 @@ The Slack source connector supports the following [sync modes](https://docs.airb For most of the streams, the Slack source connector uses the [Conversations API](https://api.slack.com/docs/conversations-api) under the hood. -* [Channels \(Conversations\)](https://api.slack.com/methods/conversations.list) -* [Channel Members \(Conversation Members\)](https://api.slack.com/methods/conversations.members) -* [Messages \(Conversation History\)](https://api.slack.com/methods/conversations.history) It will only replicate messages from non-archive, public and private channels that the Slack App is a member of. -* [Users](https://api.slack.com/methods/users.list) -* [Threads \(Conversation Replies\)](https://api.slack.com/methods/conversations.replies) +- [Channels \(Conversations\)](https://api.slack.com/methods/conversations.list) +- [Channel Members \(Conversation Members\)](https://api.slack.com/methods/conversations.members) +- [Messages \(Conversation History\)](https://api.slack.com/methods/conversations.history) It will only replicate messages from non-archive, public and private channels that the Slack App is a member of. +- [Users](https://api.slack.com/methods/users.list) +- [Threads \(Conversation Replies\)](https://api.slack.com/methods/conversations.replies) ## Performance considerations @@ -136,12 +138,12 @@ It is recommended to sync required channels only, this can be done by specifying ## Data type map -| Integration Type | Airbyte Type | -|:------------------|:-------------| -| `string` | `string` | -| `number` | `number` | -| `array` | `array` | -| `object` | `object` | +| Integration Type | Airbyte Type | +| :--------------- | :----------- | +| `string` | `string` | +| `number` | `number` | +| `array` | `array` | +| `object` | `object` | ## Limitations & Troubleshooting @@ -153,20 +155,21 @@ Expand to see details about Slack connector limitations and troubleshooting. ### Connector limitations #### Rate limiting + Slack has [rate limit restrictions](https://api.slack.com/docs/rate-limits). ### Troubleshooting -* Check out common troubleshooting issues for the Slack source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Slack source connector on our Airbyte Forum [here](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------- | | 1.1.1 | 2024-05-02 | [36661](https://github.com/airbytehq/airbyte/pull/36661) | Schema descriptions | -| 1.1.0 | 2024-04-18 | [37332](https://github.com/airbytehq/airbyte/pull/37332) | Add the capability to sync from private channels | +| 1.1.0 | 2024-04-18 | [37332](https://github.com/airbytehq/airbyte/pull/37332) | Add the capability to sync from private channels | | 1.0.0 | 2024-04-02 | [35477](https://github.com/airbytehq/airbyte/pull/35477) | Migration to low-code CDK | | 0.4.1 | 2024-03-27 | [36579](https://github.com/airbytehq/airbyte/pull/36579) | Upgrade airbyte-cdk version to emit record counts as floats | | 0.4.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | @@ -199,7 +202,7 @@ Slack has [rate limit restrictions](https://api.slack.com/docs/rate-limits). | 0.1.11 | 2021-08-27 | [5830](https://github.com/airbytehq/airbyte/pull/5830) | Fix sync operations hang forever issue | | 0.1.10 | 2021-08-27 | [5697](https://github.com/airbytehq/airbyte/pull/5697) | Fix max retries issue | | 0.1.9 | 2021-07-20 | [4860](https://github.com/airbytehq/airbyte/pull/4860) | Fix reading threads issue | -| 0.1.8 | 2021-07-14 | [4683](https://github.com/airbytehq/airbyte/pull/4683) | Add float\_ts primary key | +| 0.1.8 | 2021-07-14 | [4683](https://github.com/airbytehq/airbyte/pull/4683) | Add float_ts primary key | | 0.1.7 | 2021-06-25 | [3978](https://github.com/airbytehq/airbyte/pull/3978) | Release Slack CDK Connector | diff --git a/docs/integrations/sources/smaily.md b/docs/integrations/sources/smaily.md index 19d81701109..f45586b1452 100644 --- a/docs/integrations/sources/smaily.md +++ b/docs/integrations/sources/smaily.md @@ -6,19 +6,19 @@ This source can sync data from the [Smaily API](https://smaily.com/help/api/). A ## This Source Supports the Following Streams -* users -* segments -* campaigns -* templates -* automations -* A/B tests +- users +- segments +- campaigns +- templates +- automations +- A/B tests ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -28,12 +28,12 @@ The connector has a rate limit of 5 API requests per second per IP-address. ### Requirements -* Smaily API user username -* Smaily API user password -* Smaily API subdomain +- Smaily API user username +- Smaily API user password +- Smaily API subdomain ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.0 | 2022-10-25 | [18674](https://github.com/airbytehq/airbyte/pull/18674) | Initial commit | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------- | +| 0.1.0 | 2022-10-25 | [18674](https://github.com/airbytehq/airbyte/pull/18674) | Initial commit | diff --git a/docs/integrations/sources/smartengage.md b/docs/integrations/sources/smartengage.md index 9c277e7512c..968ec495dcd 100644 --- a/docs/integrations/sources/smartengage.md +++ b/docs/integrations/sources/smartengage.md @@ -6,30 +6,29 @@ This source can sync data from the [SmartEngage API](https://smartengage.com/doc ## This Source Supports the Following Streams -* avatars -* tags -* custom_fields -* sequences +- avatars +- tags +- custom_fields +- sequences ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | - +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Getting started ### Requirements -* SmartEngage API Key +- SmartEngage API Key ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| -| 0.1.3 | 2024-04-19 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | schema descriptions | -| 0.1.0 | 2022-10-25 | [18701](https://github.com/airbytehq/airbyte/pull/18701) | Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37261](https://github.com/airbytehq/airbyte/pull/37261) | schema descriptions | +| 0.1.0 | 2022-10-25 | [18701](https://github.com/airbytehq/airbyte/pull/18701) | Initial commit | diff --git a/docs/integrations/sources/smartsheets.md b/docs/integrations/sources/smartsheets.md index 52359b70e88..5fb63be6fa2 100644 --- a/docs/integrations/sources/smartsheets.md +++ b/docs/integrations/sources/smartsheets.md @@ -6,8 +6,8 @@ This page guides you through the process of setting up the Smartsheets source co To configure the Smartsheet Source for syncs, you'll need the following: -* A Smartsheets API access token - generated by a Smartsheets user with at least **read** access -* The ID of the spreadsheet you'd like to sync +- A Smartsheets API access token - generated by a Smartsheets user with at least **read** access +- The ID of the spreadsheet you'd like to sync ## Step 1: Set up Smartsheets @@ -15,10 +15,10 @@ To configure the Smartsheet Source for syncs, you'll need the following: You can generate an API key for your account from a session of your Smartsheet webapp by clicking: -* Account (top-right icon) -* Apps & Integrations -* API Access -* Generate new access token +- Account (top-right icon) +- Apps & Integrations +- API Access +- Generate new access token For questions on advanced authorization flows, refer to [this](https://www.smartsheet.com/content-center/best-practices/tips-tricks/api-getting-started). @@ -26,8 +26,8 @@ For questions on advanced authorization flows, refer to [this](https://www.smart You'll also need the ID of the Spreadsheet you'd like to sync. Unlike Google Sheets, this ID is not found in the URL. You can find the required spreadsheet ID from your Smartsheet app session by going to: -* File -* Properties +- File +- Properties ## Step 2: Set up the Smartsheets connector in Airbyte @@ -41,6 +41,7 @@ You'll also need the ID of the Spreadsheet you'd like to sync. Unlike Google She 6. Submit the form **For Airbyte Open Source:** + 1. Navigate to the Airbyte Open Source dashboard 2. Set the name for your source 3. Enter the API access token from Prerequisites @@ -51,10 +52,11 @@ You'll also need the ID of the Spreadsheet you'd like to sync. Unlike Google She ## Supported sync modes The Smartsheets source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): - - Full Refresh | Overwrite - - Full Refresh | Append - - Incremental | Append - - Incremental | Deduped + +- Full Refresh | Overwrite +- Full Refresh | Append +- Incremental | Append +- Incremental | Deduped ## Performance considerations @@ -68,49 +70,51 @@ For example, having a spreadsheet `Customers`, the connector would introduce a s Additionallly specific metadata fields related to the sheet or row can be include in the stream, these must be specified in the configuration in order to be included in the data stream | Supported Metadata Fields | -|------| -|sheetcreatedAt| -|sheetid| -|sheetmodifiedAt| -|sheetname| -|sheetpermalink| -|sheetversion| -|sheetaccess_level| -|row_id| -|row_access_level| -|row_created_at| -|row_created_by| -|row_expanded| -|row_modified_by| -|row_parent_id| -|row_permalink| -|row_number| -|row_version| +| ------------------------- | +| sheetcreatedAt | +| sheetid | +| sheetmodifiedAt | +| sheetname | +| sheetpermalink | +| sheetversion | +| sheetaccess_level | +| row_id | +| row_access_level | +| row_created_at | +| row_created_by | +| row_expanded | +| row_modified_by | +| row_parent_id | +| row_permalink | +| row_number | +| row_version | ## Important highlights + The Smartsheet Source is written to pull data from a single Smartsheet spreadsheet. Unlike Google Sheets, Smartsheets only allows one sheet per Smartsheet - so a given Airbyte connector instance can sync only one sheet at a time. To replicate multiple spreadsheets, you can create multiple instances of the Smartsheet Source in Airbyte, reusing the API token for all your sheets that you need to sync. **Note: Column headers must contain only alphanumeric characters or `_` , as specified in the** [**Airbyte Protocol**](../../understanding-airbyte/airbyte-protocol.md). ## Data type map + The data type mapping adopted by this connector is based on the Smartsheet [documentation](https://smartsheet-platform.github.io/api-docs/index.html?python#column-types). **NOTE**: For any column datatypes interpreted by Smartsheets beside `DATE` and `DATETIME`, this connector's source schema generation assumes a `string` type, in which case the `format` field is not required by Airbyte. -| Integration Type | Airbyte Type | Airbyte Format | -|:-----------------|:-------------|:---------------------| -| `TEXT_NUMBER` | `string` | | -| `DATE` | `string` | `format: date` | -| `DATETIME` | `string` | `format: date-time` | -| `anything else` | `string` | | +| Integration Type | Airbyte Type | Airbyte Format | +| :--------------- | :----------- | :------------------ | +| `TEXT_NUMBER` | `string` | | +| `DATE` | `string` | `format: date` | +| `DATETIME` | `string` | `format: date-time` | +| `anything else` | `string` | | The remaining column datatypes supported by Smartsheets are more complex types (e.g. Predecessor, Dropdown List) and are not supported by this connector beyond its `string` representation. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------| -| 1.1.2 | 2024-01-08 | [1234](https://github.com/airbytehq/airbyte/pull/1234) | prepare for airbyte-lib | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------- | +| 1.1.2 | 2024-01-08 | [1234](https://github.com/airbytehq/airbyte/pull/1234) | prepare for airbyte-lib | | 1.1.1 | 2023-06-06 | [27096](https://github.com/airbytehq/airbyte/pull/27096) | Fix error when optional metadata fields are not set | | 1.1.0 | 2023-06-02 | [22382](https://github.com/airbytehq/airbyte/pull/22382) | Add support for ingesting metadata fields | | 1.0.2 | 2023-05-12 | [26024](https://github.com/airbytehq/airbyte/pull/26024) | Fix dependencies conflict | diff --git a/docs/integrations/sources/snapchat-marketing.md b/docs/integrations/sources/snapchat-marketing.md index 188c019abd3..1cdf641e5d3 100644 --- a/docs/integrations/sources/snapchat-marketing.md +++ b/docs/integrations/sources/snapchat-marketing.md @@ -5,22 +5,24 @@ This page guides you through the process of setting up the Snapchat Marketing so ## Prerequisites + **For Airbyte Cloud:** -* A Snapchat Marketing account with permission to access data from accounts you want to sync +- A Snapchat Marketing account with permission to access data from accounts you want to sync + **For Airbyte Open Source:** -* client_id -* client_secret -* refresh_token -* start_date -* end_date -* action_report_time (Optional, Default value is conversion) It specifies the principle for conversion reporting. -* swipe_up_attribution_window (Optional, Default value is 1_DAY) This is the attribution window for swipe up. -* view_attribution_window (Optional, Default value is 28_DAY) This is the attribution window for views. +- client_id +- client_secret +- refresh_token +- start_date +- end_date +- action_report_time (Optional, Default value is conversion) It specifies the principle for conversion reporting. +- swipe_up_attribution_window (Optional, Default value is 1_DAY) This is the attribution window for swipe up. +- view_attribution_window (Optional, Default value is 28_DAY) This is the attribution window for views. ## Setup guide @@ -30,20 +32,21 @@ This page guides you through the process of setting up the Snapchat Marketing so 1. [Set up Snapchat Business account](https://businesshelp.snapchat.com/s/article/get-started?language=en_US) + **For Airbyte Open Source:** -2. [Activate Access to the Snapchat Marketing API](https://businesshelp.snapchat.com/s/article/api-apply?language=en_US) +2. [Activate Access to the Snapchat Marketing API](https://businesshelp.snapchat.com/s/article/api-apply?language=en_US) 3. Add the OAuth2 app: - * Adding the OAuth2 app requires the `redirect_url` parameter. + - Adding the OAuth2 app requires the `redirect_url` parameter. - If you have the API endpoint that will handle next OAuth process - write it to this parameter. - If not - just use some valid url. Here's the discussion about it: [Snapchat Redirect URL - Clarity in documentation please](https://github.com/Snap-Kit/bitmoji-sample/issues/3) - * save **Client ID** and **Client Secret** + - save **Client ID** and **Client Secret** 4. Get refresh token using OAuth2 authentication workflow: - * Open the authorize link in a browser: [https://accounts.snapchat.com/login/oauth2/authorize?response\_type=code&client\_id=CLIENT\_ID&redirect\_uri=REDIRECT\_URI&scope=snapchat-marketing-api&state=wmKkg0TWgppW8PTBZ20sldUmF7hwvU](https://accounts.snapchat.com/login/oauth2/authorize?response_type=code&client_id=CLIENT_ID&redirect_uri=REDIRECT_URI&scope=snapchat-marketing-api&state=wmKkg0TWgppW8PTBZ20sldUmF7hwvU) - * Login & Authorize via UI - * Locate "code" query parameter in the redirect - * Exchange code for access token + refresh token - ```text + - Open the authorize link in a browser: [https://accounts.snapchat.com/login/oauth2/authorize?response_type=code&client_id=CLIENT_ID&redirect_uri=REDIRECT_URI&scope=snapchat-marketing-api&state=wmKkg0TWgppW8PTBZ20sldUmF7hwvU](https://accounts.snapchat.com/login/oauth2/authorize?response_type=code&client_id=CLIENT_ID&redirect_uri=REDIRECT_URI&scope=snapchat-marketing-api&state=wmKkg0TWgppW8PTBZ20sldUmF7hwvU) + - Login & Authorize via UI + - Locate "code" query parameter in the redirect + - Exchange code for access token + refresh token + `text curl -X POST \ -d "code={one_time_use_code}" \ -d "client_id={client_id}" \ @@ -51,14 +54,15 @@ This page guides you through the process of setting up the Snapchat Marketing so -d "grant_type=authorization_code" \ -d "redirect_uri=redirect_uri" https://accounts.snapchat.com/login/oauth2/access_token - ``` -You will receive the API key and refresh token in response. Use this refresh token in the connector specifications. -The useful link to Authentication process is [here](https://marketingapi.snapchat.com/docs/#authentication) - + ` + You will receive the API key and refresh token in response. Use this refresh token in the connector specifications. + The useful link to Authentication process is [here](https://marketingapi.snapchat.com/docs/#authentication) + ### Step 2: Set up the source connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -71,6 +75,7 @@ The useful link to Authentication process is [here](https://marketingapi.snapcha + **For Airbyte Open Source:** 1. Go to local Airbyte page. @@ -106,7 +111,6 @@ The useful link to Authentication process is [here](https://marketingapi.snapcha | CampaignsStatsDaily | Yes | ["id", "granularity", "start_time"] | | CampaignsStatsLifetime | No | ["id", "granularity"] | - ## Performance considerations Hourly streams can be slowly because they generate a lot of records. @@ -116,7 +120,7 @@ Snapchat Marketing API has limitations to 1000 items per page. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------- | | 0.6.1 | 2024-04-24 | [36662](https://github.com/airbytehq/airbyte/pull/36662) | Schema descriptions | | 0.6.0 | 2024-04-10 | [30586](https://github.com/airbytehq/airbyte/pull/30586) | Add `attribution_windows`,`action_report_time` as optional configurable params | | 0.5.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | @@ -124,8 +128,8 @@ Snapchat Marketing API has limitations to 1000 items per page. | 0.3.2 | 2024-02-12 | [35171](https://github.com/airbytehq/airbyte/pull/35171) | Manage dependencies with Poetry. | | 0.3.0 | 2023-05-22 | [26358](https://github.com/airbytehq/airbyte/pull/26358) | Remove deprecated authSpecification in favour of advancedAuth | | 0.2.0 | 2023-05-10 | [25948](https://github.com/airbytehq/airbyte/pull/25948) | Introduce new field in the `Campaigns` stream schema | -| 0.1.16 | 2023-04-20 | [20897](https://github.com/airbytehq/airbyte/pull/20897) | Add missing fields to Basic Stats schema | -| 0.1.15 | 2023-03-02 | [22869](https://github.com/airbytehq/airbyte/pull/22869) | Specified date formatting in specification | +| 0.1.16 | 2023-04-20 | [20897](https://github.com/airbytehq/airbyte/pull/20897) | Add missing fields to Basic Stats schema | +| 0.1.15 | 2023-03-02 | [22869](https://github.com/airbytehq/airbyte/pull/22869) | Specified date formatting in specification | | 0.1.14 | 2023-02-10 | [22808](https://github.com/airbytehq/airbyte/pull/22808) | Enable default `AvailabilityStrategy` | | 0.1.13 | 2023-01-27 | [22023](https://github.com/airbytehq/airbyte/pull/22023) | Set `AvailabilityStrategy` for streams explicitly to `None` | | 0.1.12 | 2023-01-11 | [21267](https://github.com/airbytehq/airbyte/pull/21267) | Fix parse empty error response | @@ -138,5 +142,5 @@ Snapchat Marketing API has limitations to 1000 items per page. | 0.1.4 | 2021-12-07 | [8429](https://github.com/airbytehq/airbyte/pull/8429) | Update titles and descriptions | | 0.1.3 | 2021-11-10 | [7811](https://github.com/airbytehq/airbyte/pull/7811) | Add oauth2.0, fix stream_state | | 0.1.2 | 2021-11-08 | [7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies | -| 0.1.1 | 2021-07-29 | [5072](https://github.com/airbytehq/airbyte/pull/5072) | Fix bug with incorrect stream\_state value | +| 0.1.1 | 2021-07-29 | [5072](https://github.com/airbytehq/airbyte/pull/5072) | Fix bug with incorrect stream_state value | | 0.1.0 | 2021-07-26 | [4843](https://github.com/airbytehq/airbyte/pull/4843) | Initial release supporting the Snapchat Marketing API | diff --git a/docs/integrations/sources/snowflake.md b/docs/integrations/sources/snowflake.md index 7032ff83d72..e19a7bea954 100644 --- a/docs/integrations/sources/snowflake.md +++ b/docs/integrations/sources/snowflake.md @@ -126,8 +126,8 @@ To read more please check official [Snowflake documentation](https://docs.snowfl ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------| -| 0.3.1 | 2024-02-13 | [35220](https://github.com/airbytehq/airbyte/pull/35220) | Adopt CDK 0.20.4 | +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- | +| 0.3.1 | 2024-02-13 | [35220](https://github.com/airbytehq/airbyte/pull/35220) | Adopt CDK 0.20.4 | | 0.3.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.3.0 | 2023-12-18 | [33484](https://github.com/airbytehq/airbyte/pull/33484) | Remove LEGACY state | | 0.2.2 | 2023-10-20 | [31613](https://github.com/airbytehq/airbyte/pull/31613) | Fixed handling of TIMESTAMP_TZ columns. upgrade | diff --git a/docs/integrations/sources/sonar-cloud.md b/docs/integrations/sources/sonar-cloud.md index 1e2df4d5ddf..6228d021a1b 100644 --- a/docs/integrations/sources/sonar-cloud.md +++ b/docs/integrations/sources/sonar-cloud.md @@ -6,31 +6,30 @@ This source can sync data from the [Sonar cloud API](https://sonarcloud.io/web_a ## This Source Supports the Following Streams -* components -* issues -* metrics +- components +- issues +- metrics ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | - +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Getting started ### Requirements -* Sonar cloud User Token +- Sonar cloud User Token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.5 | 2024-04-19 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Updating to 0.80.0 CDK | -| 0.1.4 | 2024-04-18 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Manage dependencies with Poetry. | -| 0.1.3 | 2024-04-15 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.2 | 2024-04-12 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | schema descriptions | -| 0.1.1 | 2023-02-11 l [22868](https://github.com/airbytehq/airbyte/pull/22868) | Specified date formatting in specification | -| 0.1.0 | 2022-10-26 | [#18475](https://github.com/airbytehq/airbyte/pull/18475) | 🎉 New Source: Sonar Cloud API [low-code CDK] | +| Version | Date | Pull Request | Subject | +| :------ | :-------------------------------------------------------------------- | :-------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.5 | 2024-04-19 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Updating to 0.80.0 CDK | +| 0.1.4 | 2024-04-18 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Manage dependencies with Poetry. | +| 0.1.3 | 2024-04-15 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.2 | 2024-04-12 | [37262](https://github.com/airbytehq/airbyte/pull/37262) | schema descriptions | +| 0.1.1 | 2023-02-11 l [22868](https://github.com/airbytehq/airbyte/pull/22868) | Specified date formatting in specification | +| 0.1.0 | 2022-10-26 | [#18475](https://github.com/airbytehq/airbyte/pull/18475) | 🎉 New Source: Sonar Cloud API [low-code CDK] | diff --git a/docs/integrations/sources/spacex-api.md b/docs/integrations/sources/spacex-api.md index ee3730f8d5e..4503009b125 100644 --- a/docs/integrations/sources/spacex-api.md +++ b/docs/integrations/sources/spacex-api.md @@ -29,8 +29,8 @@ No prerequisites, but a dummy api_key is required as it enhances security in fut 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `api_key`. -5. Enter your `id` if needed. (Optional) -6. Click **Set up source**. +4. Enter your `id` if needed. (Optional) +5. Click **Set up source**. ## Supported sync modes @@ -70,7 +70,7 @@ The SpaceX API has both v4 and v5 for [launches](https://github.com/r-spacex/Spa ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | | 0.1.1 | 2023-11-08 | [32202](https://github.com/airbytehq/airbyte/pull/32202) | Adjust schemas to cover all fields in the records | -| 0.1.0 | 2022-10-22 | [Init](https://github.com/airbytehq/airbyte/pull/18311) | Initial commit | +| 0.1.0 | 2022-10-22 | [Init](https://github.com/airbytehq/airbyte/pull/18311) | Initial commit | diff --git a/docs/integrations/sources/spree-commerce.md b/docs/integrations/sources/spree-commerce.md index bd2ad15e292..690ad2c5fd3 100644 --- a/docs/integrations/sources/spree-commerce.md +++ b/docs/integrations/sources/spree-commerce.md @@ -6,8 +6,8 @@ Spree Commerce can run on the MySQL or Postgres databases. You can use Airbyte to sync your Spree Commerce instance by connecting to the underlying database using the appropriate Airbyte connector: -* [MySQL](mysql.md) -* [Postgres](postgres.md) +- [MySQL](mysql.md) +- [Postgres](postgres.md) :::info @@ -18,4 +18,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The Spree Commerce schema is described in the [Spree Internals](https://dev-docs.spreecommerce.org/internals/) section of the Spree docs. Otherwise, the schema will follow the rules of the MySQL or Postgres connectors. - diff --git a/docs/integrations/sources/square.md b/docs/integrations/sources/square.md index b071ce93975..bfddf145b08 100644 --- a/docs/integrations/sources/square.md +++ b/docs/integrations/sources/square.md @@ -99,7 +99,7 @@ Exponential [Backoff](https://developer.squareup.com/forums/t/current-square-api ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------ | | 1.6.2 | 2024-05-03 | [37800](https://github.com/airbytehq/airbyte/pull/37800) | Migrate to Poetry. Replace custom components with default classes | | 1.6.1 | 2023-11-07 | [31481](https://github.com/airbytehq/airbyte/pull/31481) | Fix duplicate records for `Payments` and `Refunds` stream | | 1.6.0 | 2023-10-18 | [31115](https://github.com/airbytehq/airbyte/pull/31115) | Add `customer_id` field to `Payments` and `Orders` streams | diff --git a/docs/integrations/sources/statuspage.md b/docs/integrations/sources/statuspage.md index e7314072c1d..d6133d54932 100644 --- a/docs/integrations/sources/statuspage.md +++ b/docs/integrations/sources/statuspage.md @@ -6,20 +6,20 @@ This source can sync data from the [Statuspage.io API](https://developer.statusp ## This Source Supports the Following Streams - * pages - * subscribers - * subscribers_histogram_by_state - * incident_templates - * incidents - * components - * metrics +- pages +- subscribers +- subscribers_histogram_by_state +- incident_templates +- incidents +- components +- metrics ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -29,10 +29,10 @@ Mailjet APIs are under rate limits for the number of API calls allowed per API k ### Requirements -* Statuspage.io API KEY +- Statuspage.io API KEY ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-30 | [#18664](https://github.com/airbytehq/airbyte/pull/18664) | 🎉 New Source: Statuspage.io API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------------------- | +| 0.1.0 | 2022-10-30 | [#18664](https://github.com/airbytehq/airbyte/pull/18664) | 🎉 New Source: Statuspage.io API [low-code CDK] | diff --git a/docs/integrations/sources/strava.md b/docs/integrations/sources/strava.md index 82bf68efe30..f7501e5789a 100644 --- a/docs/integrations/sources/strava.md +++ b/docs/integrations/sources/strava.md @@ -122,15 +122,15 @@ More information about Strava rate limits and adjustments to those limits can be ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------- | -| 0.2.4 | 2024-04-19 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | schema descriptions | -| 0.2.0 | 2023-10-24 | [31007](https://github.com/airbytehq/airbyte/pull/31007) | Migrate to low-code framework | -| 0.1.4 | 2023-03-23 | [24368](https://github.com/airbytehq/airbyte/pull/24368) | Add date-time format for input | -| 0.1.3 | 2023-03-15 | [24101](https://github.com/airbytehq/airbyte/pull/24101) | certified to beta, fixed spec, fixed SAT, added unit tests | -| 0.1.2 | 2021-12-15 | [8799](https://github.com/airbytehq/airbyte/pull/8799) | Implement OAuth 2.0 support | -| 0.1.1 | 2021-12-06 | [8425](https://github.com/airbytehq/airbyte/pull/8425) | Update title, description fields in spec | -| 0.1.0 | 2021-10-18 | [7151](https://github.com/airbytehq/airbyte/pull/7151) | Initial release supporting Strava API | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37266](https://github.com/airbytehq/airbyte/pull/37266) | schema descriptions | +| 0.2.0 | 2023-10-24 | [31007](https://github.com/airbytehq/airbyte/pull/31007) | Migrate to low-code framework | +| 0.1.4 | 2023-03-23 | [24368](https://github.com/airbytehq/airbyte/pull/24368) | Add date-time format for input | +| 0.1.3 | 2023-03-15 | [24101](https://github.com/airbytehq/airbyte/pull/24101) | certified to beta, fixed spec, fixed SAT, added unit tests | +| 0.1.2 | 2021-12-15 | [8799](https://github.com/airbytehq/airbyte/pull/8799) | Implement OAuth 2.0 support | +| 0.1.1 | 2021-12-06 | [8425](https://github.com/airbytehq/airbyte/pull/8425) | Update title, description fields in spec | +| 0.1.0 | 2021-10-18 | [7151](https://github.com/airbytehq/airbyte/pull/7151) | Initial release supporting Strava API | diff --git a/docs/integrations/sources/stripe-migrations.md b/docs/integrations/sources/stripe-migrations.md index 60f4be4d4ab..3a23fd6a92f 100644 --- a/docs/integrations/sources/stripe-migrations.md +++ b/docs/integrations/sources/stripe-migrations.md @@ -3,10 +3,11 @@ ## Upgrading to 5.0.0 This change fixes multiple incremental sync issues with the `Refunds`, `Checkout Sessions` and `Checkout Sessions Line Items` streams: - - `Refunds` stream was not syncing data in the incremental sync mode. Cursor field has been updated to "created" to allow for incremental syncs. Because of the changed cursor field of the `Refunds` stream, incremental syncs will not reflect every update of the records that have been previously replicated. Only newly created records will be synced. To always have the up-to-date data, users are encouraged to make use of the lookback window. - - `CheckoutSessions` stream had been missing data for one day when using the incremental sync mode after a reset; this has been resolved. - - `CheckoutSessionsLineItems` previously had potential data loss. It has been updated to use a new cursor field `checkout_session_updated`. - - Incremental streams with the `created` cursor had been duplicating some data; this has been fixed. + +- `Refunds` stream was not syncing data in the incremental sync mode. Cursor field has been updated to "created" to allow for incremental syncs. Because of the changed cursor field of the `Refunds` stream, incremental syncs will not reflect every update of the records that have been previously replicated. Only newly created records will be synced. To always have the up-to-date data, users are encouraged to make use of the lookback window. +- `CheckoutSessions` stream had been missing data for one day when using the incremental sync mode after a reset; this has been resolved. +- `CheckoutSessionsLineItems` previously had potential data loss. It has been updated to use a new cursor field `checkout_session_updated`. +- Incremental streams with the `created` cursor had been duplicating some data; this has been fixed. Stream schema update is a breaking change as well as changing the cursor field for the `Refunds` and the `CheckoutSessionsLineItems` stream. A schema refresh and data reset of all effected streams is required after the update is applied. @@ -18,4 +19,4 @@ Because of the changed cursor field of the `Refunds` stream, incremental syncs w ## Upgrading to 4.0.0 A major update of most streams to support event-based incremental sync mode. This allows the connector to pull not only the newly created data since the last sync, but the modified data as well. -A schema refresh is required for the connector to use the new cursor format. \ No newline at end of file +A schema refresh is required for the connector to use the new cursor format. diff --git a/docs/integrations/sources/stripe.md b/docs/integrations/sources/stripe.md index af13144d800..bad3dddc681 100644 --- a/docs/integrations/sources/stripe.md +++ b/docs/integrations/sources/stripe.md @@ -116,10 +116,6 @@ The Stripe source connector supports the following streams: - [Transfer Reversals](https://stripe.com/docs/api/transfer_reversals/list) - [Usage Records](https://stripe.com/docs/api/usage_records/subscription_item_summary_list) - - - - ### Data type mapping The [Stripe API](https://stripe.com/docs/api) uses the same [JSON Schema](https://json-schema.org/understanding-json-schema/reference/index.html) types that Airbyte uses internally \(`string`, `date-time`, `object`, `array`, `boolean`, `integer`, and `number`\), so no type conversions are performed for the Stripe connector. @@ -146,6 +142,7 @@ Please be aware: this also means that any change older than 30 days will not be Since the Stripe API does not allow querying objects which were updated since the last sync, the Stripe connector uses the Events API under the hood to implement incremental syncs and export data based on its update date. However, not all the entities are supported by the Events API, so the Stripe connector uses the `created` field or its analogue to query for new data in your Stripe account. These are the entities synced based on the date of creation: + - `Balance Transactions` - `Events` - `File Links` @@ -199,6 +196,7 @@ On the other hand, the following streams use the `updated` field value as a curs ## Incremental deletes The Stripe API also provides a way to implement incremental deletes for a limited number of streams: + - `Bank Accounts` - `Coupons` - `Customers` @@ -213,7 +211,8 @@ The Stripe API also provides a way to implement incremental deletes for a limite - `Subscriptions` Each record is marked with `is_deleted` flag when the appropriate event happens upstream. -* Check out common troubleshooting issues for the Stripe source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). + +- Check out common troubleshooting issues for the Stripe source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ### Data type mapping @@ -221,109 +220,110 @@ Each record is marked with `is_deleted` flag when the appropriate event happens ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| 5.3.7 | 2024-04-24 | [36663](https://github.com/airbytehq/airbyte/pull/36663) | Schema descriptions | -| 5.3.6 | 2024-04-18 | [37448](https://github.com/airbytehq/airbyte/pull/37448) | Ensure AirbyteTracedException in concurrent CDK are emitted with the right type | -| 5.3.5 | 2024-04-18 | [37418](https://github.com/airbytehq/airbyte/pull/37418) | Ensure python return code != 0 in case of error | -| 5.3.4 | 2024-04-11 | [37406](https://github.com/airbytehq/airbyte/pull/37406) | Update CDK version to have partitioned state fix | -| 5.3.3 | 2024-04-11 | [37001](https://github.com/airbytehq/airbyte/pull/37001) | Update airbyte-cdk to flush print buffer for every message | -| 5.3.2 | 2024-04-11 | [36964](https://github.com/airbytehq/airbyte/pull/36964) | Update CDK version to fix breaking change before another devs work on it | -| 5.3.1 | 2024-04-10 | [36960](https://github.com/airbytehq/airbyte/pull/36960) | Remove unused imports | -| 5.3.0 | 2024-03-12 | [35978](https://github.com/airbytehq/airbyte/pull/35978) | Upgrade CDK to start emitting record counts with state and full refresh state | -| 5.2.4 | 2024-02-12 | [35137](https://github.com/airbytehq/airbyte/pull/35137) | Fix license in `pyproject.toml` | -| 5.2.3 | 2024-02-09 | [35068](https://github.com/airbytehq/airbyte/pull/35068) | Manage dependencies with Poetry. | -| 5.2.2 | 2024-01-31 | [34619](https://github.com/airbytehq/airbyte/pull/34619) | Events stream concurrent on incremental syncs | -| 5.2.1 | 2024-01-18 | [34495](https://github.com/airbytehq/airbyte/pull/34495) | Fix deadlock issue | -| 5.2.0 | 2024-01-18 | [34347](https://github.com/airbytehq/airbyte/pull//34347) | Add new fields invoices and subscription streams. Upgrade the CDK for better memory usage. | -| 5.1.3 | 2023-12-18 | [33306](https://github.com/airbytehq/airbyte/pull/33306/) | Adding integration tests | -| 5.1.2 | 2024-01-04 | [33414](https://github.com/airbytehq/airbyte/pull/33414) | Prepare for airbyte-lib | -| 5.1.1 | 2024-01-04 | [33926](https://github.com/airbytehq/airbyte/pull/33926/) | Update endpoint for `bank_accounts` stream | -| 5.1.0 | 2023-12-11 | [32908](https://github.com/airbytehq/airbyte/pull/32908/) | Read full refresh streams concurrently | -| 5.0.2 | 2023-12-01 | [33038](https://github.com/airbytehq/airbyte/pull/33038) | Add stream slice logging for SubStream | -| 5.0.1 | 2023-11-17 | [32638](https://github.com/airbytehq/airbyte/pull/32638/) | Availability stretegy: check availability of both endpoints (if applicable) - common API + events API | -| 5.0.0 | 2023-11-16 | [32286](https://github.com/airbytehq/airbyte/pull/32286/) | Fix multiple issues regarding usage of the incremental sync mode for the `Refunds`, `CheckoutSessions`, `CheckoutSessionsLineItems` streams. Fix schemas for the streams: `Invoices`, `Subscriptions`, `SubscriptionSchedule` | -| 4.5.4 | 2023-11-16 | [32284](https://github.com/airbytehq/airbyte/pull/32284/) | Enable client-side rate limiting | -| 4.5.3 | 2023-11-14 | [32473](https://github.com/airbytehq/airbyte/pull/32473/) | Have all full_refresh stream syncs be concurrent | -| 4.5.2 | 2023-11-03 | [32146](https://github.com/airbytehq/airbyte/pull/32146/) | Fix multiple BankAccount issues | -| 4.5.1 | 2023-11-01 | [32056](https://github.com/airbytehq/airbyte/pull/32056/) | Use CDK version 0.52.8 | -| 4.5.0 | 2023-10-25 | [31327](https://github.com/airbytehq/airbyte/pull/31327/) | Use concurrent CDK when running in full-refresh | -| 4.4.2 | 2023-10-24 | [31764](https://github.com/airbytehq/airbyte/pull/31764) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 4.4.1 | 2023-10-18 | [31553](https://github.com/airbytehq/airbyte/pull/31553) | Adjusted `Setup Attempts` and extended `Checkout Sessions` stream schemas | -| 4.4.0 | 2023-10-04 | [31046](https://github.com/airbytehq/airbyte/pull/31046) | Added margins field to invoice_line_items stream. | -| 4.3.1 | 2023-09-27 | [30800](https://github.com/airbytehq/airbyte/pull/30800) | Handle permission issues a non breaking | -| 4.3.0 | 2023-09-26 | [30752](https://github.com/airbytehq/airbyte/pull/30752) | Do not sync upcoming invoices, extend stream schemas | -| 4.2.0 | 2023-09-21 | [30660](https://github.com/airbytehq/airbyte/pull/30660) | Fix updated state for the incremental syncs | -| 4.1.1 | 2023-09-15 | [30494](https://github.com/airbytehq/airbyte/pull/30494) | Fix datatype of invoices.lines property | -| 4.1.0 | 2023-08-29 | [29950](https://github.com/airbytehq/airbyte/pull/29950) | Implement incremental deletes, add suggested streams | -| 4.0.1 | 2023-09-07 | [30254](https://github.com/airbytehq/airbyte/pull/30254) | Fix cursorless incremental streams | -| 4.0.0 | 2023-08-15 | [29330](https://github.com/airbytehq/airbyte/pull/29330) | Implement incremental syncs based on date of update | -| 3.17.4 | 2023-08-15 | [29425](https://github.com/airbytehq/airbyte/pull/29425) | Revert 3.17.3 | -| 3.17.3 | 2023-08-01 | [28911](https://github.com/airbytehq/airbyte/pull/28911) | Revert 3.17.2 and fix atm_fee property | -| 3.17.2 | 2023-08-01 | [28911](https://github.com/airbytehq/airbyte/pull/28911) | Fix stream schemas, remove custom 403 error handling | -| 3.17.1 | 2023-08-01 | [28887](https://github.com/airbytehq/airbyte/pull/28887) | Fix `Invoices` schema | -| 3.17.0 | 2023-07-28 | [26127](https://github.com/airbytehq/airbyte/pull/26127) | Add `Prices` stream | -| 3.16.0 | 2023-07-27 | [28776](https://github.com/airbytehq/airbyte/pull/28776) | Add new fields to stream schemas | -| 3.15.0 | 2023-07-09 | [28709](https://github.com/airbytehq/airbyte/pull/28709) | Remove duplicate streams | -| 3.14.0 | 2023-07-09 | [27217](https://github.com/airbytehq/airbyte/pull/27217) | Add `ShippingRates` stream | -| 3.13.0 | 2023-07-18 | [28466](https://github.com/airbytehq/airbyte/pull/28466) | Pin source API version | -| 3.12.0 | 2023-05-20 | [26208](https://github.com/airbytehq/airbyte/pull/26208) | Add new stream `Persons` | -| 3.11.0 | 2023-06-26 | [27734](https://github.com/airbytehq/airbyte/pull/27734) | License Update: Elv2 stream | -| 3.10.0 | 2023-06-22 | [27132](https://github.com/airbytehq/airbyte/pull/27132) | Add `CreditNotes` stream | -| 3.9.1 | 2023-06-20 | [27522](https://github.com/airbytehq/airbyte/pull/27522) | Fix formatting | -| 3.9.0 | 2023-06-19 | [27362](https://github.com/airbytehq/airbyte/pull/27362) | Add new Streams: Transfer Reversals, Setup Attempts, Usage Records, Transactions | -| 3.8.0 | 2023-06-12 | [27238](https://github.com/airbytehq/airbyte/pull/27238) | Add `Topups` stream; Add `Files` stream; Add `FileLinks` stream | -| 3.7.0 | 2023-06-06 | [27083](https://github.com/airbytehq/airbyte/pull/27083) | Add new Streams: Authorizations, Cardholders, Cards, Payment Methods, Reviews | -| 3.6.0 | 2023-05-24 | [25893](https://github.com/airbytehq/airbyte/pull/25893) | Add `ApplicationFeesRefunds` stream with parent `ApplicationFees` | -| 3.5.0 | 2023-05-20 | [22859](https://github.com/airbytehq/airbyte/pull/22859) | Add stream `Early Fraud Warnings` | -| 3.4.3 | 2023-05-10 | [25965](https://github.com/airbytehq/airbyte/pull/25965) | Fix Airbyte date-time data-types | -| 3.4.2 | 2023-05-04 | [25795](https://github.com/airbytehq/airbyte/pull/25795) | Added `CDK TypeTransformer` to guarantee declared JSON Schema data-types | -| 3.4.1 | 2023-04-24 | [23389](https://github.com/airbytehq/airbyte/pull/23389) | Add `customer_tax_ids` to `Invoices` | -| 3.4.0 | 2023-03-20 | [23963](https://github.com/airbytehq/airbyte/pull/23963) | Add `SetupIntents` stream | -| 3.3.0 | 2023-04-12 | [25136](https://github.com/airbytehq/airbyte/pull/25136) | Add stream `Accounts` | -| 3.2.0 | 2023-04-10 | [23624](https://github.com/airbytehq/airbyte/pull/23624) | Add new stream `Subscription Schedule` | -| 3.1.0 | 2023-03-10 | [19906](https://github.com/airbytehq/airbyte/pull/19906) | Expand `tiers` when syncing `Plans` streams | -| 3.0.5 | 2023-03-25 | [22866](https://github.com/airbytehq/airbyte/pull/22866) | Specified date formatting in specification | -| 3.0.4 | 2023-03-24 | [24471](https://github.com/airbytehq/airbyte/pull/24471) | Fix stream slices for single sliced streams | -| 3.0.3 | 2023-03-17 | [24179](https://github.com/airbytehq/airbyte/pull/24179) | Get customer's attributes safely | -| 3.0.2 | 2023-03-13 | [24051](https://github.com/airbytehq/airbyte/pull/24051) | Cache `customers` stream; Do not request transactions of customers with zero balance. | -| 3.0.1 | 2023-02-22 | [22898](https://github.com/airbytehq/airbyte/pull/22898) | Add missing column to Subscriptions stream | -| 3.0.0 | 2023-02-21 | [23295](https://github.com/airbytehq/airbyte/pull/23295) | Fix invoice schema | -| 2.0.0 | 2023-02-14 | [22312](https://github.com/airbytehq/airbyte/pull/22312) | Another fix of `Invoices` stream schema + Remove http urls from openapi_spec.json | -| 1.0.2 | 2023-02-09 | [22659](https://github.com/airbytehq/airbyte/pull/22659) | Set `AvailabilityStrategy` for all streams | -| 1.0.1 | 2023-01-27 | [22042](https://github.com/airbytehq/airbyte/pull/22042) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 1.0.0 | 2023-01-25 | [21858](https://github.com/airbytehq/airbyte/pull/21858) | Update the `Subscriptions` and `Invoices` stream schemas | -| 0.1.40 | 2022-10-20 | [18228](https://github.com/airbytehq/airbyte/pull/18228) | Update the `PaymentIntents` stream schema | -| 0.1.39 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream states. | -| 0.1.38 | 2022-09-09 | [16537](https://github.com/airbytehq/airbyte/pull/16537) | Fix `redeem_by` field type for `customers` stream | -| 0.1.37 | 2022-08-16 | [15686](https://github.com/airbytehq/airbyte/pull/15686) | Fix the bug when the stream couldn't be fetched due to limited permission set, if so - it should be skipped | -| 0.1.36 | 2022-08-04 | [15292](https://github.com/airbytehq/airbyte/pull/15292) | Implement slicing | -| 0.1.35 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from spec and schema | -| 0.1.34 | 2022-07-01 | [14357](https://github.com/airbytehq/airbyte/pull/14357) | Add external account streams - | -| 0.1.33 | 2022-06-06 | [13449](https://github.com/airbytehq/airbyte/pull/13449) | Add semi-incremental support for CheckoutSessions and CheckoutSessionsLineItems streams, fixed big in StripeSubStream, added unittests, updated docs | -| 0.1.32 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | -| 0.1.31 | 2022-04-20 | [12230](https://github.com/airbytehq/airbyte/pull/12230) | Update connector to use a `spec.yaml` | -| 0.1.30 | 2022-03-21 | [11286](https://github.com/airbytehq/airbyte/pull/11286) | Minor corrections to documentation and connector specification | -| 0.1.29 | 2022-03-08 | [10359](https://github.com/airbytehq/airbyte/pull/10359) | Improved performance for streams with substreams: invoice_line_items, subscription_items, bank_accounts | -| 0.1.28 | 2022-02-08 | [10165](https://github.com/airbytehq/airbyte/pull/10165) | Improve 404 handling for `CheckoutSessionsLineItems` stream | -| 0.1.27 | 2021-12-28 | [9148](https://github.com/airbytehq/airbyte/pull/9148) | Fix `date`, `arrival\_date` fields | -| 0.1.26 | 2021-12-21 | [8992](https://github.com/airbytehq/airbyte/pull/8992) | Fix type `events.request` in schema | -| 0.1.25 | 2021-11-25 | [8250](https://github.com/airbytehq/airbyte/pull/8250) | Rearrange setup fields | -| 0.1.24 | 2021-11-08 | [7729](https://github.com/airbytehq/airbyte/pull/7729) | Include tax data in `checkout_sessions_line_items` stream | -| 0.1.23 | 2021-11-08 | [7729](https://github.com/airbytehq/airbyte/pull/7729) | Correct `payment_intents` schema | -| 0.1.22 | 2021-11-05 | [7345](https://github.com/airbytehq/airbyte/pull/7345) | Add 3 new streams | -| 0.1.21 | 2021-10-07 | [6841](https://github.com/airbytehq/airbyte/pull/6841) | Fix missing `start_date` argument + update json files for SAT | -| 0.1.20 | 2021-09-30 | [6017](https://github.com/airbytehq/airbyte/pull/6017) | Add lookback_window_days parameter | -| 0.1.19 | 2021-09-27 | [6466](https://github.com/airbytehq/airbyte/pull/6466) | Use `start_date` parameter in incremental streams | -| 0.1.18 | 2021-09-14 | [6004](https://github.com/airbytehq/airbyte/pull/6004) | Fix coupons and subscriptions stream schemas by removing incorrect timestamp formatting | -| 0.1.17 | 2021-09-14 | [6004](https://github.com/airbytehq/airbyte/pull/6004) | Add `PaymentIntents` stream | -| 0.1.16 | 2021-07-28 | [4980](https://github.com/airbytehq/airbyte/pull/4980) | Remove Updated field from schemas | -| 0.1.15 | 2021-07-21 | [4878](https://github.com/airbytehq/airbyte/pull/4878) | Fix incorrect percent_off and discounts data filed types | -| 0.1.14 | 2021-07-09 | [4669](https://github.com/airbytehq/airbyte/pull/4669) | Subscriptions Stream now returns all kinds of subscriptions \(including expired and canceled\) | -| 0.1.13 | 2021-07-03 | [4528](https://github.com/airbytehq/airbyte/pull/4528) | Remove regex for acc validation | -| 0.1.12 | 2021-06-08 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | -| 0.1.11 | 2021-05-30 | [3744](https://github.com/airbytehq/airbyte/pull/3744) | Fix types in schema | -| 0.1.10 | 2021-05-28 | [3728](https://github.com/airbytehq/airbyte/pull/3728) | Update data types to be number instead of int | -| 0.1.9 | 2021-05-13 | [3367](https://github.com/airbytehq/airbyte/pull/3367) | Add acceptance tests for connected accounts | -| 0.1.8 | 2021-05-11 | [3566](https://github.com/airbytehq/airbyte/pull/3368) | Bump CDK connectors | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| 5.3.7 | 2024-04-24 | [36663](https://github.com/airbytehq/airbyte/pull/36663) | Schema descriptions | +| 5.3.6 | 2024-04-18 | [37448](https://github.com/airbytehq/airbyte/pull/37448) | Ensure AirbyteTracedException in concurrent CDK are emitted with the right type | +| 5.3.5 | 2024-04-18 | [37418](https://github.com/airbytehq/airbyte/pull/37418) | Ensure python return code != 0 in case of error | +| 5.3.4 | 2024-04-11 | [37406](https://github.com/airbytehq/airbyte/pull/37406) | Update CDK version to have partitioned state fix | +| 5.3.3 | 2024-04-11 | [37001](https://github.com/airbytehq/airbyte/pull/37001) | Update airbyte-cdk to flush print buffer for every message | +| 5.3.2 | 2024-04-11 | [36964](https://github.com/airbytehq/airbyte/pull/36964) | Update CDK version to fix breaking change before another devs work on it | +| 5.3.1 | 2024-04-10 | [36960](https://github.com/airbytehq/airbyte/pull/36960) | Remove unused imports | +| 5.3.0 | 2024-03-12 | [35978](https://github.com/airbytehq/airbyte/pull/35978) | Upgrade CDK to start emitting record counts with state and full refresh state | +| 5.2.4 | 2024-02-12 | [35137](https://github.com/airbytehq/airbyte/pull/35137) | Fix license in `pyproject.toml` | +| 5.2.3 | 2024-02-09 | [35068](https://github.com/airbytehq/airbyte/pull/35068) | Manage dependencies with Poetry. | +| 5.2.2 | 2024-01-31 | [34619](https://github.com/airbytehq/airbyte/pull/34619) | Events stream concurrent on incremental syncs | +| 5.2.1 | 2024-01-18 | [34495](https://github.com/airbytehq/airbyte/pull/34495) | Fix deadlock issue | +| 5.2.0 | 2024-01-18 | [34347](https://github.com/airbytehq/airbyte/pull//34347) | Add new fields invoices and subscription streams. Upgrade the CDK for better memory usage. | +| 5.1.3 | 2023-12-18 | [33306](https://github.com/airbytehq/airbyte/pull/33306/) | Adding integration tests | +| 5.1.2 | 2024-01-04 | [33414](https://github.com/airbytehq/airbyte/pull/33414) | Prepare for airbyte-lib | +| 5.1.1 | 2024-01-04 | [33926](https://github.com/airbytehq/airbyte/pull/33926/) | Update endpoint for `bank_accounts` stream | +| 5.1.0 | 2023-12-11 | [32908](https://github.com/airbytehq/airbyte/pull/32908/) | Read full refresh streams concurrently | +| 5.0.2 | 2023-12-01 | [33038](https://github.com/airbytehq/airbyte/pull/33038) | Add stream slice logging for SubStream | +| 5.0.1 | 2023-11-17 | [32638](https://github.com/airbytehq/airbyte/pull/32638/) | Availability stretegy: check availability of both endpoints (if applicable) - common API + events API | +| 5.0.0 | 2023-11-16 | [32286](https://github.com/airbytehq/airbyte/pull/32286/) | Fix multiple issues regarding usage of the incremental sync mode for the `Refunds`, `CheckoutSessions`, `CheckoutSessionsLineItems` streams. Fix schemas for the streams: `Invoices`, `Subscriptions`, `SubscriptionSchedule` | +| 4.5.4 | 2023-11-16 | [32284](https://github.com/airbytehq/airbyte/pull/32284/) | Enable client-side rate limiting | +| 4.5.3 | 2023-11-14 | [32473](https://github.com/airbytehq/airbyte/pull/32473/) | Have all full_refresh stream syncs be concurrent | +| 4.5.2 | 2023-11-03 | [32146](https://github.com/airbytehq/airbyte/pull/32146/) | Fix multiple BankAccount issues | +| 4.5.1 | 2023-11-01 | [32056](https://github.com/airbytehq/airbyte/pull/32056/) | Use CDK version 0.52.8 | +| 4.5.0 | 2023-10-25 | [31327](https://github.com/airbytehq/airbyte/pull/31327/) | Use concurrent CDK when running in full-refresh | +| 4.4.2 | 2023-10-24 | [31764](https://github.com/airbytehq/airbyte/pull/31764) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 4.4.1 | 2023-10-18 | [31553](https://github.com/airbytehq/airbyte/pull/31553) | Adjusted `Setup Attempts` and extended `Checkout Sessions` stream schemas | +| 4.4.0 | 2023-10-04 | [31046](https://github.com/airbytehq/airbyte/pull/31046) | Added margins field to invoice_line_items stream. | +| 4.3.1 | 2023-09-27 | [30800](https://github.com/airbytehq/airbyte/pull/30800) | Handle permission issues a non breaking | +| 4.3.0 | 2023-09-26 | [30752](https://github.com/airbytehq/airbyte/pull/30752) | Do not sync upcoming invoices, extend stream schemas | +| 4.2.0 | 2023-09-21 | [30660](https://github.com/airbytehq/airbyte/pull/30660) | Fix updated state for the incremental syncs | +| 4.1.1 | 2023-09-15 | [30494](https://github.com/airbytehq/airbyte/pull/30494) | Fix datatype of invoices.lines property | +| 4.1.0 | 2023-08-29 | [29950](https://github.com/airbytehq/airbyte/pull/29950) | Implement incremental deletes, add suggested streams | +| 4.0.1 | 2023-09-07 | [30254](https://github.com/airbytehq/airbyte/pull/30254) | Fix cursorless incremental streams | +| 4.0.0 | 2023-08-15 | [29330](https://github.com/airbytehq/airbyte/pull/29330) | Implement incremental syncs based on date of update | +| 3.17.4 | 2023-08-15 | [29425](https://github.com/airbytehq/airbyte/pull/29425) | Revert 3.17.3 | +| 3.17.3 | 2023-08-01 | [28911](https://github.com/airbytehq/airbyte/pull/28911) | Revert 3.17.2 and fix atm_fee property | +| 3.17.2 | 2023-08-01 | [28911](https://github.com/airbytehq/airbyte/pull/28911) | Fix stream schemas, remove custom 403 error handling | +| 3.17.1 | 2023-08-01 | [28887](https://github.com/airbytehq/airbyte/pull/28887) | Fix `Invoices` schema | +| 3.17.0 | 2023-07-28 | [26127](https://github.com/airbytehq/airbyte/pull/26127) | Add `Prices` stream | +| 3.16.0 | 2023-07-27 | [28776](https://github.com/airbytehq/airbyte/pull/28776) | Add new fields to stream schemas | +| 3.15.0 | 2023-07-09 | [28709](https://github.com/airbytehq/airbyte/pull/28709) | Remove duplicate streams | +| 3.14.0 | 2023-07-09 | [27217](https://github.com/airbytehq/airbyte/pull/27217) | Add `ShippingRates` stream | +| 3.13.0 | 2023-07-18 | [28466](https://github.com/airbytehq/airbyte/pull/28466) | Pin source API version | +| 3.12.0 | 2023-05-20 | [26208](https://github.com/airbytehq/airbyte/pull/26208) | Add new stream `Persons` | +| 3.11.0 | 2023-06-26 | [27734](https://github.com/airbytehq/airbyte/pull/27734) | License Update: Elv2 stream | +| 3.10.0 | 2023-06-22 | [27132](https://github.com/airbytehq/airbyte/pull/27132) | Add `CreditNotes` stream | +| 3.9.1 | 2023-06-20 | [27522](https://github.com/airbytehq/airbyte/pull/27522) | Fix formatting | +| 3.9.0 | 2023-06-19 | [27362](https://github.com/airbytehq/airbyte/pull/27362) | Add new Streams: Transfer Reversals, Setup Attempts, Usage Records, Transactions | +| 3.8.0 | 2023-06-12 | [27238](https://github.com/airbytehq/airbyte/pull/27238) | Add `Topups` stream; Add `Files` stream; Add `FileLinks` stream | +| 3.7.0 | 2023-06-06 | [27083](https://github.com/airbytehq/airbyte/pull/27083) | Add new Streams: Authorizations, Cardholders, Cards, Payment Methods, Reviews | +| 3.6.0 | 2023-05-24 | [25893](https://github.com/airbytehq/airbyte/pull/25893) | Add `ApplicationFeesRefunds` stream with parent `ApplicationFees` | +| 3.5.0 | 2023-05-20 | [22859](https://github.com/airbytehq/airbyte/pull/22859) | Add stream `Early Fraud Warnings` | +| 3.4.3 | 2023-05-10 | [25965](https://github.com/airbytehq/airbyte/pull/25965) | Fix Airbyte date-time data-types | +| 3.4.2 | 2023-05-04 | [25795](https://github.com/airbytehq/airbyte/pull/25795) | Added `CDK TypeTransformer` to guarantee declared JSON Schema data-types | +| 3.4.1 | 2023-04-24 | [23389](https://github.com/airbytehq/airbyte/pull/23389) | Add `customer_tax_ids` to `Invoices` | +| 3.4.0 | 2023-03-20 | [23963](https://github.com/airbytehq/airbyte/pull/23963) | Add `SetupIntents` stream | +| 3.3.0 | 2023-04-12 | [25136](https://github.com/airbytehq/airbyte/pull/25136) | Add stream `Accounts` | +| 3.2.0 | 2023-04-10 | [23624](https://github.com/airbytehq/airbyte/pull/23624) | Add new stream `Subscription Schedule` | +| 3.1.0 | 2023-03-10 | [19906](https://github.com/airbytehq/airbyte/pull/19906) | Expand `tiers` when syncing `Plans` streams | +| 3.0.5 | 2023-03-25 | [22866](https://github.com/airbytehq/airbyte/pull/22866) | Specified date formatting in specification | +| 3.0.4 | 2023-03-24 | [24471](https://github.com/airbytehq/airbyte/pull/24471) | Fix stream slices for single sliced streams | +| 3.0.3 | 2023-03-17 | [24179](https://github.com/airbytehq/airbyte/pull/24179) | Get customer's attributes safely | +| 3.0.2 | 2023-03-13 | [24051](https://github.com/airbytehq/airbyte/pull/24051) | Cache `customers` stream; Do not request transactions of customers with zero balance. | +| 3.0.1 | 2023-02-22 | [22898](https://github.com/airbytehq/airbyte/pull/22898) | Add missing column to Subscriptions stream | +| 3.0.0 | 2023-02-21 | [23295](https://github.com/airbytehq/airbyte/pull/23295) | Fix invoice schema | +| 2.0.0 | 2023-02-14 | [22312](https://github.com/airbytehq/airbyte/pull/22312) | Another fix of `Invoices` stream schema + Remove http urls from openapi_spec.json | +| 1.0.2 | 2023-02-09 | [22659](https://github.com/airbytehq/airbyte/pull/22659) | Set `AvailabilityStrategy` for all streams | +| 1.0.1 | 2023-01-27 | [22042](https://github.com/airbytehq/airbyte/pull/22042) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 1.0.0 | 2023-01-25 | [21858](https://github.com/airbytehq/airbyte/pull/21858) | Update the `Subscriptions` and `Invoices` stream schemas | +| 0.1.40 | 2022-10-20 | [18228](https://github.com/airbytehq/airbyte/pull/18228) | Update the `PaymentIntents` stream schema | +| 0.1.39 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream states. | +| 0.1.38 | 2022-09-09 | [16537](https://github.com/airbytehq/airbyte/pull/16537) | Fix `redeem_by` field type for `customers` stream | +| 0.1.37 | 2022-08-16 | [15686](https://github.com/airbytehq/airbyte/pull/15686) | Fix the bug when the stream couldn't be fetched due to limited permission set, if so - it should be skipped | +| 0.1.36 | 2022-08-04 | [15292](https://github.com/airbytehq/airbyte/pull/15292) | Implement slicing | +| 0.1.35 | 2022-07-21 | [14924](https://github.com/airbytehq/airbyte/pull/14924) | Remove `additionalProperties` field from spec and schema | +| 0.1.34 | 2022-07-01 | [14357](https://github.com/airbytehq/airbyte/pull/14357) | Add external account streams - | +| 0.1.33 | 2022-06-06 | [13449](https://github.com/airbytehq/airbyte/pull/13449) | Add semi-incremental support for CheckoutSessions and CheckoutSessionsLineItems streams, fixed big in StripeSubStream, added unittests, updated docs | +| 0.1.32 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy | +| 0.1.31 | 2022-04-20 | [12230](https://github.com/airbytehq/airbyte/pull/12230) | Update connector to use a `spec.yaml` | +| 0.1.30 | 2022-03-21 | [11286](https://github.com/airbytehq/airbyte/pull/11286) | Minor corrections to documentation and connector specification | +| 0.1.29 | 2022-03-08 | [10359](https://github.com/airbytehq/airbyte/pull/10359) | Improved performance for streams with substreams: invoice_line_items, subscription_items, bank_accounts | +| 0.1.28 | 2022-02-08 | [10165](https://github.com/airbytehq/airbyte/pull/10165) | Improve 404 handling for `CheckoutSessionsLineItems` stream | +| 0.1.27 | 2021-12-28 | [9148](https://github.com/airbytehq/airbyte/pull/9148) | Fix `date`, `arrival\_date` fields | +| 0.1.26 | 2021-12-21 | [8992](https://github.com/airbytehq/airbyte/pull/8992) | Fix type `events.request` in schema | +| 0.1.25 | 2021-11-25 | [8250](https://github.com/airbytehq/airbyte/pull/8250) | Rearrange setup fields | +| 0.1.24 | 2021-11-08 | [7729](https://github.com/airbytehq/airbyte/pull/7729) | Include tax data in `checkout_sessions_line_items` stream | +| 0.1.23 | 2021-11-08 | [7729](https://github.com/airbytehq/airbyte/pull/7729) | Correct `payment_intents` schema | +| 0.1.22 | 2021-11-05 | [7345](https://github.com/airbytehq/airbyte/pull/7345) | Add 3 new streams | +| 0.1.21 | 2021-10-07 | [6841](https://github.com/airbytehq/airbyte/pull/6841) | Fix missing `start_date` argument + update json files for SAT | +| 0.1.20 | 2021-09-30 | [6017](https://github.com/airbytehq/airbyte/pull/6017) | Add lookback_window_days parameter | +| 0.1.19 | 2021-09-27 | [6466](https://github.com/airbytehq/airbyte/pull/6466) | Use `start_date` parameter in incremental streams | +| 0.1.18 | 2021-09-14 | [6004](https://github.com/airbytehq/airbyte/pull/6004) | Fix coupons and subscriptions stream schemas by removing incorrect timestamp formatting | +| 0.1.17 | 2021-09-14 | [6004](https://github.com/airbytehq/airbyte/pull/6004) | Add `PaymentIntents` stream | +| 0.1.16 | 2021-07-28 | [4980](https://github.com/airbytehq/airbyte/pull/4980) | Remove Updated field from schemas | +| 0.1.15 | 2021-07-21 | [4878](https://github.com/airbytehq/airbyte/pull/4878) | Fix incorrect percent_off and discounts data filed types | +| 0.1.14 | 2021-07-09 | [4669](https://github.com/airbytehq/airbyte/pull/4669) | Subscriptions Stream now returns all kinds of subscriptions \(including expired and canceled\) | +| 0.1.13 | 2021-07-03 | [4528](https://github.com/airbytehq/airbyte/pull/4528) | Remove regex for acc validation | +| 0.1.12 | 2021-06-08 | [3973](https://github.com/airbytehq/airbyte/pull/3973) | Add `AIRBYTE_ENTRYPOINT` for Kubernetes support | +| 0.1.11 | 2021-05-30 | [3744](https://github.com/airbytehq/airbyte/pull/3744) | Fix types in schema | +| 0.1.10 | 2021-05-28 | [3728](https://github.com/airbytehq/airbyte/pull/3728) | Update data types to be number instead of int | +| 0.1.9 | 2021-05-13 | [3367](https://github.com/airbytehq/airbyte/pull/3367) | Add acceptance tests for connected accounts | +| 0.1.8 | 2021-05-11 | [3566](https://github.com/airbytehq/airbyte/pull/3368) | Bump CDK connectors | + diff --git a/docs/integrations/sources/sugar-crm.md b/docs/integrations/sources/sugar-crm.md index 27e2960ff10..40ccdd17739 100644 --- a/docs/integrations/sources/sugar-crm.md +++ b/docs/integrations/sources/sugar-crm.md @@ -12,10 +12,10 @@ You will only be able to connect to a self-hosted instance of Sugar CRM using th Sugar CRM can run on the MySQL, MSSQL, Oracle, or Db2 databases. You can use Airbyte to sync your Sugar CRM instance by connecting to the underlying database using the appropriate Airbyte connector: -* [DB2](db2.md) -* [MySQL](mysql.md) -* [MSSQL](mssql.md) -* [Oracle](oracle.md) +- [DB2](db2.md) +- [MySQL](mysql.md) +- [MSSQL](mssql.md) +- [Oracle](oracle.md) :::info @@ -32,4 +32,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema To understand your Sugar CRM database schema, see the [VarDefs](https://support.sugarcrm.com/Documentation/Sugar_Developer/Sugar_Developer_Guide_11.0/Data_Framework/Vardefs/) documentation. Otherwise, the schema will be loaded according to the rules of the underlying database's connector. - diff --git a/docs/integrations/sources/survey-sparrow.md b/docs/integrations/sources/survey-sparrow.md index ddf36873b0c..cfe21585e33 100644 --- a/docs/integrations/sources/survey-sparrow.md +++ b/docs/integrations/sources/survey-sparrow.md @@ -5,10 +5,13 @@ This page guides you through the process of setting up the SurveySparrow source ## Prerequisites ### For Airbyte Open Source: -* Access Token + +- Access Token ## Setup guide + ### Step 1: Set up SurveySparrow + Please read this [docs](https://developers.surveysparrow.com/rest-apis). In order to get access token, follow these steps: @@ -33,21 +36,21 @@ In order to get access token, follow these steps: ## Supported streams and sync modes -* [Contacts](https://developers.surveysparrow.com/rest-apis/contacts#getV3Contacts) -* [ContactLists](https://developers.surveysparrow.com/rest-apis/contact_lists#getV3Contact_lists) -* [Questions](https://developers.surveysparrow.com/rest-apis/questions#getV3Questions) -* [Responses](https://developers.surveysparrow.com/rest-apis/response#getV3Responses) -* [Roles](https://developers.surveysparrow.com/rest-apis/roles#getV3Roles) -* [Surveys](https://developers.surveysparrow.com/rest-apis/survey#getV3Surveys) -* [SurveyFolders](https://developers.surveysparrow.com/rest-apis/survey_folder#getV3Survey_folders) -* [Users](https://developers.surveysparrow.com/rest-apis/users#getV3Users) +- [Contacts](https://developers.surveysparrow.com/rest-apis/contacts#getV3Contacts) +- [ContactLists](https://developers.surveysparrow.com/rest-apis/contact_lists#getV3Contact_lists) +- [Questions](https://developers.surveysparrow.com/rest-apis/questions#getV3Questions) +- [Responses](https://developers.surveysparrow.com/rest-apis/response#getV3Responses) +- [Roles](https://developers.surveysparrow.com/rest-apis/roles#getV3Roles) +- [Surveys](https://developers.surveysparrow.com/rest-apis/survey#getV3Surveys) +- [SurveyFolders](https://developers.surveysparrow.com/rest-apis/survey_folder#getV3Survey_folders) +- [Users](https://developers.surveysparrow.com/rest-apis/users#getV3Users) ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------- | -| 0.2.3 | 2024-04-19 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | schema descriptions | -| 0.2.0 | 2022-11-18 | [19143](https://github.com/airbytehq/airbyte/pull/19143) | Allow users to change base_url based on account's location | -| 0.1.0 | 2022-11-03 | [18395](https://github.com/airbytehq/airbyte/pull/18395) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.3 | 2024-04-19 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37267](https://github.com/airbytehq/airbyte/pull/37267) | schema descriptions | +| 0.2.0 | 2022-11-18 | [19143](https://github.com/airbytehq/airbyte/pull/19143) | Allow users to change base_url based on account's location | +| 0.1.0 | 2022-11-03 | [18395](https://github.com/airbytehq/airbyte/pull/18395) | Initial Release | diff --git a/docs/integrations/sources/surveycto.md b/docs/integrations/sources/surveycto.md index cc680b3b2b4..ccd31674fe9 100644 --- a/docs/integrations/sources/surveycto.md +++ b/docs/integrations/sources/surveycto.md @@ -47,8 +47,8 @@ The SurveyCTO source connector supports the following streams: ## Changelog -| Version | Date | Pull Request | Subject | -|---------|------|--------------|---------| -| 0.1.2 | 2023-07-27 | [28512](https://github.com/airbytehq/airbyte/pull/28512) | Added Check Connection | -| 0.1.1 | 2023-04-25 | [24784](https://github.com/airbytehq/airbyte/pull/24784) | Fix incremental sync | -| 0.1.0 | 2022-11-16 | [19371](https://github.com/airbytehq/airbyte/pull/19371) | SurveyCTO Source Connector | +| Version | Date | Pull Request | Subject | +| ------- | ---------- | -------------------------------------------------------- | -------------------------- | +| 0.1.2 | 2023-07-27 | [28512](https://github.com/airbytehq/airbyte/pull/28512) | Added Check Connection | +| 0.1.1 | 2023-04-25 | [24784](https://github.com/airbytehq/airbyte/pull/24784) | Fix incremental sync | +| 0.1.0 | 2022-11-16 | [19371](https://github.com/airbytehq/airbyte/pull/19371) | SurveyCTO Source Connector | diff --git a/docs/integrations/sources/surveymonkey.md b/docs/integrations/sources/surveymonkey.md index 2d3f2f180d7..a1ea0a37e4d 100644 --- a/docs/integrations/sources/surveymonkey.md +++ b/docs/integrations/sources/surveymonkey.md @@ -9,20 +9,24 @@ OAuth for Survey Monkey is officially supported only for the US. We are testing ::: + ## Prerequisites **For Airbyte Open Source:** -* Access Token +- Access Token ## Setup guide + ### Step 1: Set up SurveyMonkey + Please read this [docs](https://developer.surveymonkey.com/api/v3/#getting-started). Register your application [here](https://developer.surveymonkey.com/apps/) Then go to Settings and copy your access token ### Step 2: Set up the source connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -35,6 +39,7 @@ Please read this [docs](https://developer.surveymonkey.com/api/v3/#getting-start + **For Airbyte Open Source:** 1. Go to local Airbyte page. @@ -47,26 +52,26 @@ Please read this [docs](https://developer.surveymonkey.com/api/v3/#getting-start ## Supported streams and sync modes -* [Surveys](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys) \(Incremental\) -* [SurveyPages](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-pages) -* [SurveyQuestions](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-pages-page_id-questions) -* [SurveyResponses](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-id-responses-bulk) \(Incremental\) -* [SurveyCollectors](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-collectors) -* [Collectors](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-collectors-collector_id-) +- [Surveys](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys) \(Incremental\) +- [SurveyPages](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-pages) +- [SurveyQuestions](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-pages-page_id-questions) +- [SurveyResponses](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-id-responses-bulk) \(Incremental\) +- [SurveyCollectors](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-surveys-survey_id-collectors) +- [Collectors](https://api.surveymonkey.com/v3/docs?shell#api-endpoints-get-collectors-collector_id-) ### Performance considerations The SurveyMonkey API applies heavy API quotas for default private apps, which have the following limits: -* 125 requests per minute -* 500 requests per day +- 125 requests per minute +- 500 requests per day To cover more data from this source we use caching. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------- | | 0.3.1 | 2024-04-24 | [36664](https://github.com/airbytehq/airbyte/pull/36664) | Schema descriptions and CDK 0.80.0 | | 0.3.0 | 2024-02-22 | [35561](https://github.com/airbytehq/airbyte/pull/35561) | Migrate connector to low-code | | 0.2.4 | 2024-02-12 | [35168](https://github.com/airbytehq/airbyte/pull/35168) | Manage dependencies with Poetry | diff --git a/docs/integrations/sources/talkdesk-explore.md b/docs/integrations/sources/talkdesk-explore.md index dd40b48455f..7f3cffbee03 100644 --- a/docs/integrations/sources/talkdesk-explore.md +++ b/docs/integrations/sources/talkdesk-explore.md @@ -4,7 +4,7 @@ ## Deprecation Notice -The Talkdesk Explore source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. +The Talkdesk Explore source connector is scheduled for deprecation on March 5th, 2024 due to incompatibility with upcoming platform updates as we prepare to launch Airbyte 1.0. This means it will no longer be supported or available for use in Airbyte. This connector does not support new per-stream features which are vital for ensuring data integrity in Airbyte's synchronization processes. Without these capabilities, we cannot enforce our standards of reliability and correctness for data syncing operations. @@ -14,7 +14,6 @@ Users who still wish to sync data from this connector are advised to explore cre ::: - ## Overview Talkdesk is a software for contact center operations. @@ -25,11 +24,11 @@ The Talkdesk Explore connector uses the [Talkdesk Explore API](https://docs.talk The connector supports both Full Refresh and Incremental on the following streams: -* [Calls Report](https://docs.talkdesk.com/docs/calls-report) -* [User Status Report](https://docs.talkdesk.com/docs/user-status-explore) -* [Studio Flow Execution Report](https://docs.talkdesk.com/docs/studio-flow-execution-report) -* [Contacts Report](https://docs.talkdesk.com/docs/contacts-report) -* [Ring Attempts Report](https://docs.talkdesk.com/docs/ring-attempts-report) +- [Calls Report](https://docs.talkdesk.com/docs/calls-report) +- [User Status Report](https://docs.talkdesk.com/docs/user-status-explore) +- [Studio Flow Execution Report](https://docs.talkdesk.com/docs/studio-flow-execution-report) +- [Contacts Report](https://docs.talkdesk.com/docs/contacts-report) +- [Ring Attempts Report](https://docs.talkdesk.com/docs/ring-attempts-report) ### Note on report generation @@ -39,12 +38,12 @@ This process is further explained here: [Executing a Report](https://docs.talkde ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Incremental - Dedupe Sync | No | -| SSL connection | Yes | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | Yes | +| Incremental - Dedupe Sync | No | +| SSL connection | Yes | ### Performance considerations @@ -54,8 +53,8 @@ The Explore API has an account-based quota limit of 15 simultaneous reports (exe ### Requirements -* Talkdesk account -* Talkdesk API key (`Client Credentials` auth method) +- Talkdesk account +- Talkdesk API key (`Client Credentials` auth method) ### Setup guide @@ -63,7 +62,7 @@ Please refer to the [getting started with the API](https://docs.talkdesk.com/doc ## Changelog -| Version | Date | Pull Request | Subject | -|---------|------|--------------|---------| -| 0.1.0 | 2022-02-07 | | New Source: Talkdesk Explore -| :--- | :--- | :--- | :--- | +| Version | Date | Pull Request | Subject | +| ------- | ---------- | ------------ | ---------------------------- | +| 0.1.0 | 2022-02-07 | | New Source: Talkdesk Explore | +| :--- | :--- | :--- | :--- | diff --git a/docs/integrations/sources/teradata.md b/docs/integrations/sources/teradata.md index 39dc85b0ad9..32d6e914d92 100644 --- a/docs/integrations/sources/teradata.md +++ b/docs/integrations/sources/teradata.md @@ -6,19 +6,19 @@ This page guides you through the process of setting up the Teradata source conne To use the Teradata source connector, you'll need: -* Access to a Teradata Vantage instance +- Access to a Teradata Vantage instance **Note:** If you need a new instance of Vantage, you can install a free version called Vantage Express in the cloud on [Google Cloud](https://quickstarts.teradata.com/vantage.express.gcp.html), [Azure](https://quickstarts.teradata.com/run-vantage-express-on-microsoft-azure.html), and [AWS](https://quickstarts.teradata.com/run-vantage-express-on-aws.html). You can also run Vantage Express on your local machine using [VMware](https://quickstarts.teradata.com/getting.started.vmware.html), [VirtualBox](https://quickstarts.teradata.com/getting.started.vbox.html), or [UTM](https://quickstarts.teradata.com/getting.started.utm.html). You'll need the following information to configure the Teradata source: -* **Host** - The host name of the Teradata Vantage instance. -* **Username** -* **Password** -* **Database** - Specify the database (equivalent to schema in some databases i.e. **database_name.table_name** when performing queries). -* **JDBC URL Params** (optional) -* **SSL Connection** (optional) -* **SSL Modes** (optional) +- **Host** - The host name of the Teradata Vantage instance. +- **Username** +- **Password** +- **Database** - Specify the database (equivalent to schema in some databases i.e. **database_name.table_name** when performing queries). +- **JDBC URL Params** (optional) +- **SSL Connection** (optional) +- **SSL Modes** (optional) [Refer to this guide for more details](https://downloads.teradata.com/doc/connectivity/jdbc/reference/current/jdbcug_chapter_2.html#BGBHDDGB) @@ -27,7 +27,7 @@ You'll need the following information to configure the Teradata source: The Teradata source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:-----------------------------------------------|:-----------------------------------------------------------| +| :--------------------------------------------- | :--------------------------------------------------------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Replicate Incremental Deletes | No | @@ -61,9 +61,9 @@ You need a Teradata user which has read permissions on the database ## CHANGELOG -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:------------------------------------------------|:----------------------------| -| 0.2.2 | 2024-02-13 | [35219](https://github.com/airbytehq/airbyte/pull/35219) | Adopt CDK 0.20.4 | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------- | +| 0.2.2 | 2024-02-13 | [35219](https://github.com/airbytehq/airbyte/pull/35219) | Adopt CDK 0.20.4 | | 0.2.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | -| 0.2.0 | 2023-12-18 | https://github.com/airbytehq/airbyte/pull/33485 | Remove LEGACY state | -| 0.1.0 | 2022-03-27 | https://github.com/airbytehq/airbyte/pull/24221 | New Source Teradata Vantage | +| 0.2.0 | 2023-12-18 | https://github.com/airbytehq/airbyte/pull/33485 | Remove LEGACY state | +| 0.1.0 | 2022-03-27 | https://github.com/airbytehq/airbyte/pull/24221 | New Source Teradata Vantage | diff --git a/docs/integrations/sources/tidb.md b/docs/integrations/sources/tidb.md index 197673c5e8c..fe424199521 100644 --- a/docs/integrations/sources/tidb.md +++ b/docs/integrations/sources/tidb.md @@ -67,8 +67,8 @@ Using this feature requires additional configuration, when creating the source. 1. Configure all fields for the source as you normally would, except `SSH Tunnel Method`. 2. `SSH Tunnel Method` defaults to `No Tunnel` \(meaning a direct connection\). If you want to use an SSH Tunnel choose `SSH Key Authentication` or `Password Authentication`. - 1. Choose `Key Authentication` if you will be using an RSA private key as your secret for establishing the SSH Tunnel \(see below for more information on generating this key\). - 2. Choose `Password Authentication` if you will be using a password as your secret for establishing the SSH Tunnel. + 1. Choose `Key Authentication` if you will be using an RSA private key as your secret for establishing the SSH Tunnel \(see below for more information on generating this key\). + 2. Choose `Password Authentication` if you will be using a password as your secret for establishing the SSH Tunnel. 3. `SSH Tunnel Jump Server Host` refers to the intermediate \(bastion\) server that Airbyte will connect to. This should be a hostname or an IP Address. 4. `SSH Connection Port` is the port on the bastion server with which to make the SSH connection. The default port for SSH connections is `22`, so unless you have explicitly changed something, go with the default. 5. `SSH Login Username` is the username that Airbyte should use when connection to the bastion server. This is NOT the TiDB username. @@ -79,42 +79,41 @@ Using this feature requires additional configuration, when creating the source. [TiDB data types](https://docs.pingcap.com/tidb/stable/data-type-overview) are mapped to the following data types when synchronizing data: -| TiDB Type | Resulting Type | Notes | -| :---------------------------------------- |:-----------------------| :----------------------------------------------------------- | -| `bit(1)` | boolean | | -| `bit(>1)` | base64 binary string | | -| `boolean` | boolean | | -| `tinyint(1)` | boolean | | -| `tinyint` | number | | -| `smallint` | number | | -| `mediumint` | number | | -| `int` | number | | -| `bigint` | number | | -| `float` | number | | -| `double` | number | | -| `decimal` | number | | -| `binary` | base64 binary string | | -| `blob` | base64 binary string | | -| `date` | string | ISO 8601 date string. ZERO-DATE value will be converted to NULL. If column is mandatory, convert to EPOCH. | +| TiDB Type | Resulting Type | Notes | +| :---------------------------------------- | :--------------------- | :------------------------------------------------------------------------------------------------------------- | +| `bit(1)` | boolean | | +| `bit(>1)` | base64 binary string | | +| `boolean` | boolean | | +| `tinyint(1)` | boolean | | +| `tinyint` | number | | +| `smallint` | number | | +| `mediumint` | number | | +| `int` | number | | +| `bigint` | number | | +| `float` | number | | +| `double` | number | | +| `decimal` | number | | +| `binary` | base64 binary string | | +| `blob` | base64 binary string | | +| `date` | string | ISO 8601 date string. ZERO-DATE value will be converted to NULL. If column is mandatory, convert to EPOCH. | | `datetime`, `timestamp` | string | ISO 8601 datetime string. ZERO-DATE value will be converted to NULL. If column is mandatory, convert to EPOCH. | -| `time` | string | ISO 8601 time string. Values are in range between 00:00:00 and 23:59:59. | -| `year` | year string | [Doc](https://docs.pingcap.com/tidb/stable/data-type-date-and-time#year-type) | -| `char`, `varchar` with non-binary charset | string | | -| `char`, `varchar` with binary charset | base64 binary string | | -| `tinyblob` | base64 binary string | | -| `blob` | base64 binary string | | -| `mediumblob` | base64 binary string | | -| `longblob` | base64 binary string | | -| `binary` | base64 binary string | | -| `varbinary` | base64 binary string | | -| `tinytext` | string | | -| `text` | string | | -| `mediumtext` | string | | -| `longtext` | string | | -| `json` | serialized json string | E.g. `{"a": 10, "b": 15}` | -| `enum` | string | | -| `set` | string | E.g. `blue,green,yellow` | - +| `time` | string | ISO 8601 time string. Values are in range between 00:00:00 and 23:59:59. | +| `year` | year string | [Doc](https://docs.pingcap.com/tidb/stable/data-type-date-and-time#year-type) | +| `char`, `varchar` with non-binary charset | string | | +| `char`, `varchar` with binary charset | base64 binary string | | +| `tinyblob` | base64 binary string | | +| `blob` | base64 binary string | | +| `mediumblob` | base64 binary string | | +| `longblob` | base64 binary string | | +| `binary` | base64 binary string | | +| `varbinary` | base64 binary string | | +| `tinytext` | string | | +| `text` | string | | +| `mediumtext` | string | | +| `longtext` | string | | +| `json` | serialized json string | E.g. `{"a": 10, "b": 15}` | +| `enum` | string | | +| `set` | string | E.g. `blue,green,yellow` | **Note:** arrays for all the above types as well as custom types are supported, although they may be de-nested depending on the destination. @@ -126,15 +125,15 @@ Now that you have set up the TiDB source connector, check out the following TiDB ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |-------------------------------------------------------------------------------------------------------------------------------------------| -| 0.3.2 | 2024-02-13 | [35218](https://github.com/airbytehq/airbyte/pull/35218) | Adopt CDK 0.20.4 | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- | +| 0.3.2 | 2024-02-13 | [35218](https://github.com/airbytehq/airbyte/pull/35218) | Adopt CDK 0.20.4 | | 0.3.1 | 2024-01-24 | [34453](https://github.com/airbytehq/airbyte/pull/34453) | bump CDK version | | 0.3.0 | 2023-12-18 | [33485](https://github.com/airbytehq/airbyte/pull/33485) | Remove LEGACY state | | 0.2.5 | 2023-06-20 | [27212](https://github.com/airbytehq/airbyte/pull/27212) | Fix silent exception swallowing in StreamingJdbcDatabase | | 0.2.4 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting | -| 0.2.3 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | -| 0.2.2 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | +| 0.2.3 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to | +| 0.2.2 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources | | | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode | | 0.2.1 | 2022-09-01 | [16238](https://github.com/airbytehq/airbyte/pull/16238) | Emit state messages more frequently | | 0.2.0 | 2022-07-26 | [14362](https://github.com/airbytehq/airbyte/pull/14362) | Integral columns are now discovered as int64 fields. | diff --git a/docs/integrations/sources/tiktok-marketing.md b/docs/integrations/sources/tiktok-marketing.md index b5700b76247..26faa3dfe8d 100644 --- a/docs/integrations/sources/tiktok-marketing.md +++ b/docs/integrations/sources/tiktok-marketing.md @@ -66,7 +66,7 @@ To access the Sandbox environment: ## Supported streams and sync modes | Stream | Environment | Key | Incremental | -|:------------------------------------------|:-------------|:-------------------------------------------|:------------| +| :---------------------------------------- | :----------- | :----------------------------------------- | :---------- | | Advertisers | Prod,Sandbox | advertiser_id | No | | AdGroups | Prod,Sandbox | adgroup_id | Yes | | Ads | Prod,Sandbox | ad_id | Yes | @@ -122,7 +122,7 @@ The connector is restricted by [requests limitation](https://business-api.tiktok ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- | | 3.9.6 | 2024-04-19 | [36665](https://github.com/airbytehq/airbyte/pull/36665) | Updating to 0.80.0 CDK | | 3.9.5 | 2024-04-12 | [36665](https://github.com/airbytehq/airbyte/pull/36665) | Schema descriptions | | 3.9.4 | 2024-03-20 | [36302](https://github.com/airbytehq/airbyte/pull/36302) | Don't extract state from the latest record if stream doesn't have a cursor_field | diff --git a/docs/integrations/sources/timely.md b/docs/integrations/sources/timely.md index 2d0272fea9a..2bfcc925e2f 100644 --- a/docs/integrations/sources/timely.md +++ b/docs/integrations/sources/timely.md @@ -9,6 +9,7 @@ This page contains the setup guide and reference information for the Timely sour 3. Get a start-date to your events. Dateformat `YYYY-MM-DD`. ## Setup guide + ## Step 1: Set up the Timely connector in Airbyte ### For Airbyte OSS: @@ -31,12 +32,12 @@ The Timely source connector supports the following [sync modes](https://docs.air ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------- | -| 0.3.4 | 2024-04-19 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Updating to 0.80.0 CDK | -| 0.3.3 | 2024-04-18 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Manage dependencies with Poetry. | -| 0.3.2 | 2024-04-15 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.3.1 | 2024-04-12 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | schema descriptions | -| 0.3.0 | 2023-10-25 | [31002](https://github.com/airbytehq/airbyte/pull/31002) | Migrate to low-code framework | -| 0.2.0 | 2023-10-23 | [31745](https://github.com/airbytehq/airbyte/pull/31745) | Fix schemas | -| 0.1.0 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Initial release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.3.4 | 2024-04-19 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Updating to 0.80.0 CDK | +| 0.3.3 | 2024-04-18 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Manage dependencies with Poetry. | +| 0.3.2 | 2024-04-15 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.3.1 | 2024-04-12 | [37270](https://github.com/airbytehq/airbyte/pull/37270) | schema descriptions | +| 0.3.0 | 2023-10-25 | [31002](https://github.com/airbytehq/airbyte/pull/31002) | Migrate to low-code framework | +| 0.2.0 | 2023-10-23 | [31745](https://github.com/airbytehq/airbyte/pull/31745) | Fix schemas | +| 0.1.0 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Initial release | diff --git a/docs/integrations/sources/tmdb.md b/docs/integrations/sources/tmdb.md index 472fdaca839..b84dbeab49c 100644 --- a/docs/integrations/sources/tmdb.md +++ b/docs/integrations/sources/tmdb.md @@ -29,9 +29,9 @@ Just pass the generated API key and Movie ID for establishing the connection. 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. -4. Enter your `api_key`. -5. Enter params `movie_id, query, language` (if needed). -6. Click **Set up source**. +3. Enter your `api_key`. +4. Enter params `movie_id, query, language` (if needed). +5. Click **Set up source**. ## Supported sync modes @@ -81,7 +81,6 @@ The Google-webfonts source connector supports the following [sync modes](https:/ - Search_people - Search_tv_shows - ## API method example GET https://api.themoviedb.org/3/movie/{movie_id}/alternative_titles?api_key={api_key} @@ -92,6 +91,6 @@ TMDb's [API reference](https://developers.themoviedb.org/3/getting-started/intro ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.0 | 2022-10-27 | [Init](https://github.com/airbytehq/airbyte/pull/18561)| Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------ | :------------- | +| 0.1.0 | 2022-10-27 | [Init](https://github.com/airbytehq/airbyte/pull/18561) | Initial commit | diff --git a/docs/integrations/sources/todoist.md b/docs/integrations/sources/todoist.md index f4da08459ce..85fa51e003d 100644 --- a/docs/integrations/sources/todoist.md +++ b/docs/integrations/sources/todoist.md @@ -11,7 +11,7 @@ Two output streams are available from this source. A list of these streams can b ### Features | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | @@ -19,11 +19,10 @@ Two output streams are available from this source. A list of these streams can b ### Requirements -* Todoist API token +- Todoist API token You can find your personal token in the [integrations settings view](https://todoist.com/prefs/integrations) of the Todoist web app and replace the token value in the samples. - ### Set up the Todoist connector in Airbyte 1. [Log into your Airbyte Cloud](https://cloud.airbyte.io/workspaces) account or navigate to the Airbyte Open Source dashboard. @@ -37,14 +36,14 @@ You can find your personal token in the [integrations settings view](https://tod List of available streams: -* [Tasks](https://developer.todoist.com/rest/v2/#tasks) -* [Projects](https://developer.todoist.com/rest/v2/#projects) +- [Tasks](https://developer.todoist.com/rest/v2/#tasks) +- [Projects](https://developer.todoist.com/rest/v2/#projects) ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:-----------------------------------------------------------|:------------------------------------------------| -| 0.2.2 | 2024-04-19 | [37272](https://github.com/airbytehq/airbyte/pull/37272) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.2.1 | 2024-04-12 | [37272](https://github.com/airbytehq/airbyte/pull/37272) | schema descriptions | -| 0.2.0 | 2023-12-19 | [32690](https://github.com/airbytehq/airbyte/pull/32690) | Migrate to low-code | -| 0.1.0 | 2022-12-03 | [20046](https://github.com/airbytehq/airbyte/pull/20046) | 🎉 New Source: todoist | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------- | +| 0.2.2 | 2024-04-19 | [37272](https://github.com/airbytehq/airbyte/pull/37272) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.2.1 | 2024-04-12 | [37272](https://github.com/airbytehq/airbyte/pull/37272) | schema descriptions | +| 0.2.0 | 2023-12-19 | [32690](https://github.com/airbytehq/airbyte/pull/32690) | Migrate to low-code | +| 0.1.0 | 2022-12-03 | [20046](https://github.com/airbytehq/airbyte/pull/20046) | 🎉 New Source: todoist | diff --git a/docs/integrations/sources/toggl.md b/docs/integrations/sources/toggl.md index 29839bba7b8..ba8f67941ee 100644 --- a/docs/integrations/sources/toggl.md +++ b/docs/integrations/sources/toggl.md @@ -6,20 +6,20 @@ This source can sync data from the [Toggl API](https://developers.track.toggl.co ## This Source Supports the Following Streams -* time_entries -* organizations -* organizations_users -* organizations_groups -* workspace -* workspace_clients -* workspace_tasks +- time_entries +- organizations +- organizations_users +- organizations_groups +- workspace +- workspace_clients +- workspace_tasks ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -29,10 +29,10 @@ Toggl APIs are under rate limits for the number of API calls allowed per API key ### Requirements -* API token +- API token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-28 | [#18507](https://github.com/airbytehq/airbyte/pull/18507) | 🎉 New Source: Toggl API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------------- | +| 0.1.0 | 2022-10-28 | [#18507](https://github.com/airbytehq/airbyte/pull/18507) | 🎉 New Source: Toggl API [low-code CDK] | diff --git a/docs/integrations/sources/trello.md b/docs/integrations/sources/trello.md index 3ce67d3a8bb..735fb47d59a 100644 --- a/docs/integrations/sources/trello.md +++ b/docs/integrations/sources/trello.md @@ -8,12 +8,14 @@ This page contains the setup guide and reference information for the Trello sour - Trello Board IDs (Optional) + **For Airbyte Cloud:** - OAuth 1.0 + **For Airbyte Open Source:** - API Key (see [Authorizing A Client](https://developer.atlassian.com/cloud/trello/guides/rest-api/authorization/#authorizing-a-client)) @@ -27,6 +29,7 @@ This page contains the setup guide and reference information for the Trello sour Create a [Trello Account](https://trello.com). + ### Step 2: Set up the Trello connector in Airbyte **For Airbyte Cloud:** @@ -37,10 +40,11 @@ Create a [Trello Account](https://trello.com). 4. Click `Authenticate your Trello account`. 5. Log in and `Allow` access. 6. **Start date** - The date from which you'd like to replicate data for streams. -8. **Trello Board IDs (Optional)** - IDs of the boards to replicate data from. If left empty, data from all boards to which you have access will be replicated. +7. **Trello Board IDs (Optional)** - IDs of the boards to replicate data from. If left empty, data from all boards to which you have access will be replicated. + **For Airbyte Open Source:** 1. Authenticate with **API Key** and **API Token** pair. @@ -50,21 +54,21 @@ Create a [Trello Account](https://trello.com). The Trello source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) -* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) ## Supported Streams This connector outputs the following streams: -* [Boards](https://developer.atlassian.com/cloud/trello/rest/api-group-members/#api-members-id-boards-get) \(Full Refresh\) - * [Actions](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-boardid-actions-get) \(Incremental\) - * [Cards](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-cards-get) \(Full Refresh\) - * [Checklists](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-checklists-get) \(Full Refresh\) - * [Lists](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-lists-get) \(Full Refresh\) - * [Users](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-members-get) \(Full Refresh\) - * [Organizations](https://developer.atlassian.com/cloud/trello/rest/api-group-members/#api-members-id-organizations-get) \(Full Refresh\) +- [Boards](https://developer.atlassian.com/cloud/trello/rest/api-group-members/#api-members-id-boards-get) \(Full Refresh\) + - [Actions](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-boardid-actions-get) \(Incremental\) + - [Cards](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-cards-get) \(Full Refresh\) + - [Checklists](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-checklists-get) \(Full Refresh\) + - [Lists](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-lists-get) \(Full Refresh\) + - [Users](https://developer.atlassian.com/cloud/trello/rest/api-group-boards/#api-boards-id-members-get) \(Full Refresh\) + - [Organizations](https://developer.atlassian.com/cloud/trello/rest/api-group-members/#api-members-id-organizations-get) \(Full Refresh\) ### Performance considerations @@ -75,7 +79,7 @@ The Trello connector should not run into Trello API limitations under normal usa ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------- | | 1.0.3 | 2024-04-30 | [37598](https://github.com/airbytehq/airbyte/pull/37598) | Changed last records to last record | | 1.0.2 | 2023-10-13 | [31205](https://github.com/airbytehq/airbyte/pull/31205) | Improve spec description for board ids | | 1.0.1 | 2023-10-13 | [31168](https://github.com/airbytehq/airbyte/pull/31168) | Fix `cards` schema | diff --git a/docs/integrations/sources/trustpilot.md b/docs/integrations/sources/trustpilot.md index 0c030096657..7c09124ed37 100644 --- a/docs/integrations/sources/trustpilot.md +++ b/docs/integrations/sources/trustpilot.md @@ -2,8 +2,8 @@ ## Prerequisites -* Trustpilot API Token or Zendesk OAuth 2.0 redentials -* Trustpilot Business Unit URLs +- Trustpilot API Token or Zendesk OAuth 2.0 redentials +- Trustpilot Business Unit URLs ## Authentication methods @@ -21,7 +21,7 @@ Enter the API key in the Airbyte source configuration "API key". In case you wan Request the OAuth 2.0 request token by sending the following HTTP request: -``` http +```http GET https://api.trustpilot.com/v1/oauth/oauth-business-users-for-applications/accesstoken Authorization: Basic base64(apikey:secret) Content-Type: application/x-www-form-urlencoded @@ -36,16 +36,17 @@ Fill now the missing configuration fields in the Airbyte source configuration. A ## Supported sync modes The **Trustpilot** source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* Full Refresh -* Incremental Sync + +- Full Refresh +- Incremental Sync ## Supported Streams This Source is capable of syncing the following Streams: -* [Configured Business Units](https://documentation-apidocumentation.trustpilot.com/business-units-api-(public)#find-a-business-unit) - loads business units defined in the configuration -* [Business Units](https://documentation-apidocumentation.trustpilot.com/business-units-api-(public)#get-a-list-of-all-business-units) - loads **all** business units -* [Private Reviews](https://documentation-apidocumentation.trustpilot.com/business-units-api#business-unit-private-reviews) \(Incremental sync\) +- [Configured Business Units]() - loads business units defined in the configuration +- [Business Units]() - loads **all** business units +- [Private Reviews](https://documentation-apidocumentation.trustpilot.com/business-units-api#business-unit-private-reviews) \(Incremental sync\) ## Performance considerations @@ -53,11 +54,8 @@ The connector is restricted by Trustpilot [rate limit guidelines](https://docume The Trustpilot connector should not run into any limits under normal usage. Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see any rate limit issues that are not automatically retried successfully. - ## Changelog - -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----- |:----------------------------------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------- | | `0.1.0` | 2023-03-16 | [24009](https://github.com/airbytehq/airbyte/pull/24009) | Initial version | - diff --git a/docs/integrations/sources/tvmaze-schedule.md b/docs/integrations/sources/tvmaze-schedule.md index a46aa435bb0..ae3d2a3e454 100644 --- a/docs/integrations/sources/tvmaze-schedule.md +++ b/docs/integrations/sources/tvmaze-schedule.md @@ -5,19 +5,18 @@ This source retrieves historical and future TV scheduling data using the [TVMaze](https://www.tvmaze.com/) schedule API. - ### Output schema This source is capable of syncing the following streams: -* `domestic` -* `web` -* `future` +- `domestic` +- `web` +- `future` ### Features | Feature | Supported? \(Yes/No\) | Notes | -|:------------------|:----------------------|:------| +| :---------------- | :-------------------- | :---- | | Full Refresh Sync | Yes | | | Incremental Sync | No | | @@ -48,5 +47,5 @@ The following fields are required fields for the connector to work: ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------| +| :------ | :--------- | :------------------------------------------------------- | :--------- | | 0.1.0 | 2022-10-22 | [18333](https://github.com/airbytehq/airbyte/pull/18333) | New source | diff --git a/docs/integrations/sources/twilio-taskrouter.md b/docs/integrations/sources/twilio-taskrouter.md index 83a510f3d6e..6751fd6061c 100644 --- a/docs/integrations/sources/twilio-taskrouter.md +++ b/docs/integrations/sources/twilio-taskrouter.md @@ -56,9 +56,9 @@ For more information, see [the Twilio docs for rate limitations](https://support ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------- | -| 0.1.3 | 2024-04-19 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.1.2 | 2024-04-15 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.1 | 2024-04-12 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | schema descriptions | -| 0.1.0 | 2022-11-18 | [18685](https://github.com/airbytehq/airbyte/pull/18685) | 🎉 New Source: Twilio Taskrouter API [low-code cdk] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.3 | 2024-04-19 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.1.2 | 2024-04-15 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.1 | 2024-04-12 | [37278](https://github.com/airbytehq/airbyte/pull/37278) | schema descriptions | +| 0.1.0 | 2022-11-18 | [18685](https://github.com/airbytehq/airbyte/pull/18685) | 🎉 New Source: Twilio Taskrouter API [low-code cdk] | diff --git a/docs/integrations/sources/twilio.md b/docs/integrations/sources/twilio.md index 97db9d00dfa..4369d0f7688 100644 --- a/docs/integrations/sources/twilio.md +++ b/docs/integrations/sources/twilio.md @@ -13,6 +13,7 @@ See [docs](https://www.twilio.com/docs/iam/api) for more details. ## Setup guide + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -26,6 +27,7 @@ See [docs](https://www.twilio.com/docs/iam/api) for more details. + **For Airbyte Open Source:** 1. Navigate to the Airbyte Open Source dashboard. @@ -42,7 +44,7 @@ See [docs](https://www.twilio.com/docs/iam/api) for more details. The Twilio source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:------------------------------|:-----------| +| :---------------------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | | Replicate Incremental Deletes | No | @@ -51,40 +53,40 @@ The Twilio source connector supports the following [sync modes](https://docs.air ## Supported Streams -* [Accounts](https://www.twilio.com/docs/usage/api/account#read-multiple-account-resources) -* [Addresses](https://www.twilio.com/docs/usage/api/address#read-multiple-address-resources) -* [Alerts](https://www.twilio.com/docs/usage/monitor-alert#read-multiple-alert-resources) \(Incremental\) -* [Applications](https://www.twilio.com/docs/usage/api/applications#read-multiple-application-resources) -* [Available Phone Number Countries](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-resource#read-a-list-of-countries) \(Incremental\) -* [Available Phone Numbers Local](https://www.twilio.com/docs/phone-numbers/api/availablephonenumberlocal-resource#read-multiple-availablephonenumberlocal-resources) \(Incremental\) -* [Available Phone Numbers Mobile](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-mobile-resource#read-multiple-availablephonenumbermobile-resources) \(Incremental\) -* [Available Phone Numbers Toll Free](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-tollfree-resource#read-multiple-availablephonenumbertollfree-resources) \(Incremental\) -* [Calls](https://www.twilio.com/docs/voice/api/call-resource#create-a-call-resource) \(Incremental\) -* [Conference Participants](https://www.twilio.com/docs/voice/api/conference-participant-resource#read-multiple-participant-resources) \(Incremental\) -* [Conferences](https://www.twilio.com/docs/voice/api/conference-resource#read-multiple-conference-resources) \(Incremental\) -* [Conversations](https://www.twilio.com/docs/conversations/api/conversation-resource#read-multiple-conversation-resources) -* [Conversation Messages](https://www.twilio.com/docs/conversations/api/conversation-message-resource#list-all-conversation-messages) -* [Conversation Participants](https://www.twilio.com/docs/conversations/api/conversation-participant-resource) -* [Dependent Phone Numbers](https://www.twilio.com/docs/usage/api/address?code-sample=code-list-dependent-pns-subresources&code-language=curl&code-sdk-version=json#instance-subresources) \(Incremental\) -* [Executions](https://www.twilio.com/docs/phone-numbers/api/incomingphonenumber-resource#read-multiple-incomingphonenumber-resources) \(Incremental\) -* [Incoming Phone Numbers](https://www.twilio.com/docs/phone-numbers/api/incomingphonenumber-resource#read-multiple-incomingphonenumber-resources) \(Incremental\) -* [Flows](https://www.twilio.com/docs/studio/rest-api/flow#read-a-list-of-flows) -* [Keys](https://www.twilio.com/docs/usage/api/keys#read-a-key-resource) -* [Message Media](https://www.twilio.com/docs/sms/api/media-resource#read-multiple-media-resources) \(Incremental\) -* [Messages](https://www.twilio.com/docs/sms/api/message-resource#read-multiple-message-resources) \(Incremental\) -* [Outgoing Caller Ids](https://www.twilio.com/docs/voice/api/outgoing-caller-ids#outgoingcallerids-list-resource) -* [Queues](https://www.twilio.com/docs/voice/api/queue-resource#read-multiple-queue-resources) -* [Recordings](https://www.twilio.com/docs/voice/api/recording#read-multiple-recording-resources) \(Incremental\) -* [Services](https://www.twilio.com/docs/chat/rest/service-resource#read-multiple-service-resources) -* [Step](https://www.twilio.com/docs/studio/rest-api/v2/step#read-a-list-of-step-resources) -* [Roles](https://www.twilio.com/docs/chat/rest/role-resource#read-multiple-role-resources) -* [Transcriptions](https://www.twilio.com/docs/voice/api/recording-transcription?code-sample=code-read-list-all-transcriptions&code-language=curl&code-sdk-version=json#read-multiple-transcription-resources) -* [Trunks](https://www.twilio.com/docs/sip-trunking/api/trunk-resource#trunk-properties) -* [Usage Records](https://www.twilio.com/docs/usage/api/usage-record#read-multiple-usagerecord-resources) \(Incremental\) -* [Usage Triggers](https://www.twilio.com/docs/usage/api/usage-trigger#read-multiple-usagetrigger-resources) -* [Users](https://www.twilio.com/docs/conversations/api/user-resource) -* [UserConversations](https://www.twilio.com/docs/conversations/api/user-conversation-resource#list-all-of-a-users-conversations) -* [VerifyServices](https://www.twilio.com/docs/verify/api/service#maincontent) +- [Accounts](https://www.twilio.com/docs/usage/api/account#read-multiple-account-resources) +- [Addresses](https://www.twilio.com/docs/usage/api/address#read-multiple-address-resources) +- [Alerts](https://www.twilio.com/docs/usage/monitor-alert#read-multiple-alert-resources) \(Incremental\) +- [Applications](https://www.twilio.com/docs/usage/api/applications#read-multiple-application-resources) +- [Available Phone Number Countries](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-resource#read-a-list-of-countries) \(Incremental\) +- [Available Phone Numbers Local](https://www.twilio.com/docs/phone-numbers/api/availablephonenumberlocal-resource#read-multiple-availablephonenumberlocal-resources) \(Incremental\) +- [Available Phone Numbers Mobile](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-mobile-resource#read-multiple-availablephonenumbermobile-resources) \(Incremental\) +- [Available Phone Numbers Toll Free](https://www.twilio.com/docs/phone-numbers/api/availablephonenumber-tollfree-resource#read-multiple-availablephonenumbertollfree-resources) \(Incremental\) +- [Calls](https://www.twilio.com/docs/voice/api/call-resource#create-a-call-resource) \(Incremental\) +- [Conference Participants](https://www.twilio.com/docs/voice/api/conference-participant-resource#read-multiple-participant-resources) \(Incremental\) +- [Conferences](https://www.twilio.com/docs/voice/api/conference-resource#read-multiple-conference-resources) \(Incremental\) +- [Conversations](https://www.twilio.com/docs/conversations/api/conversation-resource#read-multiple-conversation-resources) +- [Conversation Messages](https://www.twilio.com/docs/conversations/api/conversation-message-resource#list-all-conversation-messages) +- [Conversation Participants](https://www.twilio.com/docs/conversations/api/conversation-participant-resource) +- [Dependent Phone Numbers](https://www.twilio.com/docs/usage/api/address?code-sample=code-list-dependent-pns-subresources&code-language=curl&code-sdk-version=json#instance-subresources) \(Incremental\) +- [Executions](https://www.twilio.com/docs/phone-numbers/api/incomingphonenumber-resource#read-multiple-incomingphonenumber-resources) \(Incremental\) +- [Incoming Phone Numbers](https://www.twilio.com/docs/phone-numbers/api/incomingphonenumber-resource#read-multiple-incomingphonenumber-resources) \(Incremental\) +- [Flows](https://www.twilio.com/docs/studio/rest-api/flow#read-a-list-of-flows) +- [Keys](https://www.twilio.com/docs/usage/api/keys#read-a-key-resource) +- [Message Media](https://www.twilio.com/docs/sms/api/media-resource#read-multiple-media-resources) \(Incremental\) +- [Messages](https://www.twilio.com/docs/sms/api/message-resource#read-multiple-message-resources) \(Incremental\) +- [Outgoing Caller Ids](https://www.twilio.com/docs/voice/api/outgoing-caller-ids#outgoingcallerids-list-resource) +- [Queues](https://www.twilio.com/docs/voice/api/queue-resource#read-multiple-queue-resources) +- [Recordings](https://www.twilio.com/docs/voice/api/recording#read-multiple-recording-resources) \(Incremental\) +- [Services](https://www.twilio.com/docs/chat/rest/service-resource#read-multiple-service-resources) +- [Step](https://www.twilio.com/docs/studio/rest-api/v2/step#read-a-list-of-step-resources) +- [Roles](https://www.twilio.com/docs/chat/rest/role-resource#read-multiple-role-resources) +- [Transcriptions](https://www.twilio.com/docs/voice/api/recording-transcription?code-sample=code-read-list-all-transcriptions&code-language=curl&code-sdk-version=json#read-multiple-transcription-resources) +- [Trunks](https://www.twilio.com/docs/sip-trunking/api/trunk-resource#trunk-properties) +- [Usage Records](https://www.twilio.com/docs/usage/api/usage-record#read-multiple-usagerecord-resources) \(Incremental\) +- [Usage Triggers](https://www.twilio.com/docs/usage/api/usage-trigger#read-multiple-usagetrigger-resources) +- [Users](https://www.twilio.com/docs/conversations/api/user-resource) +- [UserConversations](https://www.twilio.com/docs/conversations/api/user-conversation-resource#list-all-of-a-users-conversations) +- [VerifyServices](https://www.twilio.com/docs/verify/api/service#maincontent) ## Performance considerations @@ -93,36 +95,36 @@ For more information, see [the Twilio docs for rate limitations](https://support ## Changelog -| Version | Date | Pull Request | Subject | -|:---------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------| -| 0.11.2 | 2024-04-19 | [36666](https://github.com/airbytehq/airbyte/pull/36666) | Updating to 0.80.0 CDK | -| 0.11.1 | 2024-04-12 | [36666](https://github.com/airbytehq/airbyte/pull/36666) | Schema descriptions | -| 0.11.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | -| 0.10.2 | 2024-02-12 | [35153](https://github.com/airbytehq/airbyte/pull/35153) | Manage dependencies with Poetry | -| 0.10.1 | 2023-11-21 | [32718](https://github.com/airbytehq/airbyte/pull/32718) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.10.0 | 2023-07-28 | [27323](https://github.com/airbytehq/airbyte/pull/27323) | Add new stream `Step` | -| 0.9.0 | 2023-06-27 | [27221](https://github.com/airbytehq/airbyte/pull/27221) | Add new stream `UserConversations` with parent `Users` | -| 0.8.1 | 2023-07-12 | [28216](https://github.com/airbytehq/airbyte/pull/28216) | Add property `channel_metadata` to `ConversationMessages` schema | -| 0.8.0 | 2023-06-11 | [27231](https://github.com/airbytehq/airbyte/pull/27231) | Add new stream `VerifyServices` | -| 0.7.0 | 2023-05-03 | [25781](https://github.com/airbytehq/airbyte/pull/25781) | Add new stream `Trunks` | -| 0.6.0 | 2023-05-03 | [25783](https://github.com/airbytehq/airbyte/pull/25783) | Add new stream `Roles` with parent `Services` | -| 0.5.0 | 2023-03-21 | [23995](https://github.com/airbytehq/airbyte/pull/23995) | Add new stream `Conversation Participants` | -| 0.4.0 | 2023-03-18 | [23995](https://github.com/airbytehq/airbyte/pull/23995) | Add new stream `Conversation Messages` | -| 0.3.0 | 2023-03-18 | [22874](https://github.com/airbytehq/airbyte/pull/22874) | Add new stream `Executions` with parent `Flows` | -| 0.2.0 | 2023-03-16 | [24114](https://github.com/airbytehq/airbyte/pull/24114) | Add `Conversations` stream | -| 0.1.16 | 2023-02-10 | [22825](https://github.com/airbytehq/airbyte/pull/22825) | Specified date formatting in specification | -| 0.1.15 | 2023-01-27 | [22025](https://github.com/airbytehq/airbyte/pull/22025) | Set `AvailabilityStrategy` for streams explicitly to `None` | -| 0.1.14 | 2022-11-16 | [19479](https://github.com/airbytehq/airbyte/pull/19479) | Fix date range slicing | -| 0.1.13 | 2022-10-25 | [18423](https://github.com/airbytehq/airbyte/pull/18423) | Implement datetime slicing for streams supporting incremental syncs | -| 0.1.11 | 2022-09-30 | [17478](https://github.com/airbytehq/airbyte/pull/17478) | Add lookback_window parameters | -| 0.1.10 | 2022-09-29 | [17410](https://github.com/airbytehq/airbyte/pull/17410) | Migrate to per-stream states | -| 0.1.9 | 2022-09-26 | [17134](https://github.com/airbytehq/airbyte/pull/17134) | Add test data for Message Media and Conferences | -| 0.1.8 | 2022-08-29 | [16110](https://github.com/airbytehq/airbyte/pull/16110) | Add state checkpoint interval | -| 0.1.7 | 2022-08-26 | [15972](https://github.com/airbytehq/airbyte/pull/15972) | Shift start date for stream if it exceeds 400 days | -| 0.1.6 | 2022-06-22 | [14000](https://github.com/airbytehq/airbyte/pull/14000) | Update Records stream schema and align tests with connectors' best practices | -| 0.1.5 | 2022-06-22 | [13896](https://github.com/airbytehq/airbyte/pull/13896) | Add lookback window parameters to fetch messages with a rolling window and catch status updates | -| 0.1.4 | 2022-04-22 | [12157](https://github.com/airbytehq/airbyte/pull/12157) | Use Retry-After header for backoff | -| 0.1.3 | 2022-04-20 | [12183](https://github.com/airbytehq/airbyte/pull/12183) | Add new subresource on the call stream + declare a valid primary key for conference_participants stream | -| 0.1.2 | 2021-12-23 | [9092](https://github.com/airbytehq/airbyte/pull/9092) | Correct specification doc URL | -| 0.1.1 | 2021-10-18 | [7034](https://github.com/airbytehq/airbyte/pull/7034) | Update schemas and transform data types according to the API schema | -| 0.1.0 | 2021-07-02 | [4070](https://github.com/airbytehq/airbyte/pull/4070) | Native Twilio connector implemented | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------------------------------ | +| 0.11.2 | 2024-04-19 | [36666](https://github.com/airbytehq/airbyte/pull/36666) | Updating to 0.80.0 CDK | +| 0.11.1 | 2024-04-12 | [36666](https://github.com/airbytehq/airbyte/pull/36666) | Schema descriptions | +| 0.11.0 | 2024-03-19 | [36267](https://github.com/airbytehq/airbyte/pull/36267) | Pin airbyte-cdk version to `^0` | +| 0.10.2 | 2024-02-12 | [35153](https://github.com/airbytehq/airbyte/pull/35153) | Manage dependencies with Poetry | +| 0.10.1 | 2023-11-21 | [32718](https://github.com/airbytehq/airbyte/pull/32718) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.10.0 | 2023-07-28 | [27323](https://github.com/airbytehq/airbyte/pull/27323) | Add new stream `Step` | +| 0.9.0 | 2023-06-27 | [27221](https://github.com/airbytehq/airbyte/pull/27221) | Add new stream `UserConversations` with parent `Users` | +| 0.8.1 | 2023-07-12 | [28216](https://github.com/airbytehq/airbyte/pull/28216) | Add property `channel_metadata` to `ConversationMessages` schema | +| 0.8.0 | 2023-06-11 | [27231](https://github.com/airbytehq/airbyte/pull/27231) | Add new stream `VerifyServices` | +| 0.7.0 | 2023-05-03 | [25781](https://github.com/airbytehq/airbyte/pull/25781) | Add new stream `Trunks` | +| 0.6.0 | 2023-05-03 | [25783](https://github.com/airbytehq/airbyte/pull/25783) | Add new stream `Roles` with parent `Services` | +| 0.5.0 | 2023-03-21 | [23995](https://github.com/airbytehq/airbyte/pull/23995) | Add new stream `Conversation Participants` | +| 0.4.0 | 2023-03-18 | [23995](https://github.com/airbytehq/airbyte/pull/23995) | Add new stream `Conversation Messages` | +| 0.3.0 | 2023-03-18 | [22874](https://github.com/airbytehq/airbyte/pull/22874) | Add new stream `Executions` with parent `Flows` | +| 0.2.0 | 2023-03-16 | [24114](https://github.com/airbytehq/airbyte/pull/24114) | Add `Conversations` stream | +| 0.1.16 | 2023-02-10 | [22825](https://github.com/airbytehq/airbyte/pull/22825) | Specified date formatting in specification | +| 0.1.15 | 2023-01-27 | [22025](https://github.com/airbytehq/airbyte/pull/22025) | Set `AvailabilityStrategy` for streams explicitly to `None` | +| 0.1.14 | 2022-11-16 | [19479](https://github.com/airbytehq/airbyte/pull/19479) | Fix date range slicing | +| 0.1.13 | 2022-10-25 | [18423](https://github.com/airbytehq/airbyte/pull/18423) | Implement datetime slicing for streams supporting incremental syncs | +| 0.1.11 | 2022-09-30 | [17478](https://github.com/airbytehq/airbyte/pull/17478) | Add lookback_window parameters | +| 0.1.10 | 2022-09-29 | [17410](https://github.com/airbytehq/airbyte/pull/17410) | Migrate to per-stream states | +| 0.1.9 | 2022-09-26 | [17134](https://github.com/airbytehq/airbyte/pull/17134) | Add test data for Message Media and Conferences | +| 0.1.8 | 2022-08-29 | [16110](https://github.com/airbytehq/airbyte/pull/16110) | Add state checkpoint interval | +| 0.1.7 | 2022-08-26 | [15972](https://github.com/airbytehq/airbyte/pull/15972) | Shift start date for stream if it exceeds 400 days | +| 0.1.6 | 2022-06-22 | [14000](https://github.com/airbytehq/airbyte/pull/14000) | Update Records stream schema and align tests with connectors' best practices | +| 0.1.5 | 2022-06-22 | [13896](https://github.com/airbytehq/airbyte/pull/13896) | Add lookback window parameters to fetch messages with a rolling window and catch status updates | +| 0.1.4 | 2022-04-22 | [12157](https://github.com/airbytehq/airbyte/pull/12157) | Use Retry-After header for backoff | +| 0.1.3 | 2022-04-20 | [12183](https://github.com/airbytehq/airbyte/pull/12183) | Add new subresource on the call stream + declare a valid primary key for conference_participants stream | +| 0.1.2 | 2021-12-23 | [9092](https://github.com/airbytehq/airbyte/pull/9092) | Correct specification doc URL | +| 0.1.1 | 2021-10-18 | [7034](https://github.com/airbytehq/airbyte/pull/7034) | Update schemas and transform data types according to the API schema | +| 0.1.0 | 2021-07-02 | [4070](https://github.com/airbytehq/airbyte/pull/4070) | Native Twilio connector implemented | diff --git a/docs/integrations/sources/twitter.md b/docs/integrations/sources/twitter.md index 747475763fd..223eb22c095 100644 --- a/docs/integrations/sources/twitter.md +++ b/docs/integrations/sources/twitter.md @@ -22,13 +22,13 @@ To set up the Twitter source connector, you'll need the [App only Bearer Token]( The Twitter source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) -* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) ## Supported Streams -* [Tweets](https://developer.twitter.com/en/docs/twitter-api/tweets/search/api-reference/get-tweets-search-recent) +- [Tweets](https://developer.twitter.com/en/docs/twitter-api/tweets/search/api-reference/get-tweets-search-recent) ## Performance considerations @@ -37,7 +37,7 @@ Rate limiting is mentioned in the API [documentation](https://developer.twitter. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------ | | 0.1.2 | 2023-03-06 | [23749](https://github.com/airbytehq/airbyte/pull/23749) | Spec and docs are improved for beta certification | | 0.1.1 | 2023-03-03 | [23661](https://github.com/airbytehq/airbyte/pull/23661) | Incremental added for the "tweets" stream | | 0.1.0 | 2022-11-01 | [18883](https://github.com/airbytehq/airbyte/pull/18858) | 🎉 New Source: Twitter | diff --git a/docs/integrations/sources/tyntec-sms.md b/docs/integrations/sources/tyntec-sms.md index 1bccf5fa4ab..6ff050b5cc3 100644 --- a/docs/integrations/sources/tyntec-sms.md +++ b/docs/integrations/sources/tyntec-sms.md @@ -12,7 +12,7 @@ A Tyntec SMS API Key and SMS message request ID are required for this connector ### Step 1: Set up a Tyntec SMS connection -1. Create a new Tyntec account [here](https://www.tyntec.com/create-account). +1. Create a new Tyntec account [here](https://www.tyntec.com/create-account). 2. In the left navigation bar, click **API Settings** and navigate to **API Keys** to access your API key. ### Step 2: Set up a Tyntec SMS connector in Airbyte @@ -39,7 +39,7 @@ A Tyntec SMS API Key and SMS message request ID are required for this connector The Tyntec SMS source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): | Feature | Supported? | -|:------------------|:-----------| +| :---------------- | :--------- | | Full Refresh Sync | Yes | | Incremental Sync | No | @@ -60,6 +60,6 @@ The Tyntec SMS connector should not run into limitations under normal usage. Ple ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------| -| 0.1.0 | 2022-11-02 | [18883](https://github.com/airbytehq/airbyte/pull/18883) | 🎉 New Source: Tyntec SMS | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------ | +| 0.1.0 | 2022-11-02 | [18883](https://github.com/airbytehq/airbyte/pull/18883) | 🎉 New Source: Tyntec SMS | diff --git a/docs/integrations/sources/typeform-migrations.md b/docs/integrations/sources/typeform-migrations.md index 9f9726cb55b..e426dd846c5 100644 --- a/docs/integrations/sources/typeform-migrations.md +++ b/docs/integrations/sources/typeform-migrations.md @@ -4,4 +4,4 @@ This version upgrades the connector to the low-code framework for better maintainability. This migration includes a breaking change to the state format of the `responses` stream. -Any connection using the `responses` stream in `incremental` mode will need to be reset after the upgrade to avoid sync failures. \ No newline at end of file +Any connection using the `responses` stream in `incremental` mode will need to be reset after the upgrade to avoid sync failures. diff --git a/docs/integrations/sources/typeform.md b/docs/integrations/sources/typeform.md index 49e142130f6..5d6388c5019 100644 --- a/docs/integrations/sources/typeform.md +++ b/docs/integrations/sources/typeform.md @@ -6,13 +6,15 @@ This page guides you through the process of setting up the Typeform source conne - [Typeform Account](https://www.typeform.com/) - Form IDs (Optional) - If you want to sync data for specific forms, you'll need to have the IDs of those forms. If you want to sync data for all forms in your account you don't need any IDs. Form IDs can be found in the URLs to the forms in Typeform Admin Panel (for example, for URL `https://admin.typeform.com/form/12345/` a `12345` part would your Form ID) - -**For Airbyte Cloud:** + + + **For Airbyte Cloud:** - OAuth + **For Airbyte Open Source:** - Personal Access Token (see [personal access token](https://www.typeform.com/developers/get-started/personal-access-token/)) @@ -23,25 +25,30 @@ This page guides you through the process of setting up the Typeform source conne ### Step 1: Obtain an API token + **For Airbyte Open Source:** To get the API token for your application follow this [steps](https://developer.typeform.com/get-started/personal-access-token/) -* Log in to your account at Typeform. -* In the upper-right corner, in the drop-down menu next to your profile photo, click My Account. -* In the left menu, click Personal tokens. -* Click Generate a new token. -* In the Token name field, type a name for the token to help you identify it. -* Choose needed scopes \(API actions this token can perform - or permissions it has\). See [here](https://www.typeform.com/developers/get-started/scopes/) for more details on scopes. -* Click Generate token. + +- Log in to your account at Typeform. +- In the upper-right corner, in the drop-down menu next to your profile photo, click My Account. +- In the left menu, click Personal tokens. +- Click Generate a new token. +- In the Token name field, type a name for the token to help you identify it. +- Choose needed scopes \(API actions this token can perform - or permissions it has\). See [here](https://www.typeform.com/developers/get-started/scopes/) for more details on scopes. +- Click Generate token. + **For Airbyte Cloud:** This step is not needed in Airbyte Cloud. Skip to the next step. + ### Step 2: Set up the source connector in Airbyte + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces) account. @@ -55,6 +62,7 @@ This step is not needed in Airbyte Cloud. Skip to the next step. + **For Airbyte Open Source:** 1. Go to local Airbyte page. @@ -67,7 +75,7 @@ This step is not needed in Airbyte Cloud. Skip to the next step. ## Supported streams and sync modes | Stream | Key | Incremental | API Link | -|:-----------|-------------|:------------|-----------------------------------------------------------------------------| +| :--------- | ----------- | :---------- | --------------------------------------------------------------------------- | | Forms | id | No | https://developer.typeform.com/create/reference/retrieve-form/ | | Responses | response_id | Yes | https://developer.typeform.com/responses/reference/retrieve-responses | | Webhooks | id | No | https://developer.typeform.com/webhooks/reference/retrieve-webhooks/ | @@ -79,8 +87,8 @@ This step is not needed in Airbyte Cloud. Skip to the next step. Typeform API page size limit per source: -* Forms - 200 -* Responses - 1000 +- Forms - 200 +- Responses - 1000 Connector performs additional API call to fetch all possible `form ids` on an account using [retrieve forms endpoint](https://developer.typeform.com/create/reference/retrieve-forms/) @@ -89,7 +97,7 @@ API rate limits \(2 requests per second\): [https://developer.typeform.com/get-s ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------- | | 1.2.8 | 2024-05-02 | [36667](https://github.com/airbytehq/airbyte/pull/36667) | Schema descriptions | | 1.2.7 | 2024-04-30 | [37599](https://github.com/airbytehq/airbyte/pull/37599) | Changed last_records to last_record | | 1.2.6 | 2024-03-13 | [36164](https://github.com/airbytehq/airbyte/pull/36164) | Unpin CDK version | @@ -109,7 +117,7 @@ API rate limits \(2 requests per second\): [https://developer.typeform.com/get-s | 0.1.11 | 2023-02-20 | [23248](https://github.com/airbytehq/airbyte/pull/23248) | Store cursor value as a string | | 0.1.10 | 2023-01-07 | [16125](https://github.com/airbytehq/airbyte/pull/16125) | Certification to Beta | | 0.1.9 | 2022-08-30 | [16125](https://github.com/airbytehq/airbyte/pull/16125) | Improve `metadata.referer` url parsing | -| 0.1.8 | 2022-08-09 | [15435](https://github.com/airbytehq/airbyte/pull/15435) | Update Forms stream schema | +| 0.1.8 | 2022-08-09 | [15435](https://github.com/airbytehq/airbyte/pull/15435) | Update Forms stream schema | | 0.1.7 | 2022-06-20 | [13935](https://github.com/airbytehq/airbyte/pull/13935) | Update Responses stream schema | | 0.1.6 | 2022-05-23 | [12280](https://github.com/airbytehq/airbyte/pull/12280) | Full Stream Coverage | | 0.1.4 | 2021-12-08 | [8425](https://github.com/airbytehq/airbyte/pull/8425) | Update title, description fields in spec | diff --git a/docs/integrations/sources/unleash.md b/docs/integrations/sources/unleash.md index c696980bec3..eef093d18f5 100644 --- a/docs/integrations/sources/unleash.md +++ b/docs/integrations/sources/unleash.md @@ -10,7 +10,7 @@ To access the API, you will need to sign up for an API token, which should be se ## This Source Supports the Following Streams -* features +- features ## Output schema @@ -34,10 +34,10 @@ For more information around the returned payload, [see that page](https://docs.g ## Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Getting started @@ -53,6 +53,6 @@ The API key that you are assigned is rate-limited. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :----------- |:-----------------------------------------------------------| +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------------------------------ | | 0.1.0 | 2022-11-30 | [#19923](https://github.com/airbytehq/airbyte/pull/19923) | 🎉 New source: Unleash [low-code CDK] | diff --git a/docs/integrations/sources/us-census.md b/docs/integrations/sources/us-census.md index 374c9ac16a1..d1c2c5e38c8 100644 --- a/docs/integrations/sources/us-census.md +++ b/docs/integrations/sources/us-census.md @@ -3,7 +3,9 @@ ## Overview This connector syncs data from the [US Census API](https://www.census.gov/data/developers/guidance/api-user-guide.Example_API_Queries.html) + + ### Output schema This source always outputs a single stream, `us_census_stream`. The output of the stream depends on the configuration of the connector. @@ -16,7 +18,9 @@ This source always outputs a single stream, `us_census_stream`. The output of th | Incremental Sync | No | | SSL connection | Yes | | Namespaces | No | + + ## Getting started ### Requirements diff --git a/docs/integrations/sources/vantage.md b/docs/integrations/sources/vantage.md index 997e065413b..d23d78aded4 100644 --- a/docs/integrations/sources/vantage.md +++ b/docs/integrations/sources/vantage.md @@ -6,17 +6,17 @@ This source can sync data from the [Vantage API](https://vantage.readme.io/refer ## This Source Supports the Following Streams -* Providers: Providers are the highest level API Primitive. A Provider represents either cloud infrastructure provider or a cloud service provider. Some examples of Providers include AWS, GCP or Azure. Providers offer many Services, which is documented below. -* Services: Services are what Providers offer to their customers. A Service is always tied to a Provider. Some examples of Services are EC2 or S3 from a Provider of AWS. A Service has one or more Products offered, which is documented below. -* Products: Products are what Services ultimately price on. Using the example of a Provider of 'AWS' and a Service of 'EC2', Products would be the individual EC2 Instance Types available such as 'm5d.16xlarge' or 'c5.xlarge'. A Product has one or more Prices, which is documented below. -* Reports +- Providers: Providers are the highest level API Primitive. A Provider represents either cloud infrastructure provider or a cloud service provider. Some examples of Providers include AWS, GCP or Azure. Providers offer many Services, which is documented below. +- Services: Services are what Providers offer to their customers. A Service is always tied to a Provider. Some examples of Services are EC2 or S3 from a Provider of AWS. A Service has one or more Products offered, which is documented below. +- Products: Products are what Services ultimately price on. Using the example of a Provider of 'AWS' and a Service of 'EC2', Products would be the individual EC2 Instance Types available such as 'm5d.16xlarge' or 'c5.xlarge'. A Product has one or more Prices, which is documented below. +- Reports ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ### Performance considerations @@ -26,10 +26,10 @@ Vantage APIs are under rate limits for the number of API calls allowed per API k ### Requirements -* Vantage Access token +- Vantage Access token ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :-------------------------------------------------------- | :----------------------------------------- | -| 0.1.0 | 2022-10-30 | [#18665](https://github.com/airbytehq/airbyte/pull/18665) | 🎉 New Source: Vantage API [low-code CDK] | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :---------------------------------------- | +| 0.1.0 | 2022-10-30 | [#18665](https://github.com/airbytehq/airbyte/pull/18665) | 🎉 New Source: Vantage API [low-code CDK] | diff --git a/docs/integrations/sources/victorops.md b/docs/integrations/sources/victorops.md index 8d3df982b85..96b452854c1 100644 --- a/docs/integrations/sources/victorops.md +++ b/docs/integrations/sources/victorops.md @@ -13,27 +13,27 @@ the tables and columns you set up for replication, every time a sync is run. Several output streams are available from this source: -* [Incidents](https://portal.victorops.com/public/api-docs.html#!/Reporting/get_api_reporting_v2_incidents) \(Incremental\) -* [Teams](https://portal.victorops.com/public/api-docs.html#!/Teams/get_api_public_v1_team) -* [Users](https://portal.victorops.com/public/api-docs.html#!/Users/get_api_public_v1_user) +- [Incidents](https://portal.victorops.com/public/api-docs.html#!/Reporting/get_api_reporting_v2_incidents) \(Incremental\) +- [Teams](https://portal.victorops.com/public/api-docs.html#!/Teams/get_api_public_v1_team) +- [Users](https://portal.victorops.com/public/api-docs.html#!/Users/get_api_public_v1_user) If there are more endpoints you'd like Faros AI to support, please [create an issue.](https://github.com/faros-ai/airbyte-connectors/issues/new) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Namespaces | No | +| Feature | Supported? | +| :---------------- | :--------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations The VictorOps source should not run into VictorOps API limitations under normal usage, however your VictorOps account may be limited to a total number of API -calls per month. Please [create an +calls per month. Please [create an issue](https://github.com/faros-ai/airbyte-connectors/issues/new) if you see any rate limit issues that are not automatically retried successfully. @@ -41,14 +41,14 @@ rate limit issues that are not automatically retried successfully. ### Requirements -* VictorOps API ID -* VictorOps API Key +- VictorOps API ID +- VictorOps API Key Please follow the [their documentation for generating a VictorOps API Key](https://help.victorops.com/knowledge-base/api/). ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.23 | 2021-11-17 | [150](https://github.com/faros-ai/airbyte-connectors/pull/150) | Add VictorOps source and Faros destination's conterter | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------------- | :----------------------------------------------------- | +| 0.1.23 | 2021-11-17 | [150](https://github.com/faros-ai/airbyte-connectors/pull/150) | Add VictorOps source and Faros destination's conterter | diff --git a/docs/integrations/sources/visma-economic.md b/docs/integrations/sources/visma-economic.md index b6eb66ca923..77f6e8426ce 100644 --- a/docs/integrations/sources/visma-economic.md +++ b/docs/integrations/sources/visma-economic.md @@ -1,19 +1,21 @@ # Visma e-conomic ## Sync overview + This source collects data from [Visma e-conomic](https://developer.visma.com/api/e-conomic/). At the moment the source only implements full refresh, meaning you will sync all records with every new sync. ## Prerequisites -* Your Visma e-conomic Agreement Grant Token -* Your Visma e-conomic App Secret Token +- Your Visma e-conomic Agreement Grant Token +- Your Visma e-conomic App Secret Token [This page](https://www.e-conomic.com/developer/connect) guides you through the different ways of connecting to the api. -In sort your options are: -* Developer agreement -* Create a free [sandbox account](https://www.e-conomic.dk/regnskabsprogram/demo-alle), valid for 14 days. -* Demo tokens: ``app_secret_token=demo`` and ``agreement_grant_token=demo`` +In sort your options are: + +- Developer agreement +- Create a free [sandbox account](https://www.e-conomic.dk/regnskabsprogram/demo-alle), valid for 14 days. +- Demo tokens: `app_secret_token=demo` and `agreement_grant_token=demo` ## Set up the Visma e-conomic source connector @@ -24,36 +26,32 @@ In sort your options are: 5. Enter **Agreement Grant Token**. 6. Enter **Secret Key**. - - ## This Source Supports the Following Streams -* [accounts](https://restdocs.e-conomic.com/#get-accounts) -* [customers](https://restdocs.e-conomic.com/#get-customers) -* [invoices booked](https://restdocs.e-conomic.com/#get-invoices-booked) -* [invoices booked document](https://restdocs.e-conomic.com/#get-invoices-booked-bookedinvoicenumber) -* [invoices paid](https://restdocs.e-conomic.com/#get-invoices-paid) -* [invoices total](https://restdocs.e-conomic.com/#get-invoices-totals) -* [products](https://restdocs.e-conomic.com/#get-products) +- [accounts](https://restdocs.e-conomic.com/#get-accounts) +- [customers](https://restdocs.e-conomic.com/#get-customers) +- [invoices booked](https://restdocs.e-conomic.com/#get-invoices-booked) +- [invoices booked document](https://restdocs.e-conomic.com/#get-invoices-booked-bookedinvoicenumber) +- [invoices paid](https://restdocs.e-conomic.com/#get-invoices-paid) +- [invoices total](https://restdocs.e-conomic.com/#get-invoices-totals) +- [products](https://restdocs.e-conomic.com/#get-products) For more information about the api see the [E-conomic REST API Documentation](https://restdocs.e-conomic.com/#tl-dr). ### [Sync models](https://docs.airbyte.com/cloud/core-concepts/#connection-sync-modes) -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | No | | - - +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | No | | ## Changelog -| Version | Date | Pull Request | Subject | -| :------ |:-----------|:----------------------------------------------------|:-----------------------------------| -| 0.2.4 | 2024-04-19 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | schema descriptions | -| 0.2.0 | 2023-10-20 | [30991](https://github.com/airbytehq/airbyte/pull/30991) | Migrate to Low-code Framework | -| 0.1.0 | 2022-11-08 | [18595](https://github.com/airbytehq/airbyte/pull/18595) | Adding Visma e-conomic as a source | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37283](https://github.com/airbytehq/airbyte/pull/37283) | schema descriptions | +| 0.2.0 | 2023-10-20 | [30991](https://github.com/airbytehq/airbyte/pull/30991) | Migrate to Low-code Framework | +| 0.1.0 | 2022-11-08 | [18595](https://github.com/airbytehq/airbyte/pull/18595) | Adding Visma e-conomic as a source | diff --git a/docs/integrations/sources/waiteraid.md b/docs/integrations/sources/waiteraid.md index 9e568f63406..7b338d14040 100644 --- a/docs/integrations/sources/waiteraid.md +++ b/docs/integrations/sources/waiteraid.md @@ -7,6 +7,7 @@ This page contains the setup guide and reference information for the Waiteraid s You can find or create authentication tokens within [Waiteraid](https://app.waiteraid.com/api-docs/index.html#auth_call). ## Setup guide + ## Step 1: Set up the Waiteraid connector in Airbyte ### For Airbyte Cloud: @@ -15,15 +16,16 @@ You can find or create authentication tokens within [Waiteraid](https://app.wait 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+new source**. 3. On the Set up the source page, enter the name for the Waiteraid connector and select **Waiteraid** from the Source type dropdown. 4. Enter your `auth_token` - Waiteraid Authentication Token. -5. Enter your `restaurant ID` - The Waiteraid ID of the Restaurant you wanto sync. +5. Enter your `restaurant ID` - The Waiteraid ID of the Restaurant you wanto sync. 6. Click **Set up source**. + ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. +2. Set the name for your source. 3. Enter your `auth_token` - Waiteraid Authentication Token. -4. Enter your `restaurant ID` - The Waiteraid ID of the Restaurant you wanto sync. +4. Enter your `restaurant ID` - The Waiteraid ID of the Restaurant you wanto sync. 5. Click **Set up source**. ## Supported sync modes @@ -36,10 +38,12 @@ The Waiteraid source connector supports the following [sync modes](https://docs. | Incremental Sync | No | | SSL connection | No | | Namespaces | No | + + ## Supported Streams -* [Bookings](https://app.waiteraid.com/api-docs/index.html#api_get_bookings) +- [Bookings](https://app.waiteraid.com/api-docs/index.html#api_get_bookings) ## Data type map @@ -52,6 +56,6 @@ The Waiteraid source connector supports the following [sync modes](https://docs. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| -| 0.1.0 | 2022-10-QQ | [QQQQ](https://github.com/airbytehq/airbyte/pull/QQQQ) | New Source: Waiteraid | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------- | :-------------------- | +| 0.1.0 | 2022-10-QQ | [QQQQ](https://github.com/airbytehq/airbyte/pull/QQQQ) | New Source: Waiteraid | diff --git a/docs/integrations/sources/weatherstack.md b/docs/integrations/sources/weatherstack.md index 8556ac0a492..14603ab9016 100644 --- a/docs/integrations/sources/weatherstack.md +++ b/docs/integrations/sources/weatherstack.md @@ -10,30 +10,30 @@ This source currently has four streams: `current`, `historical`, `forecast`, and ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync - (append only) | Yes | -| Incremental - Append Sync | Yes | -| Namespaces | No | +| Feature | Supported? | +| :-------------------------------- | :--------- | +| Full Refresh Sync - (append only) | Yes | +| Incremental - Append Sync | Yes | +| Namespaces | No | ## Getting started ### Requirements -* An Weatherstack API key -* A city or zip code location for which you want to get weather data -* A historical date to enable the api stream to gather data for a specific date +- An Weatherstack API key +- A city or zip code location for which you want to get weather data +- A historical date to enable the api stream to gather data for a specific date ### Setup guide Visit the [Wetherstack](https://weatherstack.com/) to create a user account and obtain an API key. The current and forecast streams are available with the free plan. ## Rate limiting + The free plan allows 250 calls per month, you won't get beyond these limits with existing Airbyte's sync frequencies. ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-09-08 | [16473](https://github.com/airbytehq/airbyte/pull/16473) | Initial release | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :-------------- | +| 0.1.0 | 2022-09-08 | [16473](https://github.com/airbytehq/airbyte/pull/16473) | Initial release | diff --git a/docs/integrations/sources/webflow.md b/docs/integrations/sources/webflow.md index 12b971cade3..426e51dc970 100644 --- a/docs/integrations/sources/webflow.md +++ b/docs/integrations/sources/webflow.md @@ -36,9 +36,9 @@ If you are interested in learning more about the Webflow API and implementation ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :---------------------------- | -| 0.1.3 | 2022-12-11 | [33315](https://github.com/airbytehq/airbyte/pull/33315) | Updates CDK to latest version and adds additional properties to schema | -| 0.1.2 | 2022-07-14 | [14689](https://github.com/airbytehq/airbyte/pull/14689) | Webflow added IDs to streams | -| 0.1.1 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Updates Spec Documentation URL | -| 0.1.0 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Initial release | \ No newline at end of file +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------- | +| 0.1.3 | 2022-12-11 | [33315](https://github.com/airbytehq/airbyte/pull/33315) | Updates CDK to latest version and adds additional properties to schema | +| 0.1.2 | 2022-07-14 | [14689](https://github.com/airbytehq/airbyte/pull/14689) | Webflow added IDs to streams | +| 0.1.1 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Updates Spec Documentation URL | +| 0.1.0 | 2022-06-22 | [13617](https://github.com/airbytehq/airbyte/pull/13617) | Initial release | diff --git a/docs/integrations/sources/whisky-hunter.md b/docs/integrations/sources/whisky-hunter.md index cc999a84156..2337791fe71 100644 --- a/docs/integrations/sources/whisky-hunter.md +++ b/docs/integrations/sources/whisky-hunter.md @@ -7,20 +7,21 @@ The Whisky Hunter source can sync data from the [Whisky Hunter API](https://whis #### Output schema This source is capable of syncing the following streams: -* `auctions_data` - * Provides stats about specific auctions. -* `auctions_info` - * Provides information and metadata about recurring and one-off auctions. -* `distilleries_info` - * Provides information about distilleries. + +- `auctions_data` + - Provides stats about specific auctions. +- `auctions_info` + - Provides information and metadata about recurring and one-off auctions. +- `distilleries_info` + - Provides information about distilleries. #### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | No | -| Namespaces | No | +| Feature | Supported? | +| :------------------------ | :--------- | +| Full Refresh Sync | Yes | +| Incremental - Append Sync | No | +| Namespaces | No | ### Requirements / Setup Guide @@ -32,6 +33,6 @@ There is no published rate limit. However, since this data updates infrequently, ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-10-12 | [17918](https://github.com/airbytehq/airbyte/pull/17918) | Initial release supporting the Whisky Hunter API | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------- | +| 0.1.0 | 2022-10-12 | [17918](https://github.com/airbytehq/airbyte/pull/17918) | Initial release supporting the Whisky Hunter API | diff --git a/docs/integrations/sources/wikipedia-pageviews.md b/docs/integrations/sources/wikipedia-pageviews.md index 7a47e8d8790..dc90c46882e 100644 --- a/docs/integrations/sources/wikipedia-pageviews.md +++ b/docs/integrations/sources/wikipedia-pageviews.md @@ -48,6 +48,6 @@ The Wikipedia Pageviews source connector supports the following [sync modes](htt ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :-------------------------------------------------------- | :------------- | | 0.1.0 | 2022-10-31 | [#18343](https://github.com/airbytehq/airbyte/pull/18343) | Initial commit | diff --git a/docs/integrations/sources/wordpress.md b/docs/integrations/sources/wordpress.md index 8d70e8fbfe6..7e2e67c3b24 100644 --- a/docs/integrations/sources/wordpress.md +++ b/docs/integrations/sources/wordpress.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The output schema is the same as that of the [Wordpress Database](https://codex.wordpress.org/Database_Description) described here. - diff --git a/docs/integrations/sources/workable.md b/docs/integrations/sources/workable.md index fdcc877d8fe..f114c01fda1 100644 --- a/docs/integrations/sources/workable.md +++ b/docs/integrations/sources/workable.md @@ -23,11 +23,11 @@ You can find or create a Workable access token within the [Workable Integrations ### For Airbyte OSS: 1. Navigate to the Airbyte Open Source dashboard. -2. Set the name for your source. -4. Enter your `api_token` - Workable Access Token. -5. Enter your `account_subdomain` - Sub-domain for your organization on Workable, e.g. https://YOUR_ACCOUNT_SUBDOMAIN.workable.com. -6. Enter your `created_after_date` - The earliest created at date from which you want to sync your Workable data. -7. Click **Set up source**. +2. Set the name for your source. +3. Enter your `api_token` - Workable Access Token. +4. Enter your `account_subdomain` - Sub-domain for your organization on Workable, e.g. https://YOUR_ACCOUNT_SUBDOMAIN.workable.com. +5. Enter your `created_after_date` - The earliest created at date from which you want to sync your Workable data. +6. Click **Set up source**. ## Supported sync modes @@ -42,14 +42,13 @@ The Workable source connector supports the following [sync modes](https://docs.a ## Supported Streams -* [Jobs](https://workable.readme.io/reference/jobs) -* [Candidates](https://workable.readme.io/reference/job-candidates-index) -* [Stages](https://workable.readme.io/reference/stages) -* [Recruiters](https://workable.readme.io/reference/recruiters) +- [Jobs](https://workable.readme.io/reference/jobs) +- [Candidates](https://workable.readme.io/reference/job-candidates-index) +- [Stages](https://workable.readme.io/reference/stages) +- [Recruiters](https://workable.readme.io/reference/recruiters) ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------| -| 0.1.0 | 2022-10-15 | [18033](https://github.com/airbytehq/airbyte/pull/18033) | New Source: Workable | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------- | +| 0.1.0 | 2022-10-15 | [18033](https://github.com/airbytehq/airbyte/pull/18033) | New Source: Workable | diff --git a/docs/integrations/sources/wrike.md b/docs/integrations/sources/wrike.md index 264d92558bb..732a29f2276 100644 --- a/docs/integrations/sources/wrike.md +++ b/docs/integrations/sources/wrike.md @@ -1,20 +1,20 @@ -# Wrike +# Wrike This page guides you through the process of setting up the Wrike source connector. -## Prerequisites +## Prerequisites -* Your [Wrike `Permanent Access Token`](https://help.wrike.com/hc/en-us/community/posts/211849065-Get-Started-with-Wrike-s-API) +- Your [Wrike `Permanent Access Token`](https://help.wrike.com/hc/en-us/community/posts/211849065-Get-Started-with-Wrike-s-API) -## Set up the Wrike source connector +## Set up the Wrike source connector 1. Log into your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte OSS account. -2. Click **Sources** and then click **+ New source**. +2. Click **Sources** and then click **+ New source**. 3. On the Set up the source page, select **Wrike** from the Source type dropdown. 4. Enter a name for your source. 5. For **Permanent Access Token**, enter your [Wrike `Permanent Access Token`](https://help.wrike.com/hc/en-us/community/posts/211849065-Get-Started-with-Wrike-s-API). - - Permissions granted to the permanent token are equal to the permissions of the user who generates the token. + + Permissions granted to the permanent token are equal to the permissions of the user who generates the token. 6. For **Wrike Instance (hostname)**, add the hostname of the Wrike instance you are currently using. This could be `www.wrike.com`, `app-us2.wrike.com`, or anything similar. 7. For **Start date for comments**, enter the date in `YYYY-MM-DDTHH:mm:ssZ` format. The comments added on and after this date will be replicated. If this field is blank, Airbyte will replicate comments from the last seven days. @@ -28,12 +28,12 @@ The Wrike source connector supports on full sync refresh. The Wrike source connector supports the following streams: -* [Tasks](https://developers.wrike.com/api/v4/tasks/)\(Full Refresh\) -* [Customfields](https://developers.wrike.com/api/v4/custom-fields/)\(Full Refresh\) -* [Comments](https://developers.wrike.com/api/v4/comments/)\(Full Refresh\) -* [Contacts](https://developers.wrike.com/api/v4/contacts/)\(Full Refresh\) -* [Folders](https://developers.wrike.com/api/v4/folders-projects/)\(Full Refresh\) -* [Workflows](https://developers.wrike.com/api/v4/workflows/)\(Full Refresh\) +- [Tasks](https://developers.wrike.com/api/v4/tasks/)\(Full Refresh\) +- [Customfields](https://developers.wrike.com/api/v4/custom-fields/)\(Full Refresh\) +- [Comments](https://developers.wrike.com/api/v4/comments/)\(Full Refresh\) +- [Contacts](https://developers.wrike.com/api/v4/contacts/)\(Full Refresh\) +- [Folders](https://developers.wrike.com/api/v4/folders-projects/)\(Full Refresh\) +- [Workflows](https://developers.wrike.com/api/v4/workflows/)\(Full Refresh\) ### Data type mapping @@ -46,8 +46,7 @@ The Wrike connector should not run into Wrike API limitations under normal usage ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------- | | 0.2.1 | 2024-04-30 | [31058](https://github.com/airbytehq/airbyte/pull/31058) | Changed last_records to last_record. Fix schema for stream `workflows` | | 0.2.0 | 2023-10-10 | [31058](https://github.com/airbytehq/airbyte/pull/31058) | Migrate to low code. | | 0.1.0 | 2022-08-16 | [15638](https://github.com/airbytehq/airbyte/pull/15638) | Initial version/release of the connector. | - diff --git a/docs/integrations/sources/xero.md b/docs/integrations/sources/xero.md index 1e049713fcf..3bdf13fa9f6 100644 --- a/docs/integrations/sources/xero.md +++ b/docs/integrations/sources/xero.md @@ -8,6 +8,7 @@ This page contains the setup guide and reference information for the Xero source - Start Date **Required list of scopes to sync all streams:** + - accounting.attachments.read - accounting.budgets.read - accounting.contacts.read @@ -20,15 +21,18 @@ This page contains the setup guide and reference information for the Xero source - offline_access + **For Airbyte Cloud:** - OAuth 2.0 + **For Airbyte Open Source:** Please follow [instruction](https://developer.xero.com/documentation/guides/oauth2/auth-flow/) to obtain all requirements: + - Client ID - Client Secret - Refresh Token @@ -41,6 +45,7 @@ Please follow [instruction](https://developer.xero.com/documentation/guides/oaut ### Step 1: Set up Xero + ### Step 2: Set up the Xero connector in Airbyte **For Airbyte Cloud:** @@ -55,6 +60,7 @@ Please follow [instruction](https://developer.xero.com/documentation/guides/oaut + **For Airbyte Open Source:** 1. Create an application in [Xero development center](https://developer.xero.com/app/manage/). @@ -64,9 +70,9 @@ Please follow [instruction](https://developer.xero.com/documentation/guides/oaut The Xero source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) -* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) -* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) +- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/) +- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append) +- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) ## Supported streams @@ -103,8 +109,8 @@ The connector is restricted by Xero [API rate limits](https://developer.xero.com ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------| -| 0.2.5 | 2024-01-11 | [34154](https://github.com/airbytehq/airbyte/pull/34154) | prepare for airbyte-lib | +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------- | +| 0.2.5 | 2024-01-11 | [34154](https://github.com/airbytehq/airbyte/pull/34154) | prepare for airbyte-lib | | 0.2.4 | 2023-11-24 | [32837](https://github.com/airbytehq/airbyte/pull/32837) | Handle 403 error | | 0.2.3 | 2023-06-19 | [27471](https://github.com/airbytehq/airbyte/pull/27471) | Update CDK to 0.40 | | 0.2.2 | 2023-06-06 | [27007](https://github.com/airbytehq/airbyte/pull/27007) | Update CDK | diff --git a/docs/integrations/sources/yahoo-finance-price.md b/docs/integrations/sources/yahoo-finance-price.md index 47b26b571f8..7d7986cb60a 100644 --- a/docs/integrations/sources/yahoo-finance-price.md +++ b/docs/integrations/sources/yahoo-finance-price.md @@ -4,11 +4,11 @@ The Airbyte Source for [Yahoo Finance Price](https://finance.yahoo.com/) ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :---------------------------- | -| 0.2.4 | 2024-04-19 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | schema descriptions | -| 0.2.0 | 2023-08-22 | [29355](https://github.com/airbytehq/airbyte/pull/29355) | Migrate to no-code framework | -| 0.1.3 | 2022-03-23 | [10563](https://github.com/airbytehq/airbyte/pull/10563) | 🎉 Source Yahoo Finance Price | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37295](https://github.com/airbytehq/airbyte/pull/37295) | schema descriptions | +| 0.2.0 | 2023-08-22 | [29355](https://github.com/airbytehq/airbyte/pull/29355) | Migrate to no-code framework | +| 0.1.3 | 2022-03-23 | [10563](https://github.com/airbytehq/airbyte/pull/10563) | 🎉 Source Yahoo Finance Price | diff --git a/docs/integrations/sources/yandex-metrica.md b/docs/integrations/sources/yandex-metrica.md index f0db3dc99a0..031c4fd6c28 100644 --- a/docs/integrations/sources/yandex-metrica.md +++ b/docs/integrations/sources/yandex-metrica.md @@ -87,11 +87,11 @@ Because of the way API works some syncs may take a long time to finish. Timeout ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------- | -| 1.0.4 | 2024-04-19 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Updating to 0.80.0 CDK | -| 1.0.3 | 2024-04-18 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Manage dependencies with Poetry. | -| 1.0.2 | 2024-04-15 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 1.0.1 | 2024-04-12 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | schema descriptions | -| 1.0.0 | 2023-03-20 | [24188](https://github.com/airbytehq/airbyte/pull/24188) | Migrate to Beta; Change state structure | -| 0.1.0 | 2022-09-09 | [15061](https://github.com/airbytehq/airbyte/pull/15061) | 🎉 New Source: Yandex metrica | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 1.0.4 | 2024-04-19 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Updating to 0.80.0 CDK | +| 1.0.3 | 2024-04-18 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Manage dependencies with Poetry. | +| 1.0.2 | 2024-04-15 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 1.0.1 | 2024-04-12 | [37296](https://github.com/airbytehq/airbyte/pull/37296) | schema descriptions | +| 1.0.0 | 2023-03-20 | [24188](https://github.com/airbytehq/airbyte/pull/24188) | Migrate to Beta; Change state structure | +| 0.1.0 | 2022-09-09 | [15061](https://github.com/airbytehq/airbyte/pull/15061) | 🎉 New Source: Yandex metrica | diff --git a/docs/integrations/sources/yotpo.md b/docs/integrations/sources/yotpo.md index c419b4b4442..6b288d49e87 100644 --- a/docs/integrations/sources/yotpo.md +++ b/docs/integrations/sources/yotpo.md @@ -4,7 +4,7 @@ This page contains the setup guide and reference information for the [Yotpo](htt ## Prerequisites -Access Token (which acts as bearer token) is mandate for this connector to work, It could be generated from the auth token call (ref - https://apidocs.yotpo.com/reference/yotpo-authentication). +Access Token (which acts as bearer token) is mandate for this connector to work, It could be generated from the auth token call (ref - https://apidocs.yotpo.com/reference/yotpo-authentication). ## Setup guide @@ -13,10 +13,10 @@ Access Token (which acts as bearer token) is mandate for this connector to work, - Generate an Yotpo access token via auth endpoint (ref - https://apidocs.yotpo.com/reference/yotpo-authentication) - Setup params (All params are required) - Available params - - access_token: The generated access token - - app_key: Seen at the yotpo settings (ref - https://settings.yotpo.com/#/general_settings) - - start_date: Date filter for eligible streams, enter - - email: Registered email address + - access_token: The generated access token + - app_key: Seen at the yotpo settings (ref - https://settings.yotpo.com/#/general_settings) + - start_date: Date filter for eligible streams, enter + - email: Registered email address ## Step 2: Set up the Yotpo connector in Airbyte @@ -33,7 +33,7 @@ Access Token (which acts as bearer token) is mandate for this connector to work, 1. Navigate to the Airbyte Open Source dashboard. 2. Set the name for your source. 3. Enter your `access_token, app_key, start_date and email`. -5. Click **Set up source**. +4. Click **Set up source**. ## Supported sync modes @@ -66,6 +66,6 @@ Yotpo [API reference](https://api.yotpo.com/v1/) has v1 at present. The connecto ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :----------------------------------------------------- | :------------- | -| 0.1.0 | 2023-04-14 | [Init](https://github.com/airbytehq/airbyte/pull/25532)| Initial commit | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------ | :------------- | +| 0.1.0 | 2023-04-14 | [Init](https://github.com/airbytehq/airbyte/pull/25532) | Initial commit | diff --git a/docs/integrations/sources/younium.md b/docs/integrations/sources/younium.md index f65e0ecc682..9bf93b1e186 100644 --- a/docs/integrations/sources/younium.md +++ b/docs/integrations/sources/younium.md @@ -41,10 +41,10 @@ The Younium source connector supports the following [sync modes](https://docs.ai ## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :--------- | :------------------------------------------------------- |:---------------------------------------------------| -| 0.3.2 | 2024-04-19 | [37298](https://github.com/airbytehq/airbyte/pull/37298) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | -| 0.3.1 | 2024-04-12 | [37298](https://github.com/airbytehq/airbyte/pull/37298) | schema descriptions | -| 0.3.0 | 2023-10-25 | [31690](https://github.com/airbytehq/airbyte/pull/31690) | Migrate to low-code framework | -| 0.2.0 | 2023-03-29 | [24655](https://github.com/airbytehq/airbyte/pull/24655) | Source Younium: Adding Booking and Account streams | -| 0.1.0 | 2022-11-09 | [18758](https://github.com/airbytehq/airbyte/pull/18758) | 🎉 New Source: Younium [python cdk] | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------- | +| 0.3.2 | 2024-04-19 | [37298](https://github.com/airbytehq/airbyte/pull/37298) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | +| 0.3.1 | 2024-04-12 | [37298](https://github.com/airbytehq/airbyte/pull/37298) | schema descriptions | +| 0.3.0 | 2023-10-25 | [31690](https://github.com/airbytehq/airbyte/pull/31690) | Migrate to low-code framework | +| 0.2.0 | 2023-03-29 | [24655](https://github.com/airbytehq/airbyte/pull/24655) | Source Younium: Adding Booking and Account streams | +| 0.1.0 | 2022-11-09 | [18758](https://github.com/airbytehq/airbyte/pull/18758) | 🎉 New Source: Younium [python cdk] | diff --git a/docs/integrations/sources/youtube-analytics.md b/docs/integrations/sources/youtube-analytics.md index b3eaa441609..3cc59ebf389 100644 --- a/docs/integrations/sources/youtube-analytics.md +++ b/docs/integrations/sources/youtube-analytics.md @@ -4,19 +4,19 @@ This page contains the setup guide and reference information for the YouTube Ana ## Prerequisites -YouTube does not start to generate a report until you create a [reporting job](https://developers.google.com/youtube/reporting/v1/reports#step-3:-create-a-reporting-job) for that report. -Airbyte creates a reporting job for your report or uses current reporting job if it's already exists. -The report will be available within 48 hours of creating the reporting job and will be for the day that the job was scheduled. -For example, if you schedule a job on September 1, 2015, then the report for September 1, 2015, will be ready on September 3, 2015. -The report for September 2, 2015, will be posted on September 4, 2015, and so forth. +YouTube does not start to generate a report until you create a [reporting job](https://developers.google.com/youtube/reporting/v1/reports#step-3:-create-a-reporting-job) for that report. +Airbyte creates a reporting job for your report or uses current reporting job if it's already exists. +The report will be available within 48 hours of creating the reporting job and will be for the day that the job was scheduled. +For example, if you schedule a job on September 1, 2015, then the report for September 1, 2015, will be ready on September 3, 2015. +The report for September 2, 2015, will be posted on September 4, 2015, and so forth. Youtube also generates historical data reports covering the 30-day period prior to when you created the job. Airbyte syncs all available historical data too. ## Setup guide + ### Step 1: Set up YouTube Analytics - -* Go to the [YouTube Reporting API dashboard](https://console.cloud.google.com/apis/api/youtubereporting.googleapis.com/overview) in the project for your service user. Enable the API for your account. -* Use your Google account and authorize over Google's OAuth 2.0 on connection setup. Please make sure to grant the following [authorization scope](https://developers.google.com/youtube/reporting/v1/reports#step-1:-retrieve-authorization-credentials): `https://www.googleapis.com/auth/yt-analytics.readonly`. +- Go to the [YouTube Reporting API dashboard](https://console.cloud.google.com/apis/api/youtubereporting.googleapis.com/overview) in the project for your service user. Enable the API for your account. +- Use your Google account and authorize over Google's OAuth 2.0 on connection setup. Please make sure to grant the following [authorization scope](https://developers.google.com/youtube/reporting/v1/reports#step-1:-retrieve-authorization-credentials): `https://www.googleapis.com/auth/yt-analytics.readonly`. ## Step 2: Set up the YouTube Analytics connector in Airbyte @@ -29,61 +29,62 @@ Youtube also generates historical data reports covering the 30-day period prior 5. Log in and Authorize to the Instagram account and click `Set up source`. ### For Airbyte OSS: + 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+new source**. 3. On the Set up the source page, enter the name for the YouTube Analytics connector and select **YouTube Analytics** from the Source type dropdown. 4. Select `client_id` -4. Select `client_secret` -4. Select `refresh_token` -5. Click `Set up source`. +5. Select `client_secret` +6. Select `refresh_token` +7. Click `Set up source`. ## Supported sync modes The YouTube Analytics source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Yes | -| SSL connection | Yes | -| Channel Reports | Yes | +| Feature | Supported? | +| :-------------------- | :---------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Yes | +| SSL connection | Yes | +| Channel Reports | Yes | | Content Owner Reports | Coming soon | -| YouTube Data API | Coming soon | +| YouTube Data API | Coming soon | ## Supported Streams -* [channel_annotations_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-annotations) -* [channel_basic_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-user-activity) -* [channel_cards_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-cards) -* [channel_combined_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-combined) -* [channel_demographics_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-viewer-demographics) -* [channel_device_os_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-device-type-and-operating-system) -* [channel_end_screens_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-end-screens) -* [channel_playback_location_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-playback-locations) -* [channel_province_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-province) -* [channel_sharing_service_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-content-sharing) -* [channel_subtitles_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-subtitles) -* [channel_traffic_source_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-traffic-sources) -* [playlist_basic_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-user-activity) -* [playlist_combined_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-combined) -* [playlist_device_os_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-device-type-and-operating-system) -* [playlist_playback_location_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-playback-locations) -* [playlist_province_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-province) -* [playlist_traffic_source_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-traffic-sources) +- [channel_annotations_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-annotations) +- [channel_basic_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-user-activity) +- [channel_cards_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-cards) +- [channel_combined_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-combined) +- [channel_demographics_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-viewer-demographics) +- [channel_device_os_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-device-type-and-operating-system) +- [channel_end_screens_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-end-screens) +- [channel_playback_location_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-playback-locations) +- [channel_province_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-province) +- [channel_sharing_service_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-content-sharing) +- [channel_subtitles_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-subtitles) +- [channel_traffic_source_a2](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#video-traffic-sources) +- [playlist_basic_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-user-activity) +- [playlist_combined_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-combined) +- [playlist_device_os_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-device-type-and-operating-system) +- [playlist_playback_location_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-playback-locations) +- [playlist_province_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-province) +- [playlist_traffic_source_a1](https://developers.google.com/youtube/reporting/v1/reports/channel_reports#playlist-traffic-sources) ## Performance considerations -* Free requests per day: 20,000 -* Free requests per 100 seconds: 100 -* Free requests per minute: 60 +- Free requests per day: 20,000 +- Free requests per 100 seconds: 100 +- Free requests per minute: 60 Quota usage is not an issue because data is retrieved once and then filtered, sorted, and queried within the application. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------| -| 0.1.4 | 2023-05-22 | [26420](https://github.com/airbytehq/airbyte/pull/26420) | Migrate to advancedAuth | -| 0.1.3 | 2022-09-30 | [17454](https://github.com/airbytehq/airbyte/pull/17454) | Added custom backoff logic | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------- | +| 0.1.4 | 2023-05-22 | [26420](https://github.com/airbytehq/airbyte/pull/26420) | Migrate to advancedAuth | +| 0.1.3 | 2022-09-30 | [17454](https://github.com/airbytehq/airbyte/pull/17454) | Added custom backoff logic | | 0.1.2 | 2022-09-29 | [17399](https://github.com/airbytehq/airbyte/pull/17399) | Fixed `403` error while `check connection` | | 0.1.1 | 2022-08-18 | [15744](https://github.com/airbytehq/airbyte/pull/15744) | Fix `channel_basic_a2` schema fields data type | | 0.1.0 | 2021-11-01 | [7407](https://github.com/airbytehq/airbyte/pull/7407) | Initial Release | diff --git a/docs/integrations/sources/zapier-supported-storage.md b/docs/integrations/sources/zapier-supported-storage.md index 9659bc93293..c1138567e33 100644 --- a/docs/integrations/sources/zapier-supported-storage.md +++ b/docs/integrations/sources/zapier-supported-storage.md @@ -7,7 +7,7 @@ The Zapier Supported Storage Connector can be used to sync your [Zapier](https:/ #### Data type mapping | Integration Type | Airbyte Type | Notes | -|:-----------------|:-------------|:------| +| :--------------- | :----------- | :---- | | `string` | `string` | | | `integer` | `integer` | | | `array` | `array` | | @@ -16,14 +16,13 @@ The Zapier Supported Storage Connector can be used to sync your [Zapier](https:/ ### Requirements -* secret - The Storage by Zapier secret. +- secret - The Storage by Zapier secret. ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------| | +| Version | Date | Pull Request | Subject | +|:--------|:-----------|:---------------------------------------------------------| | | 0.1.3 | 2024-04-19 | [37300](https://github.com/airbytehq/airbyte/pull/37300) | Upgrade to CDK 0.80.0 and manage dependencies with Poetry. | | 0.1.2 | 2024-04-15 | [37300](https://github.com/airbytehq/airbyte/pull/37300) | Base image migration: remove Dockerfile and use the python-connector-base image | | 0.1.1 | 2024-04-12 | [37300](https://github.com/airbytehq/airbyte/pull/37300) | schema descriptions | | 0.1.0 | 2022-10-25 | [18442](https://github.com/airbytehq/airbyte/pull/18442) | Initial release | - diff --git a/docs/integrations/sources/zencart.md b/docs/integrations/sources/zencart.md index 2002f25cb07..f9c6ea81bc3 100644 --- a/docs/integrations/sources/zencart.md +++ b/docs/integrations/sources/zencart.md @@ -15,4 +15,3 @@ Reach out to your service representative or system admin to find the parameters ### Output schema The output schema is the same as that of the [Zencart Database](https://docs.zen-cart.com/dev/schema/) described here. - diff --git a/docs/integrations/sources/zendesk-chat.md b/docs/integrations/sources/zendesk-chat.md index ef641f77cdf..9ec646f07af 100644 --- a/docs/integrations/sources/zendesk-chat.md +++ b/docs/integrations/sources/zendesk-chat.md @@ -80,8 +80,8 @@ The connector is restricted by Zendesk's [requests limitation](https://developer | Version | Date | Pull Request | Subject | | :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------- | -| 0.3.0 | 2024-03-07 | [35867](https://github.com/airbytehq/airbyte/pull/35867) | Migrated to `YamlDeclarativeSource (Low-code)` Airbyte CDK | -| 0.2.2 | 2024-02-12 | [35185](https://github.com/airbytehq/airbyte/pull/35185) | Manage dependencies with Poetry. | +| 0.3.0 | 2024-03-07 | [35867](https://github.com/airbytehq/airbyte/pull/35867) | Migrated to `YamlDeclarativeSource (Low-code)` Airbyte CDK | +| 0.2.2 | 2024-02-12 | [35185](https://github.com/airbytehq/airbyte/pull/35185) | Manage dependencies with Poetry. | | 0.2.1 | 2023-10-20 | [31643](https://github.com/airbytehq/airbyte/pull/31643) | Upgrade base image to airbyte/python-connector-base:1.1.0 | | 0.2.0 | 2023-10-11 | [30526](https://github.com/airbytehq/airbyte/pull/30526) | Use the python connector base image, remove dockerfile and implement build_customization.py | | 0.1.14 | 2023-02-10 | [24190](https://github.com/airbytehq/airbyte/pull/24190) | Fix remove too high min/max from account stream | diff --git a/docs/integrations/sources/zendesk-sell.md b/docs/integrations/sources/zendesk-sell.md index edd602fab4d..5f91904a6c3 100644 --- a/docs/integrations/sources/zendesk-sell.md +++ b/docs/integrations/sources/zendesk-sell.md @@ -10,45 +10,45 @@ This source can sync data for the [Zendesk Sell API](https://developer.zendesk.c This Source is capable of syncing the following core Streams: -* Call Outcomes -* Calls -* Collaborations -* Contacts -* Deal Sources -* Deal Unqualified Reason -* Deals -* Lead Conversions -* Lead Sources -* Lead Unqualified Reason -* Leads -* Loss Reasons -* Notes -* Orders -* Pipelines -* Products -* Stages -* Tags -* Tasks -* Text Messages -* Users -* Visit Outcomes -* Visits +- Call Outcomes +- Calls +- Collaborations +- Contacts +- Deal Sources +- Deal Unqualified Reason +- Deals +- Lead Conversions +- Lead Sources +- Lead Unqualified Reason +- Leads +- Loss Reasons +- Notes +- Orders +- Pipelines +- Products +- Stages +- Tags +- Tasks +- Text Messages +- Users +- Visit Outcomes +- Visits ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | ### Performance considerations @@ -60,8 +60,7 @@ The Zendesk connector should not run into Zendesk API limitations under normal u ### Requirements -* Zendesk Sell API Token - +- Zendesk Sell API Token ### Setup guide @@ -73,10 +72,8 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.0 | 2022-10-27 | [17888](https://github.com/airbytehq/airbyte/pull/17888) | Initial Release | -| 0.1.1 | 2023-08-30 | [29830](https://github.com/airbytehq/airbyte/pull/29830) | Change phone_number in Calls to string (bug in zendesk sell api documentation) | -| 0.2.0 | 2023-10-23 | [31016](https://github.com/airbytehq/airbyte/pull/31016) | Migrated to Low Code CDK | - - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------- | +| 0.1.0 | 2022-10-27 | [17888](https://github.com/airbytehq/airbyte/pull/17888) | Initial Release | +| 0.1.1 | 2023-08-30 | [29830](https://github.com/airbytehq/airbyte/pull/29830) | Change phone_number in Calls to string (bug in zendesk sell api documentation) | +| 0.2.0 | 2023-10-23 | [31016](https://github.com/airbytehq/airbyte/pull/31016) | Migrated to Low Code CDK | diff --git a/docs/integrations/sources/zendesk-sunshine.md b/docs/integrations/sources/zendesk-sunshine.md index dba78a7dadf..8ebf051e5ca 100644 --- a/docs/integrations/sources/zendesk-sunshine.md +++ b/docs/integrations/sources/zendesk-sunshine.md @@ -10,32 +10,32 @@ This source can sync data for the [Zendesk Sunshine API](https://developer.zende This Source is capable of syncing the following core Streams: -* [ObjectTypes](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/resource_types/) -* [ObjectRecords](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/resources/) -* [RelationshipTypes](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/relationship_types/) -* [RelationshipRecords](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/relationships/) -* [ObjectTypePolicies](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/permissions/) -* [Jobs](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/jobs/) +- [ObjectTypes](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/resource_types/) +- [ObjectRecords](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/resources/) +- [RelationshipTypes](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/relationship_types/) +- [RelationshipRecords](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/relationships/) +- [ObjectTypePolicies](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/permissions/) +- [Jobs](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/jobs/) This stream is currently not available because it stores data temporary. -* [Limits](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/limits/) +- [Limits](https://developer.zendesk.com/api-reference/custom-data/custom-objects-api/limits/) ### Data type mapping | Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `string` | `string` | | -| `number` | `number` | | -| `array` | `array` | | -| `object` | `object` | | +| :--------------- | :----------- | :---- | +| `string` | `string` | | +| `number` | `number` | | +| `array` | `array` | | +| `object` | `object` | | ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Sync | Yes | | -| Incremental Sync | Yes | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------- | :------------------- | :---- | +| Full Refresh Sync | Yes | | +| Incremental Sync | Yes | | ### Performance considerations @@ -47,10 +47,11 @@ The Zendesk connector should not run into Zendesk API limitations under normal u ### Requirements -* Zendesk Sunshine API Token +- Zendesk Sunshine API Token OR -* Zendesk Sunshine oauth2.0 application (client_id, client_secret, access_token) + +- Zendesk Sunshine oauth2.0 application (client_id, client_secret, access_token) ### Setup guide @@ -62,14 +63,13 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.2.4 | 2024-04-19 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | schema descriptions | -| 0.2.0 | 2023-08-22 | [29310](https://github.com/airbytehq/airbyte/pull/29310) | Migrate Python CDK to Low Code | -| 0.1.2 | 2023-08-15 | [7976](https://github.com/airbytehq/airbyte/pull/7976) | Fix schemas and tests | -| 0.1.1 | 2021-11-15 | [7976](https://github.com/airbytehq/airbyte/pull/7976) | Add oauth2.0 support | -| 0.1.0 | 2021-07-08 | [4359](https://github.com/airbytehq/airbyte/pull/4359) | Initial Release | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37302](https://github.com/airbytehq/airbyte/pull/37302) | schema descriptions | +| 0.2.0 | 2023-08-22 | [29310](https://github.com/airbytehq/airbyte/pull/29310) | Migrate Python CDK to Low Code | +| 0.1.2 | 2023-08-15 | [7976](https://github.com/airbytehq/airbyte/pull/7976) | Fix schemas and tests | +| 0.1.1 | 2021-11-15 | [7976](https://github.com/airbytehq/airbyte/pull/7976) | Add oauth2.0 support | +| 0.1.0 | 2021-07-08 | [4359](https://github.com/airbytehq/airbyte/pull/4359) | Initial Release | diff --git a/docs/integrations/sources/zendesk-support-migrations.md b/docs/integrations/sources/zendesk-support-migrations.md index c157e43f6ba..bfc55cb67a3 100644 --- a/docs/integrations/sources/zendesk-support-migrations.md +++ b/docs/integrations/sources/zendesk-support-migrations.md @@ -7,4 +7,4 @@ Stream `Deleted Tickets` is removed. You may need to refresh the connection sche ## Upgrading to 1.0.0 `cursor_field` for `Tickets` stream is changed to `generated_timestamp`. -For a smooth migration, data reset and schema refresh are needed. \ No newline at end of file +For a smooth migration, data reset and schema refresh are needed. diff --git a/docs/integrations/sources/zendesk-support.md b/docs/integrations/sources/zendesk-support.md index 85e9ea4e306..4c4d7598f91 100644 --- a/docs/integrations/sources/zendesk-support.md +++ b/docs/integrations/sources/zendesk-support.md @@ -18,12 +18,15 @@ The Zendesk Support source connector supports two authentication methods: - API token + **For Airbyte Cloud:** We highly recommend using OAuth to authenticate your Zendesk Support account, as it simplifies the setup process and allows you to authenticate [directly from the Airbyte UI](#set-up-the-zendesk-support-source-connector). + + **For Airbyte Open Source:** We recommend using an API token to authenticate your Zendesk Support account. Please follow the steps below to generate this key. @@ -55,11 +58,13 @@ If you prefer to authenticate with OAuth for **Airbyte Open Source**, you can fo 4. For **Source name**, enter a name to help you identify this source. 5. You can use OAuth or an API token to authenticate your Zendesk Support account. + - **For Airbyte Cloud**: To authenticate using OAuth, select **OAuth 2.0** from the Authentication dropdown, then click **Authenticate your Zendesk Support account** to sign in with Zendesk Support and authorize your account. - - + + - **For Airbyte Open Source**: To authenticate using an API key, select **API Token** from the Authentication dropdown and enter the API token you generated, as well as the email address associated with your Zendesk Support account. + 6. For **Subdomain**, enter your Zendesk subdomain. This is the subdomain found in your account URL. For example, if your account URL is `https://MY_SUBDOMAIN.zendesk.com/`, then `MY_SUBDOMAIN` is your subdomain. 7. (Optional) For **Start Date**, use the provided datepicker or enter a UTC date and time programmatically in the format `YYYY-MM-DDTHH:mm:ssZ`. The data added on and after this date will be replicated. If this field is left blank, Airbyte will replicate the data for the last two years by default. 8. Click **Set up source** and wait for the tests to complete. @@ -81,7 +86,7 @@ There are two types of incremental sync: 1. Incremental (standard server-side, where API returns only the data updated or generated since the last sync). 2. Client-Side Incremental (API returns all available data and connector filters out only new records). -::: + ::: ## Supported streams @@ -123,10 +128,11 @@ The Zendesk Support source connector supports the following streams: - [UserFields](https://developer.zendesk.com/api-reference/ticketing/users/user_fields/#list-user-fields) ### Deleted Records Support + The Zendesk Support connector fetches deleted records in the following streams: | Stream | Deletion indicator field | -|:-------------------------|:-------------------------| +| :----------------------- | :----------------------- | | **Brands** | `is_deleted` | | **Groups** | `deleted` | | **Organizations** | `deleted_at` | @@ -150,14 +156,14 @@ The Zendesk connector ideally should not run into Zendesk API limitations under ### Troubleshooting -* Check out common troubleshooting issues for the Zendesk Support source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). +- Check out common troubleshooting issues for the Zendesk Support source connector on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | 2.6.3 | 2024-05-02 | [36669](https://github.com/airbytehq/airbyte/pull/36669) | Schema descriptions | | 2.6.2 | 2024-02-05 | [37761](https://github.com/airbytehq/airbyte/pull/37761) | Add stop condition for `Ticket Audits` when recieved old records; Ignore 403 and 404 status codes. | | 2.6.1 | 2024-04-30 | [37723](https://github.com/airbytehq/airbyte/pull/37723) | Add %Y-%m-%dT%H:%M:%S%z to cursor_datetime_formats | diff --git a/docs/integrations/sources/zendesk-talk.md b/docs/integrations/sources/zendesk-talk.md index 0cecfd7e434..bb0e6aa3d38 100644 --- a/docs/integrations/sources/zendesk-talk.md +++ b/docs/integrations/sources/zendesk-talk.md @@ -2,9 +2,9 @@ ## Prerequisites -* Zendesk API Token or Zendesk OAuth Client -* Zendesk Email (For API Token authentication) -* Zendesk Subdomain +- Zendesk API Token or Zendesk OAuth Client +- Zendesk Email (For API Token authentication) +- Zendesk Subdomain ## Setup guide @@ -17,6 +17,7 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces Another option is to use OAuth2.0 for authentication. See [Zendesk docs](https://support.zendesk.com/hc/en-us/articles/4408845965210-Using-OAuth-authentication-with-your-application) for details. + ### Step 2: Set up the Zendesk Talk connector in Airbyte **For Airbyte Cloud:** @@ -25,35 +26,36 @@ Another option is to use OAuth2.0 for authentication. See [Zendesk docs](https:/ 2. In the left navigation bar, click **Sources**. In the top-right corner, click **+new source**. 3. On the Set up the source page, enter the name for the Zendesk Talk connector and select **Zendesk Talk** from the Source type dropdown. 4. Fill in the rest of the fields: - - *Subdomain* - - *Authentication (API Token / OAuth2.0)* - - *Start Date* + - _Subdomain_ + - _Authentication (API Token / OAuth2.0)_ + - _Start Date_ 5. Click **Set up source** ## Supported sync modes The **Zendesk Talk** source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes): -* Full Refresh -* Incremental Sync + +- Full Refresh +- Incremental Sync ## Supported Streams This Source is capable of syncing the following core Streams: -* [Account Overview](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-account-overview) -* [Addresses](https://developer.zendesk.com/rest_api/docs/voice-api/phone_numbers#list-phone-numbers) -* [Agents Activity](https://developer.zendesk.com/rest_api/docs/voice-api/stats#list-agents-activity) -* [Agents Overview](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-agents-overview) -* [Calls](https://developer.zendesk.com/rest_api/docs/voice-api/incremental_exports#incremental-calls-export) \(Incremental sync\) -* [Call Legs](https://developer.zendesk.com/rest_api/docs/voice-api/incremental_exports#incremental-call-legs-export) \(Incremental sync\) -* [Current Queue Activity](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-current-queue-activity) -* [Greeting Categories](https://developer.zendesk.com/rest_api/docs/voice-api/greetings#list-greeting-categories) -* [Greetings](https://developer.zendesk.com/rest_api/docs/voice-api/greetings#list-greetings) -* [IVRs](https://developer.zendesk.com/rest_api/docs/voice-api/ivrs#list-ivrs) -* [IVR Menus](https://developer.zendesk.com/rest_api/docs/voice-api/ivrs#list-ivrs) -* [IVR Routes](https://developer.zendesk.com/rest_api/docs/voice-api/ivr_routes#list-ivr-routes) -* [Phone Numbers](https://developer.zendesk.com/rest_api/docs/voice-api/phone_numbers#list-phone-numbers) +- [Account Overview](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-account-overview) +- [Addresses](https://developer.zendesk.com/rest_api/docs/voice-api/phone_numbers#list-phone-numbers) +- [Agents Activity](https://developer.zendesk.com/rest_api/docs/voice-api/stats#list-agents-activity) +- [Agents Overview](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-agents-overview) +- [Calls](https://developer.zendesk.com/rest_api/docs/voice-api/incremental_exports#incremental-calls-export) \(Incremental sync\) +- [Call Legs](https://developer.zendesk.com/rest_api/docs/voice-api/incremental_exports#incremental-call-legs-export) \(Incremental sync\) +- [Current Queue Activity](https://developer.zendesk.com/rest_api/docs/voice-api/stats#show-current-queue-activity) +- [Greeting Categories](https://developer.zendesk.com/rest_api/docs/voice-api/greetings#list-greeting-categories) +- [Greetings](https://developer.zendesk.com/rest_api/docs/voice-api/greetings#list-greetings) +- [IVRs](https://developer.zendesk.com/rest_api/docs/voice-api/ivrs#list-ivrs) +- [IVR Menus](https://developer.zendesk.com/rest_api/docs/voice-api/ivrs#list-ivrs) +- [IVR Routes](https://developer.zendesk.com/rest_api/docs/voice-api/ivr_routes#list-ivr-routes) +- [Phone Numbers](https://developer.zendesk.com/rest_api/docs/voice-api/phone_numbers#list-phone-numbers) ## Performance considerations @@ -64,7 +66,7 @@ The Zendesk connector should not run into Zendesk API limitations under normal u ## Data type map | Integration Type | Airbyte Type | Notes | -|:-----------------|:-------------|:------| +| :--------------- | :----------- | :---- | | `string` | `string` | | | `number` | `number` | | | `array` | `array` | | @@ -73,7 +75,7 @@ The Zendesk connector should not run into Zendesk API limitations under normal u ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :-------------------------------------------------------------------------- | | 0.2.1 | 2024-05-02 | [36625](https://github.com/airbytehq/airbyte/pull/36625) | Schema descriptions and CDK 0.80.0 | | 0.2.0 | 2024-03-25 | [36459](https://github.com/airbytehq/airbyte/pull/36459) | Unpin CDK version, add record counts in state messages | | 0.1.13 | 2024-03-04 | [35783](https://github.com/airbytehq/airbyte/pull/35783) | Change order of authentication methods in spec | diff --git a/docs/integrations/sources/zenefits.md b/docs/integrations/sources/zenefits.md index 903ca69236d..b32f03751a1 100644 --- a/docs/integrations/sources/zenefits.md +++ b/docs/integrations/sources/zenefits.md @@ -51,11 +51,11 @@ You can replicate the following tables using the Zenefits connector: ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:--------------------| -| 0.2.4 | 2024-04-19 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Updating to 0.80.0 CDK | -| 0.2.3 | 2024-04-18 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Manage dependencies with Poetry. | -| 0.2.2 | 2024-04-15 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.2.1 | 2024-04-12 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | schema descriptions | -| `0.2.0` | 2023-10-29 | [31946](https://github.com/airbytehq/airbyte/pull/31946) | Migrate to Low Code | -| `0.1.0` | 2022-08-24 | [14809](https://github.com/airbytehq/airbyte/pull/14809) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.2.4 | 2024-04-19 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Updating to 0.80.0 CDK | +| 0.2.3 | 2024-04-18 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Manage dependencies with Poetry. | +| 0.2.2 | 2024-04-15 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.2.1 | 2024-04-12 | [37303](https://github.com/airbytehq/airbyte/pull/37303) | schema descriptions | +| `0.2.0` | 2023-10-29 | [31946](https://github.com/airbytehq/airbyte/pull/31946) | Migrate to Low Code | +| `0.1.0` | 2022-08-24 | [14809](https://github.com/airbytehq/airbyte/pull/14809) | Initial Release | diff --git a/docs/integrations/sources/zenloop.md b/docs/integrations/sources/zenloop.md index ba8907872fd..f73a926fe60 100644 --- a/docs/integrations/sources/zenloop.md +++ b/docs/integrations/sources/zenloop.md @@ -3,7 +3,9 @@ This page contains the setup guide and reference information for the Zenloop source connector. ## Prerequisites + + **For Airbyte Cloud:** 1. [Log into your Airbyte Cloud](https://cloud.airbyte.com/workspaces). @@ -11,13 +13,14 @@ This page contains the setup guide and reference information for the Zenloop sou 3. On the Set up the source page, select **Zenloop** from the Source type dropdown. 4. Enter the name for the Zenloop connector. 5. Enter your **API token** -6. For **Date from**, enter the date in YYYY-MM-DDTHH:mm:ssZ format. The data added on and after this date will be replicated. +6. For **Date from**, enter the date in YYYY-MM-DDTHH:mm:ssZ format. The data added on and after this date will be replicated. 7. Enter your **Survey ID**. Zenloop Survey ID. Can be found here. Leave empty to pull answers from all surveys. (Optional) 8. Enter your **Survey Group ID**. Zenloop Survey Group ID. Can be found by pulling All Survey Groups via SurveyGroups stream. Leave empty to pull answers from all survey groups. (Optional) 9. Click **Set up source**. + **For Airbyte Open Source:** 1. Navigate to the Airbyte Open Source dashboard. @@ -25,7 +28,7 @@ This page contains the setup guide and reference information for the Zenloop sou 3. On the Set up the source page, select **Zenloop** from the Source type dropdown. 4. Enter the name for the Zenloop connector. 5. Enter your **API token** -6. For **Date from**, enter the date in YYYY-MM-DDTHH:mm:ssZ format. The data added on and after this date will be replicated. +6. For **Date from**, enter the date in YYYY-MM-DDTHH:mm:ssZ format. The data added on and after this date will be replicated. 7. Enter your **Survey ID**. Zenloop Survey ID. Can be found here. Leave empty to pull answers from all surveys. (Optional) 8. Enter your **Survey Group ID**. Zenloop Survey Group ID. Can be found by pulling All Survey Groups via SurveyGroups stream. Leave empty to pull answers from all survey groups. (Optional) 9. Click **Set up source**. @@ -39,17 +42,17 @@ The Zenloop source connector supports the following [sync modes](https://docs.ai | :---------------- | :------------------- | | Full Refresh Sync | Yes | | Incremental Sync | Yes | -| Namespaces | No | +| Namespaces | No | ## Supported Streams This Source is capable of syncing the following core Streams: -* [Answers](https://docs.zenloop.com/reference#get-answers) \(Incremental\) -* [Surveys](https://docs.zenloop.com/reference#get-list-of-surveys) -* [AnswersSurveyGroup](https://docs.zenloop.com/reference#get-answers-for-survey-group) \(Incremental\) -* [SurveyGroups](https://docs.zenloop.com/reference#get-list-of-survey-groups) -* [Properties](https://docs.zenloop.com/reference#get-list-of-properties) +- [Answers](https://docs.zenloop.com/reference#get-answers) \(Incremental\) +- [Surveys](https://docs.zenloop.com/reference#get-list-of-surveys) +- [AnswersSurveyGroup](https://docs.zenloop.com/reference#get-answers-for-survey-group) \(Incremental\) +- [SurveyGroups](https://docs.zenloop.com/reference#get-list-of-survey-groups) +- [Properties](https://docs.zenloop.com/reference#get-list-of-properties) The `Answers`, `AnswersSurveyGroup` and `Properties` stream respectively have an optional survey_id parameter that can be set by filling the `public_hash_id` field of the connector configuration. If not provided answers for all surveys (groups) will be pulled. @@ -69,20 +72,20 @@ The Zenloop connector should not run into Zenloop API limitations under normal u ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------| :------------------------------------------------------- |:--------------------------------------------------------------------| -| 0.1.14 | 2024-04-19 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Updating to 0.80.0 CDK | -| 0.1.13 | 2024-04-18 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Manage dependencies with Poetry. | -| 0.1.12 | 2024-04-15 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Base image migration: remove Dockerfile and use the python-connector-base image | -| 0.1.11 | 2024-04-12 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | schema descriptions | -| 0.1.10 | 2023-06-29 | [27838](https://github.com/airbytehq/airbyte/pull/27838) | Update CDK version to avoid bug introduced during data feed release | -| 0.1.9 | 2023-06-28 | [27761](https://github.com/airbytehq/airbyte/pull/27761) | Update following state breaking changes | -| 0.1.8 | 2023-06-22 | [27243](https://github.com/airbytehq/airbyte/pull/27243) | Improving error message on state discrepancy | -| 0.1.7 | 2023-06-22 | [27243](https://github.com/airbytehq/airbyte/pull/27243) | State per partition (breaking change - require reset) | -| 0.1.6 | 2023-03-06 | [23231](https://github.com/airbytehq/airbyte/pull/23231) | Publish using low-code CDK Beta version | -| 0.1.5 | 2023-02-08 | [0](https://github.com/airbytehq/airbyte/pull/0) | Fix unhashable type in ZenloopSubstreamSlicer component | -| 0.1.4 | 2022-11-18 | [19624](https://github.com/airbytehq/airbyte/pull/19624) | Migrate to low code | -| 0.1.3 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream states | -| 0.1.2 | 2022-08-22 | [15843](https://github.com/airbytehq/airbyte/pull/15843) | Adds Properties stream | -| 0.1.1 | 2021-10-26 | [8299](https://github.com/airbytehq/airbyte/pull/8299) | Fix missing seed files | -| 0.1.0 | 2021-10-26 | [7380](https://github.com/airbytehq/airbyte/pull/7380) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------------ | +| 0.1.14 | 2024-04-19 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Updating to 0.80.0 CDK | +| 0.1.13 | 2024-04-18 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Manage dependencies with Poetry. | +| 0.1.12 | 2024-04-15 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | Base image migration: remove Dockerfile and use the python-connector-base image | +| 0.1.11 | 2024-04-12 | [37304](https://github.com/airbytehq/airbyte/pull/37304) | schema descriptions | +| 0.1.10 | 2023-06-29 | [27838](https://github.com/airbytehq/airbyte/pull/27838) | Update CDK version to avoid bug introduced during data feed release | +| 0.1.9 | 2023-06-28 | [27761](https://github.com/airbytehq/airbyte/pull/27761) | Update following state breaking changes | +| 0.1.8 | 2023-06-22 | [27243](https://github.com/airbytehq/airbyte/pull/27243) | Improving error message on state discrepancy | +| 0.1.7 | 2023-06-22 | [27243](https://github.com/airbytehq/airbyte/pull/27243) | State per partition (breaking change - require reset) | +| 0.1.6 | 2023-03-06 | [23231](https://github.com/airbytehq/airbyte/pull/23231) | Publish using low-code CDK Beta version | +| 0.1.5 | 2023-02-08 | [0](https://github.com/airbytehq/airbyte/pull/0) | Fix unhashable type in ZenloopSubstreamSlicer component | +| 0.1.4 | 2022-11-18 | [19624](https://github.com/airbytehq/airbyte/pull/19624) | Migrate to low code | +| 0.1.3 | 2022-09-28 | [17304](https://github.com/airbytehq/airbyte/pull/17304) | Migrate to per-stream states | +| 0.1.2 | 2022-08-22 | [15843](https://github.com/airbytehq/airbyte/pull/15843) | Adds Properties stream | +| 0.1.1 | 2021-10-26 | [8299](https://github.com/airbytehq/airbyte/pull/8299) | Fix missing seed files | +| 0.1.0 | 2021-10-26 | [7380](https://github.com/airbytehq/airbyte/pull/7380) | Initial Release | diff --git a/docs/integrations/sources/zoho-crm.md b/docs/integrations/sources/zoho-crm.md index 11cfdb994e4..3690b5ecf96 100644 --- a/docs/integrations/sources/zoho-crm.md +++ b/docs/integrations/sources/zoho-crm.md @@ -10,9 +10,9 @@ Airbyte uses [REST API](https://www.zoho.com/crm/developer/docs/api/v2/modules-a This Source is capable of syncing: -* standard modules available in Zoho CRM account -* custom modules manually added by user, available in Zoho CRM account -* custom fields in both standard and custom modules, available in Zoho CRM account +- standard modules available in Zoho CRM account +- custom modules manually added by user, available in Zoho CRM account +- custom fields in both standard and custom modules, available in Zoho CRM account The discovering of Zoho CRM module schema is made dynamically based on Metadata API and should generally take no longer than 10 to 30 seconds. @@ -21,12 +21,12 @@ The discovering of Zoho CRM module schema is made dynamically based on Metadata Some of Zoho CRM Modules may not be available for sync due to limitations of Zoho CRM Edition or permissions scope. For details refer to the [Scopes](https://www.zoho.com/crm/developer/docs/api/v2/scopes.html) section in the Zoho CRM documentation. Connector streams and schemas are built dynamically on top of Metadata that is available from the REST API - please see [Modules API](https://www.zoho.com/crm/developer/docs/api/v2/modules-api.html), [Modules Metadata API](https://www.zoho.com/crm/developer/docs/api/v2/module-meta.html), [Fields Metadata API](https://www.zoho.com/crm/developer/docs/api/v2/field-meta.html). -The list of available streams is the list of Modules as long as Module Metadata is available for each of them from the Zoho CRM API, and Fields Metadata is available for each of the fields. If a module you want to sync is not available from this connector, it's because the Zoho CRM API does not make it available. +The list of available streams is the list of Modules as long as Module Metadata is available for each of them from the Zoho CRM API, and Fields Metadata is available for each of the fields. If a module you want to sync is not available from this connector, it's because the Zoho CRM API does not make it available. ### Data type mapping | Integration Type | Airbyte Type | Notes | -|:----------------------|:-------------|:--------------------------| +| :-------------------- | :----------- | :------------------------ | | `boolean` | `boolean` | | | `double` | `number` | | | `currency` | `number` | | @@ -56,7 +56,7 @@ Any other data type not listed in the table above will be treated as `string`. ### Features | Feature | Supported? \(Yes/No\) | -|:------------------------------------------|:----------------------| +| :---------------------------------------- | :-------------------- | | Full Refresh Overwrite Sync | Yes | | Full Refresh Append Sync | Yes | | Incremental - Append Sync | Yes | @@ -68,7 +68,7 @@ Any other data type not listed in the table above will be treated as `string`. ### Production | Environment | Base URL | -|:------------|:------------------------| +| :---------- | :---------------------- | | US | https://zohoapis.com | | AU | https://zohoapis.com.au | | EU | https://zohoapis.eu | @@ -79,7 +79,7 @@ Any other data type not listed in the table above will be treated as `string`. ### Sandbox | Environment | Endpoint | -|:------------|:--------------------------------| +| :---------- | :------------------------------ | | US | https://sandbox.zohoapis.com | | AU | https://sandbox.zohoapis.com.au | | EU | https://sandbox.zohoapis.eu | @@ -89,14 +89,14 @@ Any other data type not listed in the table above will be treated as `string`. ### Developer -| Environment | Endpoint | -|:------------|:-----------------------------------| -| US | https://developer.zohoapis.com | -| AU | https://developer.zohoapis.com.au | -| EU | https://developer.zohoapis.eu | -| IN | https://developer.zohoapis.in | -| CN | https://developer.zohoapis.com.cn | -| JP | https://developer.zohoapis.jp | +| Environment | Endpoint | +| :---------- | :-------------------------------- | +| US | https://developer.zohoapis.com | +| AU | https://developer.zohoapis.com.au | +| EU | https://developer.zohoapis.eu | +| IN | https://developer.zohoapis.in | +| CN | https://developer.zohoapis.com.cn | +| JP | https://developer.zohoapis.jp | For more information about available environments, please visit [this page](https://www.zoho.com/crm/developer/sandbox.html?src=dev-hub) @@ -124,12 +124,12 @@ To set up a connection with a Zoho CRM source, you will need to choose start syn ### Create Refresh Token For generating the refresh token, please refer to [this page](https://www.zoho.com/crm/developer/docs/api/v2/access-refresh.html). -Make sure to complete the auth flow quickly, as the initial token granted by Zoho CRM is only live for a few minutes before it can no longer be used to generate a refresh token. +Make sure to complete the auth flow quickly, as the initial token granted by Zoho CRM is only live for a few minutes before it can no longer be used to generate a refresh token. ## Changelog | Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------| +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------- | | 0.1.2 | 2023-03-09 | [23906](https://github.com/airbytehq/airbyte/pull/23906) | added support for the latest CDK, fixed SAT | | 0.1.1 | 2023-03-13 | [23818](https://github.com/airbytehq/airbyte/pull/23818) | Set airbyte type to string for zoho autonumbers when they include prefix or suffix | | 0.1.0 | 2022-03-30 | [11193](https://github.com/airbytehq/airbyte/pull/11193) | Initial release | diff --git a/docs/integrations/sources/zoom-migrations.md b/docs/integrations/sources/zoom-migrations.md index e334aefa166..0f26ad5c202 100644 --- a/docs/integrations/sources/zoom-migrations.md +++ b/docs/integrations/sources/zoom-migrations.md @@ -32,23 +32,23 @@ The type of the 'meeting_id' field in Meeting Registration Questions stream has #### Refresh affected schemas and reset data 1. Select **Connections** in the main nav bar. - 1. Select the connection affected by the update. + 1. Select the connection affected by the update. 2. Select the **Replication** tab. - 1. Select **Refresh source schema**. - 2. Select **OK**. + 1. Select **Refresh source schema**. + 2. Select **OK**. :::note Any detected schema changes will be listed for your review. ::: 3. Select **Save changes** at the bottom of the page. - 1. Ensure the **Reset affected streams** option is checked. + 1. Ensure the **Reset affected streams** option is checked. :::note Depending on destination type you may not be prompted to reset your data. ::: -4. Select **Save connection**. +4. Select **Save connection**. :::note This will reset the data in your destination and initiate a fresh sync. diff --git a/docs/integrations/sources/zoom.md b/docs/integrations/sources/zoom.md index dcff151dedd..53a10b12a0f 100644 --- a/docs/integrations/sources/zoom.md +++ b/docs/integrations/sources/zoom.md @@ -2,7 +2,6 @@ ## Overview - The following connector allows airbyte users to fetch various meetings & webinar data points from the [Zoom](https://zoom.us) source. This connector is built entirely using the [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview/). Please note that currently, it only supports Full Refresh syncs. That is, every time a sync is run, Airbyte will copy all rows in the tables and columns you set up for replication into the destination in a new table. @@ -11,37 +10,37 @@ Please note that currently, it only supports Full Refresh syncs. That is, every Currently this source supports the following output streams/endpoints from Zoom: -* [Users](https://marketplace.zoom.us/docs/api-reference/zoom-api/users/users) -* [Meetings](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetings) - * [Meeting Registrants](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingregistrants) - * [Meeting Polls](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingpolls) - * [Meeting Poll Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/listpastmeetingpolls) - * [Meeting Questions](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingregistrantsquestionsget) -* [Webinars](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinars) - * [Webinar Panelists](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarpanelists) - * [Webinar Registrants](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarregistrants) - * [Webinar Absentees](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarabsentees) - * [Webinar Polls](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarpolls) - * [Webinar Poll Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/listpastwebinarpollresults) - * [Webinar Questions](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarregistrantsquestionsget) - * [Webinar Tracking Sources](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/gettrackingsources) - * [Webinar Q&A Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/listpastwebinarqa) -* [Report Meetings](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportmeetingdetails) -* [Report Meeting Participants](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportmeetingparticipants) -* [Report Webinars](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportwebinardetails) -* [Report Webinar Participants](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportwebinarparticipants) +- [Users](https://marketplace.zoom.us/docs/api-reference/zoom-api/users/users) +- [Meetings](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetings) + - [Meeting Registrants](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingregistrants) + - [Meeting Polls](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingpolls) + - [Meeting Poll Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/listpastmeetingpolls) + - [Meeting Questions](https://marketplace.zoom.us/docs/api-reference/zoom-api/meetings/meetingregistrantsquestionsget) +- [Webinars](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinars) + - [Webinar Panelists](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarpanelists) + - [Webinar Registrants](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarregistrants) + - [Webinar Absentees](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarabsentees) + - [Webinar Polls](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarpolls) + - [Webinar Poll Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/listpastwebinarpollresults) + - [Webinar Questions](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/webinarregistrantsquestionsget) + - [Webinar Tracking Sources](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/gettrackingsources) + - [Webinar Q&A Results](https://marketplace.zoom.us/docs/api-reference/zoom-api/webinars/listpastwebinarqa) +- [Report Meetings](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportmeetingdetails) +- [Report Meeting Participants](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportmeetingparticipants) +- [Report Webinars](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportwebinardetails) +- [Report Webinar Participants](https://marketplace.zoom.us/docs/api-reference/zoom-api/reports/reportwebinarparticipants) If there are more endpoints you'd like Airbyte to support, please [create an issue.](https://github.com/airbytehq/airbyte/issues/new/choose) ### Features -| Feature | Supported? | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental Sync | Coming soon | +| Feature | Supported? | +| :---------------------------- | :---------- | +| Full Refresh Sync | Yes | +| Incremental Sync | Coming soon | | Replicate Incremental Deletes | Coming soon | -| SSL connection | Yes | -| Namespaces | No | +| SSL connection | Yes | +| Namespaces | No | ### Performance considerations @@ -53,9 +52,10 @@ Please [create an issue](https://github.com/airbytehq/airbyte/issues) if you see ### Requirements -* Zoom Server-to-Server Oauth App +- Zoom Server-to-Server Oauth App ### Setup guide + Please read [How to generate your Server-to-Server OAuth app ](https://developers.zoom.us/docs/internal-apps/s2s-oauth/). :::info @@ -66,9 +66,9 @@ JWT Tokens are deprecated, only Server-to-Server works now. [link to Zoom](https ## Changelog -| Version | Date | Pull Request | Subject | -|:--------|:-----------|:---------------------------------------------------------| :-----------------------------------------------------| -| 1.1.0 | 2024-02-22 | [35369](https://github.com/airbytehq/airbyte/pull/35369) | Publish S2S Oauth connector with fixed authenticator | -| 1.0.0 | 2023-7-28 | [25308](https://github.com/airbytehq/airbyte/pull/25308) | Replace JWT Auth methods with server-to-server Oauth | -| 0.1.1 | 2022-11-30 | [19939](https://github.com/airbytehq/airbyte/pull/19939) | Upgrade CDK version to fix bugs with SubStreamSlicer | -| 0.1.0 | 2022-10-25 | [18179](https://github.com/airbytehq/airbyte/pull/18179) | Initial Release | +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------- | +| 1.1.0 | 2024-02-22 | [35369](https://github.com/airbytehq/airbyte/pull/35369) | Publish S2S Oauth connector with fixed authenticator | +| 1.0.0 | 2023-7-28 | [25308](https://github.com/airbytehq/airbyte/pull/25308) | Replace JWT Auth methods with server-to-server Oauth | +| 0.1.1 | 2022-11-30 | [19939](https://github.com/airbytehq/airbyte/pull/19939) | Upgrade CDK version to fix bugs with SubStreamSlicer | +| 0.1.0 | 2022-10-25 | [18179](https://github.com/airbytehq/airbyte/pull/18179) | Initial Release | diff --git a/docs/integrations/sources/zuora.md b/docs/integrations/sources/zuora.md index b0c5f019d96..5e4f4d9b06e 100644 --- a/docs/integrations/sources/zuora.md +++ b/docs/integrations/sources/zuora.md @@ -24,9 +24,9 @@ Airbyte uses [REST API](https://www.zuora.com/developer/api-reference/#section/I This Source is capable of syncing: -* standard objects available in Zuora account -* custom objects manually added by user, available in Zuora Account -* custom fields in both standard and custom objects, available in Zuora Account +- standard objects available in Zuora account +- custom objects manually added by user, available in Zuora Account +- custom fields in both standard and custom objects, available in Zuora Account The discovering of Zuora Account objects schema may take a while, if you add the connection for the first time, and/or you need to refresh your list of available streams. Please take your time to wait and don't cancel this operation, usually it takes up to 5-10 min, depending on number of objects available in Zuora Account. @@ -36,83 +36,83 @@ Some of the Zuora Objects may not be available for sync due to limitations of Zu ### Data type mapping -| Integration Type | Airbyte Type | Notes | -| :--- | :--- | :--- | -| `decimal(22,9)` | `number` | float number | -| `decimal` | `number` | float number | -| `float` | `number` | float number | -| `double` | `number` | float number | -| `integer` | `number` | | -| `int` | `number` | | -| `bigint` | `number` | | -| `smallint` | `number` | | -| `timestamp` | `number` | number representation of the unix timestamp | -| `date` | `string` | | -| `datetime` | `string` | | -| `timestamp with time zone` | `string` | | -| `picklist` | `string` | | -| `text` | `string` | | -| `varchar` | `string` | | -| `zoql` | `object` | | -| `binary` | `object` | | -| `json` | `object` | | -| `xml` | `object` | | -| `blob` | `object` | | -| `list` | `array` | | -| `array` | `array` | | -| `boolean` | `boolean` | | -| `bool` | `boolean` | | +| Integration Type | Airbyte Type | Notes | +| :------------------------- | :----------- | :------------------------------------------ | +| `decimal(22,9)` | `number` | float number | +| `decimal` | `number` | float number | +| `float` | `number` | float number | +| `double` | `number` | float number | +| `integer` | `number` | | +| `int` | `number` | | +| `bigint` | `number` | | +| `smallint` | `number` | | +| `timestamp` | `number` | number representation of the unix timestamp | +| `date` | `string` | | +| `datetime` | `string` | | +| `timestamp with time zone` | `string` | | +| `picklist` | `string` | | +| `text` | `string` | | +| `varchar` | `string` | | +| `zoql` | `object` | | +| `binary` | `object` | | +| `json` | `object` | | +| `xml` | `object` | | +| `blob` | `object` | | +| `list` | `array` | | +| `array` | `array` | | +| `boolean` | `boolean` | | +| `bool` | `boolean` | | Any other data type not listed in the table above will be treated as `string`. ### Features -| Feature | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Full Refresh Overwrite Sync | Yes | | -| Full Refresh Append Sync | Yes | | -| Incremental - Append Sync | Yes | | -| Incremental - Append + Deduplication Sync | Yes | | -| Namespaces | No | | +| Feature | Supported?\(Yes/No\) | Notes | +| :---------------------------------------- | :------------------- | :---- | +| Full Refresh Overwrite Sync | Yes | | +| Full Refresh Append Sync | Yes | | +| Incremental - Append Sync | Yes | | +| Incremental - Append + Deduplication Sync | Yes | | +| Namespaces | No | | ## Supported Environments for Zuora -| Environment | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| Production | Yes | Select from exising options while setup | -| Sandbox | Yes | Select from exising options while setup | +| Environment | Supported?\(Yes/No\) | Notes | +| :---------- | :------------------- | :-------------------------------------- | +| Production | Yes | Select from exising options while setup | +| Sandbox | Yes | Select from exising options while setup | ## Supported Data Query options -| Option | Supported?\(Yes/No\) | Notes | -| :--- | :--- | :--- | -| LIVE | Yes | Run data queries against Zuora live transactional databases | -| UNLIMITED | Yes | Run data queries against an optimized, replicated database at 12 hours freshness for high volume extraction use cases (Early Adoption, additionall access required, contact [Zuora Support](http://support.zuora.com/hc/en-us) in order to request this feature enabled for your account beforehand.) | +| Option | Supported?\(Yes/No\) | Notes | +| :-------- | :------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| LIVE | Yes | Run data queries against Zuora live transactional databases | +| UNLIMITED | Yes | Run data queries against an optimized, replicated database at 12 hours freshness for high volume extraction use cases (Early Adoption, additionall access required, contact [Zuora Support](http://support.zuora.com/hc/en-us) in order to request this feature enabled for your account beforehand.) | ## List of Supported Environments for Zuora ### Production -| Environment | Endpoint | -| :--- | :--- | -| US Production | rest.zuora.com | +| Environment | Endpoint | +| :------------------ | :---------------- | +| US Production | rest.zuora.com | | US Cloud Production | rest.na.zuora.com | -| EU Production | rest.eu.zuora.com | +| EU Production | rest.eu.zuora.com | ### Sandbox -| Environment | Endpoint | -| :--- | :--- | -| US API Sandbox | rest.apisandbox.zuora.com | +| Environment | Endpoint | +| :------------------- | :------------------------ | +| US API Sandbox | rest.apisandbox.zuora.com | | US Cloud API Sandbox | rest.sandbox.na.zuora.com | -| US Central Sandbox | rest.test.zuora.com | -| EU API Sandbox | rest.sandbox.eu.zuora.com | -| EU Central Sandbox | rest.test.eu.zuora.com | +| US Central Sandbox | rest.test.zuora.com | +| EU API Sandbox | rest.sandbox.eu.zuora.com | +| EU Central Sandbox | rest.test.eu.zuora.com | ### Other -| Environment | Endpoint | -| :--- | :--- | +| Environment | Endpoint | +| :------------------ | :----------------- | | US Performance Test | rest.pt1.zuora.com | For more information about available environments, please visit [this page](https://knowledgecenter.zuora.com/BB_Introducing_Z_Business/D_Zuora_Environments) @@ -121,8 +121,8 @@ For more information about available environments, please visit [this page](http If you experience the long time for sync operation, please consider: -* to increase the `window_in_days` parameter inside Zuora source configuration -* use the smaller date range by tuning `start_date` parameter. +- to increase the `window_in_days` parameter inside Zuora source configuration +- use the smaller date range by tuning `start_date` parameter. ### Note @@ -159,10 +159,9 @@ Usually, the very first sync operation for all of the objects inside Zuora accou ## Changelog -| Version | Date | Pull Request | Subject | -| :--- | :--- | :--- | :--- | -| 0.1.3 | 2021-10-16 | [7053](https://github.com/airbytehq/airbyte/pull/7093) | Added support of `Unlimited` option for `Data Query` | -| 0.1.2 | 2021-10-11 | [6960](https://github.com/airbytehq/airbyte/pull/6960) | Change minimum value for `Window_in_days` to 1, instead of 30 | -| 0.1.1 | 2021-10-01 | [6575](https://github.com/airbytehq/airbyte/pull/6575) | Added OAuth support for Airbyte Cloud | -| 0.1.0 | 2021-08-01 | [4661](https://github.com/airbytehq/airbyte/pull/4661) | Initial release of Native Zuora connector for Airbyte | - +| Version | Date | Pull Request | Subject | +| :------ | :--------- | :----------------------------------------------------- | :------------------------------------------------------------ | +| 0.1.3 | 2021-10-16 | [7053](https://github.com/airbytehq/airbyte/pull/7093) | Added support of `Unlimited` option for `Data Query` | +| 0.1.2 | 2021-10-11 | [6960](https://github.com/airbytehq/airbyte/pull/6960) | Change minimum value for `Window_in_days` to 1, instead of 30 | +| 0.1.1 | 2021-10-01 | [6575](https://github.com/airbytehq/airbyte/pull/6575) | Added OAuth support for Airbyte Cloud | +| 0.1.0 | 2021-08-01 | [4661](https://github.com/airbytehq/airbyte/pull/4661) | Initial release of Native Zuora connector for Airbyte | diff --git a/docs/operating-airbyte/security.md b/docs/operating-airbyte/security.md index ae224b3ad75..1d761fd271c 100644 --- a/docs/operating-airbyte/security.md +++ b/docs/operating-airbyte/security.md @@ -50,9 +50,9 @@ You can secure access to Airbyte using the following methods: listen 443 ssl; server_name airbyte..com; client_max_body_size 200M; # required for Airbyte API - ssl_certificate .crt.pem; + ssl_certificate .crt.pem; ssl_certificate_key .key.pem; - + location / { proxy_pass http://127.0.0.1:8000; proxy_set_header Cookie $http_cookie; # if you use Airbytes basic auth @@ -60,7 +60,7 @@ You can secure access to Airbyte using the following methods: } } ``` -- *Only for docker compose deployments:* Change the default username and password in your environment's `.env` file: +- _Only for docker compose deployments:_ Change the default username and password in your environment's `.env` file: ``` # Proxy Configuration # Set to empty values, e.g. "" to disable basic auth @@ -105,22 +105,23 @@ Depending on your [data residency](https://docs.airbyte.com/cloud/managing-airby #### United States and Airbyte Default GCP region: us-west3 -* 34.106.109.131 -* 34.106.196.165 -* 34.106.60.246 -* 34.106.229.69 -* 34.106.127.139 -* 34.106.218.58 -* 34.106.115.240 -* 34.106.225.141 + +- 34.106.109.131 +- 34.106.196.165 +- 34.106.60.246 +- 34.106.229.69 +- 34.106.127.139 +- 34.106.218.58 +- 34.106.115.240 +- 34.106.225.141 #### European Union AWS region: eu-west-3 -* 13.37.4.46 -* 13.37.142.60 -* 35.181.124.238 +- 13.37.4.46 +- 13.37.142.60 +- 35.181.124.238 ### Credential management @@ -145,7 +146,7 @@ Airbyte Cloud supports [user management](/using-airbyte/workspaces.md#add-users- Our compliance efforts for Airbyte Cloud include: - SOC 2 Type II assessment: An independent third-party completed a SOC2 Type II assessment and found effective operational controls in place. Independent third-party audits will continue at a regular cadence, and the most recent report is available upon request. -- ISO 27001 certification: We received our ISO 27001 certification in November 2022. A copy of the certificate is available upon request. +- ISO 27001 certification: We received our ISO 27001 certification in November 2022. A copy of the certificate is available upon request. - Assessments and penetration tests: We use tools provided by the Cloud platforms as well as third-party assessments and penetration tests. ## Reporting Vulnerabilities​ diff --git a/docs/operator-guides/browsing-output-logs.md b/docs/operator-guides/browsing-output-logs.md index d4afd258c22..a57009974e2 100644 --- a/docs/operator-guides/browsing-output-logs.md +++ b/docs/operator-guides/browsing-output-logs.md @@ -4,35 +4,38 @@ products: all # Browsing logs -Airbyte records the full logs as a part of each sync. These logs can be used to understand the underlying operations Airbyte performs to read data from the source and write to the destination as a part of the [Airbyte Protocol](/understanding-airbyte/airbyte-protocol.md). The logs includes many details, including any errors that can be helpful when troubleshooting sync errors. +Airbyte records the full logs as a part of each sync. These logs can be used to understand the underlying operations Airbyte performs to read data from the source and write to the destination as a part of the [Airbyte Protocol](/understanding-airbyte/airbyte-protocol.md). The logs includes many details, including any errors that can be helpful when troubleshooting sync errors. :::info When using Airbyte Open Source, you can also access additional logs outside of the UI. This is useful if you need to browse the Docker volumes where extra output files of Airbyte server and workers are stored. ::: -To find the logs for a connection, navigate to a connection's `Job History` tab to see the latest syncs. +To find the logs for a connection, navigate to a connection's `Job History` tab to see the latest syncs. ## View the logs in the UI + To open the logs in the UI, select the three grey dots next to a sync and select `View logs`. This will open our full screen in-app log viewer. :::tip -If you are troubleshooting a sync error, you can search for `Error`, `Exception`, or `Fail` to find common errors. +If you are troubleshooting a sync error, you can search for `Error`, `Exception`, or `Fail` to find common errors. ::: The in-app log viewer will only search for instances of the search term within that attempt. To search across all attempts, download the logs locally. ## Link to a sync job + To help others quickly find your job, copy the link to the logs to your clipboard, select the three grey dots next to a sync and select `Copy link to job`. You can also access the link to a sync job from the in-app log viewer. ## Download the logs + To download a copy of the logs locally, select the three grey dots next to a sync and select `Download logs`. -You can also access the download log button from the in-app log viewer. +You can also access the download log button from the in-app log viewer. :::note -If a sync was completed across multiple attempts, downloading the logs will union all the logs for all attempts for that job. +If a sync was completed across multiple attempts, downloading the logs will union all the logs for all attempts for that job. ::: ## Exploring Local Logs @@ -57,7 +60,7 @@ Following [Docker Volume documentation](https://docs.docker.com/storage/volumes/ ### Opening a Unix shell prompt to browse the Docker volume -For example, we can run any docker container/image to browse the content of this named volume by mounting it similarly. In the example below, the [busybox](https://hub.docker.com/_/busybox) image is used. +For example, we can run any docker container/image to browse the content of this named volume by mounting it similarly. In the example below, the [busybox](https://hub.docker.com/_/busybox) image is used. ```text docker run -it --rm --volume airbyte_workspace:/data busybox @@ -122,6 +125,7 @@ cat catalog.json If you are running on Kubernetes, use the following commands instead to browsing and copy the files to your local. To browse, identify the pod you are interested in and exec into it. You will be presented with a terminal that will accept normal linux commands e.g ls. + ```bash kubectl exec -it -n -c main bash e.g. @@ -131,6 +135,7 @@ FINISHED_UPLOADING destination_catalog.json destination_config.json ``` To copy the file on to your local in order to preserve it's contents: + ```bash kubectl cp /:/config/destination_catalog.json ./catalog.json e.g. @@ -138,7 +143,6 @@ kubectl cp jobs/normalization-worker-3605-0-sxtox:/config/destination_catalog.js cat ./catalog.json ``` - ## CSV or JSON local Destinations: Check local data folder If you setup a pipeline using one of the local File based destinations \(CSV or JSON\), Airbyte is writing the resulting files containing the data in the special `/local/` directory in the container. By default, this volume is mounted from `/tmp/airbyte_local` on the host machine. So you need to navigate to this [local folder](file:///tmp/airbyte_local/) on the filesystem of the machine running the Airbyte deployment to retrieve the local data files. @@ -184,8 +188,8 @@ Note that Docker for Mac is not a real Docker host, now it actually runs a virtu Here are some related links as references on accessing Docker Volumes: -* on macOS [Using Docker containers in 2019](https://stackoverflow.com/a/55648186) -* official doc [Use Volume](https://docs.docker.com/storage/volumes/#backup-restore-or-migrate-data-volumes) +- on macOS [Using Docker containers in 2019](https://stackoverflow.com/a/55648186) +- official doc [Use Volume](https://docs.docker.com/storage/volumes/#backup-restore-or-migrate-data-volumes) From these discussions, we've been using on macOS either: @@ -199,4 +203,3 @@ docker volume inspect ``` Then look at the `Mountpoint` value, this is where the volume is actually stored in the host filesystem and you can directly retrieve files directly from that folder. - diff --git a/docs/operator-guides/collecting-metrics.md b/docs/operator-guides/collecting-metrics.md index a1203fc5191..ccf8a7bf762 100644 --- a/docs/operator-guides/collecting-metrics.md +++ b/docs/operator-guides/collecting-metrics.md @@ -4,19 +4,18 @@ products: oss-* # Monitoring Airbyte - Airbyte offers you various ways to monitor your ELT pipelines. These options range from using open-source tools to integrating with enterprise-grade SaaS platforms. Here's a quick overview: -* Connection Logging: All Airbyte instances provide extensive logs for each connector, giving detailed reports on the data synchronization process. This is available across all Airbyte offerings. -* [Airbyte Datadog Integration](#airbyte-datadog-integration): Airbyte customers can leverage our integration with Datadog. This lets you monitor and analyze your data pipelines right within your Datadog dashboards at no additional cost. -* [Airbyte OpenTelemetry (OTEL) Integration](#airbyte-opentelemetry-integration): This allows you to push metrics to your self-hosted monitoring solution using OpenTelemetry. + +- Connection Logging: All Airbyte instances provide extensive logs for each connector, giving detailed reports on the data synchronization process. This is available across all Airbyte offerings. +- [Airbyte Datadog Integration](#airbyte-datadog-integration): Airbyte customers can leverage our integration with Datadog. This lets you monitor and analyze your data pipelines right within your Datadog dashboards at no additional cost. +- [Airbyte OpenTelemetry (OTEL) Integration](#airbyte-opentelemetry-integration): This allows you to push metrics to your self-hosted monitoring solution using OpenTelemetry. Please browse the sections below for more details on each option and how to set it up. ## Airbyte Datadog Integration - :::info Monitoring your Airbyte instance using Datadog is an early preview feature and still in development. Expect changes to this feature and the configuration to happen in the future. This feature will be @@ -32,7 +31,6 @@ This integration brings forth new `airbyte.*` metrics along with new dashboards. Setting up this integration for Airbyte instances deployed with Docker involves five straightforward steps: - 1. **Set Datadog Airbyte Config**: Create or configure the `datadog.yaml` file with the contents below: ```yaml @@ -95,7 +93,7 @@ dogstatsd_mapper_profiles: name: "airbyte.cron.jobs_run" ``` -2. **Add Datadog Agent and Mount Config:** If the Datadog Agent is not yet deployed to your instances running Airbyte, you can modify the provided `docker-compose.yaml` file in the Airbyte repository to include the Datadog Agent. For the Datadog agent to submit metrics, you will need to add an [API key](https://docs.datadoghq.com/account_management/api-app-keys/#add-an-api-key-or-client-token). Then, be sure to properly mount your `datadog.yaml` file as a Docker volume: +2. **Add Datadog Agent and Mount Config:** If the Datadog Agent is not yet deployed to your instances running Airbyte, you can modify the provided `docker-compose.yaml` file in the Airbyte repository to include the Datadog Agent. For the Datadog agent to submit metrics, you will need to add an [API key](https://docs.datadoghq.com/account_management/api-app-keys/#add-an-api-key-or-client-token). Then, be sure to properly mount your `datadog.yaml` file as a Docker volume: ```yaml dd-agent: @@ -119,19 +117,19 @@ dogstatsd_mapper_profiles: 3. **Update Docker Compose Configuration**: Modify your `docker-compose.yaml` file in the Airbyte repository to include the `metrics-reporter` container. This submits Airbyte metrics to the Datadog Agent: ```yaml - metric-reporter: - image: airbyte/metrics-reporter:${VERSION} - container_name: metric-reporter - networks: - - airbyte_internal - environment: - - DATABASE_PASSWORD=${DATABASE_PASSWORD} - - DATABASE_URL=${DATABASE_URL} - - DATABASE_USER=${DATABASE_USER} - - DD_AGENT_HOST=${DD_AGENT_HOST} - - DD_DOGSTATSD_PORT=${DD_DOGSTATSD_PORT} - - METRIC_CLIENT=${METRIC_CLIENT} - - PUBLISH_METRICS=${PUBLISH_METRICS} +metric-reporter: + image: airbyte/metrics-reporter:${VERSION} + container_name: metric-reporter + networks: + - airbyte_internal + environment: + - DATABASE_PASSWORD=${DATABASE_PASSWORD} + - DATABASE_URL=${DATABASE_URL} + - DATABASE_USER=${DATABASE_USER} + - DD_AGENT_HOST=${DD_AGENT_HOST} + - DD_DOGSTATSD_PORT=${DD_DOGSTATSD_PORT} + - METRIC_CLIENT=${METRIC_CLIENT} + - PUBLISH_METRICS=${PUBLISH_METRICS} ``` 4. **Set Environment Variables**: Amend your `.env` file with the correct values needed by `docker-compose.yaml`: @@ -145,46 +143,43 @@ DD_DOGSTATSD_PORT=8125 5. **Re-deploy Airbyte and the Datadog Agent**: With the updated configurations, you're ready to deploy your Airbyte application by running `docker compose up`. - ## Airbyte OpenTelemetry Integration - ### Docker Compose Setup Instructions Setting up this integration for Airbyte instances deployed with Docker Compose involves four straightforward steps: - 1. **Deploy an OpenTelemetry Collector**: Follow the official [Docker Compose Getting Started documentation](https://opentelemetry.io/docs/collector/getting-started/#docker-compose). ```yaml - otel-collector: - image: otel/opentelemetry-collector-contrib - volumes: - - ./otel-collector-config.yaml:/etc/otelcol-contrib/config.yaml - ports: - - 1888:1888 # pprof extension - - 8888:8888 # Prometheus metrics exposed by the collector - - 8889:8889 # Prometheus exporter metrics - - 13133:13133 # health_check extension - - 4317:4317 # OTLP gRPC receiver - - 4318:4318 # OTLP http receiver - - 55679:55679 # zpages extension +otel-collector: + image: otel/opentelemetry-collector-contrib + volumes: + - ./otel-collector-config.yaml:/etc/otelcol-contrib/config.yaml + ports: + - 1888:1888 # pprof extension + - 8888:8888 # Prometheus metrics exposed by the collector + - 8889:8889 # Prometheus exporter metrics + - 13133:13133 # health_check extension + - 4317:4317 # OTLP gRPC receiver + - 4318:4318 # OTLP http receiver + - 55679:55679 # zpages extension ``` 2. **Update Docker Compose Configuration**: Modify your `docker-compose.yaml` file in the Airbyte repository to include the `metrics-reporter` container. This submits Airbyte metrics to the OpenTelemetry collector: ```yaml - metric-reporter: - image: airbyte/metrics-reporter:${VERSION} - container_name: metric-reporter - networks: - - airbyte_internal - environment: - - DATABASE_PASSWORD=${DATABASE_PASSWORD} - - DATABASE_URL=${DATABASE_URL} - - DATABASE_USER=${DATABASE_USER} - - METRIC_CLIENT=${METRIC_CLIENT} - - OTEL_COLLECTOR_ENDPOINT=${OTEL_COLLECTOR_ENDPOINT} +metric-reporter: + image: airbyte/metrics-reporter:${VERSION} + container_name: metric-reporter + networks: + - airbyte_internal + environment: + - DATABASE_PASSWORD=${DATABASE_PASSWORD} + - DATABASE_URL=${DATABASE_URL} + - DATABASE_USER=${DATABASE_USER} + - METRIC_CLIENT=${METRIC_CLIENT} + - OTEL_COLLECTOR_ENDPOINT=${OTEL_COLLECTOR_ENDPOINT} ``` 3. **Set Environment Variables**: Amend your `.env` file with the correct values needed by `docker-compose.yaml`: diff --git a/docs/operator-guides/configuring-airbyte-db.md b/docs/operator-guides/configuring-airbyte-db.md index 9adc4c881b5..26f027dfb28 100644 --- a/docs/operator-guides/configuring-airbyte-db.md +++ b/docs/operator-guides/configuring-airbyte-db.md @@ -6,18 +6,18 @@ products: oss-* Airbyte uses different objects to store internal state and metadata. This data is stored and manipulated by the various Airbyte components, but you have the ability to manage the deployment of this database in the following two ways: -* Using the default Postgres database that Airbyte spins-up as part of the Docker service described in the `docker-compose.yml` file: `airbyte/db`. -* Through a dedicated custom Postgres instance \(the `airbyte/db` is in this case unused, and can therefore be removed or de-activated from the `docker-compose.yml` file\). It's not a good practice to deploy mission-critical databases on Docker or Kubernetes. -Using a dedicated instance will provide more reliability to your Airbyte deployment. -Moreover, using a Cloud-managed Postgres instance (such as AWS RDS our GCP Cloud SQL), you will benefit from automatic backup and fine-grained sizing. You can start with a pretty small instance, but according to your Airbyte usage, the job database might grow and require more storage if you are not truncating the job history. +- Using the default Postgres database that Airbyte spins-up as part of the Docker service described in the `docker-compose.yml` file: `airbyte/db`. +- Through a dedicated custom Postgres instance \(the `airbyte/db` is in this case unused, and can therefore be removed or de-activated from the `docker-compose.yml` file\). It's not a good practice to deploy mission-critical databases on Docker or Kubernetes. + Using a dedicated instance will provide more reliability to your Airbyte deployment. + Moreover, using a Cloud-managed Postgres instance (such as AWS RDS our GCP Cloud SQL), you will benefit from automatic backup and fine-grained sizing. You can start with a pretty small instance, but according to your Airbyte usage, the job database might grow and require more storage if you are not truncating the job history. The various entities are persisted in two internal databases: -* Job database - * Data about executions of Airbyte Jobs and various runtime metadata. - * Data about the internal orchestrator used by Airbyte, Temporal.io \(Tasks, Workflow data, Events, and visibility data\). -* Config database - * Connectors, Sync Connections and various Airbyte configuration objects. +- Job database + - Data about executions of Airbyte Jobs and various runtime metadata. + - Data about the internal orchestrator used by Airbyte, Temporal.io \(Tasks, Workflow data, Events, and visibility data\). +- Config database + - Connectors, Sync Connections and various Airbyte configuration objects. Note that no actual data from the source \(or destination\) connectors ever transits or is retained in this internal database. @@ -74,10 +74,10 @@ This step is only required when you setup Airbyte with a custom database for the If you provide an empty database to Airbyte and start Airbyte up for the first time, the server will automatically create the relevant tables in your database, and copy the data. Please make sure: -* The database exists in the server. -* The user has both read and write permissions to the database. -* The database is empty. - * If the database is not empty, and has a table that shares the same name as one of the Airbyte tables, the server will assume that the database has been initialized, and will not copy the data over, resulting in server failure. If you run into this issue, just wipe out the database, and launch the server again. +- The database exists in the server. +- The user has both read and write permissions to the database. +- The database is empty. + - If the database is not empty, and has a table that shares the same name as one of the Airbyte tables, the server will assume that the database has been initialized, and will not copy the data over, resulting in server failure. If you run into this issue, just wipe out the database, and launch the server again. ## Accessing the default database located in docker airbyte-db @@ -99,13 +99,13 @@ The following command will allow you to access the database instance using `psql docker exec -ti airbyte-db psql -U docker -d airbyte ``` -Following tables are created +Following tables are created + 1. `workspace` : Contains workspace information such as name, notification configuration, etc. -2. `actor_definition` : Contains the source and destination connector definitions. +2. `actor_definition` : Contains the source and destination connector definitions. 3. `actor` : Contains source and destination connectors information. 4. `actor_oauth_parameter` : Contains source and destination oauth parameters. 5. `operation` : Contains dbt and custom normalization operations. 6. `connection` : Contains connection configuration such as catalog details, source, destination, etc. 7. `connection_operation` : Contains the operations configured for a given connection. 8. `state`. Contains the last saved state for a connection. - diff --git a/docs/operator-guides/configuring-airbyte.md b/docs/operator-guides/configuring-airbyte.md index e8ed355ec52..cb51f8f5ca8 100644 --- a/docs/operator-guides/configuring-airbyte.md +++ b/docs/operator-guides/configuring-airbyte.md @@ -21,7 +21,7 @@ If you want to manage your own docker files, please refer to Airbyte's docker fi The recommended way to run an [Airbyte Kubernetes deployment](../deploying-airbyte/on-kubernetes-via-helm.md) is via the `Helm Charts`. -To configure the Airbyte Kubernetes deployment you need to modify the `values.yaml` file, more [info here](../deploying-airbyte/on-kubernetes-via-helm.md#custom-deployment). +To configure the Airbyte Kubernetes deployment you need to modify the `values.yaml` file, more [info here](../deploying-airbyte/on-kubernetes-via-helm.md#custom-deployment). Each application will consume the appropriate values from that file. If you want to manage your own Kube manifests, please refer to the `Helm Chart`. @@ -80,7 +80,7 @@ The following variables are relevant to both Docker and Kubernetes. #### Jobs -1. `SYNC_JOB_MAX_ATTEMPTS` - Defines the number of attempts a sync will attempt before failing. *Legacy - this is superseded by the values below* +1. `SYNC_JOB_MAX_ATTEMPTS` - Defines the number of attempts a sync will attempt before failing. _Legacy - this is superseded by the values below_ 2. `SYNC_JOB_RETRIES_COMPLETE_FAILURES_MAX_SUCCESSIVE` - Defines the max number of successive attempts in which no data was synchronized before failing the job. 3. `SYNC_JOB_RETRIES_COMPLETE_FAILURES_MAX_TOTAL` - Defines the max number of attempts in which no data was synchronized before failing the job. 4. `SYNC_JOB_RETRIES_COMPLETE_FAILURES_BACKOFF_MIN_INTERVAL_S` - Defines the minimum backoff interval in seconds between failed attempts in which no data was synchronized. diff --git a/docs/operator-guides/configuring-connector-resources.md b/docs/operator-guides/configuring-connector-resources.md index 20c03a8dc9b..9fffb77f92f 100644 --- a/docs/operator-guides/configuring-connector-resources.md +++ b/docs/operator-guides/configuring-connector-resources.md @@ -9,6 +9,7 @@ As noted in [Workers & Jobs](../understanding-airbyte/jobs.md), there are four d Although it is possible to configure resources for all four jobs, we focus on Sync jobs as it is the most frequently run job. There are three different ways to configure connector resource requirements for a Sync: + 1. Instance-wide - applies to all containers in a Sync. 2. Connector-specific - applies to all containers of that connector type in a Sync. 3. Connection-specific - applies to all containers of that connection in a Sync. @@ -16,6 +17,7 @@ There are three different ways to configure connector resource requirements for In general, **the narrower scope the requirement, the higher the precedence**. In decreasing order of precedence: + 1. Connection-specific - Highest precedence. Overrides all other configuration. We recommend using this on a case-by-case basis. 2. Connector-specific - Second-highest precedence. Overrides instance-wide configuration. Mostly for internal Airbyte-use. We recommend staying away from this. 3. Instance-wide - Lowest precedence. Overridden by all other configuration. Intended to be a default. We recommend setting this as a baseline. @@ -23,7 +25,8 @@ In decreasing order of precedence: ## Configuring Instance-Wide Requirements Instance-wide requirements are the simplest requirement to configure. All that is needed is to set the following env vars: -1. `JOB_MAIN_CONTAINER_CPU_REQUEST` - Define the job container's minimum CPU usage. Units follow either Docker or Kubernetes, depending on the deployment. Defaults to none. + +1. `JOB_MAIN_CONTAINER_CPU_REQUEST` - Define the job container's minimum CPU usage. Units follow either Docker or Kubernetes, depending on the deployment. Defaults to none. 2. `JOB_MAIN_CONTAINER_CPU_LIMIT` - Define the job container's maximum CPU usage. Units follow either Docker or Kubernetes, depending on the deployment. Defaults to none. 3. `JOB_MAIN_CONTAINER_MEMORY_REQUEST` - Define the job container's minimum RAM usage. Units follow either Docker or Kubernetes, depending on the deployment. Defaults to none. 4. `JOB_MAIN_CONTAINER_MEMORY_LIMIT` - Define the job container's maximum RAM usage. Units follow either Docker or Kubernetes, depending on the deployment. Defaults to none. @@ -31,10 +34,13 @@ Instance-wide requirements are the simplest requirement to configure. All that i ## Configuring Connector-Specific Requirements 1. Connect to the database and run the following query with the image name replaced to find the connector definition id. + ```sql select * from actor_definition where actor_definition.docker_repository like '%'; ``` + 2. Run the following commend with the resource requirements and the connection definition id filled in. + ```sql update actor_definition set resource_requirements = '{"jobSpecific": [{"jobType": "sync", "resourceRequirements": {"cpu_limit": "0.5", "cpu_request": "0.5", "memory_limit": "500Mi", "memory_request": "500Mi"}}]}' where id = ''; ``` @@ -46,6 +52,7 @@ update actor_definition set resource_requirements = '{"jobSpecific": [{"jobType" If the url is `localhost:8000/workspaces/92ad8c0e-d204-4bb4-9c9e-30fe25614eee/connections/5432b428-b04a-4562-a12b-21c7b9e8b63a/status`, the connection id is `5432b428-b04a-4562-a12b-21c7b9e8b63a`. 2. Connect to the database and run the following command with the connection id and resource requirements filled in. + ```sql // SQL command with example update connection set resource_requirements = '{"cpu_limit": "0.5", "cpu_request": "0.5", "memory_limit": "500Mi", "memory_request": "500Mi"}' where id = ''; @@ -58,11 +65,13 @@ Airbyte logs the resource requirements as part of the job logs as containers are If a job is running out-of-memory, simply navigate to the Job in the UI, and look for the log to confirm the right configuration is being detected. On Docker, the log will look something like this: + ``` Creating docker container = destination-e2e-test-write-39-0-vnqtl with resources io.airbyte.config.ResourceRequirements@1d86d7c9[cpuRequest=,cpuLimit=,memoryRequest=200Mi,memoryLimit=200Mi] ``` On Kubernetes, the log will look something like this: + ``` 2022-08-12 01:22:20 INFO i.a.w.p.KubeProcessFactory(create):100 - Attempting to start pod = source-intercom-check-480195-0-abvnr for airbyte/source-intercom:0.1.24 with resources io.airbyte.config.ResourceRequirements@11cc9fb9[cpuRequest=2,cpuLimit=2,memoryRequest=200Mi,memoryLimit=200Mi] ``` diff --git a/docs/operator-guides/reset.md b/docs/operator-guides/reset.md index 565c4357131..dcb3f4e48d3 100644 --- a/docs/operator-guides/reset.md +++ b/docs/operator-guides/reset.md @@ -4,36 +4,38 @@ products: all # Clearing your data -From time-to-time, you may want to erase all of the data that Airbyte has created in your destination. This can be accomplished by clearing your data. In order to backfill all historical data, a sync should be initiated after your clear succeeds. +From time-to-time, you may want to erase all of the data that Airbyte has created in your destination. This can be accomplished by clearing your data. In order to backfill all historical data, a sync should be initiated after your clear succeeds. Note that there is no way to recover from a clear sync, so please be certain that you wish to erase all the data in your destination. :::warning -Not all sources keep their history forever. If you clear your data, and your source does not retain all of its records, this may lead to data loss. +Not all sources keep their history forever. If you clear your data, and your source does not retain all of its records, this may lead to data loss. ::: -A Clear can be triggered either from the UI or Airbyte API. Airbyte allows you to clear all streams in the connection or only a single stream through the UI. You may also be prompted to clear some streams when making configuration changes that apply to multiple streams. Airbyte additionally supports the clearing of multiple streams through the API. +A Clear can be triggered either from the UI or Airbyte API. Airbyte allows you to clear all streams in the connection or only a single stream through the UI. You may also be prompted to clear some streams when making configuration changes that apply to multiple streams. Airbyte additionally supports the clearing of multiple streams through the API. ## Steps to Clear Data + To perform a full removal of the data for all your streams, navigate to a connection's `Settings` tab and click "Clear data". Confirm the selection to remove all previously synced data from the destination for that connection. -To clear data for a single stream, navigate to a Connection's status page, click the three grey dots next to any stream, and select "Clear data". This will clear the data for just that stream. You will then need to sync the connection again in order to reload data for that stream. +To clear data for a single stream, navigate to a Connection's status page, click the three grey dots next to any stream, and select "Clear data". This will clear the data for just that stream. You will then need to sync the connection again in order to reload data for that stream. -You will also automatically be prompted to clear affected streams if you edit any stream settings or approve any non-breaking schema changes. To ensure data continues to sync accurately, Airbyte recommends doing a clear of those streams as your streams could sync incorrectly if a clear is not performed. +You will also automatically be prompted to clear affected streams if you edit any stream settings or approve any non-breaking schema changes. To ensure data continues to sync accurately, Airbyte recommends doing a clear of those streams as your streams could sync incorrectly if a clear is not performed. -Similarly to a sync, a clear can be completed as successful, failed, or cancelled. To resolve a failed clearing of data, you should manually drop the tables in the destination so that Airbyte can continue syncing accurately into the destination. +Similarly to a sync, a clear can be completed as successful, failed, or cancelled. To resolve a failed clearing of data, you should manually drop the tables in the destination so that Airbyte can continue syncing accurately into the destination. In order to backfill all historical data, a sync should be initiated after your clear succeeds. :::note -A single stream clear will sync all enabled streams on the next sync. +A single stream clear will sync all enabled streams on the next sync. ::: ## Clear behavior + When clearing data is successfully completed, all the records are deleted from your destination tables (and files, if using local JSON or local CSV as the destination). The tables or files are not removed, they will only be emptied. -Clearing your data causes data downtime, meaning that your final tables will be empty once the Clear is complete. Clearing your data also blocks the running of regularly-scheduled syncs until they are complete. If you choose to clear your data while another sync is running, it will enqueue, and start at the end of the currently running sync. +Clearing your data causes data downtime, meaning that your final tables will be empty once the Clear is complete. Clearing your data also blocks the running of regularly-scheduled syncs until they are complete. If you choose to clear your data while another sync is running, it will enqueue, and start at the end of the currently running sync. :::tip If you have any orphaned tables or files that are no longer being synced to, they should be cleaned up separately, as Airbyte will not clean them up for you. This can occur when the `Destination Namespace` or `Stream Prefix` connection configuration is changed for an existing connection. -::: \ No newline at end of file +::: diff --git a/docs/operator-guides/scaling-airbyte.md b/docs/operator-guides/scaling-airbyte.md index 9c80cdbff37..b7b7ad58e67 100644 --- a/docs/operator-guides/scaling-airbyte.md +++ b/docs/operator-guides/scaling-airbyte.md @@ -35,7 +35,6 @@ You may want to customize this by setting `JOB_MAIN_CONTAINER_MEMORY_REQUEST` an Note that all Source database connectors are Java connectors. This means that users currently need to over-specify memory resource for Java connectors. - ### Disk Space Airbyte uses backpressure to try to read the minimal amount of logs required. In the past, disk space was a large concern, but we've since deprecated the expensive on-disk queue approach. diff --git a/docs/operator-guides/telemetry.md b/docs/operator-guides/telemetry.md index 813cedca9ed..d96341430ef 100644 --- a/docs/operator-guides/telemetry.md +++ b/docs/operator-guides/telemetry.md @@ -18,6 +18,7 @@ Also check our [privacy policy](https://airbyte.com/privacy-policy) for more det ``` TRACKING_STRATEGY=logging ``` + When visiting the webapp or our homepage the first time, you'll be asked for your consent to @@ -26,6 +27,7 @@ Also check our [privacy policy](https://airbyte.com/privacy-policy) for more det To change this later go to **Settings** > **User Settings** > **Cookie Preferences** or **Cookie Preferences** in the footer of our [homepage](https://airbyte.com). Server side telemetry collection can't be changed using Airbyte Cloud. + When running [PyAirbyte](https://docs.airbyte.com/pyairbyte) for the first time on a new machine, you'll be informed that anonymous @@ -42,5 +44,6 @@ Also check our [privacy policy](https://airbyte.com/privacy-policy) for more det You can opt-out of anonymous usage reporting by setting the environment variable `DO_NOT_TRACK` to any value. + diff --git a/docs/operator-guides/transformation-and-normalization/README.md b/docs/operator-guides/transformation-and-normalization/README.md index ba728b732b1..191b9077179 100644 --- a/docs/operator-guides/transformation-and-normalization/README.md +++ b/docs/operator-guides/transformation-and-normalization/README.md @@ -1,2 +1 @@ # Transformations and Normalization - diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md index 30fa2c4051e..37401a5a99a 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md @@ -5,7 +5,7 @@ products: oss-* # Transformations with Airbyte (Part 3/3) :::warning -Normalization and Custom Transformation are deprecated features. +Normalization and Custom Transformation are deprecated features. Destinations using Normalization will be replaced by [Typing and Deduping](/using-airbyte/core-concepts/typing-deduping.md). Custom Transformation will be removed on March 31. For more information, visit [here](https://github.com/airbytehq/airbyte/discussions/34860). ::: @@ -40,9 +40,9 @@ Now, let's connect my mono-repo Business Intelligence project stored in a privat Note that if you need to connect to a private git repository, the recommended way to do so is to generate a `Personal Access Token` that can be used instead of a password. Then, you'll be able to include the credentials in the git repository url: -* [GitHub - Personal Access Tokens](https://docs.github.com/en/github/authenticating-to-github/keeping-your-account-and-data-secure/creating-a-personal-access-token) -* [Gitlab - Personal Access Tokens](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html) -* [Azure DevOps - Personal Access Tokens](https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate) +- [GitHub - Personal Access Tokens](https://docs.github.com/en/github/authenticating-to-github/keeping-your-account-and-data-secure/creating-a-personal-access-token) +- [Gitlab - Personal Access Tokens](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html) +- [Azure DevOps - Personal Access Tokens](https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate) And then use it for cloning: @@ -75,19 +75,18 @@ According to the dbt documentation, I can configure the [packages folder](https: ```yaml # dbt_project.yml -packages-install-path: '../dbt_packages' +packages-install-path: "../dbt_packages" ``` > If I want to chain **dbt deps** and **dbt run**, I may use **[dbt build](https://docs.getdbt.com/reference/commands/build)** instead, which is not equivalent to the two previous commands, but will remove the need to alter the configuration of dbt. - ### Refresh models partially Since I am using a mono-repo from my organization, other team members or departments may also contribute their dbt models to this centralized location. This will give us many dbt models and sources to build our complete data warehouse... The whole warehouse is scheduled for full refresh on a different orchestration tool, or as part of the git repository CI. However, here, I want to partially refresh some small relevant tables when attaching this operation to a specific Airbyte sync, in this case, the Covid dataset. -Therefore, I can restrict the execution of models to a particular tag or folder by specifying in the dbt cli arguments, in this case whatever is related to "covid\_api": +Therefore, I can restrict the execution of models to a particular tag or folder by specifying in the dbt cli arguments, in this case whatever is related to "covid_api": ```text run --models tag:covid_api opendata.base.* @@ -107,4 +106,4 @@ This string must have no space. There is a [Github issue](https://github.com/air ### DBT Profile -There is no need to specify `--profiles-dir`. By default AirByte based on the destination type. For example, if you're using Postgres as your destination, Airbyte will create a profile configuration based on that destination. This means you don't need to specify the credentials. If you specify a custom `profile` file, you are responsible for securely managing the credentials. Currently, we don't have a way to manage and pass secrets and it's recommended you let Airbyte pass this to dbt. +There is no need to specify `--profiles-dir`. By default AirByte based on the destination type. For example, if you're using Postgres as your destination, Airbyte will create a profile configuration based on that destination. This means you don't need to specify the credentials. If you specify a custom `profile` file, you are responsible for securely managing the credentials. Currently, we don't have a way to manage and pass secrets and it's recommended you let Airbyte pass this to dbt. diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md b/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md index bbb7987d0b1..a2dad71bd7a 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md @@ -5,7 +5,7 @@ products: oss-* # Transformations with dbt (Part 2/3) :::warning -Normalization and Custom Transformation are deprecated features. +Normalization and Custom Transformation are deprecated features. Destinations using Normalization will be replaced by [Typing and Deduping](/using-airbyte/core-concepts/typing-deduping.md). Custom Transformation will be removed on March 31. For more information, visit [here](https://github.com/airbytehq/airbyte/discussions/34860). ::: @@ -192,7 +192,7 @@ from {{ ref('covid_epidemiology_ab3_558') }} If you have [dbt installed](https://docs.getdbt.com/dbt-cli/installation/) locally on your machine, you can then view, edit, version, customize, and run the dbt models in your project outside Airbyte syncs. ```bash -#!/usr/bin/env bash +#!/usr/bin/env bash dbt deps --profiles-dir=$NORMALIZE_DIR --project-dir=$NORMALIZE_DIR dbt run --profiles-dir=$NORMALIZE_DIR --project-dir=$NORMALIZE_DIR --full-refresh @@ -223,4 +223,3 @@ Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1 Now, that you've exported the generated normalization models, you can edit and tweak them as necessary. If you want to know how to push your modifications back to Airbyte and use your updated dbt project during Airbyte syncs, you can continue with the following [tutorial on importing transformations into Airbyte](transformations-with-airbyte.md)... - diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md index 361b26c657a..f0ba3fcf6f2 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md @@ -5,7 +5,7 @@ products: oss-* # Transformations with SQL (Part 1/3) :::warning -Normalization and Custom Transformation are deprecated features. +Normalization and Custom Transformation are deprecated features. Destinations using Normalization will be replaced by [Typing and Deduping](/using-airbyte/core-concepts/typing-deduping.md). Custom Transformation will be removed on March 31. For more information, visit [here](https://github.com/airbytehq/airbyte/discussions/34860). ::: @@ -34,7 +34,7 @@ Anyway, it is possible to short-circuit this process \(no vendor lock-in\) and h This could be useful if: -1. You have a use-case not related to analytics that could be handled with data in its raw JSON format. +1. You have a use-case not related to analytics that could be handled with data in its raw JSON format. 2. You can implement your own transformer. For example, you could write them in a different language, create them in an analytics engine like Spark, or use a transformation tool such as dbt or Dataform. 3. You want to customize and change how the data is normalized with your own queries. @@ -144,34 +144,34 @@ from "postgres".quarantine._airbyte_raw_covid_epidemiology -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type select - cast("key" as + cast("key" as varchar ) as "key", - cast("date" as + cast("date" as varchar ) as "date", - cast(new_tested as + cast(new_tested as float ) as new_tested, - cast(new_deceased as + cast(new_deceased as float ) as new_deceased, - cast(total_tested as + cast(total_tested as float ) as total_tested, - cast(new_confirmed as + cast(new_confirmed as float ) as new_confirmed, - cast(new_recovered as + cast(new_recovered as float ) as new_recovered, - cast(total_deceased as + cast(total_deceased as float ) as total_deceased, - cast(total_confirmed as + cast(total_confirmed as float ) as total_confirmed, - cast(total_recovered as + cast(total_recovered as float ) as total_recovered, _airbyte_emitted_at @@ -184,29 +184,29 @@ select *, md5(cast( - coalesce(cast("key" as + coalesce(cast("key" as varchar -), '') || '-' || coalesce(cast("date" as +), '') || '-' || coalesce(cast("date" as varchar -), '') || '-' || coalesce(cast(new_tested as +), '') || '-' || coalesce(cast(new_tested as varchar -), '') || '-' || coalesce(cast(new_deceased as +), '') || '-' || coalesce(cast(new_deceased as varchar -), '') || '-' || coalesce(cast(total_tested as +), '') || '-' || coalesce(cast(total_tested as varchar -), '') || '-' || coalesce(cast(new_confirmed as +), '') || '-' || coalesce(cast(new_confirmed as varchar -), '') || '-' || coalesce(cast(new_recovered as +), '') || '-' || coalesce(cast(new_recovered as varchar -), '') || '-' || coalesce(cast(total_deceased as +), '') || '-' || coalesce(cast(total_deceased as varchar -), '') || '-' || coalesce(cast(total_confirmed as +), '') || '-' || coalesce(cast(total_confirmed as varchar -), '') || '-' || coalesce(cast(total_recovered as +), '') || '-' || coalesce(cast(total_recovered as varchar ), '') - as + as varchar )) as _airbyte_covid_epidemiology_hashid from __dbt__CTE__covid_epidemiology_ab2_558 @@ -261,18 +261,20 @@ as ( Feel free to: -* Rename the columns as you desire - * avoiding using keywords such as `"key"` or `"date"` -* You can tweak the column data type if the ones generated by Airbyte are not the ones you favor - * For example, let's use `Integer` instead of `Float` for the number of Covid cases... -* Add deduplicating logic - * if you can identify which columns to use as Primary Keys +- Rename the columns as you desire + - avoiding using keywords such as `"key"` or `"date"` +- You can tweak the column data type if the ones generated by Airbyte are not the ones you favor + - For example, let's use `Integer` instead of `Float` for the number of Covid cases... +- Add deduplicating logic + + - if you can identify which columns to use as Primary Keys \(since airbyte isn't able to detect those automatically yet...\) - * \(Note: actually I am not even sure if I can tell the proper primary key in this dataset...\) -* Create a View \(or materialized views\) instead of a Table. -* etc + - \(Note: actually I am not even sure if I can tell the proper primary key in this dataset...\) + +- Create a View \(or materialized views\) instead of a Table. +- etc ```sql create view "postgres"."public"."covid_epidemiology" as ( @@ -322,4 +324,3 @@ create view "postgres"."public"."covid_epidemiology" as ( Then you can run in your preferred SQL editor or tool! If you are familiar with dbt or want to learn more about it, you can continue with the following [tutorial using dbt](transformations-with-dbt.md)... - diff --git a/docs/operator-guides/upgrading-airbyte.md b/docs/operator-guides/upgrading-airbyte.md index 5a4da98d990..0493f79590e 100644 --- a/docs/operator-guides/upgrading-airbyte.md +++ b/docs/operator-guides/upgrading-airbyte.md @@ -92,14 +92,20 @@ The instructions below are for users using custom deployment and have a `values. 2. You can click in `Default Values` and compare the value file between the new version and version you're running. You can run `helm list -n ` to check the CHART version you're using. 3. Update your `values.yaml` file if necessary. 4. Upgrade the Helm app running: + ```bash helm upgrade --install airbyte/airbyte --values --version ``` After 2-5 minutes, Helm will print a message showing how to port-forward Airbyte. This may take longer on Kubernetes clusters with slow internet connections. In general the message is the following: + ```bash - export POD_NAME=$(kubectl get pods -l "app.kubernetes.io/name=webapp" -o jsonpath="{.items[0].metadata.name}") - export CONTAINER_PORT=$(kubectl get pod $POD_NAME -o jsonpath="{.spec.containers[0].ports[0].containerPort}") + export POD_NAME=$(kubectl get pods -l "app.kubernetes.io/name=webapp" -o jsonpath="{.items[0].metadata.name}") + export CONTAINER_PORT=$(kubectl get pod $POD_NAME -o jsonpath="{.spec.containers[0].ports[0].containerPort}") echo "Visit http://127.0.0.1:8080 to use your application" kubectl port-forward $POD_NAME 8080:$CONTAINER_PORT - ``` + ``` + +``` + +``` diff --git a/docs/operator-guides/using-custom-connectors.md b/docs/operator-guides/using-custom-connectors.md index 6597dc7ad88..5c236c252f2 100644 --- a/docs/operator-guides/using-custom-connectors.md +++ b/docs/operator-guides/using-custom-connectors.md @@ -1,76 +1,89 @@ --- products: oss-* -sidebar_label: Uploading custom connectors ---- - +sidebar_label: Uploading custom connectors +--- + # Uploading Docker-based custom connectors :::info This guide walks through the setup of a Docker-based custom connector. To understand how to use our low-code connector builder, read our guide [here](/connector-development/connector-builder-ui/overview.md). ::: -If our connector catalog does not fulfill your needs, you can build your own Airbyte connectors! You can either use our [low-code connector builder](/connector-development/connector-builder-ui/overview.md) or upload a Docker-based custom connector. +If our connector catalog does not fulfill your needs, you can build your own Airbyte connectors! You can either use our [low-code connector builder](/connector-development/connector-builder-ui/overview.md) or upload a Docker-based custom connector. This page walks through the process to upload a **Docker-based custom connector**. This is an ideal route for connectors that have an **internal** use case like a private API with a specific fit for your organization. This guide for using Docker-based custom connectors assumes the following: -* You followed our other guides and tutorials about [connector development](/connector-development/connector-builder-ui/overview.md) -* You finished your connector development and have it running locally on an Airbyte development instance. -* You want to deploy this connector to a production Airbyte instance running on a VM with docker-compose or on a Kubernetes cluster. + +- You followed our other guides and tutorials about [connector development](/connector-development/connector-builder-ui/overview.md) +- You finished your connector development and have it running locally on an Airbyte development instance. +- You want to deploy this connector to a production Airbyte instance running on a VM with docker-compose or on a Kubernetes cluster. If you prefer video tutorials, we recorded a demo on how to upload [connectors images to a GCP Artifact Registry](https://www.youtube.com/watch?v=4YF20PODv30&ab_channel=Airbyte). ## 1. Create a private Docker registry + Airbyte needs to pull its Docker images from a remote Docker registry to consume a connector. -You should host your custom connectors image on a private Docker registry. +You should host your custom connectors image on a private Docker registry. Here are some resources to create a private Docker registry, in case your organization does not already have one: -| Cloud provider | Service name | Documentation | -|----------------|-----------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| Google Cloud | Artifact Registry | [Quickstart](https://cloud.google.com/artifact-registry/docs/docker/quickstart)| -| AWS | Amazon ECR | [Getting started with Amazon ECR](https://docs.aws.amazon.com/AmazonECR/latest/userguide/getting-started-console.html)| -| Azure | Container Registry | [Quickstart](https://docs.microsoft.com/en-us/azure/container-registry/container-registry-get-started-portal#:~:text=Azure%20Container%20Registry%20is%20a,container%20images%20and%20related%20artifacts.&text=Then%2C%20use%20Docker%20commands%20to,the%20image%20from%20your%20registry.)| -| DockerHub | Repositories | [DockerHub Quickstart](https://docs.docker.com/docker-hub/)| -| Self hosted | Open-source Docker Registry | [Deploy a registry server](https://docs.docker.com/registry/deploying/)| +| Cloud provider | Service name | Documentation | +| -------------- | --------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Google Cloud | Artifact Registry | [Quickstart](https://cloud.google.com/artifact-registry/docs/docker/quickstart) | +| AWS | Amazon ECR | [Getting started with Amazon ECR](https://docs.aws.amazon.com/AmazonECR/latest/userguide/getting-started-console.html) | +| Azure | Container Registry | [Quickstart](https://docs.microsoft.com/en-us/azure/container-registry/container-registry-get-started-portal#:~:text=Azure%20Container%20Registry%20is%20a,container%20images%20and%20related%20artifacts.&text=Then%2C%20use%20Docker%20commands%20to,the%20image%20from%20your%20registry.) | +| DockerHub | Repositories | [DockerHub Quickstart](https://docs.docker.com/docker-hub/) | +| Self hosted | Open-source Docker Registry | [Deploy a registry server](https://docs.docker.com/registry/deploying/) | ## 2. Authenticate to your private Docker registry + To push and pull images to your private Docker registry, you need to authenticate to it: -* Your local or CI environment (where you build your connector image) must be able to **push** images to your registry. -* Your Airbyte instance must be able to **pull** images from your registry. + +- Your local or CI environment (where you build your connector image) must be able to **push** images to your registry. +- Your Airbyte instance must be able to **pull** images from your registry. ### For Docker-compose Airbyte deployments + #### On GCP - Artifact Registry: + GCP offers the `gcloud` credential helper to log in to your Artifact registry. Please run the command detailed [here](https://cloud.google.com/artifact-registry/docs/docker/quickstart#auth) to authenticate your local environment/CI environment to your Artifact registry. Run the same authentication flow on your Compute Engine instance. If you do not want to use `gcloud`, GCP offers other authentication methods detailed [here](https://cloud.google.com/artifact-registry/docs/docker/authentication). #### On AWS - Amazon ECR: + You can authenticate to an ECR private registry using the `aws` CLI: `aws ecr get-login-password --region region | docker login --username AWS --password-stdin aws_account_id.dkr.ecr.region.amazonaws.com` You can find details about this command and other available authentication methods [here](https://docs.aws.amazon.com/AmazonECR/latest/userguide/registry_auth.html). You will have to authenticate your local/CI environment (where you build your image) **and** your EC2 instance where your Airbyte instance is running. #### On Azure - Container Registry: + You can authenticate to an Azure Container Registry using the `az` CLI: `az acr login --name ` You can find details about this command [here](https://docs.microsoft.com/en-us/azure/container-registry/container-registry-get-started-portal#:~:text=Azure%20Container%20Registry%20is%20a,container%20images%20and%20related%20artifacts.&text=Then,%20use%20Docker%20commands%20to,the%20image%20from%20your%20registry.) You will have to authenticate both your local/CI environment/ environment (where your image is built) **and** your Azure Virtual Machine instance where the Airbyte instance is running. #### On DockerHub - Repositories: + You can use Docker Desktop to authenticate your local machine to your DockerHub registry by signing in on the desktop application using your DockerID. You need to use a [service account](https://docs.docker.com/docker-hub/service-accounts/) to authenticate your Airbyte instance to your DockerHub registry. #### Self hosted - Open source Docker Registry: + It would be best to set up auth on your Docker registry to make it private. Available authentication options for an open-source Docker registry are listed [here](https://docs.docker.com/registry/configuration/#auth). To authenticate your local/CI environment and Airbyte instance you can use the [`docker login`](https://docs.docker.com/engine/reference/commandline/login/) command. ### For Kubernetes Airbyte deployments + You can use the previous section's authentication flow to authenticate your local/CI to your private Docker registry. If you provisioned your Kubernetes cluster using AWS EKS, GCP GKE, or Azure AKS: it is very likely that you already allowed your cluster to pull images from the respective container registry service of your cloud provider. If you want Airbyte to pull images from another private Docker registry, you will have to do the following: + 1. Create a `Secret` in Kubernetes that will host your authentication credentials. [This Kubernetes documentation](https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/) explains how to proceed. 2. Set the `JOB_KUBE_MAIN_CONTAINER_IMAGE_PULL_SECRET` environment variable on the `airbyte-worker` pod. The value must be **the name of your previously created Kubernetes Secret**. ## 3. Push your connector image to your private Docker registry + 1. Build and tag your connector image locally, e.g.: `docker build . -t my-custom-connectors/source-custom:0.1.0` 2. Create your image tag with `docker tag` command. The structure of the remote tag depends on your cloud provider's container registry service. Please check their online documentation linked at the top. 3. Use `docker push :` to push the image to your private Docker registry. @@ -78,20 +91,22 @@ If you want Airbyte to pull images from another private Docker registry, you wil You should run all the above commands from your local/CI environment, where your connector source code is available. ## 4. Use your custom Docker connector in Airbyte + At this step, you should have: -* A private Docker registry hosting your custom connector image. -* Authenticated your Airbyte instance to your private Docker registry. + +- A private Docker registry hosting your custom connector image. +- Authenticated your Airbyte instance to your private Docker registry. You can pull your connector image from your private registry to validate the previous steps. On your Airbyte instance: run `docker pull :` if you are using our `docker-compose` deployment, or start a pod that is using the connector image. 1. Click on `Settings` in the left-hand sidebar. Navigate to `Sources` or `Destinations` depending on your connector. Click on `Add a new Docker connector`. -2. Name your custom connector in `Connector display name`. This is just the display name used for your workspace. +2. Name your custom connector in `Connector display name`. This is just the display name used for your workspace. 3. Fill in the Docker `Docker full image name` and `Docker image tag`. 4. (Optional) Add a link to connector's documentation in `Connector documentation URL` -You can optionally fill this with any value if you do not have online documentation for your connector. -This documentation will be linked in your connector setting's page. + You can optionally fill this with any value if you do not have online documentation for your connector. + This documentation will be linked in your connector setting's page. -5. `Add` the connector to save the configuration. You can now select your new connector when setting up a new connection! \ No newline at end of file +5. `Add` the connector to save the configuration. You can now select your new connector when setting up a new connection! diff --git a/docs/operator-guides/using-dagster-integration.md b/docs/operator-guides/using-dagster-integration.md index 03dd051118d..3df656746bc 100644 --- a/docs/operator-guides/using-dagster-integration.md +++ b/docs/operator-guides/using-dagster-integration.md @@ -3,7 +3,7 @@ description: Start triggering Airbyte jobs with Dagster in minutes products: oss-* --- -# Using the Dagster Integration +# Using the Dagster Integration Airbyte is an official integration in the Dagster project. The Airbyte Integration allows you to trigger synchronization jobs in Airbyte, and this tutorial will walk through configuring your Dagster Ops to do so. @@ -49,16 +49,17 @@ def my_simple_airbyte_job(): The Airbyte Dagster Resource accepts the following parameters: -* `host`: The host URL to your Airbyte instance. -* `port`: The port value you have selected for your Airbyte instance. -* `use_https`: If your server use secure HTTP connection. -* `request_max_retries`: The maximum number of times requests to the Airbyte API should be retried before failing. -* `request_retry_delay`: Time in seconds to wait between each request retry. +- `host`: The host URL to your Airbyte instance. +- `port`: The port value you have selected for your Airbyte instance. +- `use_https`: If your server use secure HTTP connection. +- `request_max_retries`: The maximum number of times requests to the Airbyte API should be retried before failing. +- `request_retry_delay`: Time in seconds to wait between each request retry. The Airbyte Dagster Op accepts the following parameters: -* `connection_id`: The Connection UUID you want to trigger -* `poll_interval`: The time in seconds that will be waited between successive polls. -* `poll_timeout`: he maximum time that will waited before this operation is timed out. + +- `connection_id`: The Connection UUID you want to trigger +- `poll_interval`: The time in seconds that will be waited between successive polls. +- `poll_timeout`: he maximum time that will waited before this operation is timed out. After running the file, `dagster job execute -f airbyte_dagster.py ` this will trigger the job with Dagster. @@ -69,6 +70,7 @@ Don't be fooled by our simple example of only one Dagster Flow. Airbyte is a pow We love to hear any questions or feedback on our [Slack](https://slack.airbyte.io/). We're still in alpha, so if you see any rough edges or want to request a connector, feel free to create an issue on our [Github](https://github.com/airbytehq/airbyte) or thumbs up an existing issue. ## Related articles and guides + For additional information about using Dagster and Airbyte together, see the following: - [Build an e-commerce analytics stack with Airbyte, dbt, Dagster and BigQuery](https://github.com/airbytehq/quickstarts/tree/main/ecommerce_analytics_bigquery) diff --git a/docs/operator-guides/using-kestra-plugin.md b/docs/operator-guides/using-kestra-plugin.md index 0a8da24761a..d835b92a144 100644 --- a/docs/operator-guides/using-kestra-plugin.md +++ b/docs/operator-guides/using-kestra-plugin.md @@ -5,17 +5,17 @@ products: oss-* # Using the Kestra Plugin -Kestra has an official plugin for Airbyte, including support for self-hosted Airbyte and Airbyte Cloud. This plugin allows you to trigger data replication jobs (`Syncs`) and wait for their completion before proceeding with any downstream tasks. Alternatively, you may also run those syncs in a fire-and-forget way by setting the `wait` argument to `false`. +Kestra has an official plugin for Airbyte, including support for self-hosted Airbyte and Airbyte Cloud. This plugin allows you to trigger data replication jobs (`Syncs`) and wait for their completion before proceeding with any downstream tasks. Alternatively, you may also run those syncs in a fire-and-forget way by setting the `wait` argument to `false`. -After Airbyte tasks successfully ingest raw data, you can easily start running downstream data transformations with dbt, Python, SQL, Spark, and many more, using a variety of available plugins. Check the [plugin documentation](https://kestra.io/plugins/) for a list of all supported integrations. +After Airbyte tasks successfully ingest raw data, you can easily start running downstream data transformations with dbt, Python, SQL, Spark, and many more, using a variety of available plugins. Check the [plugin documentation](https://kestra.io/plugins/) for a list of all supported integrations. ## Available tasks These are the two main tasks to orchestrate Airbyte syncs: -1) The `io.kestra.plugin.airbyte.connections.Sync` task will sync connections for a self-hosted Airbyte instance +1. The `io.kestra.plugin.airbyte.connections.Sync` task will sync connections for a self-hosted Airbyte instance -2) The `io.kestra.plugin.airbyte.cloud.jobs.Sync` task will sync connections for Airbyte Cloud +2. The `io.kestra.plugin.airbyte.cloud.jobs.Sync` task will sync connections for Airbyte Cloud ## **1. Set up the tools** @@ -37,10 +37,9 @@ Then, run `docker compose up -d` and [navigate to the UI](http://localhost:80 ![airbyte_kestra_CLI](../.gitbook/assets/airbyte_kestra_1.gif) - ## 2. Create a flow from the UI -Kestra UI provides a wide range of Blueprints to help you get started. +Kestra UI provides a wide range of Blueprints to help you get started. Navigate to Blueprints. Then type "Airbyte" in the search bar to find the desired integration. This way, you can easily accomplish fairly standardized data orchestration tasks, such as the following: @@ -56,12 +55,11 @@ Select a blueprint matching your use case and click "Use". ![airbyte_kestra_UI](../.gitbook/assets/airbyte_kestra_2.gif) - -Then, within the editor, adjust the connection ID and task names and click "Save". Finally, trigger your flow. +Then, within the editor, adjust the connection ID and task names and click "Save". Finally, trigger your flow. ## 3. Simple demo -Here is an example flow that triggers multiple Airbyte connections in parallel to sync data for multiple **Pokémon**. +Here is an example flow that triggers multiple Airbyte connections in parallel to sync data for multiple **Pokémon**. ```yaml id: airbyteSyncs @@ -92,7 +90,7 @@ taskDefaults: triggers: - id: everyMinute type: io.kestra.core.models.triggers.types.Schedule - cron: "*/1 * * * *" + cron: "*/1 * * * *" ``` ## Next steps diff --git a/docs/operator-guides/using-prefect-task.md b/docs/operator-guides/using-prefect-task.md index c7339306356..4b65e58972a 100644 --- a/docs/operator-guides/using-prefect-task.md +++ b/docs/operator-guides/using-prefect-task.md @@ -55,7 +55,7 @@ airbyte_conn = AirbyteConnectionTask( ) with Flow("first-airbyte-task") as flow: - flow.add_task(airbyte_conn) + flow.add_task(airbyte_conn) # Register the flow under the "airbyte" project flow.register(project_name="airbyte") @@ -63,10 +63,10 @@ flow.register(project_name="airbyte") The Airbyte Prefect Task accepts the following parameters: -* `airbyte_server_host`: The host URL to your Airbyte instance. -* `airbyte_server_post`: The port value you have selected for your Airbyte instance. -* `airbyte_api_version`: default value is `v1`. -* `connection_id`: The ID of the Airbyte Connection to be triggered by Prefect. +- `airbyte_server_host`: The host URL to your Airbyte instance. +- `airbyte_server_post`: The port value you have selected for your Airbyte instance. +- `airbyte_api_version`: default value is `v1`. +- `connection_id`: The ID of the Airbyte Connection to be triggered by Prefect. After running the file, `python3 airbyte_prefect_flow.py` this will register the Flow in Prefect Server. @@ -92,6 +92,7 @@ Don't be fooled by our simple example of only one Prefect Flow. Airbyte is a pow We love to hear any questions or feedback on our [Slack](https://slack.airbyte.io/). We're still in alpha, so if you see any rough edges or want to request a connector, feel free to create an issue on our [Github](https://github.com/airbytehq/airbyte) or thumbs up an existing issue. ## Related articles and guides + For additional information about using Prefect and Airbyte together, see the following: - [Build an e-commerce analytics stack with Airbyte, dbt, Prefect and BigQuery](https://github.com/airbytehq/quickstarts/tree/main/airbyte_dbt_prefect_bigquery) diff --git a/docs/operator-guides/using-the-airflow-airbyte-operator.md b/docs/operator-guides/using-the-airflow-airbyte-operator.md index 84831527f01..6b8886ab748 100644 --- a/docs/operator-guides/using-the-airflow-airbyte-operator.md +++ b/docs/operator-guides/using-the-airflow-airbyte-operator.md @@ -5,7 +5,7 @@ products: oss-* # Using the Airbyte Operator to orchestrate Airbyte OSS -Airbyte is an official community provider for the Apache Airflow project. The Airbyte operator allows you to trigger Airbyte OSS synchronization jobs from Apache Airflow, and this article will walk through configuring your Airflow DAG to do so. +Airbyte is an official community provider for the Apache Airflow project. The Airbyte operator allows you to trigger Airbyte OSS synchronization jobs from Apache Airflow, and this article will walk through configuring your Airflow DAG to do so. :::note @@ -84,11 +84,11 @@ with DAG(dag_id='trigger_airbyte_job_example', The Airbyte Airflow Operator accepts the following parameters: -* `airbyte_conn_id`: Name of the Airflow HTTP Connection pointing at the Airbyte API. Tells Airflow where the Airbyte API is located. -* `connection_id`: The ID of the Airbyte Connection to be triggered by Airflow. -* `asynchronous`: Determines how the Airbyte Operator executes. When true, Airflow will monitor the Airbyte Job using an **AirbyteJobSensor**. Default value is `false`. -* `timeout`: Maximum time Airflow will wait for the Airbyte job to complete. Only valid when `asynchronous=False`. Default value is `3600` seconds. -* `wait_seconds`: The amount of time to wait between checks. Only valid when `asynchronous=False`. Default value is `3` seconds. +- `airbyte_conn_id`: Name of the Airflow HTTP Connection pointing at the Airbyte API. Tells Airflow where the Airbyte API is located. +- `connection_id`: The ID of the Airbyte Connection to be triggered by Airflow. +- `asynchronous`: Determines how the Airbyte Operator executes. When true, Airflow will monitor the Airbyte Job using an **AirbyteJobSensor**. Default value is `false`. +- `timeout`: Maximum time Airflow will wait for the Airbyte job to complete. Only valid when `asynchronous=False`. Default value is `3600` seconds. +- `wait_seconds`: The amount of time to wait between checks. Only valid when `asynchronous=False`. Default value is `3` seconds. This code will produce the following simple DAG in the Airbyte UI: @@ -108,7 +108,7 @@ If your Airflow instance has limited resources and/or is under load, setting the from airflow import DAG from airflow.utils.dates import days_ago from airflow.providers.airbyte.operators.airbyte import AirbyteTriggerSyncOperator -from airflow.providers.airbyte.sensors.airbyte import AirbyteJobSensor +from airflow.providers.airbyte.sensors.airbyte import AirbyteJobSensor with DAG(dag_id='airbyte_trigger_job_example_async', default_args={'owner': 'airflow'}, @@ -139,6 +139,7 @@ Don't be fooled by our simple example of only one Airflow task. Airbyte is a pow We love to hear any questions or feedback on our [Slack](https://slack.airbyte.io/). We're still in alpha, so if you see any rough edges or want to request a connector, feel free to create an issue on our [Github](https://github.com/airbytehq/airbyte) or thumbs up an existing issue. ## Related articles and guides + For additional information about using the Airflow and Airbyte together, see the following: - [Using the new Airbyte API to orchestrate Airbyte Cloud with Airflow](https://airbyte.com/blog/orchestrating-airbyte-api-airbyte-cloud-airflow) diff --git a/docs/readme.md b/docs/readme.md index 82a9d79d6cc..17fe85ddc83 100644 --- a/docs/readme.md +++ b/docs/readme.md @@ -1,18 +1,20 @@ --- displayed_sidebar: docs --- + # Welcome to Airbyte Docs + ## What is Airbyte? -Airbyte is an open-source data movement infrastructure for building extract and load (EL) data pipelines. It is designed for versatility, scalability, and ease-of-use.  +Airbyte is an open-source data movement infrastructure for building extract and load (EL) data pipelines. It is designed for versatility, scalability, and ease-of-use. -There are three major components to know in Airbyte:  +There are three major components to know in Airbyte: 1. **The connector catalog** - * **350+ pre-built connectors**: Airbyte’s connector catalog comes “out-of-the-box” with over 350 pre-built connectors. These connectors can be used to start replicating data from a source to a destination in just a few minutes.  - * **No-Code Connector Builder**: You can easily extend Airbyte’s functionality to support your custom use cases through tools like the [No-Code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview).  + - **350+ pre-built connectors**: Airbyte’s connector catalog comes “out-of-the-box” with over 350 pre-built connectors. These connectors can be used to start replicating data from a source to a destination in just a few minutes. + - **No-Code Connector Builder**: You can easily extend Airbyte’s functionality to support your custom use cases through tools like the [No-Code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview). 2. **The platform:** Airbyte’s platform provides all the horizontal services required to configure and scale data movement operations, available as [cloud-managed](https://airbyte.com/product/airbyte-cloud) or [self-managed](https://airbyte.com/product/airbyte-enterprise). -3. **The user interface:** Airbyte features a UI, [**PyAirbyte**](https://docs.airbyte.com/pyairbyte) (Python library), [**API**](https://docs.airbyte.com/api-documentation), and [**Terraform Provider**](https://docs.airbyte.com/terraform-documentation) to integrate with your preferred tooling and approach to infrastructure management.  +3. **The user interface:** Airbyte features a UI, [**PyAirbyte**](https://docs.airbyte.com/pyairbyte) (Python library), [**API**](https://docs.airbyte.com/api-documentation), and [**Terraform Provider**](https://docs.airbyte.com/terraform-documentation) to integrate with your preferred tooling and approach to infrastructure management. Airbyte is suitable for a wide range of data integration use cases, including AI data infrastructure and EL(T) workloads. Airbyte is also [embeddable](https://airbyte.com/product/powered-by-airbyte) within your own application or platform to power your product. @@ -28,7 +30,7 @@ Next, check out the [step-by-step tutorial](/using-airbyte/getting-started) to s Browse the [connector catalog](/integrations/) to find the connector you want. If the connector is not yet supported on Airbyte Open Source, [build your own connector](/connector-development/). -Next, check out the [Airbyte Open Source QuickStart](/deploying-airbyte/local-deployment). Then learn how to [deploy](/deploying-airbyte/local-deployment) and [manage](/operator-guides/upgrading-airbyte) Airbyte Open Source in your cloud infrastructure. +Next, check out the [Airbyte Open Source QuickStart](/deploying-airbyte/local-deployment). Then learn how to [deploy](/deploying-airbyte/local-deployment) and [manage](/operator-guides/upgrading-airbyte) Airbyte Open Source in your cloud infrastructure. ## For Airbyte contributors diff --git a/docs/reference/README.md b/docs/reference/README.md index 4a938e09d06..cf5aa074165 100644 --- a/docs/reference/README.md +++ b/docs/reference/README.md @@ -1 +1 @@ -# Reference \ No newline at end of file +# Reference diff --git a/docs/reference/api/README.md b/docs/reference/api/README.md index 3600010fc1e..438a512bd9a 100644 --- a/docs/reference/api/README.md +++ b/docs/reference/api/README.md @@ -1,8 +1,8 @@ # API Documentation Folder -* `generated-api-html`: Plain HTML file automatically generated from the Airbyte OAS spec as part of the build. -* `api-documentation.md`: Markdown for API documentation Gitbook [page](https://docs.airbyte.com/api-documentation). -* `rapidoc-api-docs.html`: HTML for actual API Spec Documentation and linked to in the above Gitbook page. This is a S3 static website hosted out of +- `generated-api-html`: Plain HTML file automatically generated from the Airbyte OAS spec as part of the build. +- `api-documentation.md`: Markdown for API documentation Gitbook [page](https://docs.airbyte.com/api-documentation). +- `rapidoc-api-docs.html`: HTML for actual API Spec Documentation and linked to in the above Gitbook page. This is a S3 static website hosted out of the [`airbyte-public-api-docs bucket`](https://s3.console.aws.amazon.com/s3/buckets/airbyte-public-api-docs?region=us-east-2&tab=objects) with a [Cloudfront Distribution](https://console.aws.amazon.com/cloudfront/home?#distribution-settings:E35VD0IIC8YUEW) for SSL. This file points to the Airbyte OAS spec on Master and will automatically mirror spec changes. This file will need to be uploaded to the `airbyte-public-api-docs` bucket for any file changes to propagate. diff --git a/docs/release_notes/april_2023.md b/docs/release_notes/april_2023.md index 4a7dbf35312..4d5a302bbb9 100644 --- a/docs/release_notes/april_2023.md +++ b/docs/release_notes/april_2023.md @@ -1,4 +1,5 @@ # April 2023 + ## [airbyte v0.43.0](https://github.com/airbytehq/airbyte/releases/tag/v0.43.0) to [v0.44.3](https://github.com/airbytehq/airbyte/releases/tag/v0.44.3) This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -6,28 +7,29 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt ## **✨ New and improved features** - **New Sources and Promotions** - - 🎉 New Destination: SelectDB ([#20881](https://github.com/airbytehq/airbyte/pull/20881)) - - 🎉 Source Intercom: migrate from Python CDK to Declarative YAML (Low Code) ([#23013](https://github.com/airbytehq/airbyte/pull/23013)) - - 🎉 New Source: Azure Blob Storage (publish) ([#24767](https://github.com/airbytehq/airbyte/pull/24767)) + + - 🎉 New Destination: SelectDB ([#20881](https://github.com/airbytehq/airbyte/pull/20881)) + - 🎉 Source Intercom: migrate from Python CDK to Declarative YAML (Low Code) ([#23013](https://github.com/airbytehq/airbyte/pull/23013)) + - 🎉 New Source: Azure Blob Storage (publish) ([#24767](https://github.com/airbytehq/airbyte/pull/24767)) - **New Features for Existing Connectors** - - 🎉 Source TikTok Marketing - Add country_code and platform audience reports ([#22134](https://github.com/airbytehq/airbyte/pull/22134)) - - 🎉 Source Orb: Add invoices incremental stream ([#24737](https://github.com/airbytehq/airbyte/pull/24737)) - - 🎉 Source Sentry: add stream `releases` ([#24768](https://github.com/airbytehq/airbyte/pull/24768)) - - Source Klaviyo: adds stream Templates ([#23236](https://github.com/airbytehq/airbyte/pull/23236)) - - Source Hubspot: new stream Email Subscriptions ([#22910](https://github.com/airbytehq/airbyte/pull/22910)) + - 🎉 Source TikTok Marketing - Add country_code and platform audience reports ([#22134](https://github.com/airbytehq/airbyte/pull/22134)) + - 🎉 Source Orb: Add invoices incremental stream ([#24737](https://github.com/airbytehq/airbyte/pull/24737)) + - 🎉 Source Sentry: add stream `releases` ([#24768](https://github.com/airbytehq/airbyte/pull/24768)) + - Source Klaviyo: adds stream Templates ([#23236](https://github.com/airbytehq/airbyte/pull/23236)) + - Source Hubspot: new stream Email Subscriptions ([#22910](https://github.com/airbytehq/airbyte/pull/22910)) - **New Features in Airbyte Platform** - - 🎉 Connector builder: Add transformations (#5630) - - 🎉 Display per-stream error messages on stream-centric status page (#5793) - - 🎉 Validate security of OSS installations on setup (#5583) - - 🎉 Connector builder: Set default schema (#5813) - - 🎉 Connector builder error handler (#5637) - - 🎉 Connector builder: Create user input in new stream modal (#5812) - - 🎉 Connector builder: Better UI for cursor pagination (#6083) - - 🎉 Connector builder: User configurable list for list partition router (#6076) - - 🎉 Stream status page updates (#6099) - - 🎉 Connector builder: Better form for incremental sync (#6003) - - 🎉 Connector builder: Allow importing manifests with parameters in authenticator (#6213) + - 🎉 Connector builder: Add transformations (#5630) + - 🎉 Display per-stream error messages on stream-centric status page (#5793) + - 🎉 Validate security of OSS installations on setup (#5583) + - 🎉 Connector builder: Set default schema (#5813) + - 🎉 Connector builder error handler (#5637) + - 🎉 Connector builder: Create user input in new stream modal (#5812) + - 🎉 Connector builder: Better UI for cursor pagination (#6083) + - 🎉 Connector builder: User configurable list for list partition router (#6076) + - 🎉 Stream status page updates (#6099) + - 🎉 Connector builder: Better form for incremental sync (#6003) + - 🎉 Connector builder: Allow importing manifests with parameters in authenticator (#6213) ## **🐛 Bug fixes** @@ -38,4 +40,4 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt - 🐛 Fix query parameters in APIs (#5882) - 🐛 Date picker: Avoid time column text overflow (#6210) - 🐛 Connector Builder: avoid crash when loading builder if there is already data (#6155) -- 🐛 Connector builder: Allow changing user input key (#6167) \ No newline at end of file +- 🐛 Connector builder: Allow changing user input key (#6167) diff --git a/docs/release_notes/august_2022.md b/docs/release_notes/august_2022.md index 2a4325d7f5f..2cb8232bfdc 100644 --- a/docs/release_notes/august_2022.md +++ b/docs/release_notes/august_2022.md @@ -1,49 +1,53 @@ # August 2022 + ## Airbyte [v0.39.42-alpha](https://github.com/airbytehq/airbyte/releases/tag/v0.39.42-alpha) to [v0.40.3](https://github.com/airbytehq/airbyte/releases/tag/v0.40.3) This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added reserved keywords for schema names by fixing the quotation logic in normalization. [#14683](https://github.com/airbytehq/airbyte/pull/14683) -* Added [documentation](https://docs.airbyte.com/cloud/managing-airbyte-cloud/review-sync-summary) about the data displayed in sync log summaries. [#15181](https://github.com/airbytehq/airbyte/pull/15181) +- Added reserved keywords for schema names by fixing the quotation logic in normalization. [#14683](https://github.com/airbytehq/airbyte/pull/14683) -* Added OAuth login to Airbyte Cloud, which allows you to sign in using your Google login credentials. [#15414](https://github.com/airbytehq/airbyte/pull/15414) +- Added [documentation](https://docs.airbyte.com/cloud/managing-airbyte-cloud/review-sync-summary) about the data displayed in sync log summaries. [#15181](https://github.com/airbytehq/airbyte/pull/15181) - * You can use your Google login credentials to sign in to your Airbyte account if they share the same email address. +- Added OAuth login to Airbyte Cloud, which allows you to sign in using your Google login credentials. [#15414](https://github.com/airbytehq/airbyte/pull/15414) - * You can create a new Airbyte account with OAuth using your Google login credentials. + - You can use your Google login credentials to sign in to your Airbyte account if they share the same email address. - * You cannot use OAuth to log in if you are invited to join a workspace. + - You can create a new Airbyte account with OAuth using your Google login credentials. + + - You cannot use OAuth to log in if you are invited to join a workspace. ### Improvements -* Improved the Airbyte version naming conventions by removing the `-alpha` tag. The Airbyte platform is used successfully by thousands of users, so the `-alpha` tag is no longer necessary. [#15766](https://github.com/airbytehq/airbyte/pull/15766) -* Improved the `loadBalancerIP` in the web app by making it configurable. [#14992](https://github.com/airbytehq/airbyte/pull/14992) +- Improved the Airbyte version naming conventions by removing the `-alpha` tag. The Airbyte platform is used successfully by thousands of users, so the `-alpha` tag is no longer necessary. [#15766](https://github.com/airbytehq/airbyte/pull/15766) -* Datadog: +- Improved the `loadBalancerIP` in the web app by making it configurable. [#14992](https://github.com/airbytehq/airbyte/pull/14992) - * Improved the Airbyte platform by supporting StatsD, which sends Temporal metrics to Datadog. [#14842](https://github.com/airbytehq/airbyte/pull/14842) +- Datadog: - * Added Datadog tags to help you identify metrics between Airbyte instances. [#15213](https://github.com/airbytehq/airbyte/pull/15213) + - Improved the Airbyte platform by supporting StatsD, which sends Temporal metrics to Datadog. [#14842](https://github.com/airbytehq/airbyte/pull/14842) - * Added metric client tracking to record schema validation errors. [#13393](https://github.com/airbytehq/airbyte/pull/13393) + - Added Datadog tags to help you identify metrics between Airbyte instances. [#15213](https://github.com/airbytehq/airbyte/pull/15213) + + - Added metric client tracking to record schema validation errors. [#13393](https://github.com/airbytehq/airbyte/pull/13393) ### Bugs -* Fixed an issue where data types did not display correctly in the UI. The correct data types are now displayed in the streams of your connections. [#15558](https://github.com/airbytehq/airbyte/pull/15558) -* Fixed an issue where requests would fail during a release by adding a shutdown hook to the Airbyte server. This ensures the requests will be gracefully terminated before they can fail. [#15934](https://github.com/airbytehq/airbyte/pull/15934) +- Fixed an issue where data types did not display correctly in the UI. The correct data types are now displayed in the streams of your connections. [#15558](https://github.com/airbytehq/airbyte/pull/15558) -* Helm charts: +- Fixed an issue where requests would fail during a release by adding a shutdown hook to the Airbyte server. This ensures the requests will be gracefully terminated before they can fail. [#15934](https://github.com/airbytehq/airbyte/pull/15934) - * Fixed the deployment problems of the Helm chart with FluxCD by removing unconditional resource assignment in the chart for Temporal. [#15374](https://github.com/airbytehq/airbyte/pull/15374) +- Helm charts: - * Fixed the following issues in [#15199](https://github.com/airbytehq/airbyte/pull/15199): + - Fixed the deployment problems of the Helm chart with FluxCD by removing unconditional resource assignment in the chart for Temporal. [#15374](https://github.com/airbytehq/airbyte/pull/15374) - * Fixed an issue where `toyaml` was being used instead of `toYaml`, which caused Helm chart installation to fail. + - Fixed the following issues in [#15199](https://github.com/airbytehq/airbyte/pull/15199): - * Fixed incorrect `extraContainers` indentation, which caused Helm chart installation to fail if the value was supplied. + - Fixed an issue where `toyaml` was being used instead of `toYaml`, which caused Helm chart installation to fail. - * Fixed incorrect Postgres secret reference and made it more user friendly. + - Fixed incorrect `extraContainers` indentation, which caused Helm chart installation to fail if the value was supplied. - * Updated the method of looking up secrets and included an override feature to protect users from common mistakes. + - Fixed incorrect Postgres secret reference and made it more user friendly. + + - Updated the method of looking up secrets and included an override feature to protect users from common mistakes. diff --git a/docs/release_notes/december_2022.md b/docs/release_notes/december_2022.md index 66d06e42903..c9f7cb2bbeb 100644 --- a/docs/release_notes/december_2022.md +++ b/docs/release_notes/december_2022.md @@ -1,30 +1,34 @@ # December 2022 + ## Airbyte [v0.40.24](https://github.com/airbytehq/airbyte/releases/tag/v0.40.24) to [v0.40.26](https://github.com/airbytehq/airbyte/releases/tag/v0.40.26) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added throughput metrics and a progress bar to the Connection Sync History UI for Airbyte Open Source. These provide real-time information on data transfer rates and sync progress. [#19193](https://github.com/airbytehq/airbyte/pull/19193) -* Added the custom connector UI in alpha to Airbyte Cloud, which allows you to create and update custom connectors. [#20483](https://github.com/airbytehq/airbyte/pull/20483) -* Added the stream details panel to the Connection Replication UI, which allows you to display and configure streams in your connection. [#19219](https://github.com/airbytehq/airbyte/pull/19219) - * Added source-defined **Cursor** and **Primary key** fields to the stream details panel. [#20366](https://github.com/airbytehq/airbyte/pull/20366) -* Added the UX flow for auto-detect schema changes. [#19226](https://github.com/airbytehq/airbyte/pull/19226) -* Added the auto-detect schema changes option to the Connection Replication UI, which allows you to choose whether Airbyte ignores or disables the connection when it detects a non-breaking schema change in the source. [#19734](https://github.com/airbytehq/airbyte/pull/19734) -* Added stream table configuration windows for Destination namespace and Stream name, which allow you to choose how the data is stored and edit the names and prefixes of tables in the destination. [#19713](https://github.com/airbytehq/airbyte/pull/19713) -* Added the AWS Secret Manager to Airbyte Open Source as an option for storing secrets. [#19690](https://github.com/airbytehq/airbyte/pull/19690) -* Added the [Airbyte Cloud API](http://reference.airbyte.com/) in alpha, which allows you to programmatically control Airbyte Cloud through an API. + +- Added throughput metrics and a progress bar to the Connection Sync History UI for Airbyte Open Source. These provide real-time information on data transfer rates and sync progress. [#19193](https://github.com/airbytehq/airbyte/pull/19193) +- Added the custom connector UI in alpha to Airbyte Cloud, which allows you to create and update custom connectors. [#20483](https://github.com/airbytehq/airbyte/pull/20483) +- Added the stream details panel to the Connection Replication UI, which allows you to display and configure streams in your connection. [#19219](https://github.com/airbytehq/airbyte/pull/19219) + - Added source-defined **Cursor** and **Primary key** fields to the stream details panel. [#20366](https://github.com/airbytehq/airbyte/pull/20366) +- Added the UX flow for auto-detect schema changes. [#19226](https://github.com/airbytehq/airbyte/pull/19226) +- Added the auto-detect schema changes option to the Connection Replication UI, which allows you to choose whether Airbyte ignores or disables the connection when it detects a non-breaking schema change in the source. [#19734](https://github.com/airbytehq/airbyte/pull/19734) +- Added stream table configuration windows for Destination namespace and Stream name, which allow you to choose how the data is stored and edit the names and prefixes of tables in the destination. [#19713](https://github.com/airbytehq/airbyte/pull/19713) +- Added the AWS Secret Manager to Airbyte Open Source as an option for storing secrets. [#19690](https://github.com/airbytehq/airbyte/pull/19690) +- Added the [Airbyte Cloud API](http://reference.airbyte.com/) in alpha, which allows you to programmatically control Airbyte Cloud through an API. ### Improvements -* Improved the Connection UX by preventing users from modifying an existing connection if there is a breaking change in the source schema. Now users must review changes before modifying the connection. [#20276](https://github.com/airbytehq/airbyte/pull/20276) -* Improved the stream catalog index by defining `stream`. This precaution keeps all streams matching correctly and data organized consistently. [#20443](https://github.com/airbytehq/airbyte/pull/20443) -* Updated the API to support column selection configuration in Airbyte Cloud. [#20259](https://github.com/airbytehq/airbyte/pull/20259) -* Ongoing improvements to Low-code CDK in alpha: - * Added `SessionTokenAuthenticator` for authentication management. [#19716](https://github.com/airbytehq/airbyte/pull/19716) - * Added the first iteration of the Configuration UI, which allows you to build connectors using forms instead of writing a YAML file. [#20008](https://github.com/airbytehq/airbyte/pull/20008) - * Added request options component to streams. You can now choose request options for streams in the connector builder. [#20497](https://github.com/airbytehq/airbyte/pull/20497) - * Fixed an issue where errors were not indicated properly by omitting individually touched fields in `useBuilderErrors`. [#20463](https://github.com/airbytehq/airbyte/pull/20463) - * Updated UI to match the current design, including UI text changes and the addition of the stream delete button. [#20464](https://github.com/airbytehq/airbyte/pull/20464) - * Upgraded Orval and updated the connector builder OpenAPI to pull the connector manifest schema directly into the API. [#20166](https://github.com/airbytehq/airbyte/pull/20166) + +- Improved the Connection UX by preventing users from modifying an existing connection if there is a breaking change in the source schema. Now users must review changes before modifying the connection. [#20276](https://github.com/airbytehq/airbyte/pull/20276) +- Improved the stream catalog index by defining `stream`. This precaution keeps all streams matching correctly and data organized consistently. [#20443](https://github.com/airbytehq/airbyte/pull/20443) +- Updated the API to support column selection configuration in Airbyte Cloud. [#20259](https://github.com/airbytehq/airbyte/pull/20259) +- Ongoing improvements to Low-code CDK in alpha: + - Added `SessionTokenAuthenticator` for authentication management. [#19716](https://github.com/airbytehq/airbyte/pull/19716) + - Added the first iteration of the Configuration UI, which allows you to build connectors using forms instead of writing a YAML file. [#20008](https://github.com/airbytehq/airbyte/pull/20008) + - Added request options component to streams. You can now choose request options for streams in the connector builder. [#20497](https://github.com/airbytehq/airbyte/pull/20497) + - Fixed an issue where errors were not indicated properly by omitting individually touched fields in `useBuilderErrors`. [#20463](https://github.com/airbytehq/airbyte/pull/20463) + - Updated UI to match the current design, including UI text changes and the addition of the stream delete button. [#20464](https://github.com/airbytehq/airbyte/pull/20464) + - Upgraded Orval and updated the connector builder OpenAPI to pull the connector manifest schema directly into the API. [#20166](https://github.com/airbytehq/airbyte/pull/20166) ## Bugs -* Fixed an issue where Airbyte Cloud would not properly load the values of normalization fields into the database by updating destination definitions. [#20573](https://github.com/airbytehq/airbyte/pull/20573) + +- Fixed an issue where Airbyte Cloud would not properly load the values of normalization fields into the database by updating destination definitions. [#20573](https://github.com/airbytehq/airbyte/pull/20573) diff --git a/docs/release_notes/december_2023.md b/docs/release_notes/december_2023.md index 3aa0ba4df2c..bfec7674ca7 100644 --- a/docs/release_notes/december_2023.md +++ b/docs/release_notes/december_2023.md @@ -1,4 +1,5 @@ # December 2023 + ## airbyte v0.50.36 to v0.50.40 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -11,9 +12,8 @@ Airbyte introduced a new schemaless mode for our MongoDB source connector to imp In addition to our schemaless mode for MongoDB, we have also: - - Enhanced our [Bing Ads](https://github.com/airbytehq/airbyte/pull/33095) source by allowing for account-specific filtering and improved error handling. - - Enabled per-stream state for [MS SQL](https://github.com/airbytehq/airbyte/pull/33018) source to increase resiliency to stream changes. - - Published a new [OneDrive](https://github.com/airbytehq/airbyte/pull/32655) source connector to support additional unstructured data in files. - - Added streams for our [Hubspot](https://github.com/airbytehq/airbyte/pull/33266) source to add `property_history` for Companies and Deals. We also added incremental syncing for all property history streams for increased sync reliability. - - Improved our [Klaviyo](https://github.com/airbytehq/airbyte/pull/33099) source connector to account for rate-limiting and gracefully handle stream-specific errors to continue syncing other streams - +- Enhanced our [Bing Ads](https://github.com/airbytehq/airbyte/pull/33095) source by allowing for account-specific filtering and improved error handling. +- Enabled per-stream state for [MS SQL](https://github.com/airbytehq/airbyte/pull/33018) source to increase resiliency to stream changes. +- Published a new [OneDrive](https://github.com/airbytehq/airbyte/pull/32655) source connector to support additional unstructured data in files. +- Added streams for our [Hubspot](https://github.com/airbytehq/airbyte/pull/33266) source to add `property_history` for Companies and Deals. We also added incremental syncing for all property history streams for increased sync reliability. +- Improved our [Klaviyo](https://github.com/airbytehq/airbyte/pull/33099) source connector to account for rate-limiting and gracefully handle stream-specific errors to continue syncing other streams diff --git a/docs/release_notes/february_2023.md b/docs/release_notes/february_2023.md index c180f16dd4e..1427773e844 100644 --- a/docs/release_notes/february_2023.md +++ b/docs/release_notes/february_2023.md @@ -1,26 +1,28 @@ # February 2023 + ## [airbyte v0.41.0](https://github.com/airbytehq/airbyte/releases/tag/v0.41.0) and [airbyte-platform v0.41.0](https://github.com/airbytehq/airbyte-platform/releases/tag/v0.41.0) This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### Improvements -* Improved the Airbyte GitHub repository structure and processes by splitting the current repo into two repos, `airbytehq/airbyte` for connectors and `airbytehq/airbyte-platform` for platform code. - * Allows for isolated changes and improvements to the development workflow. - * Simplifies the deployment process both internally and externally. + +- Improved the Airbyte GitHub repository structure and processes by splitting the current repo into two repos, `airbytehq/airbyte` for connectors and `airbytehq/airbyte-platform` for platform code. + - Allows for isolated changes and improvements to the development workflow. + - Simplifies the deployment process both internally and externally. :::note -If you want to contribute to the Airbyte Open Source platform, you will need to switch to `airbytehq/airbyte-platform`. If you want to contribute to Airbyte connectors, continue using `airbytehq/airbyte`. +If you want to contribute to the Airbyte Open Source platform, you will need to switch to `airbytehq/airbyte-platform`. If you want to contribute to Airbyte connectors, continue using `airbytehq/airbyte`. ::: -* Improved low-code CDK to meet the quality and functionality requirements to be promoted to beta. [#22853](https://github.com/airbytehq/airbyte/pull/22853) -* Improved the [Airbyte API](https://api.airbyte.com/) by adding new endpoints: - * Create sources - * Create connections - * Create destinations - * List jobs (+ job status) - * Cancel jobs +- Improved low-code CDK to meet the quality and functionality requirements to be promoted to beta. [#22853](https://github.com/airbytehq/airbyte/pull/22853) +- Improved the [Airbyte API](https://api.airbyte.com/) by adding new endpoints: + - Create sources + - Create connections + - Create destinations + - List jobs (+ job status) + - Cancel jobs :::note @@ -28,4 +30,4 @@ The Airbyte API is now in beta. If you are interested in joining the beta progra ::: -* Improved Airbyte’s [cost estimator](https://cost.airbyte.com/) UI by redesigning the layout and enhancing the cost visualization for a better user experience. +- Improved Airbyte’s [cost estimator](https://cost.airbyte.com/) UI by redesigning the layout and enhancing the cost visualization for a better user experience. diff --git a/docs/release_notes/february_2024.md b/docs/release_notes/february_2024.md index 6c490d4c0f0..7083a48c22b 100644 --- a/docs/release_notes/february_2024.md +++ b/docs/release_notes/february_2024.md @@ -1,11 +1,12 @@ # February 2024 + ## airbyte v0.50.46 to v0.50.54 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ## ✨ Highlights -Airbyte migrated our [Postgres destination](https://github.com/airbytehq/airbyte/pull/35042) to the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling, and deliver data incrementally. +Airbyte migrated our [Postgres destination](https://github.com/airbytehq/airbyte/pull/35042) to the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling, and deliver data incrementally. ## Platform Releases @@ -17,6 +18,6 @@ Airbyte migrated our [Postgres destination](https://github.com/airbytehq/airbyte In addition to our Postgres V2 destination, we also released a few notable Connector improvements: - - Our [Paypal source](https://github.com/airbytehq/airbyte/pull/34510) has been rigorously tested for bugs and now syncs new streams `Catalog Products`, `Disputes`, `Invoicing`, `Orders`, `Payments` and `Subscriptions`. - - [Chargebee](https://github.com/airbytehq/airbyte/pull/34053) source now syncs incrementally for `unbilled-charge`, `gift`, and `site_migration_detail` - - We launched [PyAirbyte](/using-airbyte/pyairbyte/getting-started.mdx), a new interface to use Airbyte connectors with for Python developers. +- Our [Paypal source](https://github.com/airbytehq/airbyte/pull/34510) has been rigorously tested for bugs and now syncs new streams `Catalog Products`, `Disputes`, `Invoicing`, `Orders`, `Payments` and `Subscriptions`. +- [Chargebee](https://github.com/airbytehq/airbyte/pull/34053) source now syncs incrementally for `unbilled-charge`, `gift`, and `site_migration_detail` +- We launched [PyAirbyte](/using-airbyte/pyairbyte/getting-started.mdx), a new interface to use Airbyte connectors with for Python developers. diff --git a/docs/release_notes/january_2023.md b/docs/release_notes/january_2023.md index b3d8122a73f..a6a0ae22d82 100644 --- a/docs/release_notes/january_2023.md +++ b/docs/release_notes/january_2023.md @@ -1,23 +1,27 @@ # January 2023 + ## Airbyte [v0.40.27](https://github.com/airbytehq/airbyte/releases/tag/v0.40.27) to [v0.40.32](https://github.com/airbytehq/airbyte/releases/tag/v0.40.32) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added the [Free Connector Program](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-credits#enroll-in-the-free-connector-program) to Airbyte Cloud, allowing you to sync connections with alpha or beta connectors for free. + +- Added the [Free Connector Program](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-credits#enroll-in-the-free-connector-program) to Airbyte Cloud, allowing you to sync connections with alpha or beta connectors for free. ### Improvements -* Improved Airbyte Open Source by integrating [Docker Compose V2](https://docs.docker.com/compose/compose-v2/). You must have Docker Compose V2 [installed](https://docs.docker.com/compose/install/) before upgrading to Airbyte version 0.42.0 or later. [#19321](https://github.com/airbytehq/airbyte/pull/19321) -* Improved the Airbyte Cloud UI by displaying the **Credits** label in the sidebar and low-credit alerts on the Credits page. [#20595](https://github.com/airbytehq/airbyte/pull/20595) -* Improved the Airbyte CI workflow by adding support to pull requests and limiting the CI runs to only occur on pushes to the master branch. This enhances collaboration with external contributors and reduces unnecessary runs. [#21266](https://github.com/airbytehq/airbyte/pull/21266) -* Improved the connector form by using proper validation in the array section. [#20725](https://github.com/airbytehq/airbyte/pull/20725) -* Ongoing improvements to the [Connector Builder UI](https://docs.airbyte.com/connector-development/config-based/connector-builder-ui/?_ga=2.261393869.1948366377.1675105348-1616004530.1663010260) in alpha: - * Added support for substream slicers and cartesian slicers, allowing the Connector Builder to create substreams and new streams from multiple existing streams. [#20861](https://github.com/airbytehq/airbyte/pull/20861) - * Added support for in-schema specification and validation, including a manual schema option. [#20862](https://github.com/airbytehq/airbyte/pull/20862) - * Added user inputs, request options, authentication, pagination, and slicing to the Connector Builder UI. [#20809](https://github.com/airbytehq/airbyte/pull/20809) - * Added ability to convert from YAML manifest to UI form values. [#21142](https://github.com/airbytehq/airbyte/pull/21142) - * Improved the Connector Builder’s conversion of YAML manifest to UI form values by resolving references and options in the manifest. The Connector Builder Server API has been updated with a new endpoint for resolving the manifest, which is now utilized by the conversion function. [#21898](https://github.com/airbytehq/airbyte/pull/21898) + +- Improved Airbyte Open Source by integrating [Docker Compose V2](https://docs.docker.com/compose/compose-v2/). You must have Docker Compose V2 [installed](https://docs.docker.com/compose/install/) before upgrading to Airbyte version 0.42.0 or later. [#19321](https://github.com/airbytehq/airbyte/pull/19321) +- Improved the Airbyte Cloud UI by displaying the **Credits** label in the sidebar and low-credit alerts on the Credits page. [#20595](https://github.com/airbytehq/airbyte/pull/20595) +- Improved the Airbyte CI workflow by adding support to pull requests and limiting the CI runs to only occur on pushes to the master branch. This enhances collaboration with external contributors and reduces unnecessary runs. [#21266](https://github.com/airbytehq/airbyte/pull/21266) +- Improved the connector form by using proper validation in the array section. [#20725](https://github.com/airbytehq/airbyte/pull/20725) +- Ongoing improvements to the [Connector Builder UI](https://docs.airbyte.com/connector-development/config-based/connector-builder-ui/?_ga=2.261393869.1948366377.1675105348-1616004530.1663010260) in alpha: + - Added support for substream slicers and cartesian slicers, allowing the Connector Builder to create substreams and new streams from multiple existing streams. [#20861](https://github.com/airbytehq/airbyte/pull/20861) + - Added support for in-schema specification and validation, including a manual schema option. [#20862](https://github.com/airbytehq/airbyte/pull/20862) + - Added user inputs, request options, authentication, pagination, and slicing to the Connector Builder UI. [#20809](https://github.com/airbytehq/airbyte/pull/20809) + - Added ability to convert from YAML manifest to UI form values. [#21142](https://github.com/airbytehq/airbyte/pull/21142) + - Improved the Connector Builder’s conversion of YAML manifest to UI form values by resolving references and options in the manifest. The Connector Builder Server API has been updated with a new endpoint for resolving the manifest, which is now utilized by the conversion function. [#21898](https://github.com/airbytehq/airbyte/pull/21898) # Bugs -* Fixed an issue where the checkboxes in the stream table would collapse and updated icons to match the new design. [#21108](https://github.com/airbytehq/airbyte/pull/21108) -* Fixed issues with non-breaking schema changes by adding an i18n string, ensuring supported options are rendered, and fixing a custom styling issue when resizing. [#20625](https://github.com/airbytehq/airbyte/pull/20625) + +- Fixed an issue where the checkboxes in the stream table would collapse and updated icons to match the new design. [#21108](https://github.com/airbytehq/airbyte/pull/21108) +- Fixed issues with non-breaking schema changes by adding an i18n string, ensuring supported options are rendered, and fixing a custom styling issue when resizing. [#20625](https://github.com/airbytehq/airbyte/pull/20625) diff --git a/docs/release_notes/january_2024.md b/docs/release_notes/january_2024.md index 95be1d2b5c7..86b3ff563d1 100644 --- a/docs/release_notes/january_2024.md +++ b/docs/release_notes/january_2024.md @@ -1,18 +1,18 @@ # January 2024 + ## airbyte v0.50.41 to v0.50.45 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ## ✨ Highlights -Airbyte migrated our [Redshift destination](https://github.com/airbytehq/airbyte/pull/34077) on the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling, and deliver data incrementally. +Airbyte migrated our [Redshift destination](https://github.com/airbytehq/airbyte/pull/34077) on the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling, and deliver data incrementally. ## Connector Improvements In addition to our Redshift V2 destination, we also released a few notable Connector improvements: - - Our S3 Source now supports [IAM role-based authentication](https://github.com/airbytehq/airbyte/pull/33818), allowing users to utilize IAM roles for more granular control over permissions and to eliminate the need for managing static access keys. - - Our [Salesforce](https://github.com/airbytehq/airbyte/issues/30819) source now supports syncing the object ContentDocumentLink, which enables reporting for files within Content Documents. - - [OneDrive](https://docs.airbyte.com/integrations/sources/microsoft-onedrive) and [Sharepoint](https://github.com/airbytehq/airbyte/pull/33537) are now offered as a source from which to connect your files. - - Stripe and Salesforce are enabled to run [concurrently](https://github.com/airbytehq/airbyte/pull/34454) with full refresh with 4x speed - +- Our S3 Source now supports [IAM role-based authentication](https://github.com/airbytehq/airbyte/pull/33818), allowing users to utilize IAM roles for more granular control over permissions and to eliminate the need for managing static access keys. +- Our [Salesforce](https://github.com/airbytehq/airbyte/issues/30819) source now supports syncing the object ContentDocumentLink, which enables reporting for files within Content Documents. +- [OneDrive](https://docs.airbyte.com/integrations/sources/microsoft-onedrive) and [Sharepoint](https://github.com/airbytehq/airbyte/pull/33537) are now offered as a source from which to connect your files. +- Stripe and Salesforce are enabled to run [concurrently](https://github.com/airbytehq/airbyte/pull/34454) with full refresh with 4x speed diff --git a/docs/release_notes/july_2022.md b/docs/release_notes/july_2022.md index c3a4c8240b2..1ae9d2a7c9c 100644 --- a/docs/release_notes/july_2022.md +++ b/docs/release_notes/july_2022.md @@ -1,49 +1,52 @@ # July 2022 -## Airbyte [v0.39.27-alpha](https://github.com/airbytehq/airbyte/releases/tag/v0.39.27-alpha) to [v0.39.41-alpha](https://github.com/airbytehq/airbyte/releases/tag/v0.39.41-alpha) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +## Airbyte [v0.39.27-alpha](https://github.com/airbytehq/airbyte/releases/tag/v0.39.27-alpha) to [v0.39.41-alpha](https://github.com/airbytehq/airbyte/releases/tag/v0.39.41-alpha) + +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added per-stream state to the Airbyte Cloud and OSS platforms. Per-stream state currently includes per-stream resets and connection states, and it lays the groundwork for auto-detecting schema changes, parallel syncs, and more. - * The [new flow](https://docs.airbyte.com/cloud/managing-airbyte-cloud/edit-stream-configuration) gives you the option to refresh streams when saving changes to a connection. [#14634](https://github.com/airbytehq/airbyte/pull/14634) +- Added per-stream state to the Airbyte Cloud and OSS platforms. Per-stream state currently includes per-stream resets and connection states, and it lays the groundwork for auto-detecting schema changes, parallel syncs, and more. - * Per-stream reset functionality is now available for connections with a Postgres source. Per-stream resets allow you to reset only the affected streams when saving an edited connection, instead of resetting all streams in a connection. [#14634](https://github.com/airbytehq/airbyte/pull/14634) + - The [new flow](https://docs.airbyte.com/cloud/managing-airbyte-cloud/edit-stream-configuration) gives you the option to refresh streams when saving changes to a connection. [#14634](https://github.com/airbytehq/airbyte/pull/14634) - * For connections with a Postgres source, the state of the connection to the source is displayed in the Connection State. [#15020](https://github.com/airbytehq/airbyte/pull/15020) + - Per-stream reset functionality is now available for connections with a Postgres source. Per-stream resets allow you to reset only the affected streams when saving an edited connection, instead of resetting all streams in a connection. [#14634](https://github.com/airbytehq/airbyte/pull/14634) - * For Airbyte Open Source users: - * If you are using the [Postgres](https://docs.airbyte.com/integrations/sources/postgres) source connector, upgrade your Airbyte platform to version v0.40.0-alpha or newer and [upgrade](https://docs.airbyte.com/operator-guides/upgrading-airbyte/) your AzureBlobStorage connector to version 0.1.6 or newer. [#15008](https://github.com/airbytehq/airbyte/pull/15008) + - For connections with a Postgres source, the state of the connection to the source is displayed in the Connection State. [#15020](https://github.com/airbytehq/airbyte/pull/15020) -* Added `airbyte_type` to normalization. This displays whether `timestamp` and `time` have an associated time zone. [#13591](https://github.com/airbytehq/airbyte/pull/13591) + - For Airbyte Open Source users: + - If you are using the [Postgres](https://docs.airbyte.com/integrations/sources/postgres) source connector, upgrade your Airbyte platform to version v0.40.0-alpha or newer and [upgrade](https://docs.airbyte.com/operator-guides/upgrading-airbyte/) your AzureBlobStorage connector to version 0.1.6 or newer. [#15008](https://github.com/airbytehq/airbyte/pull/15008) -* Airbyte is currently developing a low-code connector builder, which allows you to easily create new source and destination connectors in your workspace. [#14402](https://github.com/airbytehq/airbyte/pull/14402) [#14317](https://github.com/airbytehq/airbyte/pull/14317) [#14288](https://github.com/airbytehq/airbyte/pull/14288) [#14004](https://github.com/airbytehq/airbyte/pull/14004) +- Added `airbyte_type` to normalization. This displays whether `timestamp` and `time` have an associated time zone. [#13591](https://github.com/airbytehq/airbyte/pull/13591) -* Added [documentation](/using-airbyte/workspaces.md#single-workspace-vs-multiple-workspaces) about the benefits and considerations of having a single workspace vs. multiple workspaces in Airbyte Cloud. [#14608](https://github.com/airbytehq/airbyte/pull/14608) +- Airbyte is currently developing a low-code connector builder, which allows you to easily create new source and destination connectors in your workspace. [#14402](https://github.com/airbytehq/airbyte/pull/14402) [#14317](https://github.com/airbytehq/airbyte/pull/14317) [#14288](https://github.com/airbytehq/airbyte/pull/14288) [#14004](https://github.com/airbytehq/airbyte/pull/14004) + +- Added [documentation](/using-airbyte/workspaces.md#single-workspace-vs-multiple-workspaces) about the benefits and considerations of having a single workspace vs. multiple workspaces in Airbyte Cloud. [#14608](https://github.com/airbytehq/airbyte/pull/14608) ### Improvements -* Improved platform security by using Docker images from the latest version of OpenJDK (openjdk:19-slim-bullseye). [#14971](https://github.com/airbytehq/airbyte/pull/14971) -* Improved Airbyte Open Source self-hosting by refactoring and publishing Helm charts according to best practices as we prepare to formally support Helm deployments. [#14794](https://github.com/airbytehq/airbyte/pull/14794) +- Improved platform security by using Docker images from the latest version of OpenJDK (openjdk:19-slim-bullseye). [#14971](https://github.com/airbytehq/airbyte/pull/14971) -* Improved Airbyte Open Source by supporting the OpenTelemetry (OTEL) Collector. Airbyte Open Source now sends telemetry data to the OTEL collector, and we included a set of [recommended metrics](https://docs.airbyte.com/operator-guides/scaling-airbyte/#metrics) to export to OTEL when running Airbyte Open Source at scale. [#12908](https://github.com/airbytehq/airbyte/issues/12908) +- Improved Airbyte Open Source self-hosting by refactoring and publishing Helm charts according to best practices as we prepare to formally support Helm deployments. [#14794](https://github.com/airbytehq/airbyte/pull/14794) -* Improved the [Airbyte Connector Development Kit (CDK)](https://airbyte.com/connector-development-kit) by enabling detailed bug logs from the command line. In addition to the preset CDK debug logs, you can also create custom debug statements and display custom debug logs in the command line. [#14521](https://github.com/airbytehq/airbyte/pull/14521) +- Improved Airbyte Open Source by supporting the OpenTelemetry (OTEL) Collector. Airbyte Open Source now sends telemetry data to the OTEL collector, and we included a set of [recommended metrics](https://docs.airbyte.com/operator-guides/scaling-airbyte/#metrics) to export to OTEL when running Airbyte Open Source at scale. [#12908](https://github.com/airbytehq/airbyte/issues/12908) -* Improved CDK by supporting a schema generator tool. [#13518](https://github.com/airbytehq/airbyte/pull/13518) +- Improved the [Airbyte Connector Development Kit (CDK)](https://airbyte.com/connector-development-kit) by enabling detailed bug logs from the command line. In addition to the preset CDK debug logs, you can also create custom debug statements and display custom debug logs in the command line. [#14521](https://github.com/airbytehq/airbyte/pull/14521) -* Improved [documentation](https://docs.airbyte.com/contributing-to-airbyte/developing-locally#connector) about contributing locally by adding information on formatting connectors. [#14661](https://github.com/airbytehq/airbyte/pull/14661) +- Improved CDK by supporting a schema generator tool. [#13518](https://github.com/airbytehq/airbyte/pull/13518) -* Improved [Octavia CLI](https://github.com/airbytehq/airbyte/tree/master/octavia-cli#-octavia-cli) so you can now: +- Improved [documentation](https://docs.airbyte.com/contributing-to-airbyte/developing-locally#connector) about contributing locally by adding information on formatting connectors. [#14661](https://github.com/airbytehq/airbyte/pull/14661) - * Switch between Airbyte instances and deploy the same configurations on multiple instances. [#13070](https://github.com/airbytehq/airbyte/pull/13070) [#13748](https://github.com/airbytehq/airbyte/issues/13748) +- Improved [Octavia CLI](https://github.com/airbytehq/airbyte/tree/master/octavia-cli#-octavia-cli) so you can now: - * Enable normalization or custom DBT transformation from YAML configurations. [#10973](https://github.com/airbytehq/airbyte/issues/10973) + - Switch between Airbyte instances and deploy the same configurations on multiple instances. [#13070](https://github.com/airbytehq/airbyte/pull/13070) [#13748](https://github.com/airbytehq/airbyte/issues/13748) - * Set custom HTTP headers on requests made to the Airbyte server. You can use CLI If you have instances secured with basic access authentication or identity-aware proxy (IAP). This lays the groundwork for making the CLI compatible with Airbyte Cloud once we release the public API. [#13770](https://github.com/airbytehq/airbyte/issues/13770) + - Enable normalization or custom DBT transformation from YAML configurations. [#10973](https://github.com/airbytehq/airbyte/issues/10973) - * Import existing remote resources to a local Octavia project with `octavia import`. [#14291](https://github.com/airbytehq/airbyte/issues/14291) + - Set custom HTTP headers on requests made to the Airbyte server. You can use CLI If you have instances secured with basic access authentication or identity-aware proxy (IAP). This lays the groundwork for making the CLI compatible with Airbyte Cloud once we release the public API. [#13770](https://github.com/airbytehq/airbyte/issues/13770) - * Use the `get` command to get existing configurations for sources, destinations, and connections. [#13254](https://github.com/airbytehq/airbyte/pull/13254) + - Import existing remote resources to a local Octavia project with `octavia import`. [#14291](https://github.com/airbytehq/airbyte/issues/14291) - * Retrieve the JSON configuration using `octavia get`, which is useful for some scripting and orchestration use cases. [#13254](https://github.com/airbytehq/airbyte/pull/13254) + - Use the `get` command to get existing configurations for sources, destinations, and connections. [#13254](https://github.com/airbytehq/airbyte/pull/13254) + + - Retrieve the JSON configuration using `octavia get`, which is useful for some scripting and orchestration use cases. [#13254](https://github.com/airbytehq/airbyte/pull/13254) diff --git a/docs/release_notes/july_2023.md b/docs/release_notes/july_2023.md index f4f80e83539..de3fe514d74 100644 --- a/docs/release_notes/july_2023.md +++ b/docs/release_notes/july_2023.md @@ -1,4 +1,5 @@ # July 2023 + ## airbyte v0.50.6 to v0.50.11 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -15,4 +16,4 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt ## **🐛 Bug fixes** -- Our commitment to delivering a bug-free experience is unwavering. July saw us addressing a myriad of issues across our platform. We've rectified the **[custom connector creation flow](https://chat.openai.com/c/e3dcdfa7-a2d3-46b5-9976-2bb866e1bb2a#8018)**, ensuring a smoother user experience. Several sources, including **[Square](https://github.com/airbytehq/airbyte/pull/27762)** and **[Greenhouse](https://github.com/airbytehq/airbyte/pull/27773)**, have been updated following state management changes in the CDK. We've also tackled specific issues in connectors like **[Google Ads](https://github.com/airbytehq/airbyte/pull/27711)** and **[Datadog](https://github.com/airbytehq/airbyte/pull/27784)**, ensuring they function optimally. \ No newline at end of file +- Our commitment to delivering a bug-free experience is unwavering. July saw us addressing a myriad of issues across our platform. We've rectified the **[custom connector creation flow](https://chat.openai.com/c/e3dcdfa7-a2d3-46b5-9976-2bb866e1bb2a#8018)**, ensuring a smoother user experience. Several sources, including **[Square](https://github.com/airbytehq/airbyte/pull/27762)** and **[Greenhouse](https://github.com/airbytehq/airbyte/pull/27773)**, have been updated following state management changes in the CDK. We've also tackled specific issues in connectors like **[Google Ads](https://github.com/airbytehq/airbyte/pull/27711)** and **[Datadog](https://github.com/airbytehq/airbyte/pull/27784)**, ensuring they function optimally. diff --git a/docs/release_notes/june_2023.md b/docs/release_notes/june_2023.md index 99ca0ca7ac6..e0c92ff4eda 100644 --- a/docs/release_notes/june_2023.md +++ b/docs/release_notes/june_2023.md @@ -1,4 +1,5 @@ # June 2023 + ## airbyte v0.44.12 to v0.50.5 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -20,4 +21,4 @@ We've addressed various bugs for smoother user experience: - Fixed **`data_state`** config typo in **[Source Google Search Console](https://github.com/airbytehq/airbyte/pull/27307)** - Addressed issues with **[Source Amazon Seller Partner](https://github.com/airbytehq/airbyte/pull/27110)**, **[Facebook Marketing](https://github.com/airbytehq/airbyte/pull/27201)**, **[Quickbooks](https://github.com/airbytehq/airbyte/pull/27148)**, **[Smartsheets](https://github.com/airbytehq/airbyte/pull/27096)**, and others. -We've also made significant improvements to our connector builder, including reloading diff view on stream change (**[#6974](https://github.com/airbytehq/airbyte/pull/6974)**) \ No newline at end of file +We've also made significant improvements to our connector builder, including reloading diff view on stream change (**[#6974](https://github.com/airbytehq/airbyte/pull/6974)**) diff --git a/docs/release_notes/march_2023.md b/docs/release_notes/march_2023.md index fc89d8b715e..3b605f06428 100644 --- a/docs/release_notes/march_2023.md +++ b/docs/release_notes/march_2023.md @@ -1,4 +1,5 @@ # March 2023 + ## [airbyte v0.42.0](https://github.com/airbytehq/airbyte/releases/tag/v0.42.0) to [v0.42.1](https://github.com/airbytehq/airbyte/releases/tag/v0.42.1) This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -6,47 +7,49 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt ## **✨ New and improved features** - **New Sources and Promotions** - - 🎉 New Source: [Unleash](https://docs.airbyte.com/integrations/sources/unleash) [low-code CDK] ([#19923](https://github.com/airbytehq/airbyte/pull/19923)) - - 🎉 Source [Twitter](https://docs.airbyte.com/integrations/sources/twitter): to Alpha and in Cloud ([#23832](https://github.com/airbytehq/airbyte/pull/23832)) - - 🎉 Source [Confluence](https://docs.airbyte.com/integrations/sources/confluence): Enabled in cloud and now in Beta ([#23775](https://github.com/airbytehq/airbyte/pull/23775)) - - 🎉 Source [Airtable](https://docs.airbyte.com/integrations/sources/airtable): to GA ([#23763](https://github.com/airbytehq/airbyte/pull/23763)) - - 🎉 Source [Paystack](https://docs.airbyte.com/integrations/sources/paystack): in Cloud - - 🎉 Source [Google Analytics 4](https://docs.airbyte.com/integrations/sources/google-analytics-data-api): to GA - - 🎉 Source [Strava](https://docs.airbyte.com/integrations/sources/strava): to Beta - - 🎉 Source [GCS](https://docs.airbyte.com/integrations/sources/gcs): in Cloud - - 🎉 Source [ZohoCRM](https://docs.airbyte.com/integrations/sources/zoho-crm): to Alpha and in Cloud - - 🎉 Source [Yandex Metrica](https://docs.airbyte.com/integrations/sources/yandex-metrica): to Beta and in Cloud - - 🎉 Source [Salesloft](https://docs.airbyte.com/integrations/sources/salesloft/): to Alpha and in Cloud - - 🎉 Source [Xero](https://docs.airbyte.com/integrations/sources/xero/): to Beta and in Cloud - - 🎉 Source [Trello](https://docs.airbyte.com/integrations/sources/trello/): to Beta - - 🎉 Source [Paystack](https://docs.airbyte.com/integrations/sources/paystack/): to Beta and in Cloud - - 🎉 Source Trustpilot: in Cloud - - 🎉 Source [LinkedIn Pages](https://docs.airbyte.com/integrations/sources/linkedin-pages): in Cloud - - 🎉 Source [Pipedrive](https://docs.airbyte.com/integrations/sources/pipedrive): to Beta and in Cloud ([#23539](https://github.com/airbytehq/airbyte/pull/23539)) - - 🎉 Source [Chargebee](https://docs.airbyte.com/integrations/sources/chargebee): Migrate to YAML ([#21688](https://github.com/airbytehq/airbyte/pull/21688)) + + - 🎉 New Source: [Unleash](https://docs.airbyte.com/integrations/sources/unleash) [low-code CDK] ([#19923](https://github.com/airbytehq/airbyte/pull/19923)) + - 🎉 Source [Twitter](https://docs.airbyte.com/integrations/sources/twitter): to Alpha and in Cloud ([#23832](https://github.com/airbytehq/airbyte/pull/23832)) + - 🎉 Source [Confluence](https://docs.airbyte.com/integrations/sources/confluence): Enabled in cloud and now in Beta ([#23775](https://github.com/airbytehq/airbyte/pull/23775)) + - 🎉 Source [Airtable](https://docs.airbyte.com/integrations/sources/airtable): to GA ([#23763](https://github.com/airbytehq/airbyte/pull/23763)) + - 🎉 Source [Paystack](https://docs.airbyte.com/integrations/sources/paystack): in Cloud + - 🎉 Source [Google Analytics 4](https://docs.airbyte.com/integrations/sources/google-analytics-data-api): to GA + - 🎉 Source [Strava](https://docs.airbyte.com/integrations/sources/strava): to Beta + - 🎉 Source [GCS](https://docs.airbyte.com/integrations/sources/gcs): in Cloud + - 🎉 Source [ZohoCRM](https://docs.airbyte.com/integrations/sources/zoho-crm): to Alpha and in Cloud + - 🎉 Source [Yandex Metrica](https://docs.airbyte.com/integrations/sources/yandex-metrica): to Beta and in Cloud + - 🎉 Source [Salesloft](https://docs.airbyte.com/integrations/sources/salesloft/): to Alpha and in Cloud + - 🎉 Source [Xero](https://docs.airbyte.com/integrations/sources/xero/): to Beta and in Cloud + - 🎉 Source [Trello](https://docs.airbyte.com/integrations/sources/trello/): to Beta + - 🎉 Source [Paystack](https://docs.airbyte.com/integrations/sources/paystack/): to Beta and in Cloud + - 🎉 Source Trustpilot: in Cloud + - 🎉 Source [LinkedIn Pages](https://docs.airbyte.com/integrations/sources/linkedin-pages): in Cloud + - 🎉 Source [Pipedrive](https://docs.airbyte.com/integrations/sources/pipedrive): to Beta and in Cloud ([#23539](https://github.com/airbytehq/airbyte/pull/23539)) + - 🎉 Source [Chargebee](https://docs.airbyte.com/integrations/sources/chargebee): Migrate to YAML ([#21688](https://github.com/airbytehq/airbyte/pull/21688)) - **New Features for Existing Connectors** - - Redshift Destination: Add SSH Tunnelling Config Option ([#23523](https://github.com/airbytehq/airbyte/pull/23523)) - - 🎉 Source Amazon Seller Partner - Implement reportOptions for all missing reports ([#23606](https://github.com/airbytehq/airbyte/pull/23606)) - - Source Tiktok: allow to filter advertiser in reports ([#23377](https://github.com/airbytehq/airbyte/pull/23377)) - - 🎉 Source Github - added user friendly messages, added AirbyteTracedException config_error ([#23467](https://github.com/airbytehq/airbyte/pull/23467)) - - 🎉 Destination Weaviate: Support any string based ID and fix issues with additionalProperties ([#22527](https://github.com/airbytehq/airbyte/pull/22527)) + + - Redshift Destination: Add SSH Tunnelling Config Option ([#23523](https://github.com/airbytehq/airbyte/pull/23523)) + - 🎉 Source Amazon Seller Partner - Implement reportOptions for all missing reports ([#23606](https://github.com/airbytehq/airbyte/pull/23606)) + - Source Tiktok: allow to filter advertiser in reports ([#23377](https://github.com/airbytehq/airbyte/pull/23377)) + - 🎉 Source Github - added user friendly messages, added AirbyteTracedException config_error ([#23467](https://github.com/airbytehq/airbyte/pull/23467)) + - 🎉 Destination Weaviate: Support any string based ID and fix issues with additionalProperties ([#22527](https://github.com/airbytehq/airbyte/pull/22527)) - **New Features in Airbyte Platform** - - 🎉 octavia-cli: add pypi package workflow ([#22654](https://github.com/airbytehq/airbyte/pull/22654)) - - 🪟🎉 Connector builder projects UI (#4774) - - 🎉 Add stream syncing or resetting state to rows (#5364) + - 🎉 octavia-cli: add pypi package workflow ([#22654](https://github.com/airbytehq/airbyte/pull/22654)) + - 🪟🎉 Connector builder projects UI (#4774) + - 🎉 Add stream syncing or resetting state to rows (#5364) ## **🐛 Bug fixes** - 🐛 Source Delighted: fix `Date Since` - date-format bug in UI ([#23909](https://github.com/airbytehq/airbyte/pull/23909)) +  date-format bug in UI ([#23909](https://github.com/airbytehq/airbyte/pull/23909)) - 🐛 Source Iterable: add retry for 500 - Generic Error, increase `reduce slice max attempts` - ([#23821](https://github.com/airbytehq/airbyte/pull/23821)) +  ([#23821](https://github.com/airbytehq/airbyte/pull/23821)) - 🐛 Source S3: Make `Advanced Reader Options`and `Advanced Options`truly `Optional`([#23669](https://github.com/airbytehq/airbyte/pull/23669)) - Source Jira: Small fix in the board stream ([#21524](https://github.com/airbytehq/airbyte/pull/21524)) - 🐛 Source Sentry: fix `None` state_value + other bad `state_values` ([#23619](https://github.com/airbytehq/airbyte/pull/23619)) - 🐛 Source Pinterest: fix for `HTTP - 400 Bad Request` - when requesting data >= 90 days. ([#23649](https://github.com/airbytehq/airbyte/pull/23649)) +  when requesting data >= 90 days. ([#23649](https://github.com/airbytehq/airbyte/pull/23649)) - 🐛 Source Fauna: fix bug during discover step ([#23583](https://github.com/airbytehq/airbyte/pull/23583)) -- 🐛 Prevent crash on copying malformed manifest into yaml editor (#5391) \ No newline at end of file +- 🐛 Prevent crash on copying malformed manifest into yaml editor (#5391) diff --git a/docs/release_notes/march_2024.md b/docs/release_notes/march_2024.md index abf2486e399..55eb7c5e4a9 100644 --- a/docs/release_notes/march_2024.md +++ b/docs/release_notes/march_2024.md @@ -1,4 +1,5 @@ # March 2024 + ## airbyte v0.51.0 to v0.56.0 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -9,12 +10,11 @@ Airbyte now supports **OpenID Connect (OIDC) SSO** for Airbyte Enterprise and Ai Airbyte certified our [Microsoft SQL Server source](/integrations/sources/mssql) to support terabyte-sized tables, expanded datetime data types, and reliability improvements. -Airbyte migrated our [Redshift destination](https://github.com/airbytehq/airbyte/pull/36255) to the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling (particularly with large records), and deliver data incrementally. - +Airbyte migrated our [Redshift destination](https://github.com/airbytehq/airbyte/pull/36255) to the [Destinations V2](./upgrading_to_destinations_v2) framework. This enables you to map tables one-to-one with your source, experience better error handling (particularly with large records), and deliver data incrementally. ## Platform Releases -In addition to our OpenID Connect support, we also released: +In addition to our OpenID Connect support, we also released: - A major upgrade to our Docker and Helm deployments, which simplifies how external logs are configured. Learn more about the specific changes in our [migration guide](/deploying-airbyte/on-kubernetes-via-helm#migrate-from-old-chart-to-airbyte-v0520-and-latest-chart-version). @@ -28,5 +28,5 @@ In addition to our MS-SQL certification, we also released a few notable Connecto - We released several connector builder enhancements, including support for raw YAML blocks, modification the start date when testing, and added the ability to adjust page/slice/record limits. We also resolved bugs in page size and interpolation inputs, improved the switching time between YAML and UI, and fixed several layout issues. - Our [Bing source](https://github.com/airbytehq/airbyte/pull/35812) includes the following new streams: `Audience Performance Report`, `Goals And Funnels Report`, `Product Dimension Performance Report` -- Our [JIRA source](https://github.com/airbytehq/airbyte/pull/35656) now contains more fields to the following streams: `board_issues`,`filter_sharing`,`filters`,`issues`, `permission_schemes`, `sprint_issues`,`users_groups_detailed` and `workflows` +- Our [JIRA source](https://github.com/airbytehq/airbyte/pull/35656) now contains more fields to the following streams: `board_issues`,`filter_sharing`,`filters`,`issues`, `permission_schemes`, `sprint_issues`,`users_groups_detailed` and `workflows` - Our [Snapchat Source](https://github.com/airbytehq/airbyte/pull/35660) now contains additional fields in the `ads`, `adsquads`, `creatives`, and `media` streams. diff --git a/docs/release_notes/may_2023.md b/docs/release_notes/may_2023.md index 7754174f230..c84798579b7 100644 --- a/docs/release_notes/may_2023.md +++ b/docs/release_notes/may_2023.md @@ -1,4 +1,5 @@ # May 2023 + ## [airbyte v0.44.5](https://github.com/airbytehq/airbyte-platform/releases/tag/v0.44.5) to [v0.44.6](https://github.com/airbytehq/airbyte-platform/releases/tag/v0.44.6) This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -6,27 +7,27 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt ## **✨ New and improved features** - **New Sources and Promotions** - - 🎉 New Source: FullStory [Low code CDK] ([#25465](https://github.com/airbytehq/airbyte/pull/25465)) - - 🎉 New Source: Yotpo [Low code CDK] ([#25532](https://github.com/airbytehq/airbyte/pull/25532)) - - 🎉 New Source: Merge [Low code CDK] ([#25342](https://github.com/airbytehq/airbyte/pull/25342)) + + - 🎉 New Source: FullStory [Low code CDK] ([#25465](https://github.com/airbytehq/airbyte/pull/25465)) + - 🎉 New Source: Yotpo [Low code CDK] ([#25532](https://github.com/airbytehq/airbyte/pull/25532)) + - 🎉 New Source: Merge [Low code CDK] ([#25342](https://github.com/airbytehq/airbyte/pull/25342)) - **New Features for Existing Connectors** - - Source Marketo: New Stream Segmentation ([#23956](https://github.com/airbytehq/airbyte/pull/23956)) - - 🎉Categorized Config Errors Accurately for Google Analytics 4 (GA4) and Google Ads ([#25987](https://github.com/airbytehq/airbyte/pull/25987)) - - 🎉 Source Amplitude: added missing attrs in events schema, enabled default availability strategy ([#25842](https://github.com/airbytehq/airbyte/pull/25842)) - - 🎉 Source Bind Ads: add campaignlabels col ([#24223](https://github.com/airbytehq/airbyte/pull/24223)) - - ✨ Source Amazon Ads: add availability strategy for basic streams ([#25792](https://github.com/airbytehq/airbyte/pull/25792)) - - 🎉 Source Bing Ads: added undeclared fields to schemas ([#25668](https://github.com/airbytehq/airbyte/pull/25668)) - - 🎉Source Hubspot: Add oauth scope for goals and custom objects stream (#5820) - + - Source Marketo: New Stream Segmentation ([#23956](https://github.com/airbytehq/airbyte/pull/23956)) + - 🎉Categorized Config Errors Accurately for Google Analytics 4 (GA4) and Google Ads ([#25987](https://github.com/airbytehq/airbyte/pull/25987)) + - 🎉 Source Amplitude: added missing attrs in events schema, enabled default availability strategy ([#25842](https://github.com/airbytehq/airbyte/pull/25842)) + - 🎉 Source Bind Ads: add campaignlabels col ([#24223](https://github.com/airbytehq/airbyte/pull/24223)) + - ✨ Source Amazon Ads: add availability strategy for basic streams ([#25792](https://github.com/airbytehq/airbyte/pull/25792)) + - 🎉 Source Bing Ads: added undeclared fields to schemas ([#25668](https://github.com/airbytehq/airbyte/pull/25668)) + - 🎉Source Hubspot: Add oauth scope for goals and custom objects stream (#5820) - **New Features in Airbyte Platform** - - Normalization: Better handling for CDC transactional updates ([#25993](https://github.com/airbytehq/airbyte/pull/25993)) - - 🎉 Connector builder: Keep testing values around when leaving connector builder (#6336) - - 🎉 Connector builder: Copy from new stream modal (#6582) - - 🎉 Schema auto-propagation UI (#6700) - - 🎉 Connector builder: Client credentials flow for oauth authenticator (#6555) - - 🎉 Add support for source/destination LD contexts in UI (#6586) - - 🎉 Workspaces can be opened in a new tab (#6565) + - Normalization: Better handling for CDC transactional updates ([#25993](https://github.com/airbytehq/airbyte/pull/25993)) + - 🎉 Connector builder: Keep testing values around when leaving connector builder (#6336) + - 🎉 Connector builder: Copy from new stream modal (#6582) + - 🎉 Schema auto-propagation UI (#6700) + - 🎉 Connector builder: Client credentials flow for oauth authenticator (#6555) + - 🎉 Add support for source/destination LD contexts in UI (#6586) + - 🎉 Workspaces can be opened in a new tab (#6565) ## **🚨 Security & Breaking changes** @@ -57,4 +58,4 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt - 🐛 Connector builder: Always save yaml based manifest (#6486) - 🐛 Allow users to cancel a sync on a disabled connection (#6496) - 🐛 Asynchronously fetch connector update notifications (#6396) -- 🐛 Don't show connector builder prompt in destinations (#6321) \ No newline at end of file +- 🐛 Don't show connector builder prompt in destinations (#6321) diff --git a/docs/release_notes/november_2022.md b/docs/release_notes/november_2022.md index 8c31dd14699..12433e60fd6 100644 --- a/docs/release_notes/november_2022.md +++ b/docs/release_notes/november_2022.md @@ -1,20 +1,23 @@ # November 2022 + ## Airbyte [v0.40.18](https://github.com/airbytehq/airbyte/releases/tag/v0.40.18) to [v0.40.23](https://github.com/airbytehq/airbyte/releases/tag/v0.40.23) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added multi-region Cloud architecture, which allows for better [data protection](https://airbyte.com/blog/why-airbytes-eu-launch-is-a-milestone-for-our-data-protection-roadmap) and for Airbyte Cloud to [launch in Europe](https://airbyte.com/blog/airbyte-cloud-is-now-available-in-europe). -* Added the [low-code connector builder](https://www.loom.com/share/acf899938ef74dec8dd61ba012bc872f) UI to Airbyte Open Source. Run Airbyte v0.40.19 or higher and visit `localhost:8000/connector-builder` to start building low-code connectors. -* Added a Helm chart for deploying `airbyte-cron`. New installations of Airbyte Open Source will now deploy `airbyte-cron` by default. To disable cron, use `--set cron.enabled=false` when running a `helm install`. [#18542](https://github.com/airbytehq/airbyte/pull/18542) -* Added a progress bar estimate to syncs in Airbyte Cloud. [#19814](https://github.com/airbytehq/airbyte/pull/19814) + +- Added multi-region Cloud architecture, which allows for better [data protection](https://airbyte.com/blog/why-airbytes-eu-launch-is-a-milestone-for-our-data-protection-roadmap) and for Airbyte Cloud to [launch in Europe](https://airbyte.com/blog/airbyte-cloud-is-now-available-in-europe). +- Added the [low-code connector builder](https://www.loom.com/share/acf899938ef74dec8dd61ba012bc872f) UI to Airbyte Open Source. Run Airbyte v0.40.19 or higher and visit `localhost:8000/connector-builder` to start building low-code connectors. +- Added a Helm chart for deploying `airbyte-cron`. New installations of Airbyte Open Source will now deploy `airbyte-cron` by default. To disable cron, use `--set cron.enabled=false` when running a `helm install`. [#18542](https://github.com/airbytehq/airbyte/pull/18542) +- Added a progress bar estimate to syncs in Airbyte Cloud. [#19814](https://github.com/airbytehq/airbyte/pull/19814) ### Improvements -* Improved the Airbyte Protocol by introducing Airbyte Protocol v1 [#19846](https://github.com/airbytehq/airbyte/pull/19846), which defines a set of [well-known data types](https://github.com/airbytehq/airbyte/blob/5813700927cfc690d2bffcec28f5286e59ac0122/docs/understanding-airbyte/supported-data-types.md). [#17486](https://github.com/airbytehq/airbyte/pull/17486) - * These replace existing JSON Schema primitive types. - * They provide out-of-the-box validation and enforce specific formatting on some data types, like timestamps. - * Non-primitive types, like `object`, `array`, and ` oneOf`, still use raw JSON Schema types. - * These well-known types mostly correspond with the existing Airbyte data types, aside from a few differences: - * `BinaryData` is the only new type, which is used in places that previously produced a `Base64` string. - * `TimestampWithTimezone`, `TimestampWithoutTimezone`, `TimeWithTimezone`, and `TimeWithoutTimezone` have been in use for some time, so we made them official. - * The `big_integer` and `big_number` types have been retired because they were not being used. + +- Improved the Airbyte Protocol by introducing Airbyte Protocol v1 [#19846](https://github.com/airbytehq/airbyte/pull/19846), which defines a set of [well-known data types](https://github.com/airbytehq/airbyte/blob/5813700927cfc690d2bffcec28f5286e59ac0122/docs/understanding-airbyte/supported-data-types.md). [#17486](https://github.com/airbytehq/airbyte/pull/17486) + - These replace existing JSON Schema primitive types. + - They provide out-of-the-box validation and enforce specific formatting on some data types, like timestamps. + - Non-primitive types, like `object`, `array`, and ` oneOf`, still use raw JSON Schema types. + - These well-known types mostly correspond with the existing Airbyte data types, aside from a few differences: + - `BinaryData` is the only new type, which is used in places that previously produced a `Base64` string. + - `TimestampWithTimezone`, `TimestampWithoutTimezone`, `TimeWithTimezone`, and `TimeWithoutTimezone` have been in use for some time, so we made them official. + - The `big_integer` and `big_number` types have been retired because they were not being used. diff --git a/docs/release_notes/november_2023.md b/docs/release_notes/november_2023.md index 67323252e8c..e5ea5e83900 100644 --- a/docs/release_notes/november_2023.md +++ b/docs/release_notes/november_2023.md @@ -1,4 +1,5 @@ # November 2023 + ## airbyte v0.50.34 to v0.50.35 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -10,15 +11,15 @@ Airbyte now supports extracting text content from PDF, Docx, and Pptx files from SSO and RBAC (admin roles only) are now available in Airbyte Cloud! Read more below. ## Platform Releases + - **SSO and RBAC** You can now use SSO in Airbyte Cloud to administer permissions in Airbyte. This is currently only available through Okta, with plans to support Active Directory next. We also now offer **RBAC** (admin roles only) to ensure a high level of security when managing you workspace. For access to this feature, reach out to our [Sales team](https://www.airbyte.com/company/talk-to-sales). -- **Continuous heartbeat checks** We're continually monitoring syncs to verify they continue making progress, and have added functionality in the background to ensure that we continue receiving updated ["heartbeat" messages](/understanding-airbyte/heartbeats.md) from our connectors. This will ensure that we continue delivering data and avoid any timeouts. +- **Continuous heartbeat checks** We're continually monitoring syncs to verify they continue making progress, and have added functionality in the background to ensure that we continue receiving updated ["heartbeat" messages](/understanding-airbyte/heartbeats.md) from our connectors. This will ensure that we continue delivering data and avoid any timeouts. ## Connector Improvements In addition to being able to extract text content from unstructured data sources, we have also: - - Revamped core Marketing connectors Pinterest, Instagram and Klaviyo to significantly improve the setup experience and ensure resiliency and reliability. - - [Added incremenetal sync](https://github.com/airbytehq/airbyte/pull/32473) functionality for Hubspot's stream `property_history`, which improves sync time and reliability. - - [Added new streams](https://github.com/airbytehq/airbyte/pull/32738) for Amazon Seller Partner: `get_vendor_net_pure_product_margin_report`,`get_vendor_readl_time_inventory_report`, and `get_vendor_traffic_report` to enable additional reporting. - - Released our first connector, Stripe, that can perform [concurrent syncs](https://github.com/airbytehq/airbyte/pull/32473) where streams sync in parallel when syncing in Full Refresh mode. - +- Revamped core Marketing connectors Pinterest, Instagram and Klaviyo to significantly improve the setup experience and ensure resiliency and reliability. +- [Added incremenetal sync](https://github.com/airbytehq/airbyte/pull/32473) functionality for Hubspot's stream `property_history`, which improves sync time and reliability. +- [Added new streams](https://github.com/airbytehq/airbyte/pull/32738) for Amazon Seller Partner: `get_vendor_net_pure_product_margin_report`,`get_vendor_readl_time_inventory_report`, and `get_vendor_traffic_report` to enable additional reporting. +- Released our first connector, Stripe, that can perform [concurrent syncs](https://github.com/airbytehq/airbyte/pull/32473) where streams sync in parallel when syncing in Full Refresh mode. diff --git a/docs/release_notes/october_2022.md b/docs/release_notes/october_2022.md index a75430e4133..b205dbb884d 100644 --- a/docs/release_notes/october_2022.md +++ b/docs/release_notes/october_2022.md @@ -1,19 +1,22 @@ # October 2022 + ## Airbyte [v0.40.13](https://github.com/airbytehq/airbyte/releases/tag/v0.40.13) to [v0.40.17](https://github.com/airbytehq/airbyte/releases/tag/v0.40.17) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added the low-code connector builder UI to Airbyte OSS. It includes an embedded YAML editor and significantly reduces the time and complexity of building and maintaining connectors. [#17482](https://github.com/airbytehq/airbyte/pull/17482) -* Added Datadog Real User Monitoring (RUM) support to the webapp, which helps us monitor frontend performance in Airbyte Cloud. [#17821](https://github.com/airbytehq/airbyte/pull/17821) -* Added Nginx and Basic Auth to ensure security when using Airbyte Open Source. [#17694](https://github.com/airbytehq/airbyte/pull/17694) - * Now when you start the Airbyte server and go to localhost:8000, you’ll be prompted to log in before accessing your Airbyte workspace. - * You should change the default username (airbyte) and password (password) before you deploy Airbyte. If you do not want a username or password, you can remove them by setting `BASIC_AUTH_USERNAME` and `BASIC_AUTH_PASSWORD` to empty values (" ") in your `.env` file. - * Our [CLI](https://github.com/airbytehq/airbyte/pull/17982) and [docs](https://docs.airbyte.com/deploying-airbyte/local-deployment) have been updated to reflect this change. + +- Added the low-code connector builder UI to Airbyte OSS. It includes an embedded YAML editor and significantly reduces the time and complexity of building and maintaining connectors. [#17482](https://github.com/airbytehq/airbyte/pull/17482) +- Added Datadog Real User Monitoring (RUM) support to the webapp, which helps us monitor frontend performance in Airbyte Cloud. [#17821](https://github.com/airbytehq/airbyte/pull/17821) +- Added Nginx and Basic Auth to ensure security when using Airbyte Open Source. [#17694](https://github.com/airbytehq/airbyte/pull/17694) + - Now when you start the Airbyte server and go to localhost:8000, you’ll be prompted to log in before accessing your Airbyte workspace. + - You should change the default username (airbyte) and password (password) before you deploy Airbyte. If you do not want a username or password, you can remove them by setting `BASIC_AUTH_USERNAME` and `BASIC_AUTH_PASSWORD` to empty values (" ") in your `.env` file. + - Our [CLI](https://github.com/airbytehq/airbyte/pull/17982) and [docs](https://docs.airbyte.com/deploying-airbyte/local-deployment) have been updated to reflect this change. ### Improvements -* Since adding Basic Auth to Airbyte Open Source, we improved the `load_test` script to reflect this change. Now when the `load_test` script sources the `.env` file, it includes `BASIC_AUTH_USERNAME` and `BASIC_AUTH_PASSWORD` when calling the API. [#18273](https://github.com/airbytehq/airbyte/pull/18273) -* Improved the Airbyte platform by updating the Apache Commons Text from 1.9 to 1.10.0 because version 1.9 was affected by [CVE 2022-42889](https://nvd.nist.gov/vuln/detail/CVE-2022-42889) (Text4Shell). [#18273](https://github.com/airbytehq/airbyte/pull/18273) - * We do not intend to update older versions of Airbyte because we were not affected by the vulnerable behavior: - * Our direct usages of `commons-text` either do not use the vulnerable class or are pinned to an unaffected version. - * Almost all of our transitive dependencies on `commons-text` are limited to test code. Runtime code has no vulnerable transitive dependencies on `commons-text`. + +- Since adding Basic Auth to Airbyte Open Source, we improved the `load_test` script to reflect this change. Now when the `load_test` script sources the `.env` file, it includes `BASIC_AUTH_USERNAME` and `BASIC_AUTH_PASSWORD` when calling the API. [#18273](https://github.com/airbytehq/airbyte/pull/18273) +- Improved the Airbyte platform by updating the Apache Commons Text from 1.9 to 1.10.0 because version 1.9 was affected by [CVE 2022-42889](https://nvd.nist.gov/vuln/detail/CVE-2022-42889) (Text4Shell). [#18273](https://github.com/airbytehq/airbyte/pull/18273) + - We do not intend to update older versions of Airbyte because we were not affected by the vulnerable behavior: + - Our direct usages of `commons-text` either do not use the vulnerable class or are pinned to an unaffected version. + - Almost all of our transitive dependencies on `commons-text` are limited to test code. Runtime code has no vulnerable transitive dependencies on `commons-text`. diff --git a/docs/release_notes/october_2023.md b/docs/release_notes/october_2023.md index 79126acbadd..7c2f9be2bbb 100644 --- a/docs/release_notes/october_2023.md +++ b/docs/release_notes/october_2023.md @@ -1,4 +1,5 @@ # October 2023 + ## airbyte v0.50.31 to v0.50.33 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -14,6 +15,7 @@ We're also always learning and listening to user feedback. We no longer [dedupli This month, we also held our annual Hacktoberfest, from which we have already merged 51 PRs and welcomed 3 new contributors to our community! ## Platform Releases + - **Enhanced payment options:** Cloud customers can now sign up for [auto-recharging of their balance](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-credits#automatic-reload-of-credits-beta) and can purchase up to 6,000 credits within our application. - **Free historical syncs:** Cloud customers can have more predictability around pricing with free historical syncs for any new connector. Reach out to our Sales team if interested. - **Email Notification Recipient** Cloud customers can now designate the recipient of important email notifications about their connectors and syncs. @@ -22,15 +24,16 @@ This month, we also held our annual Hacktoberfest, from which we have already me Many of our enhancements came from our Community this month as a part of our Hacktoberfest. Notably, we enhanced the connector experience by: -- [**GitLab**](https://github.com/airbytehq/airbyte/pull/31492) now gracefully handles the expiration of access tokens +- [**GitLab**](https://github.com/airbytehq/airbyte/pull/31492) now gracefully handles the expiration of access tokens - [**Orbit**](https://github.com/airbytehq/airbyte/pull/30138) and [**Qualaroo**](https://github.com/airbytehq/airbyte/pull/30138) were migrated to low-code, which improves the maintainability of the connector (thanks to community member Aviraj Gour!) - [**Pipdrive**](https://github.com/airbytehq/airbyte/pull/30138): optimized custom fields, which are commonly found in this connector. Additionally, we added new streams for several connectors to ensure users have access to all their data, including: + - [**Chargify**](https://github.com/airbytehq/airbyte/pull/31116): Coupons, Transactions, and Invoices - [**Mailchimp**](https://github.com/airbytehq/airbyte/pull/31922): Segment and Unsubscribes - [**Pipedrive**](https://github.com/airbytehq/airbyte/pull/31885): Mails (thanks to community member Tope Folorunso!) and Goals - [**Asana**](https://github.com/airbytehq/airbyte/pull/31634): Events, Attachments, OrganizationExports (thanks to Tope again!) - [**Tiktok Ads**](https://github.com/airbytehq/airbyte/pull/31610): Audiences, Images, Music, Portfolios, Videos, Ad Audiences Report by Province - [**Square**](https://github.com/airbytehq/airbyte/pull/30138): Bank Accounts (thanks community member Aviraj Gour) and Cash Drawers -- [**Notion**](https://github.com/airbytehq/airbyte/pull/30324): Blocks, Pages and Comments \ No newline at end of file +- [**Notion**](https://github.com/airbytehq/airbyte/pull/30324): Blocks, Pages and Comments diff --git a/docs/release_notes/september_2022.md b/docs/release_notes/september_2022.md index abb383f2e15..e1dcf7939fe 100644 --- a/docs/release_notes/september_2022.md +++ b/docs/release_notes/september_2022.md @@ -1,21 +1,25 @@ # September 2022 + ## Airbyte [v0.40.4](https://github.com/airbytehq/airbyte/releases/tag/v0.40.4) to [v0.40.6](https://github.com/airbytehq/airbyte/releases/tag/v0.40.6) -This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. +This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. ### New features -* Added the low-code connector development kit (early access). This low-code framework is a declarative approach based on YAML with the goal of significantly reducing the time and complexity of building and maintaining connectors. [#11582](https://github.com/airbytehq/airbyte/issues/11582) - * Added a [guide](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview/) for using the low-code framework. [#17534](https://github.com/airbytehq/airbyte/pull/17534) -* Added support for large schema discovery. [#17394](https://github.com/airbytehq/airbyte/pull/17394) + +- Added the low-code connector development kit (early access). This low-code framework is a declarative approach based on YAML with the goal of significantly reducing the time and complexity of building and maintaining connectors. [#11582](https://github.com/airbytehq/airbyte/issues/11582) + - Added a [guide](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview/) for using the low-code framework. [#17534](https://github.com/airbytehq/airbyte/pull/17534) +- Added support for large schema discovery. [#17394](https://github.com/airbytehq/airbyte/pull/17394) ### Improvements -* Improved `airbyte-metrics` support in the Helm chart. [#16166](https://github.com/airbytehq/airbyte/pull/16166) -* Improved the visibility button behavior for the password input field. This ensures that passwords are always submitted as sensitive fields. [#16011](https://github.com/airbytehq/airbyte/pull/16011) -* Improved Sync History page performance by adding the **Load more** button, which you can click to display previous syncs. [#15938](https://github.com/airbytehq/airbyte/pull/15938) -* Improved the validation error that displays when submitting an incomplete ServiceForm. [#15625](https://github.com/airbytehq/airbyte/pull/15625) -* Improved the source-defined cursor and primary key by adding a tooltip, which displays the full cursor or primary key when you hover over them. [#16116](https://github.com/airbytehq/airbyte/pull/16116) -* Improved Airbyte Cloud’s method of updating source and destination definitions by using `airbyte-cron` to schedule updates. This allows us to keep connectors updated as the catalog changes. [#16438](https://github.com/airbytehq/airbyte/pull/16438) -* Improved the speed that workspace connections are listed. [#17004](https://github.com/airbytehq/airbyte/pull/17004) + +- Improved `airbyte-metrics` support in the Helm chart. [#16166](https://github.com/airbytehq/airbyte/pull/16166) +- Improved the visibility button behavior for the password input field. This ensures that passwords are always submitted as sensitive fields. [#16011](https://github.com/airbytehq/airbyte/pull/16011) +- Improved Sync History page performance by adding the **Load more** button, which you can click to display previous syncs. [#15938](https://github.com/airbytehq/airbyte/pull/15938) +- Improved the validation error that displays when submitting an incomplete ServiceForm. [#15625](https://github.com/airbytehq/airbyte/pull/15625) +- Improved the source-defined cursor and primary key by adding a tooltip, which displays the full cursor or primary key when you hover over them. [#16116](https://github.com/airbytehq/airbyte/pull/16116) +- Improved Airbyte Cloud’s method of updating source and destination definitions by using `airbyte-cron` to schedule updates. This allows us to keep connectors updated as the catalog changes. [#16438](https://github.com/airbytehq/airbyte/pull/16438) +- Improved the speed that workspace connections are listed. [#17004](https://github.com/airbytehq/airbyte/pull/17004) ## Bugs -* Fixed an issue where the Helm chart templates did not correctly render `extraContainers` values. [#17084](https://github.com/airbytehq/airbyte/pull/17084) + +- Fixed an issue where the Helm chart templates did not correctly render `extraContainers` values. [#17084](https://github.com/airbytehq/airbyte/pull/17084) diff --git a/docs/release_notes/september_2023.md b/docs/release_notes/september_2023.md index 4c21c4384d3..33cd2fb1364 100644 --- a/docs/release_notes/september_2023.md +++ b/docs/release_notes/september_2023.md @@ -1,4 +1,5 @@ # September 2023 + ## airbyte v0.50.24 to v0.50.31 This page includes new features and improvements to the Airbyte Cloud and Airbyte Open Source platforms. @@ -6,6 +7,7 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt ## ✨ Highlights This month, we brought 4 new destinations to Airbyte focused on AI. This enables users to seamlessly flow data from 100s of our sources into large language models. Those four destinations are: + - [Qdrant](https://github.com/airbytehq/airbyte/pull/30332) - [Choroma](https://github.com/airbytehq/airbyte/pull/30023) - [Milvus](https://github.com/airbytehq/airbyte/pull/30023) @@ -22,7 +24,8 @@ We've also worked on several connector enhancements and additions. To name a few - [**Google Ads**](https://github.com/airbytehq/airbyte/pull/28970) now uses the change status to implement an improved incremental sync for Ad Groups and Campaign Criterion streams Additionally, we added new streams for several connectors to bring in newly available API endpoints and adapt to user feedback, including: -- [**Github**](https://github.com/airbytehq/airbyte/pull/30823): Issue Timeline and Contributor Activity + +- [**Github**](https://github.com/airbytehq/airbyte/pull/30823): Issue Timeline and Contributor Activity - [**JIRA**](https://github.com/airbytehq/airbyte/pull/30755): Issue Types, Project Roles, and Issue Transitions -- [**Outreach**](https://github.com/airbytehq/airbyte/pull/30639): Call Purposes and Call Dispositions -- [**Zendesk**](https://github.com/airbytehq/airbyte/pull/30138): Articles \ No newline at end of file +- [**Outreach**](https://github.com/airbytehq/airbyte/pull/30639): Call Purposes and Call Dispositions +- [**Zendesk**](https://github.com/airbytehq/airbyte/pull/30138): Articles diff --git a/docs/release_notes/upgrading_to_destinations_v2.md b/docs/release_notes/upgrading_to_destinations_v2.md index 77fe4b878a5..d245fdeb5c1 100644 --- a/docs/release_notes/upgrading_to_destinations_v2.md +++ b/docs/release_notes/upgrading_to_destinations_v2.md @@ -55,11 +55,11 @@ Whenever possible, we've taken this opportunity to use the best data type for st :::caution Upgrade Warning -* The upgrading process entails hydrating the v2 format raw table by querying the v1 raw table through a standard query, such as "INSERT INTO v2_raw_table SELECT * FROM v1_raw_table." -The duration of this process can vary significantly based on the data size and may encounter failures contingent on the Destination's capacity to execute the query. -In some cases, creating a new Airbyte connection, rather than migrating your existing connection, may be faster. Note that in these cases, all data will be re-imported. -* Following the successful migration of v1 raw tables to v2, the v1 raw tables will be dropped. However, it is essential to note that if there are any derived objects (materialized views) or referential -constraints (foreign keys) linked to the old raw table, this operation may encounter failure, resulting in an unsuccessful upgrade or broken derived objects (like materialized views etc). +- The upgrading process entails hydrating the v2 format raw table by querying the v1 raw table through a standard query, such as "INSERT INTO v2_raw_table SELECT \* FROM v1_raw_table." + The duration of this process can vary significantly based on the data size and may encounter failures contingent on the Destination's capacity to execute the query. + In some cases, creating a new Airbyte connection, rather than migrating your existing connection, may be faster. Note that in these cases, all data will be re-imported. +- Following the successful migration of v1 raw tables to v2, the v1 raw tables will be dropped. However, it is essential to note that if there are any derived objects (materialized views) or referential + constraints (foreign keys) linked to the old raw table, this operation may encounter failure, resulting in an unsuccessful upgrade or broken derived objects (like materialized views etc). If any of the above concerns are applicable to your existing setup, we recommend [Upgrading Connections One by One with Dual-Writing](#upgrading-connections-one-by-one-with-dual-writing) for a more controlled upgrade process ::: @@ -160,7 +160,7 @@ As a user previously not running Normalization, Upgrading to Destinations V2 wil For each [CDC-supported](https://docs.airbyte.com/understanding-airbyte/cdc) source connector, we recommend the following: | CDC Source | Recommendation | Notes | -|------------|--------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| ---------- | ------------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Postgres | [Upgrade connection in place](#quick-start-to-upgrading) | You can optionally dual write, but this requires resyncing historical data from the source. You must create a new Postgres source with a different replication slot than your existing source to preserve the integrity of your existing connection. | | MySQL | [All above upgrade paths supported](#advanced-upgrade-paths) | You can upgrade the connection in place, or dual write. When dual writing, Airbyte can leverage the state of an existing, active connection to ensure historical data is not re-replicated from MySQL. | @@ -169,7 +169,7 @@ For each [CDC-supported](https://docs.airbyte.com/understanding-airbyte/cdc) sou For each destination connector, Destinations V2 is effective as of the following versions: | Destination Connector | Safe Rollback Version | Destinations V2 Compatible | Upgrade Deadline | -|-----------------------|-----------------------|----------------------------|--------------------------| +| --------------------- | --------------------- | -------------------------- | ------------------------ | | BigQuery | 1.10.2 | 2.0.6+ | November 7, 2023 | | Snowflake | 2.1.7 | 3.1.0+ | November 7, 2023 | | Redshift | 0.8.0 | 2.0.0+ | March 15, 2024 | @@ -201,12 +201,14 @@ In addition to the changes which apply for all destinations described above, the ### BigQuery #### [Object and array properties](https://docs.airbyte.com/understanding-airbyte/supported-data-types/#the-types) are properly stored as JSON columns + Previously, we had used TEXT, which made querying sub-properties more difficult. In certain cases, numbers within sub-properties with long decimal values will need to be converted to float representations due to a _quirk_ of Bigquery. Learn more [here](https://github.com/airbytehq/airbyte/issues/29594). ### Snowflake #### Explicitly uppercase column names in Final Tables + Snowflake will implicitly uppercase column names if they are not quoted. Airbyte needs to quote the column names because a variety of sources have column/field names which contain special characters that require quoting in Snowflake. However, when you quote a column name in Snowflake, it also preserves lowercase naming. During the Snowflake V2 beta, most customers found this behavior unexpected and expected column selection to be case-insensitive for columns without special characters. As a result of this feedback, we decided to explicitly uppercase column names in the final tables, which does mean that columns which previous required quoting, now also require you to convert to the upper case version. @@ -236,8 +238,8 @@ SELECT "MY COLUMN" from my_table; #### Preserving mixed case column names in Final Tables -Postgres will implicitly lower case column names with mixed case characters when using unquoted identifiers. Based on feedback, we chose to replace any special -characters like spaces with underscores and use quoted identifiers to preserve mixed case column names. +Postgres will implicitly lower case column names with mixed case characters when using unquoted identifiers. Based on feedback, we chose to replace any special +characters like spaces with underscores and use quoted identifiers to preserve mixed case column names. ## Updating Downstream Transformations diff --git a/docs/snowflake-native-apps/event-sharing.md b/docs/snowflake-native-apps/event-sharing.md index afade3e69b1..ef93a8aa53b 100644 --- a/docs/snowflake-native-apps/event-sharing.md +++ b/docs/snowflake-native-apps/event-sharing.md @@ -5,16 +5,21 @@ Sharing the events is important to ensure that in case of issue, our team can in In order to share the events, you can refer to the [Snowflake documentation](https://other-docs.snowflake.com/en/native-apps/consumer-enable-logging#label-nativeapps-consumer-logging-enabling). As of 2023-10-02, you have to: 1. Create the event table. This table is global to an account so all applications share the same event table. We recommend using: + ``` CREATE DATABASE event_database; CREATE SCHEMA event_schema; CREATE EVENT TABLE event_database.event_schema.event_table; ``` + 2. Make the table active for your account, + ``` ALTER ACCOUNT SET EVENT_TABLE=event_database.event_schema.event_table; ``` + 3. Allow the application to share the logs. + ``` ALTER APPLICATION SET SHARE_EVENTS_WITH_PROVIDER = TRUE`; ``` diff --git a/docs/snowflake-native-apps/facebook-marketing.md b/docs/snowflake-native-apps/facebook-marketing.md index 1b4a458e2e2..45523ea35ca 100644 --- a/docs/snowflake-native-apps/facebook-marketing.md +++ b/docs/snowflake-native-apps/facebook-marketing.md @@ -9,6 +9,7 @@ The Snowflake Native Apps platform is new and rapidly evolving. The Facebook Mar # Getting started ## Prerequisites + A Facebook Marketing account with permission to access data from accounts you want to sync. ## Installing the App @@ -21,16 +22,17 @@ Do not refresh the Apps page while the application is being installed. This may 2. On the left sidebar, click `Marketplace`. 3. Search for `Facebook Marketing Connector` by Airbyte or navigate to https://app.snowflake.com/marketplace/listing/GZTYZ9BCRTG/airbyte-facebook-marketing-connector 4. Click `Get`. This will open a pop-up where you can specify install options. Expand `Options`. - 1. You can rename the application or leave the default. This is how you will reference the application from a worksheet. - 2. Specify the warehouse that the application will be installed to. + 1. You can rename the application or leave the default. This is how you will reference the application from a worksheet. + 2. Specify the warehouse that the application will be installed to. 5. Wait for the application to install. Once complete, the pop-up window should automatically close. 6. On the left sidebar, click `Apps`. ![](./facebook-marketing-app-install.png) -7. Once your installation is complete, under the `Installed Apps` section, you should see the `Facebook Marketing Connector` by Airbyte. +7. Once your installation is complete, under the `Installed Apps` section, you should see the `Facebook Marketing Connector` by Airbyte. ## Facebook Marketing Account + In order for the Facebook Marketing Connector by Airbyte to query Facebook's APIs, you will need an account with the right permissions. Please follow the [Facebook Marketing authentication guide](https://docs.airbyte.com/integrations/sources/facebook-marketing#for-airbyte-open-source-generate-an-access-token-and-request-a-rate-limit-increase) for further information. ## Snowflake Native App Authorizations @@ -40,15 +42,18 @@ By default the app will be installed using the name `FACEBOOK_MARKETING_CONNECTO ::: ### Adding Credentials and Configuring External API Access + Before using the application, you will need to perform a few prerequisite steps to prepare the application to make outbound API requests and use your authentication credentials. From a SQL worksheet, you will need to run a series of commands. 1. Create the database where the app will access the authorization. + ``` CREATE DATABASE AIRBYTE_FACEBOOK_MARKETING_DB; USE AIRBYTE_FACEBOOK_MARKETING_DB; ``` 2. You will need to allow outgoing network traffic based on the domain of the source. In the case of Facebook Marketing, simply run: + ``` CREATE OR REPLACE NETWORK RULE FACEBOOK_MARKETING_APIS_NETWORK_RULE MODE = EGRESS @@ -61,6 +66,7 @@ As of 2023-09-13, the [Snowflake documentation](https://docs.snowflake.com/en/sq ::: 3. Once you have external access configured, you need define your authorization/authentication. Provide the credentials to the app as such: + ``` CREATE OR REPLACE SECRET AIRBYTE_APP_SECRET TYPE = GENERIC_STRING @@ -68,9 +74,11 @@ CREATE OR REPLACE SECRET AIRBYTE_APP_SECRET "access_token": "" }'; ``` + ... where `client_id`, `client_secret` and `refresh_token` are strings. For more information, see the [Facebook Marketing authentication guide](https://docs.airbyte.com/integrations/sources/facebook-marketing#for-airbyte-open-source-generate-an-access-token-and-request-a-rate-limit-increase). 4. Once the network rule and the secret are defined in Snowflake, you need to make them available to the app by using an external access integration. + ``` CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION AIRBYTE_APP_INTEGRATION ALLOWED_NETWORK_RULES = (facebook_marketing_apis_network_rule) @@ -79,11 +87,13 @@ CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION AIRBYTE_APP_INTEGRATION ``` 5. Grant permission for the app to access the integration. + ``` GRANT USAGE ON INTEGRATION AIRBYTE_APP_INTEGRATION TO APPLICATION FACEBOOK_MARKETING_CONNECTOR; ``` 6. Grant permissions for the app to access the database that houses the secret and read the secret. + ``` GRANT USAGE ON DATABASE AIRBYTE_FACEBOOK_MARKETING_DB TO APPLICATION FACEBOOK_MARKETING_CONNECTOR; GRANT USAGE ON SCHEMA PUBLIC TO APPLICATION FACEBOOK_MARKETING_CONNECTOR; @@ -91,7 +101,8 @@ GRANT READ ON SECRET AIRBYTE_APP_SECRET TO APPLICATION FACEBOOK_MARKETING_CONNEC ``` ### Granting Account Privileges -Once you have completed the prerequisite SQL setup steps, you will need to grant privileges to allow the application to create databases, create warehouses, and execute tasks. + +Once you have completed the prerequisite SQL setup steps, you will need to grant privileges to allow the application to create databases, create warehouses, and execute tasks. All of these privileges are required for the application to extract data into Snowflake database successfully. 1. Start by going in the `Apps` section and selecting `Facebook Marketing Connector`. You will have to accept the Anaconda terms in order to use Streamlit. @@ -109,21 +120,22 @@ All of these privileges are required for the application to extract data into Sn You are now ready to begin syncing your data. ## Configuring a Connection + Navigate back to the application by clicking `STREAMLIT` in the top left corner. Select `New Connection` and fill the following fields: ---- +--- `account_id` The Facebook Ad account ID to use when pulling data from the Facebook Marketing API. The Ad account ID number is in the account dropdown menu or in your browser's address bar of your [Meta Ads Manager](https://adsmanager.facebook.com/adsmanager/). ---- +--- `start_date` UTC date in the format YYYY-MM-DDTHH:mm:ssZ (e.g. 2021-09-29T12:13:14Z). Any data before this date will not be replicated. ---- +--- `end_date` @@ -169,15 +181,15 @@ The database where the records will be saved. Snowflake's database [naming conve `Output Schema` -The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here. +The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here. ---- +--- `Connection Name` How the connection will be referred in the Streamlit app. ---- +--- `Replication Frequency` @@ -186,13 +198,16 @@ The sync schedule that determines how often your data will be synced to the targ --- ## Enabling Logging and Event Sharing for an Application + Sharing the logging and telemetry data of your installed application helps us improve the application and can allow us to better triage problems that your run into. To configure your application for logging and telemetry data please refer to the documentation for [Enabling Logging and Event Sharing](event-sharing.md). ## Syncing Your Facebook Marketing Data + Once a connection is configured, go in `Connections List` to view all of your connections. From here for each connection you can view the configuration settings, start a sync, and view the prior sync history. ### Scheduled Syncs + While creating a connection, you can specify a "Replication Frequency" which will dictate how often your data will be extracted from Facebook Marketing and loaded into your Snowflake database. This process is started automatically according to your schedule and does not require that you manually trigger syncs. For example, if you create a connection at 10:15 AM and set your replication frequency to @@ -200,27 +215,32 @@ hourly, then a sync will be started immediately. The next sync will start at 11: time. In the event that your sync runs longer than one hour, a new sync will start at the next available time. ### Manual Syncs + In addition to scheduled syncs, you can also configure a connection to only sync data on-demand by setting "Replication Frequency" to `MANUAL`. After creating a connection, from the `Connections List` page, you can use the "Sync Now" button to trigger a sync of your API data to your Snowflake database. You can also use this button to manually trigger connections that sync according to a -schedule. If there is already a sync in progress, this button will be disabled. +schedule. If there is already a sync in progress, this button will be disabled. ### Sync History + From the `Connections List` page, you can view information about past syncs for each connection to determine when your data is done syncing and whether the operation was successful. Once the sync is completed successfully, you should be able to validate that the records have been stored in `.`. ## Supported Streams + As of now, all supported streams perform a full refresh. Incremental syncs are not yet supported. Here are the list of supported streams: -* Activities -* Ad Account -* Ad Creatives -* Ad Insights -* Ad Sets -* Ads -* Campaigns -* Custom Audiences -* Custom Conversions + +- Activities +- Ad Account +- Ad Creatives +- Ad Insights +- Ad Sets +- Ads +- Campaigns +- Custom Audiences +- Custom Conversions # Contact Us + snowflake-native-apps@airbyte.io diff --git a/docs/snowflake-native-apps/linkedin-ads.md b/docs/snowflake-native-apps/linkedin-ads.md index bd34a7ffa56..326034a6f6b 100644 --- a/docs/snowflake-native-apps/linkedin-ads.md +++ b/docs/snowflake-native-apps/linkedin-ads.md @@ -9,6 +9,7 @@ The Snowflake Native Apps platform is new and rapidly evolving. The LinkedIn Ads # Getting started ## Prerequisites + A LinkedIn Ads account with permission to access data from accounts you want to sync. ## Installing the App @@ -21,16 +22,17 @@ Do not refresh the Apps page while the application is being installed. This may 2. On the left sidebar, click `Marketplace`. 3. Search for `LinkedIn Ads Connector` by Airbyte or navigate to https://app.snowflake.com/marketplace/listing/GZTYZ9BCRTW/airbyte-linkedin-ads-connector 4. Click `Get`. This will open a pop-up where you can specify install options. Expand `Options`. - 1. You can rename the application or leave the default. This is how you will reference the application from a worksheet. - 2. Specify the warehouse that the application will be installed to. + 1. You can rename the application or leave the default. This is how you will reference the application from a worksheet. + 2. Specify the warehouse that the application will be installed to. 5. Wait for the application to install. Once complete, the pop-up window should automatically close. 6. On the left sidebar, click `Apps`. ![](./linkedin-ads-app-install.png) -7. Once your installation is complete, under the `Installed Apps` section, you should see the `LinkedIn Ads Connector` by Airbyte. +7. Once your installation is complete, under the `Installed Apps` section, you should see the `LinkedIn Ads Connector` by Airbyte. ## LinkedIn Ads Account + In order for the LinkedIn Ads Connector by Airbyte to query LinkedIn, you will need an account with the right permissions. Please follow the [LinkedIn Ads authentication guide](https://docs.airbyte.com/integrations/sources/linkedin-ads/#set-up-linkedin-ads-authentication-airbyte-open-source) for further information. ## Snowflake Native App Authorizations @@ -40,15 +42,18 @@ By default the app will be installed using the name `LINKEDIN_ADS_CONNECTOR`, bu ::: ### Adding Credentials and Configuring External API Access + Before using the application, you will need to perform a few prerequisite steps to prepare the application to make outbound API requests and use your authentication credentials. From a SQL worksheet, you will need to run a series of commands. 1. Create the database where the app will access the authorization. + ``` CREATE DATABASE AIRBYTE_LINKEDIN_ADS_DB; USE AIRBYTE_LINKEDIN_ADS_DB; ``` 2. You will need to allow outgoing network traffic based on the domain of the source. In the case of LinkedIn Ads, simply run: + ``` CREATE OR REPLACE NETWORK RULE LINKEDIN_APIS_NETWORK_RULE MODE = EGRESS @@ -61,6 +66,7 @@ As of 2023-09-13, the [Snowflake documentation](https://docs.snowflake.com/en/sq ::: 3. Once you have external access configured, you need define your authorization/authentication. Provide the credentials to the app as such: + ``` CREATE OR REPLACE SECRET AIRBYTE_APP_SECRET TYPE = GENERIC_STRING @@ -71,9 +77,11 @@ CREATE OR REPLACE SECRET AIRBYTE_APP_SECRET "refresh_token": }'; ``` + ... where `client_id`, `client_secret` and `refresh_token` are strings. For more information, see the [LinkedIn Ads authentication guide](https://docs.airbyte.com/integrations/sources/linkedin-ads/#set-up-linkedin-ads-authentication-airbyte-open-source). 4. Once the network rule and the secret are defined in Snowflake, you need to make them available to the app by using an external access integration. + ``` CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION AIRBYTE_APP_INTEGRATION ALLOWED_NETWORK_RULES = (LINKEDIN_APIS_NETWORK_RULE) @@ -82,11 +90,13 @@ CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION AIRBYTE_APP_INTEGRATION ``` 5. Grant permission for the app to access the integration. + ``` GRANT USAGE ON INTEGRATION AIRBYTE_APP_INTEGRATION TO APPLICATION LINKEDIN_ADS_CONNECTOR; ``` 6. Grant permissions for the app to access the database that houses the secret and read the secret. + ``` GRANT USAGE ON DATABASE AIRBYTE_LINKEDIN_ADS_DB TO APPLICATION LINKEDIN_ADS_CONNECTOR; GRANT USAGE ON SCHEMA PUBLIC TO APPLICATION LINKEDIN_ADS_CONNECTOR; @@ -94,7 +104,8 @@ GRANT READ ON SECRET AIRBYTE_APP_SECRET TO APPLICATION LINKEDIN_ADS_CONNECTOR; ``` ### Granting Account Privileges -Once you have completed the prerequisite SQL setup steps, you will need to grant privileges to allow the application to create databases, create warehouses, and execute tasks. + +Once you have completed the prerequisite SQL setup steps, you will need to grant privileges to allow the application to create databases, create warehouses, and execute tasks. All of these privileges are required for the application to extract data into Snowflake database successfully. 1. Start by going in the `Apps` section and selecting `LinkedIn Ads Connector`. You will have to accept the Anaconda terms in order to use Streamlit. @@ -112,13 +123,14 @@ All of these privileges are required for the application to extract data into Sn You are now ready to begin syncing your data. ## Configuring a Connection + Navigate back to the application by clicking `STREAMLIT` in the top left corner. Select `New Connection` and fill the following fields: ---- +--- `start_date` -UTC date in the format YYYY-MM-DD (e.g. 2020-09-17). Any data before this date will not be replicated. +UTC date in the format YYYY-MM-DD (e.g. 2020-09-17). Any data before this date will not be replicated. --- @@ -136,15 +148,15 @@ The database where the records will be saved. Snowflake's database [naming conve `Output Schema` -The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here. +The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here. ---- +--- `Connection Name` How the connection will be referred in the Streamlit app. ---- +--- `Replication Frequency` @@ -153,13 +165,16 @@ The sync schedule that determines how often your data will be synced to the targ --- ## Enabling Logging and Event Sharing for an Application + Sharing the logging and telemetry data of your installed application helps us improve the application and can allow us to better triage problems that your run into. To configure your application for logging and telemetry data please refer to the documentation for [Enabling Logging and Event Sharing](event-sharing.md). ## Syncing Your LinkedIn Ads Data + Once a connection is configured, go in `Connections List` to view all of your connections. From here for each connection you can view the configuration settings, start a sync, and view the prior sync history. ### Scheduled Syncs + While creating a connection, you can specify a "Replication Frequency" which will dictate how often your data will be extracted from LinkedIn Ads and loaded into your Snowflake database. This process is started automatically according to your schedule and does not require that you manually trigger syncs. For example, if you create a connection at 10:15 AM and set your replication frequency to @@ -167,25 +182,30 @@ hourly, then a sync will be started immediately. The next sync will start at 11: time. In the event that your sync runs longer than one hour, a new sync will start at the next available time. ### Manual Syncs + In addition to scheduled syncs, you can also configure a connection to only sync data on-demand by setting "Replication Frequency" to `MANUAL`. After creating a connection, from the `Connections List` page, you can use the "Sync Now" button to trigger a sync of your API data to your Snowflake database. You can also use this button to manually trigger connections that sync according to a -schedule. If there is already a sync in progress, this button will be disabled. +schedule. If there is already a sync in progress, this button will be disabled. ### Sync History + From the `Connections List` page, you can view information about past syncs for each connection to determine when your data is done syncing and whether the operation was successful. Once the sync is completed successfully, you should be able to validate that the records have been stored in `.`. ## Supported Streams + As of now, all supported streams perform a full refresh. Incremental syncs are not yet supported. Here are the list of supported streams: -* Accounts -* Account Users -* Ad Analytics by Campaign -* Ad Analytics by Creative -* Campaigns -* Campaign Groups -* Creatives + +- Accounts +- Account Users +- Ad Analytics by Campaign +- Ad Analytics by Creative +- Campaigns +- Campaign Groups +- Creatives # Contact Us + snowflake-native-apps@airbyte.io diff --git a/docs/terraform-documentation.md b/docs/terraform-documentation.md index dc4b4058884..94bee1f3235 100644 --- a/docs/terraform-documentation.md +++ b/docs/terraform-documentation.md @@ -4,10 +4,10 @@ products: all # Terraform Documentation -Airbyte's Terraform provider enables you to automate & version-control your Airbyte configuration as code. Save time managing Airbyte and collaborate on Airbyte configuration changes with your teammates. Airbyte's Terraform provider is built off our [Airbyte API](https://api.airbyte.com). +Airbyte's Terraform provider enables you to automate & version-control your Airbyte configuration as code. Save time managing Airbyte and collaborate on Airbyte configuration changes with your teammates. Airbyte's Terraform provider is built off our [Airbyte API](https://api.airbyte.com). -The Terraform provider is available for users on Airbyte Cloud, OSS & Self-Managed Enterprise. +The Terraform provider is available for users on Airbyte Cloud, OSS & Self-Managed Enterprise. Check out our guide for [getting started with Airbyte's Terraform provider](https://reference.airbyte.com/reference/using-the-terraform-provider). -Additionally, you can find examples of data stacks using the Terraform provider in our [quickstarts repository](https://github.com/airbytehq/quickstarts). +Additionally, you can find examples of data stacks using the Terraform provider in our [quickstarts repository](https://github.com/airbytehq/quickstarts). diff --git a/docs/understanding-airbyte/README.md b/docs/understanding-airbyte/README.md index 19657b56c7e..2e87f0a7a1a 100644 --- a/docs/understanding-airbyte/README.md +++ b/docs/understanding-airbyte/README.md @@ -1,2 +1 @@ # Understanding Airbyte - diff --git a/docs/understanding-airbyte/airbyte-protocol-docker.md b/docs/understanding-airbyte/airbyte-protocol-docker.md index 8b630f2c7aa..68f317357c7 100644 --- a/docs/understanding-airbyte/airbyte-protocol-docker.md +++ b/docs/understanding-airbyte/airbyte-protocol-docker.md @@ -1,8 +1,8 @@ # Airbyte Protocol Docker Interface ## Summary -The [Airbyte Protocol](airbyte-protocol.md) describes a series of structs and interfaces for building data pipelines. The Protocol article describes those interfaces in language agnostic pseudocode, this article transcribes those into docker commands. Airbyte's implementation of the protocol is all done in docker. Thus, this reference is helpful for getting a more concrete look at how the Protocol is used. It can also be used as a reference for interacting with Airbyte's implementation of the Protocol. +The [Airbyte Protocol](airbyte-protocol.md) describes a series of structs and interfaces for building data pipelines. The Protocol article describes those interfaces in language agnostic pseudocode, this article transcribes those into docker commands. Airbyte's implementation of the protocol is all done in docker. Thus, this reference is helpful for getting a more concrete look at how the Protocol is used. It can also be used as a reference for interacting with Airbyte's implementation of the Protocol. ## Source @@ -16,6 +16,7 @@ read(Config, ConfiguredAirbyteCatalog, State) -> Stream spec docker run --rm -i check --config @@ -28,6 +29,7 @@ The `read` command will emit a stream records to STDOUT. ## Destination ### Pseudocode: + ``` spec() -> ConnectorSpecification check(Config) -> AirbyteConnectionStatus @@ -35,6 +37,7 @@ write(Config, AirbyteCatalog, Stream(stdin)) -> Stream spec docker run --rm -i check --config @@ -44,6 +47,7 @@ cat <&0 | docker run --rm -i write --config ..` scheme for the Protocol Versioning. (see [SemVer](https://semver.org/)). We increment the -* MAJOR version when you make incompatible protocol changes -* MINOR version when you add functionality in a backwards compatible manner -* PATCH version when you make backwards compatible bug fixes + +- MAJOR version when you make incompatible protocol changes +- MINOR version when you add functionality in a backwards compatible manner +- PATCH version when you make backwards compatible bug fixes ## Development Guidelines 1. We will continue to do our best effort to avoid introducing breaking changes to the Airbyte Protocol. 2. When introducing a new minor version of the Airbyte Protocol, new fields must come with sensible defaults for backward compatibility within the same major version, or be entirely optional. -3. When introducing a new major version of the Airbyte Protocol, all connectors from the previous major version will continue to work. This requires the ability to “translate” messages between 1 major version of the Airbyte Protocol. +3. When introducing a new major version of the Airbyte Protocol, all connectors from the previous major version will continue to work. This requires the ability to “translate” messages between 1 major version of the Airbyte Protocol. ## Safeguards @@ -35,4 +36,4 @@ If any connector fails this check, we abort the upgrade and the `airbyte-bootloa ### When upgrading a Connector -When upgrading a Connector from the UI, we will verify that the Protocol Version is supported before finalizing the Connector upgrade. \ No newline at end of file +When upgrading a Connector from the UI, we will verify that the Protocol Version is supported before finalizing the Connector upgrade. diff --git a/docs/understanding-airbyte/airbyte-protocol.md b/docs/understanding-airbyte/airbyte-protocol.md index 19f59a160b2..3c3120113b7 100644 --- a/docs/understanding-airbyte/airbyte-protocol.md +++ b/docs/understanding-airbyte/airbyte-protocol.md @@ -27,7 +27,7 @@ Each of these concepts is described in greater depth in their respective section The Airbyte Protocol is versioned independently of the Airbyte Platform, and the version number is used to determine the compatibility between connectors and the Airbyte Platform. | Version | Date of Change | Pull Request(s) | Subject | -|:---------|:---------------|:--------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| +| :------- | :------------- | :------------------------------------------------------------------------------------------------------------------------ | :-------------------------------------------------------------------------------- | | `v0.5.2` | 2023-12-26 | [58](https://github.com/airbytehq/airbyte-protocol/pull/58) | Remove unused V1. | | `v0.5.1` | 2023-04-12 | [53](https://github.com/airbytehq/airbyte-protocol/pull/53) | Modify various helper libraries. | | `v0.5.0` | 2023-11-13 | [49](https://github.com/airbytehq/airbyte-protocol/pull/49) | `AirbyteStateStatsMessage` added. | @@ -109,9 +109,10 @@ check(Config) -> AirbyteConnectionStatus ``` The `check` command validates that, given a configuration, that the Actor is able to connect and access all resources that it needs in order to operate. e.g. Given some Postgres credentials, it determines whether it can connect to the Postgres database. The output will be as follows: -- If it can, the `check` command will return a success response. + +- If it can, the `check` command will return a success response. - If `check` fails because of a configuration issue (perhaps the password is incorrect), it will return a failed response and (when possible) a helpful error message. A failed response will be considered as a config error, i.e. user error. Outputting a trace message detailing the config error is optional, but allows for more detailed debugging of the error. -- If it fails because of a connector issue, the `check` command should output a trace message detailing the failure. It is not expected to receive an `AirbyteConnectionStatus` in this failure case. +- If it fails because of a connector issue, the `check` command should output a trace message detailing the failure. It is not expected to receive an `AirbyteConnectionStatus` in this failure case. If an actor's `check` command succeeds, it is expected that all subsequent methods in the sync will also succeed. @@ -494,6 +495,7 @@ The normal success case (T3, not depicted) would be that all the records would m -- [link](https://whimsical.com/state-TYX5bSCVtVF4BU1JbUwfpZ) to source image ### State Types + In addition to allowing a Source to checkpoint data replication, the state object allows for the ability to configure and reset streams in isolation from each other. For example, if adding or removing a stream, it is possible to do so without affecting the state of any other stream in the Source. There are 3 types of state: Stream, Global, and Legacy. @@ -515,6 +517,7 @@ This table breaks down attributes of these state types. - **Single state message describes full state for Source** means that any state message contains the full state information for a Source. Stream does not meet this condition because each state message is scoped by stream. This means that in order to build a full picture of the state for the Source, the state messages for each configured stream must be gathered. ### State Principles + The following are principles Airbyte recommends Sources/Destinations adhere to with State. Airbyte enforces these principles via our CDK. These principles are intended to produce simple overall system behavior, and move Airbyte towards a world of shorter-lived jobs. The goal is reliable data movement with minimal data loss windows on errors. @@ -527,6 +530,7 @@ These principles are intended to produce simple overall system behavior, and mov This simplifies how the Platform treats jobs and means all Syncs are resumable. This also enables checkpointing on full refreshes in the future. This rule does not appear to Sources that do not support cursors. However: + 1. If the source stream has no records, an empty state should still be emitted. This supports state-based counts/checksums. It is recommended for the emitted state to have unique and non-null content. 2. If the stream is unsorted, and therefore non-resumable, it is recommended to still send a state message, even with bogus resumability, to indicate progress in the sync. @@ -544,11 +548,10 @@ These principles are intended to produce simple overall system behavior, and mov 6. **Destinations return state in the order it was received.** - Order is used by the Platform to determine if a State message was dropped. Out-of-order State messages throw errors, as do skipped state messages. Every state message the destination recieved must be returned back to the platform, in order. + Order is used by the Platform to determine if a State message was dropped. Out-of-order State messages throw errors, as do skipped state messages. Every state message the destination recieved must be returned back to the platform, in order. Order-ness is determined by the type of State message. Per-stream state messages require order per-stream. Global state messages require global ordering. - ## Messages ### Common diff --git a/docs/understanding-airbyte/beginners-guide-to-catalog.md b/docs/understanding-airbyte/beginners-guide-to-catalog.md index 1953b1681c8..1ebc825b711 100644 --- a/docs/understanding-airbyte/beginners-guide-to-catalog.md +++ b/docs/understanding-airbyte/beginners-guide-to-catalog.md @@ -10,11 +10,11 @@ The goal of the `AirbyteCatalog` is to describe _what_ data is available in a so This article will illustrate how to use `AirbyteCatalog` via a series of examples. We recommend reading the [Database Example](#database-example) first. The other examples, will refer to knowledge described in that section. After that, jump around to whichever example is most pertinent to your inquiry. -* [Postgres Example](#database-example) -* [API Example](#api-examples) - * [Static Streams Example](#static-streams-example) - * [Dynamic Streams Example](#dynamic-streams-example) -* [Nested Schema Example](#nested-schema-example) +- [Postgres Example](#database-example) +- [API Example](#api-examples) + - [Static Streams Example](#static-streams-example) + - [Dynamic Streams Example](#dynamic-streams-example) +- [Nested Schema Example](#nested-schema-example) In order to understand in depth how to configure incremental data replication, head over to the [incremental replication docs](/using-airbyte/core-concepts/sync-modes/incremental-append.md). @@ -91,10 +91,10 @@ The catalog is structured as a list of `AirbyteStream`. In the case of a databas Let's walk through what each field in a stream means. -* `name` - The name of the stream. -* `supported_sync_modes` - This field lists the type of data replication that this source supports. The possible values in this array include `FULL_REFRESH` \([docs](/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md)\) and `INCREMENTAL` \([docs](/using-airbyte/core-concepts/sync-modes/incremental-append.md)\). -* `source_defined_cursor` - If the stream supports `INCREMENTAL` replication, then this field signals whether the source can figure out how to detect new records on its own or not. -* `json_schema` - This field is a [JsonSchema](https://json-schema.org/understanding-json-schema) object that describes the structure of the data. Notice that each key in the `properties` object corresponds to a column name in our database table. +- `name` - The name of the stream. +- `supported_sync_modes` - This field lists the type of data replication that this source supports. The possible values in this array include `FULL_REFRESH` \([docs](/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md)\) and `INCREMENTAL` \([docs](/using-airbyte/core-concepts/sync-modes/incremental-append.md)\). +- `source_defined_cursor` - If the stream supports `INCREMENTAL` replication, then this field signals whether the source can figure out how to detect new records on its own or not. +- `json_schema` - This field is a [JsonSchema](https://json-schema.org/understanding-json-schema) object that describes the structure of the data. Notice that each key in the `properties` object corresponds to a column name in our database table. Now we understand _what_ data is available from this source. Next we will configure _how_ we want to replicate that data. @@ -135,9 +135,9 @@ Just as with the `AirbyteCatalog` the `ConfiguredAirbyteCatalog` contains a list Let's walk through each field in the `ConfiguredAirbyteStream`: -* `sync_mode` - This field must be one of the values that was in `supported_sync_modes` in the `AirbyteStream` - Configures which sync mode will be used when data is replicated. -* `stream` - Hopefully this one looks familiar! This field contains an `AirbyteStream`. It should be _identical_ to the one we saw in the `AirbyteCatalog`. -* `cursor_field` - When `sync_mode` is `INCREMENTAL` and `source_defined_cursor = false`, this field configures which field in the stream will be used to determine if a record should be replicated or not. Read more about this concept in our [documentation of incremental replication](/using-airbyte/core-concepts/sync-modes/incremental-append.md). +- `sync_mode` - This field must be one of the values that was in `supported_sync_modes` in the `AirbyteStream` - Configures which sync mode will be used when data is replicated. +- `stream` - Hopefully this one looks familiar! This field contains an `AirbyteStream`. It should be _identical_ to the one we saw in the `AirbyteCatalog`. +- `cursor_field` - When `sync_mode` is `INCREMENTAL` and `source_defined_cursor = false`, this field configures which field in the stream will be used to determine if a record should be replicated or not. Read more about this concept in our [documentation of incremental replication](/using-airbyte/core-concepts/sync-modes/incremental-append.md). ### Summary of the Postgres Example @@ -324,4 +324,3 @@ The `AirbyteCatalog` would look like this: ``` Because Airbyte uses JsonSchema to model the schema of streams, it is able to handle arbitrary nesting of data in a way that a table / column based model cannot. - diff --git a/docs/understanding-airbyte/cdc.md b/docs/understanding-airbyte/cdc.md index 92a9629174e..13e460b9256 100644 --- a/docs/understanding-airbyte/cdc.md +++ b/docs/understanding-airbyte/cdc.md @@ -12,37 +12,35 @@ A single sync might have some tables configured for Full Refresh replication and The Airbyte Protocol outputs records from sources. Records from `UPDATE` statements appear the same way as records from `INSERT` statements. We support different options for how to sync this data into destinations using primary keys, so you can choose to append this data, delete in place, etc. -We add some metadata columns for CDC sources which all begin with the `_ab_cdc_` prefix. The actual columns syced will vary per srouce, but might look like: +We add some metadata columns for CDC sources which all begin with the `_ab_cdc_` prefix. The actual columns syced will vary per srouce, but might look like: -* `_ab_cdc_lsn` of `_ab_cdc_cursor` the point in the log where the record was retrieved -* `_ab_cdc_log_file` & `_ab_cdc_log_pos` \(specific to mysql source\) is the file name and position in the file where the record was retrieved -* `_ab_cdc_updated_at` is the timestamp for the database transaction that resulted in this record change and is present for records from `DELETE`/`INSERT`/`UPDATE` statements -* `_ab_cdc_deleted_at` is the timestamp for the database transaction that resulted in this record change and is only present for records from `DELETE` statements +- `_ab_cdc_lsn` of `_ab_cdc_cursor` the point in the log where the record was retrieved +- `_ab_cdc_log_file` & `_ab_cdc_log_pos` \(specific to mysql source\) is the file name and position in the file where the record was retrieved +- `_ab_cdc_updated_at` is the timestamp for the database transaction that resulted in this record change and is present for records from `DELETE`/`INSERT`/`UPDATE` statements +- `_ab_cdc_deleted_at` is the timestamp for the database transaction that resulted in this record change and is only present for records from `DELETE` statements ## Limitations -* CDC incremental is only supported for tables with primary keys for most sources. A CDC source can still choose to replicate tables without primary keys as Full Refresh or a non-CDC source can be configured for the same database to replicate the tables without primary keys using standard incremental replication. -* Data must be in tables, not views. -* The modifications you are trying to capture must be made using `DELETE`/`INSERT`/`UPDATE`. For example, changes made from `TRUNCATE`/`ALTER` won't appear in logs and therefore in your destination. -* There are database-specific limitations. See the documentation pages for individual connectors for more information. -* The records produced by `DELETE` statements only contain primary keys. All other data fields are unset. +- CDC incremental is only supported for tables with primary keys for most sources. A CDC source can still choose to replicate tables without primary keys as Full Refresh or a non-CDC source can be configured for the same database to replicate the tables without primary keys using standard incremental replication. +- Data must be in tables, not views. +- The modifications you are trying to capture must be made using `DELETE`/`INSERT`/`UPDATE`. For example, changes made from `TRUNCATE`/`ALTER` won't appear in logs and therefore in your destination. +- There are database-specific limitations. See the documentation pages for individual connectors for more information. +- The records produced by `DELETE` statements only contain primary keys. All other data fields are unset. ## Current Support -* [Postgres](../integrations/sources/postgres.md) \(For a quick video overview of CDC on Postgres, click [here](https://www.youtube.com/watch?v=NMODvLgZvuE&ab_channel=Airbyte)\) -* [MySQL](../integrations/sources/mysql.md) -* [Microsoft SQL Server / MSSQL](../integrations/sources/mssql.md) -* [MongoDB](../integrations/sources/mongodb-v2.md) - +- [Postgres](../integrations/sources/postgres.md) \(For a quick video overview of CDC on Postgres, click [here](https://www.youtube.com/watch?v=NMODvLgZvuE&ab_channel=Airbyte)\) +- [MySQL](../integrations/sources/mysql.md) +- [Microsoft SQL Server / MSSQL](../integrations/sources/mssql.md) +- [MongoDB](../integrations/sources/mongodb-v2.md) ## Coming Soon -* Oracle DB -* Please [create a ticket](https://github.com/airbytehq/airbyte/issues/new/choose) if you need CDC support on another database! +- Oracle DB +- Please [create a ticket](https://github.com/airbytehq/airbyte/issues/new/choose) if you need CDC support on another database! ## Additional information -* [An overview of Airbyte’s replication modes](https://airbyte.com/blog/understanding-data-replication-modes). -* [Understanding Change Data Capture (CDC): Definition, Methods and Benefits](https://airbyte.com/blog/change-data-capture-definition-methods-and-benefits) -* [Explore Airbyte's Change Data Capture (CDC) synchronization](https://airbyte.com/tutorials/incremental-change-data-capture-cdc-replication) - +- [An overview of Airbyte’s replication modes](https://airbyte.com/blog/understanding-data-replication-modes). +- [Understanding Change Data Capture (CDC): Definition, Methods and Benefits](https://airbyte.com/blog/change-data-capture-definition-methods-and-benefits) +- [Explore Airbyte's Change Data Capture (CDC) synchronization](https://airbyte.com/tutorials/incremental-change-data-capture-cdc-replication) diff --git a/docs/understanding-airbyte/database-data-catalog.md b/docs/understanding-airbyte/database-data-catalog.md index fa0b7dfd3dc..a9152131011 100644 --- a/docs/understanding-airbyte/database-data-catalog.md +++ b/docs/understanding-airbyte/database-data-catalog.md @@ -1,97 +1,99 @@ # Airbyte Databases Data Catalog ## Config Database -* `workspace` - * Each record represents a logical workspace for an Airbyte user. In the open-source version of the product, only one workspace is allowed. -* `actor_definition` - * Each record represents a connector that Airbyte supports, e.g. Postgres. This table represents all the connectors that is supported by the current running platform. - * The `actor_type` column tells us whether the record represents a Source or a Destination. - * The `spec` column is a JSON blob. The schema of this JSON blob matches the [spec](airbyte-protocol.md#actor-specification) model in the Airbyte Protocol. Because the protocol object is JSON, this has to be a JSON blob. - * The `support_level` describes the support level of the connector (e.g. community, certified). - * The `docker_repository` field is the name of the docker image associated with the connector definition. `docker_image_tag` is the tag of the docker image and the version of the connector definition. - * The `source_type` field is only used for Sources, and represents the category of the connector definition (e.g. API, Database). - * The `resource_requirements` field sets a default resource requirement for any connector of this type. This overrides the default we set for all connector definitions, and it can be overridden by a connection-specific resource requirement. The column is a JSON blob with the schema defined in [ActorDefinitionResourceRequirements.yaml](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/ActorDefinitionResourceRequirements.yaml) - * The `public` boolean column, describes if a connector is available to all workspaces or not. For non, `public` connector definitions, they can be provisioned to a workspace using the `actor_definition_workspace_grant` table. `custom` means that the connector is written by a user of the platform (and not packaged into the Airbyte product). - * Each record contains additional metadata and display data about a connector (e.g. `name` and `icon`), and we should add additional metadata here over time. -* `actor_definition_workspace_grant` - * Each record represents provisioning a non `public` connector definition to a workspace. - * todo (cgardens) - should this table have a `created_at` column? -* `actor` - * Each record represents a configured connector. e.g. A Postgres connector configured to pull data from my database. - * The `actor_type` column tells us whether the record represents a Source or a Destination. - * The `actor_definition_id` column is a foreign key to the connector definition that this record is implementing. - * The `configuration` column is a JSON blob. The schema of this JSON blob matches the schema specified in the `spec` column in the `connectionSpecification` field of the JSON blob. Keep in mind this schema is specific to each connector (e.g. the schema of Postgres and Salesforce are different), which is why this column has to be a JSON blob. -* `actor_catalog` - * Each record contains a catalog for an actor. The records in this table are meant to be immutable. - * The `catalog` column is a JSON blob. The schema of this JSON blob matches the [catalog](airbyte-protocol.md#catalog) model in the Airbyte Protocol. Because the protocol object is JSON, this has to be a JSON blob. The `catalog_hash` column is a 32-bit murmur3 hash ( x86 variant) of the `catalog` field to make comparisons easier. - * todo (cgardens) - should we remove the `modified_at` column? These records should be immutable. -* `actor_catalog_fetch_event` - * Each record represents an attempt to fetch the catalog for an actor. The records in this table are meant to be immutable. - * The `actor_id` column represents the actor that the catalog is being fetched for. The `config_hash` represents a hash (32-bit murmur3 hash - x86 variant) of the `configuration` column of that actor, at the time the attempt to fetch occurred. - * The `catalog_id` is a foreign key to the `actor_catalog` table. It represents the catalog fetched by this attempt. We use the foreign key, because the catalogs are often large and often multiple fetch events result in retrieving the same catalog. Also understanding how often the same catalog is fetched is interesting from a product analytics point of view. - * The `actor_version` column represents the `actor_definition` version that was in use when the fetch event happened. This column is needed, because while we can infer the `actor_definition` from the foreign key relationship with the `actor` table, we cannot do the same for the version, as that can change over time. - * todo (cgardens) - should we remove the `modified_at` column? These records should be immutable. -* `connection` - * Each record in this table configures a connection (`source_id`, `destination_id`, and relevant configuration). - * The `resource_requirements` field sets a default resource requirement for the connection. This overrides the default we set for all connector definitions and the default set for the connector definitions. The column is a JSON blob with the schema defined in [ResourceRequirements.yaml](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/ResourceRequirements.yaml). - * The `source_catalog_id` column is a foreign key that refers to `id` column in `actor_catalog` table and represents the catalog that was used to configure the connection. This should not be confused with the `catalog` column which contains the [ConfiguredCatalog](airbyte-protocol.md#catalog) for the connection. - * The `schedule_type` column defines what type of schedule is being used. If the `type` is manual, then `schedule_data` will be null. Otherwise, `schedule_data` column is a JSON blob with the schema of [StandardSync#scheduleData](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/StandardSync.yaml#L74) that defines the actual schedule. The columns `manual` and `schedule` are deprecated and should be ignored (they will be dropped soon). - * The `namespace_type` column configures whether the namespace for the connection should use that defined by the source, the destination, or a user-defined format (`custom`). If `custom` the `namespace_format` column defines the string that will be used as the namespace. - * The `status` column describes the activity level of the connector: `active` - current schedule is respected, `inactive` - current schedule is ignored (the connection does not run) but it could be switched back to active, and `deprecated` - the connection is permanently off (cannot be moved to active or inactive). -* `state` - * The `state` table represents the current (last) state for a connection. For a connection with `stream` state, there will be a record per stream. For a connection with `global` state, there will be a record per stream and an additional record to store the shared (global) state. For a connection with `legacy` state, there will be one record per connection. - * In the `stream` and `global` state cases, the `stream_name` and `namespace` columns contains the name of the stream whose state is represented by that record. For the shared state in global `stream_name` and `namespace` will be null. - * The `state` column contains the state JSON blob. Depending on the type of the connection, the schema of the blob will be different. - * `stream` - for this type, this column is a JSON blob that is a blackbox to the platform and known only to the connector that generated it. - * `global` - for this type, this column is a JSON blob that is a blackbox to the platform and known only to the connector that generated it. This is true for both the states for each stream and the shared state. - * `legacy` - for this type, this column is a JSON blob with a top-level key called `state`. Within that `state` is a blackbox to the platform and known only to the connector that generated it. - * The `type` column describes the type of the state of the row. type can be `STREAM`, `GLOBAL` or `LEGACY`. - * The connection_id is a foreign key to the connection for which we are tracking state. -* `stream_reset` - * Each record in this table represents a stream in a connection that is enqueued to be reset or is currently being reset. It can be thought of as a queue. Once the stream is reset, the record is removed from the table. -* `operation` - * The `operation` table transformations for a connection beyond the raw output produced by the destination. The two options are: `normalization`, which outputs Airbyte's basic normalization. The second is `dbt`, which allows a user to configure their own custom dbt transformation. A connection can have multiple operations (e.g. it can do `normalization` and `dbt`). - * If the `operation` is `dbt`, then the `operator_dbt` column will be populated with a JSON blob with the schema from [OperatorDbt](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/OperatorDbt.yaml). - * If the `operation` is `normalization`, then the `operator_dbt` column will be populated with a JSON blob with the scehma from [OperatorNormalization](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/OperatorNormalization.yaml). - * Operations are scoped by workspace, using the `workspace_id` column. -* `connection_operation` - * This table joins the `operation` table to the `connection` for which it is configured. -* `workspace_service_account` - * This table is a WIP for an unfinished feature. -* `actor_oauth_parameter` - * The name of this table is misleading. It refers to parameters to be used for any instance of an `actor_definition` (not an `actor`) within a given workspace. For OAuth, the model is that a user is provisioning access to their data to a third party tool (in this case the Airbyte Platform). Each record represents information (e.g. client id, client secret) for that third party that is getting access. - * These parameters can be scoped by workspace. If `workspace_id` is not present, then the scope of the parameters is to the whole deployment of the platform (e.g. all workspaces). - * The `actor_type` column tells us whether the record represents a Source or a Destination. - * The `configuration` column is a JSON blob. The schema of this JSON blob matches the schema specified in the `spec` column in the `advanced_auth` field of the JSON blob. Keep in mind this schema is specific to each connector (e.g. the schema of Hubspot and Salesforce are different), which is why this column has to be a JSON blob. -* `secrets` - * This table is used to store secrets in open-source versions of the platform that have not set some other secrets store. This table allows us to use the same code path for secrets handling regardless of whether an external secrets store is set or not. This table is used by default for the open-source product. -* `airbyte_configs_migrations` is metadata table used by Flyway (our database migration tool). It is not used for any application use cases. -* `airbyte_configs` - * Legacy table for config storage. Should be dropped. + +- `workspace` + - Each record represents a logical workspace for an Airbyte user. In the open-source version of the product, only one workspace is allowed. +- `actor_definition` + - Each record represents a connector that Airbyte supports, e.g. Postgres. This table represents all the connectors that is supported by the current running platform. + - The `actor_type` column tells us whether the record represents a Source or a Destination. + - The `spec` column is a JSON blob. The schema of this JSON blob matches the [spec](airbyte-protocol.md#actor-specification) model in the Airbyte Protocol. Because the protocol object is JSON, this has to be a JSON blob. + - The `support_level` describes the support level of the connector (e.g. community, certified). + - The `docker_repository` field is the name of the docker image associated with the connector definition. `docker_image_tag` is the tag of the docker image and the version of the connector definition. + - The `source_type` field is only used for Sources, and represents the category of the connector definition (e.g. API, Database). + - The `resource_requirements` field sets a default resource requirement for any connector of this type. This overrides the default we set for all connector definitions, and it can be overridden by a connection-specific resource requirement. The column is a JSON blob with the schema defined in [ActorDefinitionResourceRequirements.yaml](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/ActorDefinitionResourceRequirements.yaml) + - The `public` boolean column, describes if a connector is available to all workspaces or not. For non, `public` connector definitions, they can be provisioned to a workspace using the `actor_definition_workspace_grant` table. `custom` means that the connector is written by a user of the platform (and not packaged into the Airbyte product). + - Each record contains additional metadata and display data about a connector (e.g. `name` and `icon`), and we should add additional metadata here over time. +- `actor_definition_workspace_grant` + - Each record represents provisioning a non `public` connector definition to a workspace. + - todo (cgardens) - should this table have a `created_at` column? +- `actor` + - Each record represents a configured connector. e.g. A Postgres connector configured to pull data from my database. + - The `actor_type` column tells us whether the record represents a Source or a Destination. + - The `actor_definition_id` column is a foreign key to the connector definition that this record is implementing. + - The `configuration` column is a JSON blob. The schema of this JSON blob matches the schema specified in the `spec` column in the `connectionSpecification` field of the JSON blob. Keep in mind this schema is specific to each connector (e.g. the schema of Postgres and Salesforce are different), which is why this column has to be a JSON blob. +- `actor_catalog` + - Each record contains a catalog for an actor. The records in this table are meant to be immutable. + - The `catalog` column is a JSON blob. The schema of this JSON blob matches the [catalog](airbyte-protocol.md#catalog) model in the Airbyte Protocol. Because the protocol object is JSON, this has to be a JSON blob. The `catalog_hash` column is a 32-bit murmur3 hash ( x86 variant) of the `catalog` field to make comparisons easier. + - todo (cgardens) - should we remove the `modified_at` column? These records should be immutable. +- `actor_catalog_fetch_event` + - Each record represents an attempt to fetch the catalog for an actor. The records in this table are meant to be immutable. + - The `actor_id` column represents the actor that the catalog is being fetched for. The `config_hash` represents a hash (32-bit murmur3 hash - x86 variant) of the `configuration` column of that actor, at the time the attempt to fetch occurred. + - The `catalog_id` is a foreign key to the `actor_catalog` table. It represents the catalog fetched by this attempt. We use the foreign key, because the catalogs are often large and often multiple fetch events result in retrieving the same catalog. Also understanding how often the same catalog is fetched is interesting from a product analytics point of view. + - The `actor_version` column represents the `actor_definition` version that was in use when the fetch event happened. This column is needed, because while we can infer the `actor_definition` from the foreign key relationship with the `actor` table, we cannot do the same for the version, as that can change over time. + - todo (cgardens) - should we remove the `modified_at` column? These records should be immutable. +- `connection` + - Each record in this table configures a connection (`source_id`, `destination_id`, and relevant configuration). + - The `resource_requirements` field sets a default resource requirement for the connection. This overrides the default we set for all connector definitions and the default set for the connector definitions. The column is a JSON blob with the schema defined in [ResourceRequirements.yaml](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/ResourceRequirements.yaml). + - The `source_catalog_id` column is a foreign key that refers to `id` column in `actor_catalog` table and represents the catalog that was used to configure the connection. This should not be confused with the `catalog` column which contains the [ConfiguredCatalog](airbyte-protocol.md#catalog) for the connection. + - The `schedule_type` column defines what type of schedule is being used. If the `type` is manual, then `schedule_data` will be null. Otherwise, `schedule_data` column is a JSON blob with the schema of [StandardSync#scheduleData](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/StandardSync.yaml#L74) that defines the actual schedule. The columns `manual` and `schedule` are deprecated and should be ignored (they will be dropped soon). + - The `namespace_type` column configures whether the namespace for the connection should use that defined by the source, the destination, or a user-defined format (`custom`). If `custom` the `namespace_format` column defines the string that will be used as the namespace. + - The `status` column describes the activity level of the connector: `active` - current schedule is respected, `inactive` - current schedule is ignored (the connection does not run) but it could be switched back to active, and `deprecated` - the connection is permanently off (cannot be moved to active or inactive). +- `state` + - The `state` table represents the current (last) state for a connection. For a connection with `stream` state, there will be a record per stream. For a connection with `global` state, there will be a record per stream and an additional record to store the shared (global) state. For a connection with `legacy` state, there will be one record per connection. + - In the `stream` and `global` state cases, the `stream_name` and `namespace` columns contains the name of the stream whose state is represented by that record. For the shared state in global `stream_name` and `namespace` will be null. + - The `state` column contains the state JSON blob. Depending on the type of the connection, the schema of the blob will be different. + - `stream` - for this type, this column is a JSON blob that is a blackbox to the platform and known only to the connector that generated it. + - `global` - for this type, this column is a JSON blob that is a blackbox to the platform and known only to the connector that generated it. This is true for both the states for each stream and the shared state. + - `legacy` - for this type, this column is a JSON blob with a top-level key called `state`. Within that `state` is a blackbox to the platform and known only to the connector that generated it. + - The `type` column describes the type of the state of the row. type can be `STREAM`, `GLOBAL` or `LEGACY`. + - The connection_id is a foreign key to the connection for which we are tracking state. +- `stream_reset` + - Each record in this table represents a stream in a connection that is enqueued to be reset or is currently being reset. It can be thought of as a queue. Once the stream is reset, the record is removed from the table. +- `operation` + - The `operation` table transformations for a connection beyond the raw output produced by the destination. The two options are: `normalization`, which outputs Airbyte's basic normalization. The second is `dbt`, which allows a user to configure their own custom dbt transformation. A connection can have multiple operations (e.g. it can do `normalization` and `dbt`). + - If the `operation` is `dbt`, then the `operator_dbt` column will be populated with a JSON blob with the schema from [OperatorDbt](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/OperatorDbt.yaml). + - If the `operation` is `normalization`, then the `operator_dbt` column will be populated with a JSON blob with the scehma from [OperatorNormalization](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/OperatorNormalization.yaml). + - Operations are scoped by workspace, using the `workspace_id` column. +- `connection_operation` + - This table joins the `operation` table to the `connection` for which it is configured. +- `workspace_service_account` + - This table is a WIP for an unfinished feature. +- `actor_oauth_parameter` + - The name of this table is misleading. It refers to parameters to be used for any instance of an `actor_definition` (not an `actor`) within a given workspace. For OAuth, the model is that a user is provisioning access to their data to a third party tool (in this case the Airbyte Platform). Each record represents information (e.g. client id, client secret) for that third party that is getting access. + - These parameters can be scoped by workspace. If `workspace_id` is not present, then the scope of the parameters is to the whole deployment of the platform (e.g. all workspaces). + - The `actor_type` column tells us whether the record represents a Source or a Destination. + - The `configuration` column is a JSON blob. The schema of this JSON blob matches the schema specified in the `spec` column in the `advanced_auth` field of the JSON blob. Keep in mind this schema is specific to each connector (e.g. the schema of Hubspot and Salesforce are different), which is why this column has to be a JSON blob. +- `secrets` + - This table is used to store secrets in open-source versions of the platform that have not set some other secrets store. This table allows us to use the same code path for secrets handling regardless of whether an external secrets store is set or not. This table is used by default for the open-source product. +- `airbyte_configs_migrations` is metadata table used by Flyway (our database migration tool). It is not used for any application use cases. +- `airbyte_configs` + - Legacy table for config storage. Should be dropped. ## Jobs Database -* `jobs` - * Each record in this table represents a job. - * The `config_type` column captures the type of job. We only make jobs for `sync` and `reset` (we do not use them for `spec`, `check`, `discover`). - * A job represents an attempt to use a connector (or a pair of connectors). The goal of this model is to capture the input of that run. A job can have multiple attempts (see the `attempts` table). The guarantee across all attempts is that the input into each attempt will be the same. - * That input is captured in the `config` column. This column is a JSON Blob with the schema of a [JobConfig](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/JobConfig.yaml). Only `sync` and `resetConnection` are ever used in that model. - * The other top-level fields are vestigial from when `spec`, `check`, `discover` were used in this model (we will eventually remove them). - * The `scope` column contains the `connection_id` for the relevant connection of the job. - * Context: It is called `scope` and not `connection_id`, because, this table was originally used for `spec`, `check`, and `discover`, and in those cases the `scope` referred to the relevant actor or actor definition. At this point the scope is always a `connection_id`. - * The `status` column contains the job status. The lifecycle of a job is explained in detail in the [Jobs & Workers documentation](jobs.md#job-state-machine). -* `attempts` - * Each record in this table represents an attempt. - * Each attempt belongs to a job--this is captured by the `job_id` column. All attempts for a job will run on the same input. - * The `id` column is a unique id across all attempts while the `attempt_number` is an ascending number of the attempts for a job. - * The output of each attempt, however, can be different. The `output` column is a JSON blob with the schema of a [JobOutput](ahttps://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/StandardSyncOutput.yaml). Only `sync` is used in that model. Reset jobs will also use the `sync` field, because under the hood `reset` jobs end up just doing a `sync` with special inputs. This object contains all the output info for a sync including stats on how much data was moved. - * The other top-level fields are vestigial from when `spec`, `check`, `discover` were used in this model (we will eventually remove them). - * The `status` column contains the attempt status. The lifecycle of a job / attempt is explained in detail in the [Jobs & Workers documentation](jobs.md#job-state-machine). - * If the attempt fails, the `failure_summary` column will be populated. The column is a JSON blob with the schema of [AttemptFailureReason](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/AttemptFailureSummary.yaml). - * The `log_path` column captures where logs for the attempt will be written. - * `created_at`, `started_at`, and `ended_at` track the run time. - * The `temporal_workflow_id` column keeps track of what temporal execution is associated with the attempt. -* `airbyte_metadata` - * This table is a key-value store for various metadata about the platform. It is used to track information about what version the platform is currently on as well as tracking the upgrade history. - * Logically it does not make a lot of sense that it is in the jobs db. It would make sense if it were either in its own dbs or in the config dbs. - * The only two columns are `key` and `value`. It is truly just a key-value store. -* `airbyte_jobs_migrations` is metadata table used by Flyway (our database migration tool). It is not used for any application use cases. + +- `jobs` + - Each record in this table represents a job. + - The `config_type` column captures the type of job. We only make jobs for `sync` and `reset` (we do not use them for `spec`, `check`, `discover`). + - A job represents an attempt to use a connector (or a pair of connectors). The goal of this model is to capture the input of that run. A job can have multiple attempts (see the `attempts` table). The guarantee across all attempts is that the input into each attempt will be the same. + - That input is captured in the `config` column. This column is a JSON Blob with the schema of a [JobConfig](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/JobConfig.yaml). Only `sync` and `resetConnection` are ever used in that model. + - The other top-level fields are vestigial from when `spec`, `check`, `discover` were used in this model (we will eventually remove them). + - The `scope` column contains the `connection_id` for the relevant connection of the job. + - Context: It is called `scope` and not `connection_id`, because, this table was originally used for `spec`, `check`, and `discover`, and in those cases the `scope` referred to the relevant actor or actor definition. At this point the scope is always a `connection_id`. + - The `status` column contains the job status. The lifecycle of a job is explained in detail in the [Jobs & Workers documentation](jobs.md#job-state-machine). +- `attempts` + - Each record in this table represents an attempt. + - Each attempt belongs to a job--this is captured by the `job_id` column. All attempts for a job will run on the same input. + - The `id` column is a unique id across all attempts while the `attempt_number` is an ascending number of the attempts for a job. + - The output of each attempt, however, can be different. The `output` column is a JSON blob with the schema of a [JobOutput](ahttps://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/StandardSyncOutput.yaml). Only `sync` is used in that model. Reset jobs will also use the `sync` field, because under the hood `reset` jobs end up just doing a `sync` with special inputs. This object contains all the output info for a sync including stats on how much data was moved. + - The other top-level fields are vestigial from when `spec`, `check`, `discover` were used in this model (we will eventually remove them). + - The `status` column contains the attempt status. The lifecycle of a job / attempt is explained in detail in the [Jobs & Workers documentation](jobs.md#job-state-machine). + - If the attempt fails, the `failure_summary` column will be populated. The column is a JSON blob with the schema of [AttemptFailureReason](https://github.com/airbytehq/airbyte/blob/master/airbyte-config-oss/config-models-oss/src/main/resources/types/AttemptFailureSummary.yaml). + - The `log_path` column captures where logs for the attempt will be written. + - `created_at`, `started_at`, and `ended_at` track the run time. + - The `temporal_workflow_id` column keeps track of what temporal execution is associated with the attempt. +- `airbyte_metadata` + - This table is a key-value store for various metadata about the platform. It is used to track information about what version the platform is currently on as well as tracking the upgrade history. + - Logically it does not make a lot of sense that it is in the jobs db. It would make sense if it were either in its own dbs or in the config dbs. + - The only two columns are `key` and `value`. It is truly just a key-value store. +- `airbyte_jobs_migrations` is metadata table used by Flyway (our database migration tool). It is not used for any application use cases. diff --git a/docs/understanding-airbyte/heartbeats.md b/docs/understanding-airbyte/heartbeats.md index ce2f3499f5a..159b17e0927 100644 --- a/docs/understanding-airbyte/heartbeats.md +++ b/docs/understanding-airbyte/heartbeats.md @@ -13,6 +13,7 @@ In these cases, Airbyte takes the more conservative approach. Airbyte restarts t ## Known Heartbeat Error Causes Possible reasons for a heartbeat error: + 1. Certain API sources take an unknown amount of time to generate asynchronous responses (e.g., Salesforce, Facebook, Amplitude). No workaround currently exists. 2. Certain API sources can be rate-limited for a time period longer than their configured threshold. Although Airbyte tries its best to handle this on a per-connector basis, rate limits are not always predictable. 3. Database sources can be slow to respond to a query. This can be due to a variety of reasons, including the size of the database, the complexity of the query, and the number of other queries being made to the database at the same time. @@ -21,33 +22,38 @@ Possible reasons for a heartbeat error: 1. The most common reason we see here is destination resource availability vis-a-vis data volumes. In general, -* **Database Sources and Destination errors are extremely rare**. Any issues are likely to be indicative of actual issues and need to be investigated. -* **API Sources errors are uncommon but not unexpected**. This is especially true if an API source generates asynchronous responses or has rate limits. + +- **Database Sources and Destination errors are extremely rare**. Any issues are likely to be indicative of actual issues and need to be investigated. +- **API Sources errors are uncommon but not unexpected**. This is especially true if an API source generates asynchronous responses or has rate limits. ## Airbyte Cloud -Airbyte Cloud has identical heartbeat monitoring and alerting as Airbyte Open Source. + +Airbyte Cloud has identical heartbeat monitoring and alerting as Airbyte Open Source. If these issues show up on Airbyte Cloud, + 1. Please read [Known Causes](#known-causes). In many cases, the issue is with the source, the destination or the connection set up, and not with Airbyte. 2. Reach out to Airbyte Support for help. ## Technical Details ### Source + #### Heartbeating logic The platform considers both `RECORD` and `STATE` messages emitted by the source as source heartbeats. The Airbyte platform has a process which monitors when the last beat was send and if it reaches a threshold, -the synchronization attempt will be failed. It fails with a cause being the source an message saying +the synchronization attempt will be failed. It fails with a cause being the source an message saying `The source is unresponsive`. Internal the error has a heartbeat timeout type, which is not display in the UI. #### Configuration The heartbeat can be configured using the file flags.yaml through 2 entries: -* `hseartbeat-max-seconds-between-messages`: this configures the maximum time allowed between 2 messages. -The default is 3 hours. -* `heartbeat.failSync`: Setting this to true will make the syncs to fail if a missed heartbeat is detected. -If false no sync will be failed because of a missed heartbeat. The default value is true. + +- `hseartbeat-max-seconds-between-messages`: this configures the maximum time allowed between 2 messages. + The default is 3 hours. +- `heartbeat.failSync`: Setting this to true will make the syncs to fail if a missed heartbeat is detected. + If false no sync will be failed because of a missed heartbeat. The default value is true. ### Destination @@ -56,6 +62,8 @@ If false no sync will be failed because of a missed heartbeat. The default value Adding a heartbeat to the destination similar to the one at the source is not straightforward since there isn't a constant stream of messages from the destination to the platform. Instead, we have implemented something that is more akin to a timeout. The platform monitors whether there has been a call to the destination that has taken more than a specified amount of time. If such a delay occurs, the platform considers the destination to have timed out. #### Configuration + The timeout can be configured using the file `flags.yaml` through 2 entries: -* `destination-timeout-max-seconds`: If the platform detects a call to the destination exceeding the duration specified in this entry, it will consider the destination to have timed out. The default timeout value is 24 hours. -* `destination-timeout.failSync`: If enabled (true by default), a detected destination timeout will cause the platform to fail the sync. If not, the platform will log a message and allow the sync to continue. When the platform fails a sync due to a destination timeout, the UI will display the message: `The destination is unresponsive`. + +- `destination-timeout-max-seconds`: If the platform detects a call to the destination exceeding the duration specified in this entry, it will consider the destination to have timed out. The default timeout value is 24 hours. +- `destination-timeout.failSync`: If enabled (true by default), a detected destination timeout will cause the platform to fail the sync. If not, the platform will log a message and allow the sync to continue. When the platform fails a sync due to a destination timeout, the UI will display the message: `The destination is unresponsive`. diff --git a/docs/understanding-airbyte/high-level-view.md b/docs/understanding-airbyte/high-level-view.md index 19cb5291da7..2d5ae4abe8a 100644 --- a/docs/understanding-airbyte/high-level-view.md +++ b/docs/understanding-airbyte/high-level-view.md @@ -4,13 +4,14 @@ description: A high level view of Airbyte's components. # Architecture overview -Airbyte is conceptually composed of two parts: platform and connectors. +Airbyte is conceptually composed of two parts: platform and connectors. -The platform provides all the horizontal services required to configure and run data movement operations e.g: the UI, configuration API, job scheduling, logging, alerting, etc. and is structured as a set of microservices. +The platform provides all the horizontal services required to configure and run data movement operations e.g: the UI, configuration API, job scheduling, logging, alerting, etc. and is structured as a set of microservices. -Connectors are independent modules which push/pull data to/from sources and destinations. Connectors are built in accordance with the [Airbyte Specification](./airbyte-protocol.md), which describes the interface with which data can be moved between a source and a destination using Airbyte. Connectors are packaged as Docker images, which allows total flexibility over the technologies used to implement them. +Connectors are independent modules which push/pull data to/from sources and destinations. Connectors are built in accordance with the [Airbyte Specification](./airbyte-protocol.md), which describes the interface with which data can be moved between a source and a destination using Airbyte. Connectors are packaged as Docker images, which allows total flexibility over the technologies used to implement them. A more concrete diagram can be seen below: + ```mermaid --- title: Architecture Overview @@ -32,14 +33,15 @@ flowchart LR W2 -->|launches| Destination ``` -* **Web App/UI** [`airbyte-webapp`, `airbyte-proxy`]: An easy-to-use graphical interface for interacting with the Airbyte API. -* **Server/Config API** [`airbyte-server`, `airbyte-server-api`]: Handles connection between UI and API. Airbyte's main control plane. All operations in Airbyte such as creating sources, destinations, connections, managing configurations, etc.. are configured and invoked from the API. -* **Database Config & Jobs** [`airbyte-db`]: Stores all the connections information \(credentials, frequency...\). -* **Temporal Service** [`airbyte-temporal`]: Manages the task queue and workflows. -* **Worker** [`airbyte-worker`]: The worker connects to a source connector, pulls the data and writes it to a destination. +- **Web App/UI** [`airbyte-webapp`, `airbyte-proxy`]: An easy-to-use graphical interface for interacting with the Airbyte API. +- **Server/Config API** [`airbyte-server`, `airbyte-server-api`]: Handles connection between UI and API. Airbyte's main control plane. All operations in Airbyte such as creating sources, destinations, connections, managing configurations, etc.. are configured and invoked from the API. +- **Database Config & Jobs** [`airbyte-db`]: Stores all the connections information \(credentials, frequency...\). +- **Temporal Service** [`airbyte-temporal`]: Manages the task queue and workflows. +- **Worker** [`airbyte-worker`]: The worker connects to a source connector, pulls the data and writes it to a destination. The diagram shows the steady-state operation of Airbyte, there are components not described you'll see in your deployment: -* **Cron** [`airbyte-cron`]: Clean the server and sync logs (when using local logs) -* **Bootloader** [`airbyte-bootloader`]: Upgrade and Migrate the Database tables and confirm the enviroment is ready to work. + +- **Cron** [`airbyte-cron`]: Clean the server and sync logs (when using local logs) +- **Bootloader** [`airbyte-bootloader`]: Upgrade and Migrate the Database tables and confirm the enviroment is ready to work. This is a holistic high-level description of each component. For Airbyte deployed in Kubernetes the structure is very similar with a few changes. diff --git a/docs/understanding-airbyte/jobs.md b/docs/understanding-airbyte/jobs.md index c9b56ee6056..5ffd4a33fa8 100644 --- a/docs/understanding-airbyte/jobs.md +++ b/docs/understanding-airbyte/jobs.md @@ -2,10 +2,10 @@ In Airbyte, all interactions with connectors are run as jobs performed by a Worker. Each job has a corresponding worker: -* Spec worker: retrieves the specification of a connector \(the inputs needed to run this connector\) -* Check connection worker: verifies that the inputs to a connector are valid and can be used to run a sync -* Discovery worker: retrieves the schema of the source underlying a connector -* Sync worker, used to sync data between a source and destination +- Spec worker: retrieves the specification of a connector \(the inputs needed to run this connector\) +- Check connection worker: verifies that the inputs to a connector are valid and can be used to run a sync +- Discovery worker: retrieves the schema of the source underlying a connector +- Sync worker, used to sync data between a source and destination Thus, there are generally 4 types of workers. @@ -34,10 +34,10 @@ state NonTerminal { When an attempt fails, the job status is transitioned to incomplete. If this is the final attempt, then the job is transitioned to failed. Otherwise it is transitioned back to running upon new attempt creation. - + end note } -note left of NonSuccess +note left of NonSuccess All Non Terminal Statuses can be transitioned to cancelled or failed end note @@ -52,7 +52,6 @@ state NonSuccess { NonTerminal --> NonSuccess ``` - ```mermaid --- title: Attempt Status State Machine @@ -63,7 +62,6 @@ stateDiagram-v2 running --> failed ``` - ### Attempts and Retries In the event of a failure, the Airbyte platform will retry the pipeline. Each of these sub-invocations of a job is called an attempt. @@ -72,9 +70,9 @@ In the event of a failure, the Airbyte platform will retry the pipeline. Each of Based on the outcome of previous attempts, the number of permitted attempts per job changes. By default, Airbyte is configured to allow the following: -* 5 subsequent attempts where no data was synchronized -* 10 total attempts where no data was synchronized -* 10 total attempts where some data was synchronized +- 5 subsequent attempts where no data was synchronized +- 10 total attempts where no data was synchronized +- 10 total attempts where some data was synchronized For oss users, these values are configurable. See [Configuring Airbyte](../operator-guides/configuring-airbyte.md#jobs) for more details. @@ -83,10 +81,11 @@ For oss users, these values are configurable. See [Configuring Airbyte](../opera After an attempt where no data was synchronized, we implement a short backoff period before starting a new attempt. This will increase with each successive complete failure—a partially successful attempt will reset this value. By default, Airbyte is configured to backoff with the following values: -* 10 seconds after the first complete failure -* 30 seconds after the second -* 90 seconds after the third -* 4 minutes and 30 seconds after the fourth + +- 10 seconds after the first complete failure +- 30 seconds after the second +- 90 seconds after the third +- 4 minutes and 30 seconds after the fourth For oss users, these values are configurable. See [Configuring Airbyte](../operator-guides/configuring-airbyte.md#jobs) for more details. @@ -94,7 +93,7 @@ The duration of expected backoff between attempts can be viewed in the logs acce ### Retry examples -To help illustrate what is possible, below are a couple examples of how the retry rules may play out under more elaborate circumstances. +To help illustrate what is possible, below are a couple examples of how the retry rules may play out under more elaborate circumstances. @@ -238,11 +237,11 @@ Conceptually, **workers contain the complexity of all non-connector-related job ### Worker Types -There are 2 flavors of workers: +There are 2 flavors of workers: 1. **Synchronous Job Worker** - Workers that interact with a single connector \(e.g. spec, check, discover\). - The worker extracts data from the connector and reports it to the scheduler. It does this by listening to the connector's STDOUT. + The worker extracts data from the connector and reports it to the scheduler. It does this by listening to the connector's STDOUT. These jobs are synchronous as they are part of the configuration process and need to be immediately run to provide a good user experience. These are also all lightweight operations. 2. **Asynchronous Job Worker** - Workers that interact with 2 connectors \(e.g. sync, reset\) @@ -269,7 +268,6 @@ sequenceDiagram Worker->>Result: json output ``` - See the [architecture overview](high-level-view.md) for more information about workers. ## Deployment Types @@ -287,6 +285,7 @@ Airbyte offers two deployment types. The underlying process implementations diff Workers being responsible for all non-connector-related job operations means multiple jobs are operationally dependent on a single worker process. There are two downsides to this: + 1. Any issues to the parent worker process affects all job processes launched by the worker. 2. Unnecessary complexity of vertically scaling the worker process to deal with IO and processing requirements from multiple jobs. @@ -295,6 +294,7 @@ This gives us a potentially brittle system component that can be operationally t The Container Orchestrator was introduced to solve this. #### Container Orchestrator + When enabled, workers launch the Container Orchestrator process. The worker process delegates the [above listed responsibilities](#worker-responsibilities) to the orchestrator process. @@ -302,6 +302,7 @@ The worker process delegates the [above listed responsibilities](#worker-respons This decoupling introduces a new need for workers to track the orchestrator's, and the job's, state. This is done via a shared Cloud Storage store. Brief description of how this works, + 1. Workers constantly poll the Cloud Storage location for job state. 2. As an Orchestrator process executes, it writes status marker files to the Cloud Storage location i.e. `NOT_STARTED`, `INITIALIZING`, `RUNNING`, `SUCCESS`, `FAILURE`. 3. If the Orchestrator process runs into issues at any point, it writes a `FAILURE`. @@ -311,7 +312,6 @@ The Cloud Storage store is treated as the source-of-truth of execution state. The Container Orchestrator is only available for Airbyte Kubernetes today and automatically enabled when running the Airbyte Helm Charts deploys. - ```mermaid --- title: Start a new Sync @@ -337,7 +337,6 @@ sequenceDiagram PersistA->>Temporal: Return output ``` - Users running Airbyte Docker should be aware of the above pitfalls. ## Configuring Jobs & Workers @@ -345,11 +344,13 @@ Users running Airbyte Docker should be aware of the above pitfalls. Details on configuring jobs & workers can be found [here](../operator-guides/configuring-airbyte.md). ### Worker Parallization -Airbyte exposes the following environment variable to change the maximum number of each type of worker allowed to run in parallel. -Tweaking these values might help you run more jobs in parallel and increase the workload of your Airbyte instance: -* `MAX_SPEC_WORKERS`: Maximum number of *Spec* workers allowed to run in parallel. -* `MAX_CHECK_WORKERS`: Maximum number of *Check connection* workers allowed to run in parallel. -* `MAX_DISCOVERY_WORKERS`: Maximum number of *Discovery* workers allowed to run in parallel. -* `MAX_SYNC_WORKERS`: Maximum number of *Sync* workers allowed to run in parallel. + +Airbyte exposes the following environment variable to change the maximum number of each type of worker allowed to run in parallel. +Tweaking these values might help you run more jobs in parallel and increase the workload of your Airbyte instance: + +- `MAX_SPEC_WORKERS`: Maximum number of _Spec_ workers allowed to run in parallel. +- `MAX_CHECK_WORKERS`: Maximum number of _Check connection_ workers allowed to run in parallel. +- `MAX_DISCOVERY_WORKERS`: Maximum number of _Discovery_ workers allowed to run in parallel. +- `MAX_SYNC_WORKERS`: Maximum number of _Sync_ workers allowed to run in parallel. The current default value for these environment variables is currently set to **5**. diff --git a/docs/understanding-airbyte/json-avro-conversion.md b/docs/understanding-airbyte/json-avro-conversion.md index 54648af5421..e2abde02918 100644 --- a/docs/understanding-airbyte/json-avro-conversion.md +++ b/docs/understanding-airbyte/json-avro-conversion.md @@ -9,14 +9,14 @@ When an Airbyte data stream is synced to the Avro or Parquet format (e.g. Parque Json schema types are mapped to Avro types as follows: | Json Data Type | Avro Data Type | -| :---: | :---: | -| string | string | -| number | double | -| integer | int | -| boolean | boolean | -| null | null | -| object | record | -| array | array | +| :------------: | :------------: | +| string | string | +| number | double | +| integer | int | +| boolean | boolean | +| null | null | +| object | record | +| array | array | ### Nullable Fields @@ -26,11 +26,11 @@ All fields are nullable. For example, a `string` Json field will be typed as `[" The following built-in Json formats will be mapped to Avro logical types. -| Json Type | Json Built-in Format | Avro Type | Avro Logical Type | Meaning | -| --- | --- | --- | --- | --- | -| `string` | `date` | `int` | `date` | Number of epoch days from 1970-01-01 ([reference](https://avro.apache.org/docs/current/spec.html#Date)). | -| `string` | `time` | `long` | `time-micros` | Number of microseconds after midnight ([reference](https://avro.apache.org/docs/current/spec.html#Time+%28microsecond+precision%29)). | -| `string` | `date-time` | `long` | `timestamp-micros` | Number of microseconds from `1970-01-01T00:00:00Z` ([reference](https://avro.apache.org/docs/current/spec.html#Timestamp+%28microsecond+precision%29)). | +| Json Type | Json Built-in Format | Avro Type | Avro Logical Type | Meaning | +| --------- | -------------------- | --------- | ------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `string` | `date` | `int` | `date` | Number of epoch days from 1970-01-01 ([reference](https://avro.apache.org/docs/current/spec.html#Date)). | +| `string` | `time` | `long` | `time-micros` | Number of microseconds after midnight ([reference](https://avro.apache.org/docs/current/spec.html#Time+%28microsecond+precision%29)). | +| `string` | `date-time` | `long` | `timestamp-micros` | Number of microseconds from `1970-01-01T00:00:00Z` ([reference](https://avro.apache.org/docs/current/spec.html#Timestamp+%28microsecond+precision%29)). | In the final Avro schema, these Avro logical type fields will be a union of the logical type and string. The rationale is that the incoming Json objects may contain invalid Json built-in formats. If that's the case, and the conversion from the Json built-in format to Avro built-in format fails, the field will fall back to a string. The extra string type can cause problem for some users in the destination. We may re-evaluate this conversion rule in the future. This issue is tracked [here](https://github.com/airbytehq/airbyte/issues/17011). @@ -151,10 +151,7 @@ Combined restrictions \(`allOf`, `anyOf`, and `oneOf`\) will be converted to typ ```json { - "oneOf": [ - {"type": "string"}, - {"type": "integer"} - ] + "oneOf": [{ "type": "string" }, { "type": "integer" }] } ``` @@ -184,10 +181,7 @@ For array fields in Json schema, when the `items` property is an array, it means { "array_field": { "type": "array", - "items": [ - {"type": "string"}, - {"type": "number"} - ] + "items": [{ "type": "string" }, { "type": "number" }] } } ``` @@ -259,7 +253,8 @@ Json object: "id_part_1": 1000, "id_part_2": "abcde" } - }, { + }, + { "id": { "id_part_1": "wxyz", "id_part_2": 2000 @@ -370,7 +365,7 @@ For example, given the following Json schema and object: ```json { - "identifier": ["151", 152, true, {"id": 153}, null] + "identifier": ["151", 152, true, { "id": 153 }, null] } ``` @@ -407,11 +402,11 @@ Note that every non-null element inside the `identifier` array field is converte Three Airbyte specific fields will be added to each Avro record: -| Field | Schema | Document | -| :--- | :--- | :---: | -| `_airbyte_ab_id` | `uuid` | [link](http://avro.apache.org/docs/current/spec.html#UUID) | -| `_airbyte_emitted_at` | `timestamp-millis` | [link](http://avro.apache.org/docs/current/spec.html#Timestamp+%28millisecond+precision%29) | -| `_airbyte_additional_properties` | `map` of `string` | See explanation below. | +| Field | Schema | Document | +| :------------------------------- | :----------------- | :-----------------------------------------------------------------------------------------: | +| `_airbyte_ab_id` | `uuid` | [link](http://avro.apache.org/docs/current/spec.html#UUID) | +| `_airbyte_emitted_at` | `timestamp-millis` | [link](http://avro.apache.org/docs/current/spec.html#Timestamp+%28millisecond+precision%29) | +| `_airbyte_additional_properties` | `map` of `string` | See explanation below. | ### Additional Properties @@ -420,7 +415,7 @@ A Json object can have additional properties of unknown types, which is not comp ```json { "name": "_airbyte_additional_properties", - "type": ["null", {"type": "map", "values": "string"}], + "type": ["null", { "type": "map", "values": "string" }], "default": null } ``` @@ -498,7 +493,7 @@ the corresponding Avro schema and record will be: "fields": [ { "name": "_airbyte_additional_properties", - "type": ["null", {"type": "map", "values": "string"}], + "type": ["null", { "type": "map", "values": "string" }], "default": null } ] @@ -597,7 +592,7 @@ Its corresponding Avro schema will be: }, { "name": "_airbyte_additional_properties", - "type": ["null", {"type": "map", "values": "string"}], + "type": ["null", { "type": "map", "values": "string" }], "default": null } ] @@ -609,14 +604,14 @@ Its corresponding Avro schema will be: "name": "created_at", "type": [ "null", - {"type": "long", "logicalType": "timestamp-micros"}, + { "type": "long", "logicalType": "timestamp-micros" }, "string" ], "default": null }, { "name": "_airbyte_additional_properties", - "type": ["null", {"type": "map", "values": "string"}], + "type": ["null", { "type": "map", "values": "string" }], "default": null } ] @@ -626,5 +621,6 @@ Its corresponding Avro schema will be: More examples can be found in the Json to Avro conversion [test cases](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/bases/base-java-s3/src/test/resources/parquet/json_schema_converter). ## Implementation + - Schema conversion: [JsonToAvroSchemaConverter](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/bases/base-java-s3/src/main/java/io/airbyte/integrations/destination/s3/avro/JsonToAvroSchemaConverter.java) - Object conversion: [airbytehq/json-avro-converter](https://github.com/airbytehq/json-avro-converter) (forked and modified from [allegro/json-avro-converter](https://github.com/allegro/json-avro-converter)). diff --git a/docs/understanding-airbyte/operations.md b/docs/understanding-airbyte/operations.md index b21a087651b..50ce80e2c2c 100644 --- a/docs/understanding-airbyte/operations.md +++ b/docs/understanding-airbyte/operations.md @@ -2,10 +2,10 @@ Airbyte [connections](/using-airbyte/core-concepts/sync-modes/) support configuring additional transformations that execute after the sync. Useful applications could be: -* Customized normalization to better fit the requirements of your own business context. -* Business transformations from a technical data representation into a more logical and business oriented data structure. This can facilitate usage by end-users, non-technical operators, and executives looking to generate Business Intelligence dashboards and reports. -* Data Quality, performance optimization, alerting and monitoring, etc. -* Integration with other tools from your data stack \(orchestration, data visualization, etc.\) +- Customized normalization to better fit the requirements of your own business context. +- Business transformations from a technical data representation into a more logical and business oriented data structure. This can facilitate usage by end-users, non-technical operators, and executives looking to generate Business Intelligence dashboards and reports. +- Data Quality, performance optimization, alerting and monitoring, etc. +- Integration with other tools from your data stack \(orchestration, data visualization, etc.\) ## Supported Operations @@ -17,8 +17,8 @@ A url to a git repository to \(shallow\) clone the latest dbt project code from. The project versioned in the repository is expected to: -* be a valid dbt package with a `dbt_project.yml` file at its root. -* have a `dbt_project.yml` with a "profile" name declared as described [here](https://docs.getdbt.com/dbt-cli/configure-your-profile). +- be a valid dbt package with a `dbt_project.yml` file at its root. +- have a `dbt_project.yml` with a "profile" name declared as described [here](https://docs.getdbt.com/dbt-cli/configure-your-profile). When using the dbt CLI, dbt checks your `profiles.yml` file for a profile with the same name. A profile contains all the details required to connect to your data warehouse. This file generally lives outside of your dbt project to avoid sensitive credentials being checked in to version control. Therefore, a `profiles.yml` will be generated according to the configured destination from the Airbyte UI. @@ -46,11 +46,10 @@ One thing to consider is that dbt allows for vast configuration of the run comma ## Future Operations -* Docker/Script operations: Execute a generic script in a custom Docker container. -* Webhook operations: Trigger API or hooks from other providers. -* Airflow operations: To use a specialized orchestration tool that lets you schedule and manage more advanced/complex sequences of operations in your sync workflow. +- Docker/Script operations: Execute a generic script in a custom Docker container. +- Webhook operations: Trigger API or hooks from other providers. +- Airflow operations: To use a specialized orchestration tool that lets you schedule and manage more advanced/complex sequences of operations in your sync workflow. ## Going Further In the meantime, please feel free to react, comment, and share your thoughts/use cases with us. We would be glad to hear your feedback and ideas as they will help shape the next set of features and our roadmap for the future. You can head to our GitHub and participate in the corresponding issue or discussions. Thank you! - diff --git a/docs/understanding-airbyte/schemaless-sources-and-destinations.md b/docs/understanding-airbyte/schemaless-sources-and-destinations.md index edd4051ce2c..27fe2c4f649 100644 --- a/docs/understanding-airbyte/schemaless-sources-and-destinations.md +++ b/docs/understanding-airbyte/schemaless-sources-and-destinations.md @@ -1,10 +1,12 @@ # "Schemaless" Sources and Destinations + In order to run a sync, Airbyte requires a [catalog](/understanding-airbyte/airbyte-protocol#catalog), which includes a data schema describing the shape of data being emitted by the source. This schema will be used to prepare the destination to populate the data during the sync. -While having a [strongly-typed](/understanding-airbyte/supported-data-types) catalog/schema is possible for most sources, some won't have a reasonably static schema. This document describes the options available for the subset of sources that do not have a strict schema, aka "schemaless sources". +While having a [strongly-typed](/understanding-airbyte/supported-data-types) catalog/schema is possible for most sources, some won't have a reasonably static schema. This document describes the options available for the subset of sources that do not have a strict schema, aka "schemaless sources". ## What is a Schemaless Source? + Schemaless sources are sources for which there is no requirement or expectation that records will conform to a particular pattern. For example, in a MongoDB database, there's no requirement that the fields in one document are the same as the fields in the next, or that the type of value in one field is the same as the type for that field in a separate document. Similarly, for a file-based source such as S3, the files that are present in your source may not all have the same schema. @@ -16,8 +18,9 @@ For these sources, during the [`discover`](/understanding-airbyte/airbyte-protoc 2. A hardcoded "schemaless" schema. ### Dynamic schema inference + If this option is selected, Airbyte will infer the schema dynamically based on the contents of the source. -If your source's content is homogenous, we recommend this option, as the data in your destination will be typed and you can make use of schema evolution features, column selection, and similar Airbyte features which operate against the source's schema. +If your source's content is homogenous, we recommend this option, as the data in your destination will be typed and you can make use of schema evolution features, column selection, and similar Airbyte features which operate against the source's schema. For MongoDB, you can configure the number of documents that will be used for schema inference (from 1,000 to 10,000 documents; by default, this is set to 10,000). Airbyte will read in the requested number of documents (sampled randomly) and infer the schema from them. @@ -30,6 +33,7 @@ The type assigned to each field will be the widest type observed for that field So if we observe that a field has an integer type in one record and a string in another, the schema will identify the field as a string. There are a few drawbacks to be aware of: + - If your dataset is very large, the `discover` process can be very time-consuming. - Because we may not use 100% of the available data to create the schema, your schema may not contain every field present in your records. Airbyte only syncs fields that are in the schema, so you may end up with incomplete data in the destination. @@ -41,6 +45,7 @@ If your data is uniform across all or most records, you can set this to a lower If your data varies but you cannot use the Schemaless option, you can set it to a larger value to ensure that as many fields as possible are accounted for._ ### Schemaless schema + If this option is selected, the schema will always be `{"data": object}`, regardless of the contents of the data. During the sync, we "wrap" each record behind a key named `data`. This means that the destination receives the data with one top-level field only, and the value of the field is the entire record. @@ -49,14 +54,17 @@ This option avoids a time-consuming or inaccurate `discover` phase and guarantee ## Future Enhancements ### File-based Sources: configurable amount of data read for schema inference + Currently, Airbyte chooses the amount of data that we'll use to infer the schema for file-based sources. We will be surfacing a config option for users to choose how much data to read to infer the schema. This option is already available for the MongoDB source. ### Unwrapping the data at schemaless Destinations + MongoDB and file storage systems also don't require a schema at the destination. For this reason, if you are syncing data from a schemaless source to a schemaless destination and chose the "schemaless" schema option, Airbyte will offer the ability to "unwrap" the data at the destination so that it is not nested under the "data" key. ### Column exclusion for schemaless schemas + We are planning to offer a way to exclude fields from being synced when the schemaless option is selected, as column selection is not applicable. diff --git a/docs/understanding-airbyte/supported-data-types.md b/docs/understanding-airbyte/supported-data-types.md index 3080f5186eb..bb3a640fd5e 100644 --- a/docs/understanding-airbyte/supported-data-types.md +++ b/docs/understanding-airbyte/supported-data-types.md @@ -11,7 +11,7 @@ This type system does not constrain values. However, destinations may not fully This table summarizes the available types. See the [Specific Types](#specific-types) section for explanation of optional parameters. | Airbyte type | JSON Schema | Examples | -|----------------------------|-----------------------------------------------------------------------------------------------------|-------------------------------------------------------------------| +| -------------------------- | --------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------- | | String | `{"type": "string"}` | `"foo bar"` | | Boolean | `{"type": "boolean"}` | `true` or `false` | | Date | `{"type": "string", "format": "date"}` | `"2021-01-23"`, `"2021-01-23 BC"` | @@ -26,9 +26,11 @@ This table summarizes the available types. See the [Specific Types](#specific-ty | Union | `{"oneOf": [...]}` | | ### Record structure + As a reminder, sources expose a `discover` command, which returns a list of [`AirbyteStreams`](https://github.com/airbytehq/airbyte/blob/111131a193359027d0081de1290eb4bb846662ef/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_protocol.yaml#L122), and a `read` method, which emits a series of [`AirbyteRecordMessages`](https://github.com/airbytehq/airbyte/blob/111131a193359027d0081de1290eb4bb846662ef/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_protocol.yaml#L46-L66). The type system determines what a valid `json_schema` is for an `AirbyteStream`, which in turn dictates what messages `read` is allowed to emit. For example, a source could produce this `AirbyteStream` (remember that the `json_schema` must declare `"type": "object"` at the top level): + ```json { "name": "users", @@ -53,7 +55,9 @@ For example, a source could produce this `AirbyteStream` (remember that the `jso } } ``` + Along with this `AirbyteRecordMessage` (observe that the `data` field conforms to the `json_schema` from the stream): + ```json { "stream": "users", @@ -69,10 +73,13 @@ Along with this `AirbyteRecordMessage` (observe that the `data` field conforms t The top-level `object` must conform to the type system. This [means](#objects) that all of the fields must also conform to the type system. #### Nulls -Many sources cannot guarantee that all fields are present on all records. In these cases, sources should not list them as `required` fields, and add that the property can be null in the jsonSchema, e.g. `[null, string]`. If a null property is found for a non-nullable schema, a validation error may occur in the platform or the destination may have trouble storing the record. + +Many sources cannot guarantee that all fields are present on all records. In these cases, sources should not list them as `required` fields, and add that the property can be null in the jsonSchema, e.g. `[null, string]`. If a null property is found for a non-nullable schema, a validation error may occur in the platform or the destination may have trouble storing the record. #### Unsupported types + Destinations must have handling for all types, but they are free to cast types to a convenient representation. For example, let's say a source discovers a stream with this schema: + ```json { "type": "object", @@ -88,12 +95,15 @@ Destinations must have handling for all types, but they are free to cast types t } } ``` + Along with records which contain data that looks like this: + ```json -{"appointments": ["2021-11-22T01:23:45+00:00", "2022-01-22T14:00:00+00:00"]} +{ "appointments": ["2021-11-22T01:23:45+00:00", "2022-01-22T14:00:00+00:00"] } ``` The user then connects this source to a destination that cannot natively handle `array` fields. The destination connector is free to simply JSON-serialize the array back to a string when pushing data into the end platform. In other words, the destination connector could behave as though the source declared this schema: + ```json { "type": "object", @@ -104,23 +114,31 @@ The user then connects this source to a destination that cannot natively handle } } ``` + And emitted this record: + ```json -{"appointments": "[\"2021-11-22T01:23:45+00:00\", \"2022-01-22T14:00:00+00:00\"]"} +{ + "appointments": "[\"2021-11-22T01:23:45+00:00\", \"2022-01-22T14:00:00+00:00\"]" +} ``` Of course, destinations are free to choose the most convenient/reasonable representation for any given value. JSON serialization is just one possible strategy. For example, many SQL destinations will fall back to a native JSON type (e.g. Postgres' JSONB type, or Snowflake's VARIANT). ### Specific types + These sections explain how each specific type should be used. #### Boolean + Boolean values are represented as native JSON booleans (i.e. `true` or `false`, case-sensitive). Note that "truthy" and "falsy" values are _not_ acceptable: `"true"`, `"false"`, `1`, and `0` are not valid booleans. #### Dates and timestamps + Airbyte has five temporal types: `date`, `timestamp_with_timezone`, `timestamp_without_timezone`, `time_with_timezone`, and `time_without_timezone`. These are represented as strings with specific `format` (either `date` or `date-time`). However, JSON schema does not have a built-in way to indicate whether a field includes timezone information. For example, given this JsonSchema: + ```json { "type": "object", @@ -132,6 +150,7 @@ However, JSON schema does not have a built-in way to indicate whether a field in } } ``` + Both `{"created_at": "2021-11-22T01:23:45+00:00"}` and `{"created_at": "2021-11-22T01:23:45"}` are valid records. The `airbyte_type` field resolves this ambiguity; sources producing timestamp-ish fields should choose either `timestamp_with_timezone` or `timestamp_without_timezone` (or time with/without timezone). @@ -141,19 +160,23 @@ Many sources (which were written before this system was formalized) do not speci All of these must be represented as RFC 3339§5.6 strings, extended with BC era support. See the type definition descriptions for specifics. #### Numeric values + The number and integer types can accept any value, without constraint on range. However, this is still subject to compatibility with the destination: the destination (or normalization) _may_ throw an error if it attempts to write a value outside the range supported by the destination warehouse / storage medium. Airbyte does not currently support infinity/NaN values. #### Arrays + Arrays contain 0 or more items, which must have a defined type. These types should also conform to the type system. Arrays may require that all of their elements be the same type (`"items": {whatever type...}`). They may instead require each element to conform to one of a list of types (`"items": [{first type...}, {second type...}, ... , {Nth type...}]`). Note that Airbyte's usage of the `items` field is slightly different than JSON schema's usage, in which an `"items": [...]` actually constrains the element correpsonding to the index of that item (AKA tuple-typing). This is becase destinations may have a difficult time supporting tuple-typed arrays without very specific handling, and as such are permitted to somewhat loosen their requirements. #### Objects + As with arrays, objects may declare `properties`, each of which should have a type which conforms to the type system. #### Unions + Sources may want to mix different types in a single field, e.g. `"type": ["string", "object"]`. Destinations must handle this case, either using a native union type, or by finding a native type that can accept all of the source's types (this frequently will be `string` or `json`). -In some cases, sources may want to use multiple types for the same field. For example, a user might have a property which holds one of two object schemas. This is supported with JSON schema's `oneOf` type. Note that many destinations do not currently support these types, and may not behave as expected. +In some cases, sources may want to use multiple types for the same field. For example, a user might have a property which holds one of two object schemas. This is supported with JSON schema's `oneOf` type. Note that many destinations do not currently support these types, and may not behave as expected. diff --git a/docs/understanding-airbyte/tech-stack.md b/docs/understanding-airbyte/tech-stack.md index 4bbb07010bb..2efc1357977 100644 --- a/docs/understanding-airbyte/tech-stack.md +++ b/docs/understanding-airbyte/tech-stack.md @@ -2,33 +2,33 @@ ## Airbyte Core Backend -* [Java 21](https://jdk.java.net/archive/) -* Framework: [Micronaut](https://micronaut.io/) -* API: [OAS3](https://www.openapis.org/) -* Databases: [PostgreSQL](https://www.postgresql.org/) -* Unit & E2E testing: [JUnit 5](https://junit.org/junit5) -* Orchestration: [Temporal](https://temporal.io) +- [Java 21](https://jdk.java.net/archive/) +- Framework: [Micronaut](https://micronaut.io/) +- API: [OAS3](https://www.openapis.org/) +- Databases: [PostgreSQL](https://www.postgresql.org/) +- Unit & E2E testing: [JUnit 5](https://junit.org/junit5) +- Orchestration: [Temporal](https://temporal.io) ## Connectors Connectors can be written in any language. However the most common languages are: -* Python 3.9 or higher -* [Java 21](https://jdk.java.net/archive/) +- Python 3.9 or higher +- [Java 21](https://jdk.java.net/archive/) ## **Frontend** -* [Node.js](https://nodejs.org/en/) -* [TypeScript](https://www.typescriptlang.org/) -* Web Framework/Library: [React](https://reactjs.org/) +- [Node.js](https://nodejs.org/en/) +- [TypeScript](https://www.typescriptlang.org/) +- Web Framework/Library: [React](https://reactjs.org/) ## Additional Tools -* CI/CD: [GitHub Actions](https://github.com/features/actions) -* Containerization: [Docker](https://www.docker.com/) and [Docker Compose](https://docs.docker.com/compose/) -* Linter \(Frontend\): [ESLint](https://eslint.org/) -* Formatter \(Frontend & Backend\): [Prettier](https://prettier.io/) -* Formatter \(Backend\): [Spotless](https://github.com/diffplug/spotless) +- CI/CD: [GitHub Actions](https://github.com/features/actions) +- Containerization: [Docker](https://www.docker.com/) and [Docker Compose](https://docs.docker.com/compose/) +- Linter \(Frontend\): [ESLint](https://eslint.org/) +- Formatter \(Frontend & Backend\): [Prettier](https://prettier.io/) +- Formatter \(Backend\): [Spotless](https://github.com/diffplug/spotless) ## FAQ @@ -47,4 +47,3 @@ Simply put, the team has more experience writing production Java code. ### _Why do we use_ [_Temporal_](https://temporal.io) _for orchestration?_ Temporal solves the two major hurdles that exist in orchestrating hundreds to thousands of jobs simultaneously: scaling state management and proper queue management. Temporal solves this by offering primitives that allow serialising the jobs' current runtime memory into a DB. Since a job's entire state is stored, it's trivial to recover from failures, and it's easy to determine if a job was assigned correctly. - diff --git a/docs/using-airbyte/core-concepts/basic-normalization.md b/docs/using-airbyte/core-concepts/basic-normalization.md index 16de09002ec..eb0446c2565 100644 --- a/docs/using-airbyte/core-concepts/basic-normalization.md +++ b/docs/using-airbyte/core-concepts/basic-normalization.md @@ -18,7 +18,7 @@ The high-level overview contains all the information you need to use Basic Norma ::: -For every connection, you can choose between two options: +For every connection, you can choose between two options: - Basic Normalization: Airbyte converts the raw JSON blob version of your data to the format of your destination. _Note: Not all destinations support normalization._ - Raw data (no normalization): Airbyte places the JSON blob version of your data in a table called `_airbyte_raw_` @@ -140,14 +140,14 @@ Airbyte tracks types using JsonSchema's primitive types. Here is how these types Airbyte uses the types described in the catalog to determine the correct type for each column. It does not try to use the values themselves to infer the type. -| JsonSchema Type | Resulting Type | Notes | -| :------------------------------------- | :---------------------- | :-------------------------------------------- | -| `number` | float | | -| `integer` | integer | | -| `string` | string | | -| `bit` | boolean | | -| `boolean` | boolean | | -| `string` with format label `date-time` | timestamp with timezone | | +| JsonSchema Type | Resulting Type | Notes | +| :------------------------------------- | :---------------------- | :---------------------- | +| `number` | float | | +| `integer` | integer | | +| `string` | string | | +| `bit` | boolean | | +| `boolean` | boolean | | +| `string` with format label `date-time` | timestamp with timezone | | | `array` | new table | see [nesting](#Nesting) | | `object` | new table | see [nesting](#Nesting) | diff --git a/docs/using-airbyte/core-concepts/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md index 0595b2571da..cdc2cf37347 100644 --- a/docs/using-airbyte/core-concepts/namespaces.md +++ b/docs/using-airbyte/core-concepts/namespaces.md @@ -8,12 +8,11 @@ Namespaces are used to generally organize data, separate tests and production da As a part of connection setup, you select where in the destination you want to write your data. Note: The default configuration is **Destination-defined**. -| Destination Namespace | Description | -| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| Custom | All streams will be replicated to a single user-defined namespace. See Custom format for more details | -| Destination-defined | All streams will be replicated to the single default namespace defined in the Destination's settings. | -| Source-defined | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | - +| Destination Namespace | Description | +| --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Custom | All streams will be replicated to a single user-defined namespace. See Custom format for more details | +| Destination-defined | All streams will be replicated to the single default namespace defined in the Destination's settings. | +| Source-defined | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | Most of our destinations support this feature. To learn if your connector supports this, head to the individual connector page to learn more. If your desired destination doesn't support it, you can ignore this feature. @@ -23,18 +22,18 @@ Systems often group their underlying data into namespaces with each namespace's An example of a namespace is the RDMS's `schema` concept. Some common use cases for schemas are enforcing permissions, segregating test and production data and general data organisation. -In a source, the namespace is the location from where the data is replicated to the destination. In a destination, the namespace is the location where the replicated data is stored in the destination. +In a source, the namespace is the location from where the data is replicated to the destination. In a destination, the namespace is the location where the replicated data is stored in the destination. Airbyte supports namespaces and allows Sources to define namespaces, and Destinations to write to various namespaces. In Airbyte, the following options are available and are set on each individual connection. ### Custom -When replicating multiple sources into the same destination, you may create table conflicts where tables are overwritten by different syncs. This is where using a custom namespace will ensure data is synced accurately. +When replicating multiple sources into the same destination, you may create table conflicts where tables are overwritten by different syncs. This is where using a custom namespace will ensure data is synced accurately. For example, a Github source can be replicated into a `github` schema. However, you may have multiple connections writing from different GitHub repositories \(common in multi-tenant scenarios\). :::tip -To write more than 1 table with the same name to your destination, Airbyte recommends writing the connections to unique namespaces to avoid mixing data from the different GitHub repositories. +To write more than 1 table with the same name to your destination, Airbyte recommends writing the connections to unique namespaces to avoid mixing data from the different GitHub repositories. ::: You can enter plain text (most common) or additionally add a dynamic parameter `${SOURCE_NAMESPACE}`, which uses the namespace provided by the source if available. @@ -44,18 +43,18 @@ You can enter plain text (most common) or additionally add a dynamic parameter ` All streams will be replicated and stored in the default namespace defined on the destination settings page, which is typically defined when the destination was set up. Depending on your destination, the namespace refers to: | Destination Connector | Namespace setting | -| :--- | :--- | -| BigQuery | dataset | -| MSSQL | schema | -| MySql | database | -| Oracle DB | schema | -| Postgres | schema | -| Redshift | schema | -| Snowflake | schema | -| S3 | path prefix | +| :-------------------- | :---------------- | +| BigQuery | dataset | +| MSSQL | schema | +| MySql | database | +| Oracle DB | schema | +| Postgres | schema | +| Redshift | schema | +| Snowflake | schema | +| S3 | path prefix | :::tip -If you prefer to replicate multiple sources into the same namespace, use the `Stream Prefix` configuration to differentiate data from these sources to ensure no streams collide when writing to the destination. +If you prefer to replicate multiple sources into the same namespace, use the `Stream Prefix` configuration to differentiate data from these sources to ensure no streams collide when writing to the destination. ::: ### Source-Defined @@ -68,18 +67,18 @@ Some sources \(such as databases based on JDBC\) provide namespace information f If the Source does not support namespaces, the data will be replicated into the Destination's default namespace. If the Destination does not support namespaces, any preference set in the connection is ignored. ::: -The following table summarises how this works. In this example, we're looking at the replication configuration between a Postgres Source and Snowflake Destination \(with settings of schema = "my\_schema"\): +The following table summarises how this works. In this example, we're looking at the replication configuration between a Postgres Source and Snowflake Destination \(with settings of schema = "my_schema"\): -| Namespace Configuration | Source Namespace | Source Table Name | Destination Namespace | Destination Table Name | -| :--- | :--- | :--- | :--- | :--- | -| Destination default | public | my\_table | my\_schema | my\_table | -| Destination default | | my\_table | my\_schema | my\_table | -| Mirror source structure | public | my\_table | public | my\_table | -| Mirror source structure | | my\_table | my\_schema | my\_table | -| Custom format = "custom" | public | my\_table | custom | my\_table | -| Custom format = `"${SOURCE\_NAMESPACE}"` | public | my\_table | public | my\_table | -| Custom format = `"my\_${SOURCE\_NAMESPACE}\_schema"` | public | my\_table | my\_public\_schema | my\_table | -| Custom format = " " | public | my\_table | my\_schema | my\_table | +| Namespace Configuration | Source Namespace | Source Table Name | Destination Namespace | Destination Table Name | +| :--------------------------------------------------- | :--------------- | :---------------- | :-------------------- | :--------------------- | +| Destination default | public | my_table | my_schema | my_table | +| Destination default | | my_table | my_schema | my_table | +| Mirror source structure | public | my_table | public | my_table | +| Mirror source structure | | my_table | my_schema | my_table | +| Custom format = "custom" | public | my_table | custom | my_table | +| Custom format = `"${SOURCE\_NAMESPACE}"` | public | my_table | public | my_table | +| Custom format = `"my\_${SOURCE\_NAMESPACE}\_schema"` | public | my_table | my_public_schema | my_table | +| Custom format = " " | public | my_table | my_schema | my_table | ## Using Namespaces with Basic Normalization @@ -93,7 +92,6 @@ Note custom transformation outputs are not affected by the namespace settings fr ## Requirements -* Both Source and Destination connectors need to support namespaces. -* Relevant Source and Destination connectors need to be at least version `0.3.0` or later. -* Airbyte version `0.21.0-alpha` or later. - +- Both Source and Destination connectors need to support namespaces. +- Relevant Source and Destination connectors need to be at least version `0.3.0` or later. +- Airbyte version `0.21.0-alpha` or later. diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index c5f015cbf77..d5a16cf7380 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -24,13 +24,13 @@ An Airbyte component which pulls data from a source or pushes data to a destinat A connection is an automated data pipeline that replicates data from a source to a destination. Setting up a connection enables configuration of the following parameters: -| Concept | Description | -|-----------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------| +| Concept | Description | +| ------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------ | | [Stream and Field Selection](/cloud/managing-airbyte-cloud/configuring-connections.md#modify-streams-in-your-connection) | What data should be replicated from the source to the destination? | -| [Sync Mode](/using-airbyte/core-concepts/sync-modes/README.md) | How should the streams be replicated (read and written)? | -| [Sync Schedule](/using-airbyte/core-concepts/sync-schedules.md) | When should a data sync be triggered? | -| [Destination Namespace and Stream Prefix](/using-airbyte/core-concepts/namespaces.md) | Where should the replicated data be written? | -| [Schema Propagation](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How should Airbyte handle schema drift in sources? | +| [Sync Mode](/using-airbyte/core-concepts/sync-modes/README.md) | How should the streams be replicated (read and written)? | +| [Sync Schedule](/using-airbyte/core-concepts/sync-schedules.md) | When should a data sync be triggered? | +| [Destination Namespace and Stream Prefix](/using-airbyte/core-concepts/namespaces.md) | Where should the replicated data be written? | +| [Schema Propagation](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How should Airbyte handle schema drift in sources? | ## Stream @@ -46,7 +46,7 @@ Examples of streams: A record is a single entry or unit of data. This is commonly known as a "row". A record is usually unique and contains information related to a particular entity, like a customer or transaction. -Examples of records: +Examples of records: - A row in the table in a relational database - A line in a file @@ -54,15 +54,13 @@ Examples of records: ## Field -A field is an attribute of a record in a stream. +A field is an attribute of a record in a stream. Examples of fields: - A column in the table in a relational database - A field in an API response - - ## Sync Schedule There are three options for scheduling a sync to run: @@ -95,7 +93,7 @@ Typing and deduping ensures the data emitted from sources is written into the co - BigQuery :::info -Typing and Deduping is the default method of transforming datasets within data warehouse and database destinations after they've been replicated. We are retaining documentation about normalization to support legacy destinations. +Typing and Deduping is the default method of transforming datasets within data warehouse and database destinations after they've been replicated. We are retaining documentation about normalization to support legacy destinations. ::: For more details, see our [Typing & Deduping documentation](/using-airbyte/core-concepts/typing-deduping). diff --git a/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md index 1bdd03f8dde..04949809891 100644 --- a/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md +++ b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md @@ -19,48 +19,48 @@ On the nth sync of a full refresh connection: data in the destination _before_ the nth sync: | Languages | -| :--- | -| Python | -| Java | +| :-------- | +| Python | +| Java | new data: | Languages | -| :--- | -| Python | -| Java | -| Ruby | +| :-------- | +| Python | +| Java | +| Ruby | data in the destination _after_ the nth sync: | Languages | -| :--- | -| Python | -| Java | -| Python | -| Java | -| Ruby | +| :-------- | +| Python | +| Java | +| Python | +| Java | +| Ruby | This could be useful when we are interested to know about deletion of data in the source. This is possible if we also consider the date, or the batch id from which the data was written to the destination: new data at the n+1th sync: | Languages | -| :--- | -| Python | -| Ruby | +| :-------- | +| Python | +| Ruby | data in the destination _after_ the n+1th sync: | Languages | batch id | -| :--- | :--- | -| Python | 1 | -| Java | 1 | -| Python | 2 | -| Java | 2 | -| Ruby | 2 | -| Python | 3 | -| Ruby | 3 | +| :-------- | :------- | +| Python | 1 | +| Java | 1 | +| Python | 2 | +| Java | 2 | +| Ruby | 2 | +| Python | 3 | +| Ruby | 3 | ## In the future diff --git a/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md index 0e19bca28a2..f918cf62235 100644 --- a/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md +++ b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md @@ -8,7 +8,7 @@ products: all The **Full Refresh** modes are the simplest methods that Airbyte uses to sync data, as they always retrieve all available information requested from the source, regardless of whether it has been synced before. This contrasts with [**Incremental sync**](./incremental-append.md), which does not sync data that has already been synced before. -In the **Overwrite** variant, new syncs will destroy all data in the existing destination table and then pull the new data in. Therefore, data that has been removed from the source after an old sync will be deleted in the destination table. +In the **Overwrite** variant, new syncs will destroy all data in the existing destination table and then pull the new data in. Therefore, data that has been removed from the source after an old sync will be deleted in the destination table. ## Example Behavior @@ -19,30 +19,30 @@ On the nth sync of a full refresh connection: data in the destination _before_ the sync: | Languages | -| :--- | -| Python | -| Java | -| Bash| +| :-------- | +| Python | +| Java | +| Bash | new data in the source: | Languages | -| :--- | -| Python | -| Java | -| Ruby | +| :-------- | +| Python | +| Java | +| Ruby | data in the destination _after_ the sync (note how the old value of "bash" is no longer present): | Languages | -| :--- | -| Python | -| Java | -| Ruby | +| :-------- | +| Python | +| Java | +| Ruby | ## Destination-specific mechanism for full refresh -The mechanism by which a destination connector acomplishes the full refresh will vary wildly from destination to destinaton. For our certified database and data warehouse destinations, we will be recreating the final table each sync. This allows us leave the previous sync's data viewable by writing to a "final-table-tmp" location as the sync is running, and at the end dropping the olf "final" table, and renaming the new one into place. That said, this may not possible for all destinations, and we may need to erase the existing data at the start of each full-refresh sync. +The mechanism by which a destination connector acomplishes the full refresh will vary wildly from destination to destinaton. For our certified database and data warehouse destinations, we will be recreating the final table each sync. This allows us leave the previous sync's data viewable by writing to a "final-table-tmp" location as the sync is running, and at the end dropping the olf "final" table, and renaming the new one into place. That said, this may not possible for all destinations, and we may need to erase the existing data at the start of each full-refresh sync. ## Related information diff --git a/docs/using-airbyte/core-concepts/sync-schedules.md b/docs/using-airbyte/core-concepts/sync-schedules.md index 1a0d091a1c2..6c87983a6de 100644 --- a/docs/using-airbyte/core-concepts/sync-schedules.md +++ b/docs/using-airbyte/core-concepts/sync-schedules.md @@ -12,16 +12,18 @@ For each connection, you can select between three options that allow a sync to r ## Sync Considerations -* Only one sync per connection can run at a time. -* If a sync is scheduled to run before the previous sync finishes, the scheduled sync will start after the completion of the previous sync. -* Syncs can run at most every 60 minutes in Airbyte Cloud. Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour. +- Only one sync per connection can run at a time. +- If a sync is scheduled to run before the previous sync finishes, the scheduled sync will start after the completion of the previous sync. +- Syncs can run at most every 60 minutes in Airbyte Cloud. Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour. :::note For Scheduled or cron scheduled syncs, Airbyte guarantees syncs will initiate with a schedule accuracy of +/- 30 minutes. ::: ## Scheduled syncs -You can choose between the following scheduled options: + +You can choose between the following scheduled options: + - Every 24 hours (most common) - Every 12 hours - Every 8 hours @@ -40,21 +42,23 @@ When a scheduled connection is first created, a sync is executed immediately aft - **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run ## Cron Syncs + If you prefer more precision in scheduling your sync, you can also use CRON scheduling to set a specific time of day or month. -Airbyte uses the CRON scheduler from [Quartz](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html). We recommend reading their [documentation](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) to understand the required formatting. You can also refer to these examples: +Airbyte uses the CRON scheduler from [Quartz](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html). We recommend reading their [documentation](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) to understand the required formatting. You can also refer to these examples: + +| Cron string | Sync Timing | +| -------------------- | ------------------------------------------------------ | +| 0 0 \* \* \* ? | Every hour, at 0 minutes past the hour | +| 0 0 15 \* \* ? | At 15:00 every day | +| 0 0 15 \* \* MON,TUE | At 15:00, only on Monday and Tuesday | +| 0 0 0,2,4,6 \* \* ? | At 12:00 AM, 02:00 AM, 04:00 AM and 06:00 AM every day | +| 0 0 _/15 _ \* ? | At 0 minutes past the hour, every 15 hours | -| Cron string | Sync Timing| -| - | - | -| 0 0 * * * ? | Every hour, at 0 minutes past the hour | -| 0 0 15 * * ? | At 15:00 every day | -| 0 0 15 * * MON,TUE | At 15:00, only on Monday and Tuesday | -| 0 0 0,2,4,6 * * ? | At 12:00 AM, 02:00 AM, 04:00 AM and 06:00 AM every day | -| 0 0 */15 * * ? | At 0 minutes past the hour, every 15 hours | - When setting up the cron expression, you will also be asked to choose a time zone the sync will run in. ## Manual Syncs -When the connection is set to replicate with `Manual` frequency, the sync will not automatically run. -It can be triggered by clicking the "Sync Now" button at any time through the UI or be triggered through the API. \ No newline at end of file +When the connection is set to replicate with `Manual` frequency, the sync will not automatically run. + +It can be triggered by clicking the "Sync Now" button at any time through the UI or be triggered through the API. diff --git a/docs/using-airbyte/core-concepts/typing-deduping.md b/docs/using-airbyte/core-concepts/typing-deduping.md index c0c6c57906b..f5ed6ed5745 100644 --- a/docs/using-airbyte/core-concepts/typing-deduping.md +++ b/docs/using-airbyte/core-concepts/typing-deduping.md @@ -127,6 +127,7 @@ recommend altering the final tables (e.g. adding constraints) as it may cause is In some cases, you need to manually run a soft reset - for example, if you accidentally delete some records from the final table and want to repopulate them from the raw data. This can be done by: + 1. Dropping the final table entirely (`DROP TABLE `) 1. Unsetting the raw table's `_airbyte_loaded_at` column (`UPDATE airbyte_internal. SET _airbyte_loaded_at = NULL`) diff --git a/docs/using-airbyte/getting-started/add-a-destination.md b/docs/using-airbyte/getting-started/add-a-destination.md index fe0786fa2b4..637c7bcde6e 100644 --- a/docs/using-airbyte/getting-started/add-a-destination.md +++ b/docs/using-airbyte/getting-started/add-a-destination.md @@ -25,7 +25,7 @@ You can filter the list of destinations by support level. Airbyte connectors are ![Destination Page](./assets/getting-started-google-sheets-destination.png) -:::info +:::info Google Sheets imposes rate limits and hard limits on the amount of data it can receive. Only use Google Sheets as a destination for small, non-production use cases, as it is not designed for handling large-scale data operations. Read more about the [specific limitations](/integrations/destinations/google-sheets.md#limitations) in our Google Sheets documentation. @@ -34,13 +34,15 @@ Read more about the [specific limitations](/integrations/destinations/google-she The left half of the page contains a set of fields that you will have to fill out. In the **Destination name** field, you can enter a name of your choosing to help you identify this instance of the connector. By default, this will be set to the name of the destination (i.e., `Google Sheets`). Authenticate into your Google account by clicking "Sign in with Google" and granting permissions to Airbyte. Because this is a simple Google Sheets destination, there is only one more required field, **Spreadsheet Link**. This is the path to your spreadsheet that can be copied directly from your browser. + As an example, we'll be setting up a simple JSON file that will be saved on our local system as the destination. Select **Local JSON** from the list of destinations. This will take you to the destination setup page. The left half of the page contains a set of fields that you will have to fill out. In the **Destination name** field, you can enter a name of your choosing to help you identify this instance of the connector. By default, this will be set to the name of the destination (i.e., `Local JSON`). - Because this is a simple JSON file, there is only one more required field, **Destination Path**. This is the path in your local filesystem where the JSON file containing your data will be saved. In our example, if we set the path to `/my_first_destination`, the file will be saved in `/tmp/airbyte_local/my_first_destination`. + Because this is a simple JSON file, there is only one more required field, **Destination Path**. This is the path in your local filesystem where the JSON file containing your data will be saved. In our example, if we set the path to `/my_first_destination`, the file will be saved in `/tmp/airbyte_local/my_first_destination`. + diff --git a/docs/using-airbyte/getting-started/add-a-source.md b/docs/using-airbyte/getting-started/add-a-source.md index 145d6152887..4f21f706134 100644 --- a/docs/using-airbyte/getting-started/add-a-source.md +++ b/docs/using-airbyte/getting-started/add-a-source.md @@ -23,4 +23,3 @@ Some sources will have an **Optional Fields** tab. You can open this tab to view Once you've filled out all the required fields, click on the **Set up source** button and Airbyte will run a check to verify the connection. Happy replicating! Can't find the connectors that you want? Try your hand at easily building one yourself using our [Connector Builder](../../connector-development/connector-builder-ui/overview.md)! - diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md index 7b43f108ed9..0616d1120bf 100644 --- a/docs/using-airbyte/getting-started/readme.md +++ b/docs/using-airbyte/getting-started/readme.md @@ -20,7 +20,6 @@ Airbyte Cloud offers a 14-day free trial that begins after your first successful To start setting up a data pipeline, see how to [set up a source](./add-a-source.md). - ## Deploy Airbyte (Self-Managed) When self-managing Airbyte, your data never leaves your premises. Get started immediately by deploying locally using Docker. @@ -41,6 +40,7 @@ With Airbyte Self-Managed Community (Open Source), you can use one of the follow - [On AWS ECS](/deploying-airbyte/on-aws-ecs.md) (Spoiler alert: it doesn't work) ### Self-Managed Enterprise + Airbyte Self-Managed Enterprise is the best way to run Airbyte yourself. You get all 300+ pre-built connectors, data never leaves your environment, and Airbyte becomes self-serve in your organization with new tools to manage multiple users, and multiple teams using Airbyte all in one place. To start with Self-Managed Enterprise, navigate to our [Enterprise setup guide](/enterprise-setup/README.md). diff --git a/docs/using-airbyte/getting-started/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md index 7acc58028f9..3b2ff061802 100644 --- a/docs/using-airbyte/getting-started/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -9,7 +9,7 @@ import TabItem from "@theme/TabItem"; Now that you've learned how to set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the setup by creating your very first connection! -On the left side of your main Airbyte dashboard, select **Connections**. You will be prompted to choose which source and destination to use for this connection. For this example, we'll use the **Google Sheets** source and the destination you previously set up, either **Local JSON** or **Google Sheets**. +On the left side of your main Airbyte dashboard, select **Connections**. You will be prompted to choose which source and destination to use for this connection. For this example, we'll use the **Google Sheets** source and the destination you previously set up, either **Local JSON** or **Google Sheets**. ## Configure the connection @@ -19,7 +19,7 @@ Most users select "Mirror Source", which will simply copy the data from the sour -Next, you can toggle which streams you want to replicate. Our test data consists of three streams, which we've enabled and set to `Incremental - Append + Deduped` sync mode. +Next, you can toggle which streams you want to replicate. Our test data consists of three streams, which we've enabled and set to `Incremental - Append + Deduped` sync mode. ![Setup streams](./assets/getting-started-select-streams.png) @@ -50,7 +50,7 @@ Here's a basic overview of the tabs and their use: 2. The **Job History** tab allows you to check the logs for each sync. If you encounter any errors or unexpected behaviors during a sync, checking the logs is always a good first step to finding the cause and solution. 3. The **Schema** tab allows you to modify the streams you chose during the connection setup. 4. The **Transformation** tab allows you to set up a custom post-sync transformations using dbt. -4. The **Settings** tab contains the connection settings, and the option to delete the connection if you no longer wish to use it. +5. The **Settings** tab contains the connection settings, and the option to delete the connection if you no longer wish to use it. ### Check the data from your first sync @@ -70,7 +70,7 @@ Once the first sync has completed, you can verify the sync has completed by chec You should see a list of JSON objects, each containing a unique `airbyte_ab_id`, an `emitted_at` timestamp, and `airbyte_data` containing the extracted record. -:::tip +:::tip If you are using Airbyte on Windows with WSL2 and Docker, refer to [this guide](/integrations/locating-files-local-destination.md) to locate the replicated folder and file. ::: diff --git a/docs/using-airbyte/workspaces.md b/docs/using-airbyte/workspaces.md index 099c7044b12..72bdd2a458f 100644 --- a/docs/using-airbyte/workspaces.md +++ b/docs/using-airbyte/workspaces.md @@ -4,7 +4,7 @@ products: cloud, oss-enterprise # Manage your workspace -A workspace in Airbyte allows you to collaborate with other users and manage connections together. +A workspace in Airbyte allows you to collaborate with other users and manage connections together. ## Add users to your workspace @@ -13,7 +13,7 @@ A workspace in Airbyte allows you to collaborate with other users and manage con 2. On the **Add new member** dialog, enter the email address of the user you want to invite to your workspace. Click **Add new member**. :::info -The user will have access to only the workspace you invited them to. They will be added with a role of `Workspace Admin`, which has the ability to add or delete other users and make changes to connections and connectors in the workspace. +The user will have access to only the workspace you invited them to. They will be added with a role of `Workspace Admin`, which has the ability to add or delete other users and make changes to connections and connectors in the workspace. ::: ## Remove users from your workspace​ @@ -35,13 +35,13 @@ To rename a workspace, go to the **Settings** via the side navigation in Airbyte To delete a workspace, go to the **Settings** via the side navigation in Airbyte. Navigate to **Workspace** > **General**. In the **Danger!** section, click **Delete your workspace**. ## Managing multiple workspaces - + You can have access to one or multiple workspaces with Airbyte Cloud, which gives you flexibility in managing user access and billing. Workspaces can also be linked through an organization, which allows you to collaborate with team members and share workspaces across your team. :::info Organizations are only available in Airbyte Cloud through Cloud Teams. [Get in touch](https://airbyte.com/company/talk-to-sales) with us if you would like to take advantage of organization features. ::: - + ### Billing across multiple workspaces Airbyte [credits](https://airbyte.com/pricing) are by default assigned per workspace and cannot be transferred between workspaces. [Get in touch](https://airbyte.com/company/talk-to-sales) with us if you would like to centralize billing across workspaces. @@ -50,13 +50,12 @@ Airbyte [credits](https://airbyte.com/pricing) are by default assigned per works Airbyte offers multiple user roles to enable teams to securely access workspaces or organizations. Some roles are only available to certain products. -| Role | Cloud | Cloud Teams | Enterprise | -|---|------|------|------| -|**Organization Admin:** Administer the whole organization, create workspaces in it, and manage organization permissions| |✅|✅| -|**Workspace Admin:** Administer the workspace, create workspace permissions|✅| | | -|**Workspace Reader:** View information within a workspace, cannot modify anything within a workspace| |✅|✅| +| Role | Cloud | Cloud Teams | Enterprise | +| ----------------------------------------------------------------------------------------------------------------------- | ----- | ----------- | ---------- | +| **Organization Admin:** Administer the whole organization, create workspaces in it, and manage organization permissions | | ✅ | ✅ | +| **Workspace Admin:** Administer the workspace, create workspace permissions | ✅ | | | +| **Workspace Reader:** View information within a workspace, cannot modify anything within a workspace | | ✅ | ✅ | ## Switch between multiple workspaces To switch between workspaces, click the current workspace name under the Airbyte logo in the navigation bar. Search for the workspace or click the name of the workspace you want to switch to. - diff --git a/tools/internal/README.md b/tools/internal/README.md index 88369410244..918325bc1d8 100644 --- a/tools/internal/README.md +++ b/tools/internal/README.md @@ -1,6 +1,7 @@ Scripts in this directory are for Airbyte's employees # `demo.sh` + This script helps maintain Airbyte's demo instance: ```shell @@ -9,6 +10,7 @@ This script helps maintain Airbyte's demo instance: ``` # `compare_versions.sh` + This script compare records output for two given connector versions ## Usage @@ -21,19 +23,17 @@ Config, configured catalog and state files should be saved in `config_files` fol config - `/config_files/secrets/config.json` -catalog - `/config_files/configured_catalog.json` +catalog - `/config_files/configured_catalog.json` state - `/config_files/state.json` (only if you want start sync with state is required) - - Enter connector name: [source-twitter] - Enter first connector version: [0.1.1] - Enter second connector version: [0.1.2] - Start sync with state (y/n)? [y/n] -Depend on choose sync will be started with state or without. -State should be present in `/config_files/state.json` to start sync with state. -After 3 wrong tries process will be finished with 1. - + Depend on choose sync will be started with state or without. + State should be present in `/config_files/state.json` to start sync with state. + After 3 wrong tries process will be finished with 1. If comparing successful and script didn't find difference you get `Records output equal.` Otherwise you get difference and `Records output not equal.` diff --git a/tools/openapi2jsonschema/README.md b/tools/openapi2jsonschema/README.md index c6f4be634f8..34fa1a14143 100644 --- a/tools/openapi2jsonschema/README.md +++ b/tools/openapi2jsonschema/README.md @@ -1,13 +1,17 @@ # openapi2jsonschema + Util for generating catalog schema from OpenAPI definition file. Forked from [openapi2jsonschema](https://github.com/instrumenta/openapi2jsonschema) util with fixes for generating standlone schemas e.g. ones that don't contain reference to other files/resources. ## Usage + ```bash $ tools/openapi2jsonschema/run.sh ``` -It would generate set of JSONSchema files based on components described on OpenAPI's definition and place it in "**schemas**" folder in the current working directory. - Support OpenAPI v2.0, v3.0 and v3.1. Works with both JSON and Yaml OpenAPI formats. +It would generate set of JSONSchema files based on components described on OpenAPI's definition and place it in "**schemas**" folder in the current working directory. + +Support OpenAPI v2.0, v3.0 and v3.1. Works with both JSON and Yaml OpenAPI formats. ### Examples + You can try to run this tool on the sample OpenApi definition files located in [examples](./examples) directory. There are some OpenAPI files taken from APIs-guru repo [from github](https://github.com/APIs-guru). diff --git a/tools/site/README.md b/tools/site/README.md index 2c46dc0cbf2..ae7e0d57014 100644 --- a/tools/site/README.md +++ b/tools/site/README.md @@ -1,17 +1,21 @@ # Link Checker + Used to detect broken links in a domain. To check docs: + ```shell script ./tools/site/link_checker.sh check_docs ``` To run BLC: + ```shell script ./tools/site/link_checker.sh run --help ``` To update the image: + ```shell script ./tools/site/link_checker.sh publish -``` \ No newline at end of file +```