11059 multi-instance, versioned docs (#58095)
Co-authored-by: devin-ai-integration[bot] <158243242+devin-ai-integration[bot]@users.noreply.github.com>
This commit is contained in:
@@ -1,49 +0,0 @@
|
||||
---
|
||||
products: all
|
||||
---
|
||||
|
||||
# Data Transfer Options
|
||||
|
||||
A connection links a source to a destination and defines how your data will sync. After you have created a connection, you can modify any of the configuration settings or stream settings.
|
||||
|
||||
## Configure Connection Settings
|
||||
|
||||
Configuring the connection settings allows you to manage various aspects of the sync, such as how often data syncs and where data is written.
|
||||
|
||||
To configure these settings:
|
||||
|
||||
1. In the Airbyte UI, click **Connections** and then click the connection you want to change.
|
||||
|
||||
2. Click the **Settings** tab.
|
||||
|
||||
3. Click the **Advanced seetings** dropdown to display all settings.
|
||||
|
||||
:::note
|
||||
|
||||
These settings apply to all streams in the connection.
|
||||
|
||||
:::
|
||||
|
||||
You can configure the following settings:
|
||||
|
||||
| Connection Setting | Description |
|
||||
| --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------- |
|
||||
| Connection Name | A custom name for your connection |
|
||||
| [Schedule Type](/using-airbyte/core-concepts/sync-schedules.md) | Configure how often data syncs (can be scheduled, cron, or manually triggered) |
|
||||
| [Destination Namespace](/using-airbyte/core-concepts/namespaces.md) | Determines where the replicated data is written to in the destination |
|
||||
| [Destination Stream Prefix](/using-airbyte/configuring-schema.md) | (Optional) Adds a prefix to each table name in the destination |
|
||||
| [Detect and propagate schema changes](using-airbyte/schema-change-management.md) | Set how Airbyte handles schema changes in the source |
|
||||
| [Connection Data Residency](/cloud/managing-airbyte-cloud/manage-data-residency.md) | Determines where data will be processed (Cloud only) |
|
||||
|
||||
|
||||
## Configure Stream Settings
|
||||
|
||||
In addition to connection configuration settings, you apply the following specific settings per individual stream. This allows for greater flexibility in how your data syncs.
|
||||
|
||||
| Stream Setting | Description |
|
||||
| --------- | ----------- |
|
||||
| [Stream selection](/using-airbyte/configuring-schema.md) | Determine if the stream syncs to your destination |
|
||||
| [Sync mode](/using-airbyte/core-concepts/sync-modes/README.md) | Configure how Airbyte reads data from the source and writes it |
|
||||
| [Cursor selection](/using-airbyte/configuring-schema.md) | Select what field the stream uses to incrementally read from the source |
|
||||
| [Primary key selection](/using-airbyte/configuring-schema.md) | Select what field the stream uses to determine uniqueness of a record |
|
||||
| [Field selection](/using-airbyte/configuring-schema.md) | (Optional) Disable a partial set of fields Airbyte should not sync to the destination |
|
||||
33
docs/home/readme.md
Normal file
33
docs/home/readme.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Airbyte documentation
|
||||
|
||||
Airbyte is open source data movement infrastructure for building extract and load (EL) data pipelines. It is designed for versatility, scalability, and ease-of-use.
|
||||
|
||||
## Get started
|
||||
|
||||
<Grid columns="3">
|
||||
|
||||
<CardWithIcon title="Platform help" description="Deploy Airbyte locally, to cloud providers, or use Airbyte Cloud. Create connections, build custom connectors, and start syncing data in minutes." ctaText="Let's go!" ctaLink="/platform/" icon="enterprise" />
|
||||
|
||||
<CardWithIcon title="Connector catalog" description="Browse Airbyte's catalog of over 600 sources and destinations, and learn to set them up in Airbyte." ctaText="Browse connectors" ctaLink="./integrations/" icon="oss" ctaVariant="secondary" />
|
||||
|
||||
<CardWithIcon title="Release notes" description="See what's new. Airbyte releases new Self-Managed versions regularly. Airbyte Cloud customers always have the latest enhancements." ctaText="What's new" ctaLink="/release_notes/" icon="cloud" ctaVariant="secondary" />
|
||||
|
||||
</Grid>
|
||||
|
||||
<Arcade id="8UUaeQOILatZ38Rjh8cs" title="Airbyte Demo: Get Started Creating Connections" paddingBottom="calc(61.416666666666664% + 41px)" />
|
||||
|
||||
## Why Airbyte?
|
||||
|
||||
Teams and organizations need efficient and timely data access to an ever-growing list of data sources. In-house data pipelines are brittle and costly to build and maintain. Airbyte's unique open source approach enables your data stack to adapt as your data needs evolve.
|
||||
|
||||
- **Wide connector availability:** Airbyte’s connector catalog comes “out-of-the-box” with over 600 pre-built connectors. These connectors can be used to start replicating data from a source to a destination in just a few minutes.
|
||||
|
||||
- **Long-tail connector coverage:** You can easily extend Airbyte’s capability to support your custom use cases through Airbyte's [No-Code Connector Builder](/platform/connector-development/connector-builder-ui/overview).
|
||||
|
||||
- **Robust platform** provides horizontal scaling required for large-scale data movement operations, available as [Cloud-managed](https://airbyte.com/product/airbyte-cloud) or [Self-managed](https://airbyte.com/product/airbyte-enterprise).
|
||||
|
||||
- **Accessible User Interfaces** through the UI, [**PyAirbyte**](/platform/using-airbyte/pyairbyte/getting-started) (Python library), [**API**](/platform/api-documentation), and [**Terraform Provider**](/platform/terraform-documentation) to integrate with your preferred tooling and approach to infrastructure management.
|
||||
|
||||
Airbyte is suitable for a wide range of data integration use cases, including AI data infrastructure and EL(T) workloads. Airbyte is also [embeddable](https://airbyte.com/product/powered-by-airbyte) within your own app or platform to power your product.
|
||||
|
||||
[](https://GitHub.com/airbytehq/airbyte/stargazers/) [](https://github.com/airbytehq/airbyte/tree/a9b1c6c0420550ad5069aca66c295223e0d05e27/LICENSE/README.md) [](https://github.com/airbytehq/airbyte/tree/a9b1c6c0420550ad5069aca66c295223e0d05e27/LICENSE/README.md)
|
||||
@@ -1,6 +1,6 @@
|
||||
import ConnectorRegistry from '@site/src/components/ConnectorRegistry';
|
||||
|
||||
# Connector Catalog
|
||||
# Connectors
|
||||
|
||||
## Introduction to Connectors
|
||||
|
||||
@@ -8,13 +8,13 @@ Each source or destination is a connector. A source is an API, file, database, o
|
||||
|
||||
By browsing the catalog, you can see useful links to documentation, source code, and issues related to each connector. You'll also be able to see whether a connector is supported on our Open Source Software (OSS), our Cloud platform, or both.
|
||||
|
||||
As an open source project, Airbyte's catalog of connectors is continually growing thanks to community contributions as well as development by the Airbyte team. Airbyte enables you to [build new connectors](/connector-development/). We encourage you to consider contributing enhancements, bug fixes, or features to existing connectors or to submit entirely new connectors you've built for inclusion in the connector catalog. That said, you always have the option to publish connectors privately, to your own workspaces.
|
||||
As an open source project, Airbyte's catalog of connectors is continually growing thanks to community contributions as well as development by the Airbyte team. Airbyte enables you to [build new connectors](../platform/connector-development/). We encourage you to consider contributing enhancements, bug fixes, or features to existing connectors or to submit entirely new connectors you've built for inclusion in the connector catalog. That said, you always have the option to publish connectors privately, to your own workspaces.
|
||||
|
||||
Learn more about contributing to Airbyte [here](/contributing-to-airbyte/).
|
||||
Learn more about contributing to Airbyte [here](../platform/contributing-to-airbyte/).
|
||||
|
||||
## Connector Support Levels
|
||||
|
||||
Airbyte uses a tiered system for connectors to help you understand what to expect from a connector. In short, there are three tiers: Airbyte Connectors, Marketplace Connectors, and Custom Connectors. Review the documentation on [connector support levels](./connector-support-levels.md) for details on each tier.
|
||||
Airbyte uses a tiered system for connectors to help you understand what to expect from a connector. In short, there are three tiers: Airbyte Connectors, Marketplace Connectors, and Custom Connectors. Review the documentation on [connector support levels](connector-support-levels) for details on each tier.
|
||||
|
||||
_[View the connector registries in full](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html)_
|
||||
|
||||
|
||||
@@ -67,6 +67,6 @@ GitHub.
|
||||
|
||||
If you wish to take over the maintenance of an archived connector, please open a Github Discussion.
|
||||
For API Sources (python), updating the connector to the latest version of the
|
||||
[CDK](/connector-development/cdk-python/) and ensuring that the connector successfully passes the
|
||||
[Connector Acceptance Tests](/connector-development/testing-connectors/connector-acceptance-tests-reference)
|
||||
[CDK](/platform/connector-development/cdk-python/) and ensuring that the connector successfully passes the
|
||||
[Connector Acceptance Tests](/platform/connector-development/testing-connectors/connector-acceptance-tests-reference)
|
||||
is the start to the un-archiving process.
|
||||
|
||||
@@ -6,7 +6,7 @@ description: Missing a connector?
|
||||
|
||||
If you'd like to **ask for a new connector,** you can request it directly [here](https://github.com/airbytehq/airbyte/discussions/new?category=new-connector-request).
|
||||
|
||||
If you'd like to build new connectors and **make them part of the pool of pre-built connectors on Airbyte,** first a big thank you. We invite you to check our [contributing guide on building connectors](../contributing-to-airbyte/README.md).
|
||||
If you'd like to build new connectors and **make them part of the pool of pre-built connectors on Airbyte,** first a big thank you. We invite you to check our [contributing guide on building connectors](../platform/contributing-to-airbyte/).
|
||||
|
||||
If you'd like to build new connectors, or update existing ones, **for your own usage,** without contributing to the Airbyte codebase, read along.
|
||||
|
||||
@@ -20,7 +20,7 @@ You should only build and deploy your own connector in code (using Python or Jav
|
||||
|
||||
### Really need to build your own connector from scratch?
|
||||
|
||||
It's easy to build your own connectors for Airbyte. You can learn how to build new connectors using either our Connector Builder or our connector CDKs [here](../connector-development/README.md).
|
||||
It's easy to build your own connectors for Airbyte. You can learn how to build new connectors using either our Connector Builder or our connector CDKs [here](/platform/connector-development/).
|
||||
|
||||
While the guides in the link above are specific to the languages used most frequently to write integrations, **Airbyte connectors can be written in any language**. Please reach out to us if you'd like help developing connectors in other languages.
|
||||
|
||||
@@ -29,10 +29,10 @@ If you don't use one of the official development options, remember to set the `A
|
||||
Otherwise, your connector will not run correctly.
|
||||
:::
|
||||
|
||||
Learn how to upload Docker-based custom connectors [here](../../operator-guides/using-custom-connectors/)
|
||||
Learn how to upload Docker-based custom connectors [here](../../platform/operator-guides/using-custom-connectors/)
|
||||
|
||||
## Upgrading a connector
|
||||
|
||||
To upgrade your connector version, go to the Settings in the left hand side of the UI and navigate to either Sources or Destinations. Find your connector in the list, and input the latest connector version.
|
||||
|
||||

|
||||

|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
### Airbyte field names
|
||||
|
||||
This version updates the Azure Blob Storage destination connector to use the [DV2 Airbyte metadata field names](../../understanding-airbyte/airbyte-metadata-fields). You should update any downstream consumers to reference the new field names. Specifically, these two fields have been renamed:
|
||||
This version updates the Azure Blob Storage destination connector to use the [DV2 Airbyte metadata field names](../../platform/understanding-airbyte/airbyte-metadata-fields). You should update any downstream consumers to reference the new field names. Specifically, these two fields have been renamed:
|
||||
|
||||
| Old field name | New field name |
|
||||
| --------------------- | -------------------------- |
|
||||
|
||||
@@ -31,7 +31,7 @@ as `<stream_namespace>/<stream_name>/yyyy_mm_dd_<unix_epoch>_<part_number>.<file
|
||||
|
||||
### CSV
|
||||
|
||||
Like most other Airbyte destination connectors, the output contains your data, along with some [metadata fields](/understanding-airbyte/airbyte-metadata-fields).
|
||||
Like most other Airbyte destination connectors, the output contains your data, along with some [metadata fields](/platform/understanding-airbyte/airbyte-metadata-fields).
|
||||
If you select the "root level flattening" option, your data will be promoted to additional columns; if you select "no flattening", your data
|
||||
will be left as a JSON blob inside the `_airbyte_data` column.
|
||||
|
||||
@@ -62,7 +62,7 @@ With root level flattening, the output CSV is:
|
||||
### JSON Lines \(JSONL\)
|
||||
|
||||
[JSON Lines](https://jsonlines.org/) is a text format with one JSON per line. As with the [CSV](#csv) format, this connector will write your data along
|
||||
with some [metadata fields](/understanding-airbyte/airbyte-metadata-fields). You can enable "root level flattening" to promote your data to the root
|
||||
with some [metadata fields](/platform/understanding-airbyte/airbyte-metadata-fields). You can enable "root level flattening" to promote your data to the root
|
||||
of the JSON object, or use "no flattening" to leave your data inside the `_airbyte_data` object.
|
||||
|
||||
For example, given the following two JSON object from a source:
|
||||
|
||||
@@ -11,4 +11,4 @@ Worthy of specific mention, this version includes:
|
||||
- Removal of sub-tables for nested properties
|
||||
- Removal of SCD tables
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
@@ -129,7 +129,7 @@ The raw table contains these fields:
|
||||
- `_airbyte_meta`
|
||||
- `_airbyte_data`
|
||||
|
||||
`_airbyte_data` is a JSON blob with the event data. See [here](/understanding-airbyte/airbyte-metadata-fields)
|
||||
`_airbyte_data` is a JSON blob with the event data. See [here](/platform/understanding-airbyte/airbyte-metadata-fields)
|
||||
for more information about the other fields.
|
||||
|
||||
**Note:** Although the contents of the `_airbyte_data` are fairly stable, schema of the raw table
|
||||
@@ -143,7 +143,7 @@ The final table contains these fields, in addition to the columns declared in yo
|
||||
- `airbyte_extracted_at`
|
||||
- `_airbyte_meta`
|
||||
|
||||
Again, see [here](/understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
Again, see [here](/platform/understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
|
||||
The output tables in BigQuery are partitioned by the Time-unit column `airbyte_extracted_at` at a
|
||||
daily granularity and clustered by `airbyte_extracted_at` and the table Primary Keys. Partitions
|
||||
@@ -230,7 +230,7 @@ tutorials:
|
||||
| 2.8.1 | 2024-06-25 | [39379](https://github.com/airbytehq/airbyte/pull/39379) | Removing requirement of a redundant permission bigquery.datasets.create permission |
|
||||
| 2.8.0 | 2024-06-21 | [39904](https://github.com/airbytehq/airbyte/pull/39904) | Convert all production code to kotlin |
|
||||
| 2.7.1 | 2024-06-17 | [39526](https://github.com/airbytehq/airbyte/pull/39526) | Internal code change for improved error reporting in case of source/platform failure (`INCOMPLETE` stream status / empty ConfiguredCatalog). |
|
||||
| 2.7.0 | 2024-06-17 | [38713](https://github.com/airbytehq/airbyte/pull/38713) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.7.0 | 2024-06-17 | [38713](https://github.com/airbytehq/airbyte/pull/38713) | Support for [refreshes](/platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.6.3 | 2024-06-10 | [38331](https://github.com/airbytehq/airbyte/pull/38331) | Internal code changes in preparation for future feature release |
|
||||
| 2.6.2 | 2024-06-07 | [38764](https://github.com/airbytehq/airbyte/pull/38764) | Increase message length limit to 50MiB |
|
||||
| 2.6.1 | 2024-05-29 | [38770](https://github.com/airbytehq/airbyte/pull/38770) | Internal code change (switch to CDK artifact) |
|
||||
|
||||
@@ -69,7 +69,7 @@ You can also copy the output file to your host machine, the following command wi
|
||||
docker cp airbyte-server:/tmp/airbyte_local/{destination_path}/{filename}.csv .
|
||||
```
|
||||
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach.
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination) for an alternative approach.
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
@@ -24,4 +24,4 @@ Worthy of specific mention, this version includes:
|
||||
- Removal of sub-tables for nested properties
|
||||
- Removal of SCD tables
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
@@ -30,11 +30,11 @@ When setting up a Databricks destination, you need these pieces of information:
|
||||
1. Open the workspace console.
|
||||
2. Open your SQL warehouse:
|
||||
|
||||

|
||||

|
||||
|
||||
3. Open the Connection Details tab:
|
||||
|
||||

|
||||

|
||||
|
||||
4. Finally, you'll need to provide the `Databricks Unity Catalog Path`, which is the path to the database you wish to use within the Unity Catalog. This is often the same as the workspace name.
|
||||
|
||||
@@ -50,11 +50,11 @@ to generate a client ID and secret.
|
||||
1. Open your workspace console.
|
||||
2. Click on your icon in the top-right corner, and head to `settings`, then `developer`, then `manage` under `access tokens`
|
||||
|
||||

|
||||

|
||||
|
||||
3. Enter a description for the token and how long it will be valid for (or leave blank for a permanent token):
|
||||
|
||||

|
||||

|
||||
|
||||
### Other Options
|
||||
|
||||
@@ -79,7 +79,7 @@ Each table will have the following columns, in addition to your whatever columns
|
||||
| `_airbyte_raw_id` | string | A random UUID. |
|
||||
| `_airbyte_extracted_at` | timestamp | Timestamp when the source read the record. |
|
||||
| `_airbyte_loaded_at` | timestamp | Timestamp when the record was written to the destination |
|
||||
| `_airbyte_generation_id` | bigint | See the [refreshes](../../operator-guides/refreshes.md) documentation. |
|
||||
| `_airbyte_generation_id` | bigint | See the [refreshes](../../platform/operator-guides/refreshes) documentation. |
|
||||
|
||||
Airbyte will also produce "raw tables" (by default in the `airbyte_internal` schema). We do not recommend directly interacting
|
||||
with the raw tables, and their format is subject to change without notice.
|
||||
@@ -103,7 +103,7 @@ with the raw tables, and their format is subject to change without notice.
|
||||
| 3.2.2 | 2024-08-22 | [#44941](https://github.com/airbytehq/airbyte/pull/44941) | Clarify Unity Catalog Path option. |
|
||||
| 3.2.1 | 2024-08-22 | [#44506](https://github.com/airbytehq/airbyte/pull/44506) | Handle uppercase/mixed-case stream name/namespaces |
|
||||
| 3.2.0 | 2024-08-12 | [#40712](https://github.com/airbytehq/airbyte/pull/40712) | Rely solely on PAT, instead of also needing a user/pass |
|
||||
| 3.1.0 | 2024-07-22 | [#40692](https://github.com/airbytehq/airbyte/pull/40692) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.1.0 | 2024-07-22 | [#40692](https://github.com/airbytehq/airbyte/pull/40692) | Support for [refreshes](../../platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.0.0 | 2024-07-12 | [#40689](https://github.com/airbytehq/airbyte/pull/40689) | (Private release, not to be used for production) Add `_airbyte_generation_id` column, and `sync_id` entry in `_airbyte_meta` |
|
||||
| 2.0.0 | 2024-05-17 | [#37613](https://github.com/airbytehq/airbyte/pull/37613) | (Private release, not to be used for production) Alpha release of the connector to use Unity Catalog |
|
||||
| 1.1.2 | 2024-04-04 | [#36846](https://github.com/airbytehq/airbyte/pull/36846) | (incompatible with CDK, do not use) Remove duplicate S3 Region |
|
||||
|
||||
@@ -98,7 +98,7 @@ You can also copy the output file to your host machine, the following command wi
|
||||
docker cp airbyte-server:/tmp/airbyte_local/{destination_path} .
|
||||
```
|
||||
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach.
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination) for an alternative approach.
|
||||
|
||||
<!-- /env:oss -->
|
||||
|
||||
@@ -107,7 +107,7 @@ Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have
|
||||
### Error message `Request failed: (UNAVAILABLE, RPC 'GET_WELCOME_PACK')`
|
||||
|
||||
This error may indicate that you are connecting with a `0.10.x` DuckDB client (as per DuckDB Destination connector versions `>=0.4.0`) and your database has not yet been upgraded to a version `>=0.10.x`. To resolve this, you'll need to manually upgrade your database or revert to a previous version of the DuckDB Destination connector.
|
||||
For information about migrating between different versions of DuckDB, please see the [DuckDB Migration Guide](./duckdb-migrations.md).
|
||||
For information about migrating between different versions of DuckDB, please see the [DuckDB Migration Guide](./duckdb-migrations).
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -42,7 +42,7 @@ The Airbyte GCS destination allows you to sync data to cloud storage buckets. Ea
|
||||
| Feature | Support | Notes |
|
||||
| :----------------------------- | :-----: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. |
|
||||
| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) |
|
||||
| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/platform/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) |
|
||||
| Incremental - Append + Deduped | ❌ | |
|
||||
| Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. |
|
||||
|
||||
|
||||
@@ -69,7 +69,7 @@ You can also copy the output file to your host machine, the following command wi
|
||||
docker cp airbyte-server:/tmp/airbyte_local/{destination_path}/{filename}.jsonl .
|
||||
```
|
||||
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach.
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination) for an alternative approach.
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ For file-based DBs, data is written to `/tmp/airbyte_local` by default. To chang
|
||||
|
||||
This destination implements [Destinations V2](/release_notes/upgrading_to_destinations_v2/#what-is-destinations-v2), which provides improved final table structures. It's a new version of the existing DuckDB destination and works both with DuckDB and MotherDuck.
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
## Use with MotherDuck
|
||||
|
||||
@@ -53,9 +53,9 @@ In addition, columns specified in the [JSON schema](https://docs.airbyte.com/con
|
||||
| Full Refresh Sync | Yes | |
|
||||
| Incremental - Append Sync | Yes | |
|
||||
| Incremental - Append + Deduped | Yes | |
|
||||
| [Typing and Deduplication](/using-airbyte/core-concepts/typing-deduping) | Yes | |
|
||||
| [Namespaces](/using-airbyte/core-concepts/namespaces) | No | |
|
||||
| [Data Generations](/operator-guides/refreshes#data-generations) | No | |
|
||||
| [Typing and Deduplication](/platform/using-airbyte/core-concepts/typing-deduping) | Yes | |
|
||||
| [Namespaces](/platform/using-airbyte/core-concepts/namespaces) | No | |
|
||||
| [Data Generations](/platform/operator-guides/refreshes#data-generations) | No | |
|
||||
|
||||
#### Performance consideration
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ Each stream will be output into its own table in SQL Server. Each table will con
|
||||
- `_airbyte_meta`: Additional information about the record. The column type in SQL Server is `TEXT`.
|
||||
- `_airbyte_generation_id`: Incremented each time a [refresh](https://docs.airbyte.com/operator-guides/refreshes) is executed. The column type in SQL Server is `TEXT`.
|
||||
|
||||
See [here](../../understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
See [here](../../platform/understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
|
||||
## Getting Started
|
||||
|
||||
|
||||
@@ -11,4 +11,4 @@ Worthy of specific mention, this version includes:
|
||||
- Removal of sub-tables for nested properties
|
||||
- Removal of SCD tables
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
@@ -12,4 +12,4 @@ Worthy of specific mention, this version includes:
|
||||
- Removal of SCD tables
|
||||
- Preserving [upper case column names](https://docs.airbyte.com/release_notes/upgrading_to_destinations_v2/#destinations-v2-implementation-differences)
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
@@ -274,11 +274,11 @@ _where_ it is deployed.
|
||||
| 2.4.0 | 2024-08-18 | [\#45434](https://github.com/airbytehq/airbyte/pull/45434) | upgrade all dependencies. |
|
||||
| 2.3.2 | 2024-08-07 | [\#43331](https://github.com/airbytehq/airbyte/pull/43331) | bump java CDK. |
|
||||
| 2.3.1 | 2024-08-07 | [\#43363](https://github.com/airbytehq/airbyte/pull/43363) | Adopt latest CDK. |
|
||||
| 2.3.0 | 2024-07-22 | [\#41954](https://github.com/airbytehq/airbyte/pull/41954) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.3.0 | 2024-07-22 | [\#41954](https://github.com/airbytehq/airbyte/pull/41954) | Support for [refreshes](../../platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.2.1 | 2024-07-22 | [\#42423](https://github.com/airbytehq/airbyte/pull/42423) | no-op. Bumping to a clean image |
|
||||
| 2.2.0 | 2024-07-22 | [\#42423](https://github.com/airbytehq/airbyte/pull/42423) | Revert refreshes support |
|
||||
| 2.1.1 | 2024-07-22 | [\#42415](https://github.com/airbytehq/airbyte/pull/42415) | fixing PostgresSqlOperations.isOtherGenerationIdInTable to close the streams coming from JdbcDatabase.unsafeQuery |
|
||||
| 2.1.0 | 2024-07-22 | [\#41954](https://github.com/airbytehq/airbyte/pull/41954) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.1.0 | 2024-07-22 | [\#41954](https://github.com/airbytehq/airbyte/pull/41954) | Support for [refreshes](../../platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 2.0.15 | 2024-06-26 | [\#40554](https://github.com/airbytehq/airbyte/pull/40554) | Convert all strict-encrypt prod code to kotlin. |
|
||||
| 2.0.14 | 2024-06-26 | [\#40563](https://github.com/airbytehq/airbyte/pull/40563) | Convert all test code to kotlin. |
|
||||
| 2.0.13 | 2024-06-13 | [\#40159](https://github.com/airbytehq/airbyte/pull/40159) | Config error on drop failure when cascade is disabled |
|
||||
|
||||
@@ -12,7 +12,7 @@ Postgres, while an excellent relational database, is not a data warehouse. Pleas
|
||||
|
||||
1. Postgres is likely to perform poorly with large data volumes. Even postgres-compatible
|
||||
destinations (e.g. AWS Aurora) are not immune to slowdowns when dealing with large writes or
|
||||
updates over ~100GB. Especially when using [typing and deduplication](/using-airbyte/core-concepts/typing-deduping) with `destination-postgres`, be sure to
|
||||
updates over ~100GB. Especially when using [typing and deduplication](/platform/using-airbyte/core-concepts/typing-deduping) with `destination-postgres`, be sure to
|
||||
monitor your database's memory and CPU usage during your syncs. It is possible for your
|
||||
destination to 'lock up', and incur high usage costs with large sync volumes.
|
||||
2. When attempting to scale a postgres database to handle larger data volumes, scaling IOPS (disk throughput) is as important as increasing memory and compute capacity.
|
||||
|
||||
@@ -23,4 +23,4 @@ Worthy of specific mention, this version includes:
|
||||
* Removal of sub-tables for nested properties
|
||||
* Removal of SCD tables
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
@@ -232,7 +232,7 @@ Each stream will be output into its own raw table in Redshift. Each table will c
|
||||
| 3.4.1 | 2024-08-13 | [xxx](https://github.com/airbytehq/airbyte/pull/xxx) | Simplify Redshift Options |
|
||||
| 3.4.0 | 2024-07-23 | [42445](https://github.com/airbytehq/airbyte/pull/42445) | Respect the `drop cascade` option on raw tables |
|
||||
| 3.3.1 | 2024-07-15 | [41968](https://github.com/airbytehq/airbyte/pull/41968) | Don't hang forever on empty stream list; shorten error message on INCOMPLETE stream status |
|
||||
| 3.3.0 | 2024-07-02 | [40567](https://github.com/airbytehq/airbyte/pull/40567) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.3.0 | 2024-07-02 | [40567](https://github.com/airbytehq/airbyte/pull/40567) | Support for [refreshes](../../platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.2.0 | 2024-07-02 | [40201](https://github.com/airbytehq/airbyte/pull/40201) | Add `_airbyte_generation_id` column, and add `sync_id` to `_airbyte_meta` column |
|
||||
| 3.1.1 | 2024-06-26 | [39008](https://github.com/airbytehq/airbyte/pull/39008) | Internal code changes |
|
||||
| 3.1.0 | 2024-06-26 | [39141](https://github.com/airbytehq/airbyte/pull/39141) | Remove nonfunctional "encrypted staging" option |
|
||||
|
||||
@@ -236,12 +236,12 @@ This connector never rewrites existing Iceberg data files. This means Airbyte ca
|
||||
|
||||
You have the following options to manage schema evolution.
|
||||
|
||||
- To handle unsupported schema changes automatically, use [Full Refresh - Overwrite](../../using-airbyte/core-concepts/sync-modes/full-refresh-overwrite) as your [sync mode](../../using-airbyte/core-concepts/sync-modes).
|
||||
- To handle unsupported schema changes automatically, use [Full Refresh - Overwrite](../../platform/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite) as your [sync mode](../../platform/using-airbyte/core-concepts/sync-modes).
|
||||
- To handle unsupported schema changes as they occur, wait for a sync to fail, then take action to restore it. Either:
|
||||
|
||||
- Manually edit your table schema in Iceberg directly.
|
||||
- [Refresh](../../operator-guides/refreshes) your connection in Airbyte.
|
||||
- [Clear](../../operator-guides/clear) your connection in Airbyte.
|
||||
- [Refresh](../../platform/operator-guides/refreshes) your connection in Airbyte.
|
||||
- [Clear](../../platform/operator-guides/clear) your connection in Airbyte.
|
||||
|
||||
## Deduplication
|
||||
|
||||
@@ -257,7 +257,7 @@ The S3 Data Lake connector assumes that one of two things is true:
|
||||
- The source never emits the same primary key twice in a single sync attempt.
|
||||
- If the source emits the same primary key multiple times in a single attempt, it always emits those records in cursor order from oldest to newest.
|
||||
|
||||
If these conditions aren't met, you may see inaccurate data in Iceberg in the form of older records taking precedence over newer records. If this happens, use append or overwrite as your [sync modes](../../using-airbyte/core-concepts/sync-modes/).
|
||||
If these conditions aren't met, you may see inaccurate data in Iceberg in the form of older records taking precedence over newer records. If this happens, use append or overwrite as your [sync modes](../../platform/using-airbyte/core-concepts/sync-modes/).
|
||||
|
||||
An unknown number of API sources have streams that don't meet these conditions. Airbyte knows [Stripe](../sources/stripe) and [Monday](../sources/monday) don't, but there are probably others.
|
||||
|
||||
@@ -285,7 +285,7 @@ For example, the following table contains three versions of the 'Alice' record.
|
||||
| 1 | Alice | 2024-03-02 12:00 | 2024-03-02 12:10 |
|
||||
| 1 | Alice | 2024-03-03 14:00 | 2024-03-03 14:10 |
|
||||
|
||||
To mitigate this, generate a flag to detect outdated records. Airbyte generates an `airbyte_extracted_at` [metadata field](../../understanding-airbyte/airbyte-metadata-fields.md) that assists with this.
|
||||
To mitigate this, generate a flag to detect outdated records. Airbyte generates an `airbyte_extracted_at` [metadata field](../../platform/understanding-airbyte/airbyte-metadata-fields) that assists with this.
|
||||
|
||||
```sql
|
||||
row_number() over (partition by {primary_key} order by {cursor}, _airbyte_extracted_at)) != 1 OR _ab_cdc_deleted_at IS NOT NULL as is_outdated;
|
||||
|
||||
@@ -314,7 +314,7 @@ sync may create multiple files as the output files can be partitioned by size (t
|
||||
| Feature | Support | Notes |
|
||||
| :----------------------------- | :-----: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| FullRefresh - Overwrite Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. |
|
||||
| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) |
|
||||
| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/platform/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) |
|
||||
| Incremental - Append + Deduped | ❌ | |
|
||||
| Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. |
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ Worthy of specific mention, this version includes:
|
||||
- Removal of sub-tables for nested properties
|
||||
- Removal of SCD tables
|
||||
|
||||
Learn more about what's new in Destinations V2 [here](/using-airbyte/core-concepts/typing-deduping).
|
||||
Learn more about what's new in Destinations V2 [here](/platform/using-airbyte/core-concepts/typing-deduping).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -205,7 +205,7 @@ The raw table contains these fields:
|
||||
- `_airbyte_meta`
|
||||
- `_airbyte_data`
|
||||
|
||||
`_airbyte_data` is a JSON blob with the event data. See [here](/understanding-airbyte/airbyte-metadata-fields)
|
||||
`_airbyte_data` is a JSON blob with the event data. See [here](/platform/understanding-airbyte/airbyte-metadata-fields)
|
||||
for more information about the other fields.
|
||||
|
||||
**Note:** Although the contents of the `_airbyte_data` are fairly stable, schema of the raw table
|
||||
@@ -219,7 +219,7 @@ The final table contains these fields, in addition to the columns declared in yo
|
||||
- `airbyte_extracted_at`
|
||||
- `_airbyte_meta`
|
||||
|
||||
Again, see [here](/understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
Again, see [here](/platform/understanding-airbyte/airbyte-metadata-fields) for more information about these fields.
|
||||
|
||||
## Data type map
|
||||
|
||||
@@ -301,7 +301,7 @@ desired namespace.
|
||||
| 3.11.3 | 2024-07-15 | [\#41968](https://github.com/airbytehq/airbyte/pull/41968) | Don't hang forever on empty stream list; shorten error message on INCOMPLETE stream status |
|
||||
| 3.11.2 | 2024-07-12 | [\#41674](https://github.com/airbytehq/airbyte/pull/41674) | Upgrade to latest CDK |
|
||||
| 3.11.1 | 2024-07-08 | [\#41041](https://github.com/airbytehq/airbyte/pull/41041) | Fix resume logic in truncate refreshes to prevent data loss |
|
||||
| 3.11.0 | 2024-06-25 | [\#39473](https://github.com/airbytehq/airbyte/pull/39473) | Support for [refreshes](../../operator-guides/refreshes.md) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.11.0 | 2024-06-25 | [\#39473](https://github.com/airbytehq/airbyte/pull/39473) | Support for [refreshes](../../platform/operator-guides/refreshes) and resumable full refresh. WARNING: You must upgrade to platform 0.63.7 before upgrading to this connector version. |
|
||||
| 3.10.1 | 2024-06-11 | [\#39399](https://github.com/airbytehq/airbyte/pull/39399) | Bug fix for \_airbyte_meta not migrated in OVERWRITE mode |
|
||||
| 3.10.0 | 2024-06-10 | [\#39107](https://github.com/airbytehq/airbyte/pull/39107) | \_airbyte_meta and \_airbyte_generation_id in Raw tables and final tables |
|
||||
| 3.9.1 | 2024-06-05 | [\#39135](https://github.com/airbytehq/airbyte/pull/39135) | Improved error handling for Staging files |
|
||||
|
||||
@@ -68,7 +68,7 @@ You can also copy the output file to your host machine, the following command wi
|
||||
docker cp airbyte-server:/tmp/airbyte_local/{destination_path} .
|
||||
```
|
||||
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach.
|
||||
Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination) for an alternative approach.
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
@@ -2,6 +2,6 @@
|
||||
|
||||
Airbyte Enterprise Connectors are a selection of premium connectors available exclusively for Airbyte Self-Managed Enterprise and Airbyte Teams customers. These connectors, built and maintained by the Airbyte team, provide enhanced capabilities and support for critical enterprise systems that are not available to users of Airbyte Open Source and Airbyte Cloud. Key benefits of these connectors include support for larger data sets, parallelism for faster data transfers, and that they are covered under Airbyte Support SLAs.
|
||||
|
||||

|
||||

|
||||
|
||||
To learn more about enterprise connectors, please [talk to our sales team](https://airbyte.com/company/talk-to-sales).
|
||||
|
||||
@@ -1,7 +1,3 @@
|
||||
---
|
||||
displayed_sidebar: docs
|
||||
---
|
||||
|
||||
# Windows - Browsing Local File Output
|
||||
|
||||
## Overview
|
||||
@@ -18,7 +14,7 @@ While running Airbyte's Docker image on Windows with WSL2, you can access your t
|
||||
2. Type in `\\wsl$` in the address bar
|
||||
3. The folders below will be displayed
|
||||
|
||||

|
||||

|
||||
|
||||
4. You can start digging here, but it is recommended to start searching from here and just search for the folder name you used for your local files. The folder address should be similar to `\\wsl$\docker-desktop\tmp\docker-desktop-root\containers\services\docker\rootfs\tmp\airbyte_local`
|
||||
5. You should be able to locate your local destination CSV or JSON files in this folder.
|
||||
|
||||
@@ -23,17 +23,17 @@ This page contains the setup guide and reference information for the [Airtable](
|
||||
#### For Airbyte Open Source:
|
||||
|
||||
1. Go to https://airtable.com/create/tokens to create new token.
|
||||

|
||||

|
||||
2. Add following scopes:
|
||||
|
||||
- `data.records:read`
|
||||
- `data.recordComments:read`
|
||||
- `schema.bases:read`
|
||||
|
||||

|
||||

|
||||
|
||||
3. Select required bases or allow access to all available and press the `Create Token` button.
|
||||

|
||||

|
||||
4. Save token from the popup window.
|
||||
<!-- /env:oss -->
|
||||
|
||||
|
||||
@@ -82,7 +82,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
|
||||
## Upgrading to 5.0.0
|
||||
@@ -128,7 +128,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 4.0.0
|
||||
|
||||
@@ -159,7 +159,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 3.0.0
|
||||
|
||||
|
||||
@@ -31,7 +31,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 3.0.0
|
||||
|
||||
@@ -78,7 +78,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -22,7 +22,7 @@ You can find your personal API token in the Apify Console in the [Settings -> In
|
||||
|
||||
When your Apify job (aka [Actor run](https://docs.apify.com/platform/actors/running)) finishes, it can trigger an Airbyte sync by calling the Airbyte [API](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/connections/sync) manual connection trigger (`POST /v1/connections/sync`). The API can be called from Apify [webhook](https://docs.apify.com/platform/integrations/webhooks) which is executed when your Apify run finishes.
|
||||
|
||||

|
||||

|
||||
|
||||
### Features
|
||||
|
||||
|
||||
@@ -54,5 +54,5 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
|
||||
@@ -53,10 +53,10 @@ to use role [Storage Blob Data Reader](https://learn.microsoft.com/en-gb/azure/s
|
||||
Follow these steps to set up an IAM role:
|
||||
</summary>
|
||||
|
||||
1. Go to Azure portal, select the Storage (or Container) you'd like to sync from and get to Access Control(IAM) -> Role Assignment 
|
||||
2. Click on `Add` and select `Add role assignment` from the dropdown list 
|
||||
3. Search by role name `Storage Blob Data Reader` in search box, Select role from the list and click `Next` 
|
||||
4. Select `User, Group, or service principal`, click on `members` and select member(s) so they appear in table and click `Next` 
|
||||
1. Go to Azure portal, select the Storage (or Container) you'd like to sync from and get to Access Control(IAM) -> Role Assignment 
|
||||
2. Click on `Add` and select `Add role assignment` from the dropdown list 
|
||||
3. Search by role name `Storage Blob Data Reader` in search box, Select role from the list and click `Next` 
|
||||
4. Select `User, Group, or service principal`, click on `members` and select member(s) so they appear in table and click `Next` 
|
||||
5. (Optional) Add Conditions to restrict the role assignments a user can create.
|
||||
6. Click `Review + Assign`
|
||||
</details>
|
||||
|
||||
@@ -12,10 +12,10 @@ You will only be able to connect to a self-hosted instance of Drupal using these
|
||||
|
||||
Drupal can run on MySQL, Percona, MariaDb, MSSQL, MongoDB, Postgres, or SQL-Lite. If you're not using SQL-lite, you can use Airbyte to sync your Drupal instance by connecting to the underlying database using the appropriate Airbyte connector:
|
||||
|
||||
- [MySQL/Percona/MariaDB](mysql.md)
|
||||
- [MSSQL](mssql.md)
|
||||
- [Mongo](mongodb-v2.md)
|
||||
- [Postgres](postgres.md)
|
||||
- [MySQL/Percona/MariaDB](mysql)
|
||||
- [MSSQL](mssql)
|
||||
- [Mongo](mongodb-v2)
|
||||
- [Postgres](postgres)
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
## Overview
|
||||
|
||||
This is a mock source for testing the Airbyte pipeline. It can generate arbitrary data streams. It is a subset of what is in [End-to-End Testing Source](e2e-test.md) in Open Source to avoid Airbyte Cloud users accidentally in curring a huge bill.
|
||||
This is a mock source for testing the Airbyte pipeline. It can generate arbitrary data streams. It is a subset of what is in [End-to-End Testing Source](e2e-test) in Open Source to avoid Airbyte Cloud users accidentally in curring a huge bill.
|
||||
|
||||
## Mode
|
||||
|
||||
@@ -38,4 +38,4 @@ The OSS and Cloud variants have the same version number. The Cloud variant was i
|
||||
| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Fix inheritance between e2e-test and e2e-test-cloud |
|
||||
| 0.1.0 | 2021-07-23 | [9720](https://github.com/airbytehq/airbyte/pull/9720) | Initial release. |
|
||||
|
||||
</details>
|
||||
</details>
|
||||
|
||||
@@ -58,7 +58,7 @@ Custom Insights Reports now have updated schema for following breakdowns:
|
||||
|
||||
3. Navigate to a connection's **Settings** tab and click **Clear data** to clear all streams. This action will clear data for all streams in the connection. To clear data for a single stream navigate to the **Status** tab, click the **three grey dots** next to the affected stream, and select **Clear data**. Do this for all affected streams in the connection.
|
||||
|
||||
For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
@@ -98,4 +98,4 @@ Any detected schema changes will be listed for your review.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -39,4 +39,4 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear)
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear)
|
||||
|
||||
@@ -29,7 +29,7 @@ The Facebook Pages souce connector is currently only compatible with v15 of the
|
||||
|
||||
After all the steps, it should look something like this
|
||||
|
||||

|
||||

|
||||
|
||||
5. [Generate](https://developers.facebook.com/docs/facebook-login/guides/access-tokens/get-long-lived#get-a-long-lived-user-access-token) Long-Lived User Access Token.
|
||||
6. [Generate](https://developers.facebook.com/docs/facebook-login/guides/access-tokens/get-long-lived#long-lived-page-token) Long-Lived Page Token.
|
||||
|
||||
@@ -285,7 +285,7 @@ Please see (or add) more at `airbyte-integrations/connectors/source-file/integra
|
||||
|
||||
In order to read large files from a remote location, this connector uses the [smart_open](https://pypi.org/project/smart-open/) library. However, it is possible to switch to either [GCSFS](https://gcsfs.readthedocs.io/en/latest/) or [S3FS](https://s3fs.readthedocs.io/en/latest/) implementations as it is natively supported by the `pandas` library. This choice is made possible through the optional `reader_impl` parameter.
|
||||
|
||||
- Note that for local filesystem, the file probably have to be stored somewhere in the `/tmp/airbyte_local` folder with the same limitations as the [CSV Destination](../destinations/csv.md) so the `URL` should also starts with `/local/`.
|
||||
- Note that for local filesystem, the file probably have to be stored somewhere in the `/tmp/airbyte_local` folder with the same limitations as the [CSV Destination](../destinations/csv) so the `URL` should also starts with `/local/`.
|
||||
- Please make sure that Docker Desktop has access to `/tmp` (and `/private` on a MacOS, as /tmp has a symlink that points to /private. It will not work otherwise). You allow it with "File sharing" in `Settings -> Resources -> File sharing -> add the one or two above folder` and hit the "Apply & restart" button.
|
||||
- The JSON implementation needs to be tweaked in order to produce more complex catalog and is still in an experimental state: Simple JSON schemas should work at this point but may not be well handled when there are multiple layers of nesting.
|
||||
|
||||
|
||||
@@ -55,7 +55,7 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 3.0.0
|
||||
|
||||
@@ -107,7 +107,7 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -45,7 +45,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -29,4 +29,4 @@ To clear your data for the impacted streams, follow the steps below:
|
||||
2. Select the **Status** tab.
|
||||
1. In the **Enabled streams** list, click the three dots on the right side of the stream and select **Clear Data**.
|
||||
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -2,13 +2,13 @@
|
||||
|
||||
:::caution
|
||||
|
||||
This connector is graveyarded and will not be receiving any updates from the Airbyte team. Its functionalities have been replaced by the [Airbyte CDK](../../connector-development/cdk-python/README.md), which allows you to create source connectors for any HTTP API.
|
||||
This connector is graveyarded and will not be receiving any updates from the Airbyte team. Its functionalities have been replaced by the [Airbyte CDK](../../platform/connector-development/cdk-python/), which allows you to create source connectors for any HTTP API.
|
||||
|
||||
:::
|
||||
|
||||
## Overview
|
||||
|
||||
This connector allows you to generally connect to any HTTP API. In order to use this connector, you must manually bring it in as a custom connector. The steps to do this can be found [here](../../connector-development/tutorials/custom-python-connector/0-getting-started.md).
|
||||
This connector allows you to generally connect to any HTTP API. In order to use this connector, you must manually bring it in as a custom connector. The steps to do this can be found [here](/platform/connector-development/tutorials/custom-python-connector/getting-started).
|
||||
|
||||
## Where do I find the Docker image?
|
||||
|
||||
|
||||
@@ -38,7 +38,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear)
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear)
|
||||
|
||||
## Upgrading to 3.0.0
|
||||
|
||||
@@ -78,7 +78,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear)
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear)
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
@@ -112,4 +112,4 @@ Depending on destination type you may not be prompted to reset your data
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -36,7 +36,7 @@ To ensure uninterrupted syncs, users should follow the instructions below to mig
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
`
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 3.0.0
|
||||
|
||||
@@ -99,7 +99,7 @@ Please follow the instructions below to migrate to version 3.0.0:
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
`
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ To gracefully handle these changes for your existing connections, we highly reco
|
||||
8. Check all your streams.
|
||||
9. Select **Sync now** to sync your data
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
@@ -37,7 +37,7 @@ To gracefully handle these changes for your existing connections, we highly reco
|
||||
8. Check all your streams.
|
||||
9. Select **Sync now** to sync your data
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 1.0.0
|
||||
|
||||
@@ -63,4 +63,4 @@ This is a breaking change because Stream State for `Boards Issues` will be chang
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -33,7 +33,7 @@ for the affected streams, follow the steps below:
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds,
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 4.0.0
|
||||
|
||||
@@ -117,7 +117,7 @@ for the affected streams, follow the steps below:
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds,
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 1.0.0
|
||||
|
||||
@@ -149,4 +149,4 @@ for the affected streams, follow the steps below:
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds,
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
Magento runs on MySQL. You can use Airbyte to sync your Magento instance by connecting to the underlying database using the [MySQL connector](mysql.md).
|
||||
Magento runs on MySQL. You can use Airbyte to sync your Magento instance by connecting to the underlying database using the [MySQL connector](mysql).
|
||||
|
||||
:::info
|
||||
|
||||
@@ -14,4 +14,4 @@ Reach out to your service representative or system admin to find the parameters
|
||||
|
||||
### Output schema
|
||||
|
||||
The output schema is described in the [Magento docs](https://docs.magento.com/mbi/data-analyst/importing-data/integrations/magento-data.html). See the [MySQL connector](mysql.md) for more info on general rules followed by the MySQL connector when moving data.
|
||||
The output schema is described in the [Magento docs](https://docs.magento.com/mbi/data-analyst/importing-data/integrations/magento-data.html). See the [MySQL connector](mysql) for more info on general rules followed by the MySQL connector when moving data.
|
||||
|
||||
@@ -23,7 +23,7 @@ Version 2.0.0 introduces changes in primary key for streams `Segment Members` an
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 1.0.0
|
||||
|
||||
|
||||
@@ -49,4 +49,4 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
MS Dynamics AX runs on the MSSQL database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics AX instance by connecting to the underlying database.
|
||||
MS Dynamics AX runs on the MSSQL database. You can use the [MSSQL connector](mssql) to sync your MS Dynamics AX instance by connecting to the underlying database.
|
||||
|
||||
### Output schema
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
MS Dynamics Customer Engagement runs on [MSSQL](https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/deploy/system-requirements-required-technologies?view=op-9-1) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics Customer Engagement instance by connecting to the underlying database.
|
||||
MS Dynamics Customer Engagement runs on [MSSQL](https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/deploy/system-requirements-required-technologies?view=op-9-1) database. You can use the [MSSQL connector](mssql) to sync your MS Dynamics Customer Engagement instance by connecting to the underlying database.
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
MS Dynamics GP runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-gp/installation/installing-on-first-computer) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics GP instance by connecting to the underlying database.
|
||||
MS Dynamics GP runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-gp/installation/installing-on-first-computer) database. You can use the [MSSQL connector](mssql) to sync your MS Dynamics GP instance by connecting to the underlying database.
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
MS Dynamics NAV runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-nav/installation-considerations-for-microsoft-sql-server) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics NAV instance by connecting to the underlying database.
|
||||
MS Dynamics NAV runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-nav/installation-considerations-for-microsoft-sql-server) database. You can use the [MSSQL connector](mssql) to sync your MS Dynamics NAV instance by connecting to the underlying database.
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -22,7 +22,7 @@ Depending on your destination, you may be prompted to **Reset all streams**. Alt
|
||||
|
||||
5. Select **Save connection**. This will reset the data in your destination (if selected) and initiate a fresh sync.
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
### Changes in 1.0.0
|
||||
|
||||
|
||||
@@ -58,7 +58,7 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -47,7 +47,7 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
### Refresh affected schemas and reset data
|
||||
|
||||
@@ -66,4 +66,4 @@ For more information on resetting your data in Airbyte, see [this page](/operato
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
## Upgrading to 1.0.0
|
||||
|
||||
This version introduces a general availability version of the MongoDB V2 source connector, which leverages
|
||||
[Change Data Capture (CDC)](/understanding-airbyte/cdc) to improve the performance and
|
||||
[Change Data Capture (CDC)](/platform/understanding-airbyte/cdc) to improve the performance and
|
||||
reliability of syncs. This version provides better error handling, incremental delivery of data and improved
|
||||
reliability of large syncs via frequent checkpointing.
|
||||
|
||||
|
||||
@@ -32,31 +32,31 @@ access to the database.
|
||||
1. Log in to the MongoDB Atlas dashboard.
|
||||
2. From the dashboard, click on "Database Access" under "Security"
|
||||
|
||||

|
||||

|
||||
|
||||
3. Click on the "+ ADD NEW DATABASE USER" button.
|
||||
|
||||

|
||||

|
||||
|
||||
4. On the "Add new Database User" modal dialog, choose "Password" for the "Authentication Method".
|
||||
|
||||

|
||||

|
||||
|
||||
5. In the "Password Authentication" section, set the username to `READ_ONLY_USER` in the first text box and set a password in the second text box.
|
||||
|
||||

|
||||

|
||||
|
||||
6. Under "Database User Privileges", click on "Select one built-in role for this user" under "Built-in Role" and choose "Only read any database".
|
||||
|
||||

|
||||

|
||||
|
||||
7. Enable "Restrict Access to Specific Clusters/Federated Database instances" and enable only those clusters/database that you wish to replicate.
|
||||
|
||||

|
||||

|
||||
|
||||
8. Click on "Add User" at the bottom to save the user.
|
||||
|
||||

|
||||

|
||||
|
||||
##### Self Hosted
|
||||
|
||||
@@ -112,15 +112,15 @@ the connection configuration for a MongoDB Atlas-hosted replica set cluster:
|
||||
1. Log in to the [MongoDB Atlas dashboard](https://cloud.mongodb.com/).
|
||||
2. From the dashboard, click on the "Connect" button of the source cluster.
|
||||
|
||||

|
||||

|
||||
|
||||
3. On the "Connect to <cluster name>" modal dialog, select "Shell" under the "Access your data through tools" section.
|
||||
|
||||

|
||||

|
||||
|
||||
4. Copy the connection string from the entry labeled "2. Run your connection string in your command line" on the modal dialog, removing/avoiding the quotation marks.
|
||||
|
||||

|
||||

|
||||
|
||||
##### Self Hosted Cluster
|
||||
|
||||
|
||||
@@ -69,7 +69,7 @@ CDC-enabled tables.
|
||||
Some extra setup requiring at least _db_owner_ permissions on the database\(s\) you intend to sync
|
||||
from will be required \(detailed [below](mssql.md#setting-up-cdc-for-mssql)\).
|
||||
|
||||
Please read the [CDC docs](../../understanding-airbyte/cdc.md) for an overview of how Airbyte
|
||||
Please read the [CDC docs](../../platform/understanding-airbyte/cdc) for an overview of how Airbyte
|
||||
approaches CDC.
|
||||
|
||||
### Should I use CDC for MSSQL?
|
||||
@@ -86,7 +86,7 @@ approaches CDC.
|
||||
|
||||
#### CDC Limitations
|
||||
|
||||
- Make sure to read our [CDC docs](../../understanding-airbyte/cdc.md) to see limitations that
|
||||
- Make sure to read our [CDC docs](../../platform/understanding-airbyte/cdc) to see limitations that
|
||||
impact all databases using CDC replication.
|
||||
- `hierarchyid` and `sql_variant` types are not processed in CDC migration type (not supported by
|
||||
Debezium). For more details please check
|
||||
|
||||
@@ -90,7 +90,7 @@ To fill out the required information:
|
||||
#### Step 4: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs.
|
||||
|
||||
If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in
|
||||
our [Airbyte Security docs](../../operating-airbyte/security#network-security-1).
|
||||
our [Airbyte Security docs](../../platform/operating-airbyte/security#network-security-1).
|
||||
|
||||
Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte MySQL source!
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
|
||||
### CDC Requirements
|
||||
|
||||
- Make sure to read our [CDC docs](../../../understanding-airbyte/cdc.md) to see limitations that impact all databases using CDC replication.
|
||||
- Make sure to read our [CDC docs](/platform/understanding-airbyte/cdc) to see limitations that impact all databases using CDC replication.
|
||||
- Our CDC implementation uses at least once delivery for all change records.
|
||||
- To enable CDC with incremental sync, ensure the table has at least one primary key.
|
||||
Tables without primary keys can still be replicated by CDC but only in Full Refresh mode.
|
||||
|
||||
@@ -17,7 +17,7 @@ Data for the `Comments` stream will need to cleared to ensure your syncs continu
|
||||
2. Select the **Status** tab.
|
||||
1. In the **Enabled streams** list, click the three dots on the right side of the **Comments** stream and select **Clear data**.
|
||||
|
||||
After the clear succeeds, trigger a sync for the `Comments` stream by clicking "Sync Now". For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
After the clear succeeds, trigger a sync for the `Comments` stream by clicking "Sync Now". For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.0.0
|
||||
|
||||
|
||||
@@ -6,9 +6,9 @@
|
||||
|
||||
Oracle PeopleSoft can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle.com/en/applications/peoplesoft/peopletools/index.html) databases. You can use Airbyte to sync your Oracle PeopleSoft instance by connecting to the underlying database using the appropriate Airbyte connector:
|
||||
|
||||
- [DB2](db2.md)
|
||||
- [MSSQL](mssql.md)
|
||||
- [Oracle](oracle.md)
|
||||
- [DB2](db2)
|
||||
- [MSSQL](mssql)
|
||||
- [Oracle](oracle)
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -6,9 +6,9 @@
|
||||
|
||||
Oracle Siebel CRM can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle.com/cd/E88140_01/books/DevDep/installing-and-configuring-siebel-crm.html#PrerequisiteSoftware) databases. You can use Airbyte to sync your Oracle Siebel CRM instance by connecting to the underlying database using the appropriate Airbyte connector:
|
||||
|
||||
- [DB2](db2.md)
|
||||
- [MSSQL](mssql.md)
|
||||
- [Oracle](oracle.md)
|
||||
- [DB2](db2)
|
||||
- [MSSQL](mssql)
|
||||
- [Oracle](oracle)
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -62,7 +62,7 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
8. Check all your streams.
|
||||
9. Select **Sync now** to sync your data
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 1.0.0
|
||||
|
||||
|
||||
@@ -4,8 +4,8 @@
|
||||
|
||||
The PokéAPI is primarly used as a tutorial and educational resource, as it requires zero dependencies. Learn how Airbyte and this connector works with these tutorials:
|
||||
|
||||
- [Airbyte Quickstart: An Introduction to Deploying and Syncing](../../using-airbyte/getting-started/readme.md)
|
||||
- [Using Connector Builder and the low-code CDK](../../connector-development/connector-builder-ui/overview.md)
|
||||
- [Airbyte Quickstart: An Introduction to Deploying and Syncing](/platform/using-airbyte/getting-started/oss-quickstart)
|
||||
- [Using Connector Builder and the low-code CDK](/platform/connector-development/connector-builder-ui/overview)
|
||||
- [How to Build ETL Sources in Under 30 Minutes: A Video Tutorial](https://www.youtube.com/watch?v=kJ3hLoNfz_E&t=13s&ab_channel=Airbyte)
|
||||
|
||||
## Features
|
||||
|
||||
@@ -69,7 +69,7 @@ To fill out the required information:
|
||||
### Step 3: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs.
|
||||
|
||||
If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in
|
||||
our [Airbyte Security docs](../../operating-airbyte/security#network-security-1).
|
||||
our [Airbyte Security docs](../../platform/operating-airbyte/security#network-security-1).
|
||||
|
||||
Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte Postgres source!
|
||||
|
||||
|
||||
@@ -2,11 +2,11 @@
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
|
||||
- Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) and replication using the [xmin system column](https://docs.airbyte.com/integrations/sources/postgres#xmin).
|
||||
- All available [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing flexibility in how data is delivered to your destination.
|
||||
- Reliable replication at any table size with [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads.
|
||||
- Multiple methods of keeping your data fresh, including [Change Data Capture (CDC)](/platform/understanding-airbyte/cdc) and replication using the [xmin system column](/integrations/sources/postgres#xmin).
|
||||
- All available [sync modes](/platform/using-airbyte/core-concepts/sync-modes), providing flexibility in how data is delivered to your destination.
|
||||
- Reliable replication at any table size with [checkpointing](/platform/understanding-airbyte/airbyte-protocol/#state--checkpointing) and chunking of database reads.
|
||||
|
||||

|
||||

|
||||
|
||||
## Quick Start
|
||||
|
||||
@@ -63,7 +63,7 @@ If you are on Airbyte Cloud, you will always need to modify your database config
|
||||
|
||||

|
||||
|
||||
2. Add a new network, and enter the Airbyte's IPs, which you can find in our [Airbyte Security documentation](../../../operating-airbyte/security#network-security-1).
|
||||
2. Add a new network, and enter the Airbyte's IPs, which you can find in our [Airbyte Security documentation](../../../platform/operating-airbyte/security#network-security-1).
|
||||
|
||||
Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte Postgres source!
|
||||
|
||||
|
||||
@@ -18,4 +18,4 @@ To gracefully handle these changes for your existing connections, we highly reco
|
||||
8. Check all your streams.
|
||||
9. Select **Sync now** to sync your data
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
@@ -54,4 +54,4 @@ Any detected schema changes will be listed for your review.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -19,4 +19,4 @@ Clearing your data is required for the affected streams in order to continue syn
|
||||
1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema.
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -121,7 +121,7 @@ S3 authentication using an IAM role member must be enabled by a member of the Ai
|
||||
2. Click Sources and then click + New source.
|
||||
3. On the Set up the source page, select S3 from the Source type dropdown.
|
||||
4. Enter a name for the S3 connector.
|
||||
5. Choose a [delivery method](../../using-airbyte/delivery-methods) for your data.
|
||||
5. Choose a [delivery method](../../platform/using-airbyte/delivery-methods) for your data.
|
||||
6. Enter the name of the **Bucket** containing your files to replicate.
|
||||
7. Add a stream
|
||||
1. Choose the **File Format**
|
||||
@@ -149,7 +149,7 @@ All other fields are optional and can be left empty. Refer to the [S3 Provider S
|
||||
|
||||
<FieldAnchor field="delivery_method.delivery_type">
|
||||
|
||||
Choose a [delivery method](../../using-airbyte/delivery-methods) for your data.
|
||||
Choose a [delivery method](../../platform/using-airbyte/delivery-methods) for your data.
|
||||
|
||||
</FieldAnchor>
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
SAP Business One can run on the MSSQL or SAP HANA databases. If your instance is deployed on MSSQL, you can use Airbyte to sync your SAP Business One instance by using the [MSSQL connector](mssql.md).
|
||||
SAP Business One can run on the MSSQL or SAP HANA databases. If your instance is deployed on MSSQL, you can use Airbyte to sync your SAP Business One instance by using the [MSSQL connector](mssql).
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -54,4 +54,4 @@ To clear your data for the affected streams, follow the steps below:
|
||||
1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema.
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -23,4 +23,4 @@ Clearing your data is required for the affected streams in order to continue syn
|
||||
1. Ensure the **Clear affected streams** option is checked to ensure your streams continue syncing successfully with the new schema.
|
||||
4. Select **Save connection**.
|
||||
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
This will clear the data in your destination for the subset of streams with schema changes. After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -66,7 +66,7 @@ For more information on SSH key pair authentication, please refer to the
|
||||
2. Click Sources and then click + New source.
|
||||
3. On the Set up the source page, select SFTP Bulk from the Source type dropdown.
|
||||
4. Enter a name for the SFTP Bulk connector.
|
||||
5. Choose a [delivery method](../../using-airbyte/delivery-methods) for your data.
|
||||
5. Choose a [delivery method](../../platform/using-airbyte/delivery-methods) for your data.
|
||||
6. Enter the **Host Address**.
|
||||
7. Enter your **Username**
|
||||
8. Enter your authentication credentials for the SFTP server (**Password** or **Private Key**). If you are authenticating with a private key, you can upload the file containing the private key (usually named `rsa_id`) using the Upload file button.
|
||||
@@ -108,7 +108,7 @@ This pattern will filter for files that match the format `log-YYYYMMDD`, where `
|
||||
|
||||
<FieldAnchor field="delivery_method.delivery_type">
|
||||
|
||||
Choose a [delivery method](../../using-airbyte/delivery-methods) for your data.
|
||||
Choose a [delivery method](../../platform/using-airbyte/delivery-methods) for your data.
|
||||
|
||||
</FieldAnchor>
|
||||
|
||||
|
||||
@@ -66,7 +66,7 @@ To clear your data for the impacted streams, follow the steps below:
|
||||
2. Select the **Status** tab.
|
||||
1. In the **Enabled streams** list, click the three dots on the right side of the stream and select **Clear Data**.
|
||||
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
## Upgrading to 2.6.1
|
||||
|
||||
|
||||
@@ -15,4 +15,4 @@ Clearing your data is required in order to continue syncing `Channel Messages` s
|
||||
2. Select the **Status** tab.
|
||||
1. In the **Enabled streams** list, click the three dots on the right side of the `Channel Messages` and select **Clear Data**.
|
||||
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
After the clear succeeds, trigger a sync by clicking **Sync Now**. For more information on clearing your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -93,7 +93,7 @@ Additionallly specific metadata fields related to the sheet or row can be includ
|
||||
|
||||
The Smartsheet Source is written to pull data from a single Smartsheet spreadsheet. Unlike Google Sheets, Smartsheets only allows one sheet per Smartsheet - so a given Airbyte connector instance can sync only one sheet at a time. To replicate multiple spreadsheets, you can create multiple instances of the Smartsheet Source in Airbyte, reusing the API token for all your sheets that you need to sync.
|
||||
|
||||
**Note: Column headers must contain only alphanumeric characters or `_` , as specified in the** [**Airbyte Protocol**](../../understanding-airbyte/airbyte-protocol.md).
|
||||
**Note: Column headers must contain only alphanumeric characters or `_` , as specified in the** [**Airbyte Protocol**](../../platform/understanding-airbyte/airbyte-protocol).
|
||||
|
||||
## Data type map
|
||||
|
||||
|
||||
@@ -24,4 +24,4 @@ To gracefully handle these changes for your existing connections, we highly reco
|
||||
8. Check all your streams.
|
||||
9. Select **Sync now** to sync your data
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -6,8 +6,8 @@
|
||||
|
||||
Spree Commerce can run on the MySQL or Postgres databases. You can use Airbyte to sync your Spree Commerce instance by connecting to the underlying database using the appropriate Airbyte connector:
|
||||
|
||||
- [MySQL](mysql.md)
|
||||
- [Postgres](postgres.md)
|
||||
- [MySQL](mysql)
|
||||
- [Postgres](postgres)
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
```
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
|
||||
|
||||
@@ -68,7 +68,7 @@ The stream `Refunds` will need to be synced historically again to ensure the con
|
||||
6. Confirm the modal to save the connection and initiate a `Refresh`. This will start to pull in all historical data for the stream.
|
||||
|
||||
:::note
|
||||
If you are using a destination that does not support the `Refresh` feature, you will need to [Clear](/operator-guides/clear) your stream. This will remove the data from the destination for just that stream. You will then need to sync the connection again in order to sync all data again for that stream.
|
||||
If you are using a destination that does not support the `Refresh` feature, you will need to [Clear](/platform/operator-guides/clear) your stream. This will remove the data from the destination for just that stream. You will then need to sync the connection again in order to sync all data again for that stream.
|
||||
:::
|
||||
|
||||
## Upgrading to 5.0.0
|
||||
|
||||
@@ -12,10 +12,10 @@ You will only be able to connect to a self-hosted instance of Sugar CRM using th
|
||||
|
||||
Sugar CRM can run on the MySQL, MSSQL, Oracle, or Db2 databases. You can use Airbyte to sync your Sugar CRM instance by connecting to the underlying database using the appropriate Airbyte connector:
|
||||
|
||||
- [DB2](db2.md)
|
||||
- [MySQL](mysql.md)
|
||||
- [MSSQL](mssql.md)
|
||||
- [Oracle](oracle.md)
|
||||
- [DB2](db2)
|
||||
- [MySQL](mysql)
|
||||
- [MSSQL](mssql)
|
||||
- [Oracle](oracle)
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -65,7 +65,7 @@ You may want to review SurveyMonkey's [API docs](https://developer.surveymonkey.
|
||||
|
||||
## Supported streams and sync modes
|
||||
|
||||
You can stream the following data from SurveyMonkey using the [sync modes](/using-airbyte/core-concepts/sync-modes/) indicated.
|
||||
You can stream the following data from SurveyMonkey using the [sync modes](/platform/using-airbyte/core-concepts/sync-modes/) indicated.
|
||||
|
||||
| Stream | Sync mode |
|
||||
| :------ | :--------- |
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
## Sync overview
|
||||
|
||||
Zencart runs on a MySQL database. You can use Airbyte to sync your Zencart instance by connecting to the underlying MySQL database and leveraging the [MySQL](mysql.md) connector.
|
||||
Zencart runs on a MySQL database. You can use Airbyte to sync your Zencart instance by connecting to the underlying MySQL database and leveraging the [MySQL](mysql) connector.
|
||||
|
||||
:::info
|
||||
|
||||
|
||||
@@ -56,4 +56,4 @@ Each instance of the connector must be updated separately. If you have created m
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear).
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear).
|
||||
|
||||
@@ -54,4 +54,4 @@ Depending on destination type you may not be prompted to reset your data.
|
||||
This will reset the data in your destination and initiate a fresh sync.
|
||||
:::
|
||||
|
||||
For more information on resetting your data in Airbyte, see [this page](/operator-guides/clear)
|
||||
For more information on resetting your data in Airbyte, see [this page](/platform/operator-guides/clear)
|
||||
|
||||
|
Before Width: | Height: | Size: 151 KiB After Width: | Height: | Size: 151 KiB |
@@ -108,7 +108,7 @@ Once your Microsoft Entra ID app is set up, you're ready to deploy Airbyte Self-
|
||||
* Client ID: You'll find this in the **Essentials** section on the **Overview** page of the application you created.
|
||||
* Client Secret: The client secret you copied in the previous step.
|
||||
|
||||
Use this information to configure the auth details of your `airbyte.yml` for your Self-Managed Enterprise deployment. To learn more on deploying Self-Managed Enterprise, see our [implementation guide](/enterprise-setup/implementation-guide).
|
||||
Use this information to configure the auth details of your `airbyte.yml` for your Self-Managed Enterprise deployment. To learn more on deploying Self-Managed Enterprise, see our [implementation guide](/platform/enterprise-setup/implementation-guide).
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
@@ -109,7 +109,7 @@ On the following screen you'll need to configure all parameters for your Okta ap
|
||||
* Client ID
|
||||
* Client Secret
|
||||
|
||||
Visit the [implementation guide](/enterprise-setup/implementation-guide.md) for instructions on how to deploy Airbyte Enterprise using `kubernetes`, `kubectl` and `helm`.
|
||||
Visit the [implementation guide](../../enterprise-setup/implementation-guide.md) for instructions on how to deploy Airbyte Enterprise using `kubernetes`, `kubectl` and `helm`.
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
@@ -13,7 +13,7 @@ Our API is a reliable, easy-to-use interface for programmatically controlling th
|
||||
|
||||
## Configuring API Access
|
||||
|
||||
View our documentation [here](./using-airbyte/configuring-api-access.md) to learn how to start using the Airbyte API.
|
||||
View our documentation [here](using-airbyte/configuring-api-access.md) to learn how to start using the Airbyte API.
|
||||
|
||||
## Using the Airbyte API
|
||||
|
||||
|
Before Width: | Height: | Size: 254 KiB After Width: | Height: | Size: 254 KiB |
|
Before Width: | Height: | Size: 103 KiB After Width: | Height: | Size: 103 KiB |
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user