1
0
mirror of synced 2025-12-19 18:14:56 -05:00

Documentation - Consolidation of documentation around building new/custom connectors + consistency on vocabulary (#868)

This commit is contained in:
John Lafleur
2020-11-11 09:46:39 +11:00
committed by GitHub
parent ce40a0852b
commit 458a91a19e
18 changed files with 99 additions and 88 deletions

View File

@@ -15,8 +15,8 @@ The new open-source standard to sync data from applications, APIs & databases to
Airbyte is on a mission to make data integration pipelines a commodity.
* **Maintenance-free connectors you can use in minutes**. Just authenticate your sources and warehouse, and get connectors that adapt to schema and API changes for you.
* **Building new integrations made trivial.** We make it very easy to add new integrations that you need, using the language of your choice, by offering scheduling and orchestration.
* Designed to **cover the long tail of integrations and needs**. Benefit from the community's battle-tested connectors and adapt them to your specific needs.
* **Building new connectors made trivial.** We make it very easy to add new connectors that you need, using the language of your choice, by offering scheduling and orchestration.
* Designed to **cover the long tail of connectors and needs**. Benefit from the community's battle-tested connectors and adapt them to your specific needs.
* **Your data stays in your cloud**. Have full control over your data, and the costs of your data transfers.
* **No more security compliance process** to go through as Airbyte is self-hosted.
* **No more pricing indexed on volume**, as cloud-based solutions offer.
@@ -35,7 +35,7 @@ Here is a [step-by-step guide](docs/getting-started-tutorial.md) showing you how
## Features
* **Built for extensibility**: Adapt an existing integration to your needs or build a new one with ease.
* **Built for extensibility**: Adapt an existing connector to your needs or build a new one with ease.
* **Optional normalized schemas**: Entirely customizable, start with raw data or from some suggestion of normalized data.
* **Full-grade scheduler**: Automate your replications with the frequency you need.
* **Real-time monitoring**: We log all errors in full detail to help you understand.
@@ -51,7 +51,7 @@ We love contributions to Airbyte, big or small.
See our [Contributing guide](docs/contributing-to-airbyte/) on how to get started. Not sure where to start? Weve listed some [good first issues](https://github.com/airbytehq/airbyte/labels/good%20first%20issue) to start with. You can also [book a free, no-pressure pairing session](https://drift.me/micheltricot/meeting) with one of our core contributors.
**Note that you are able to create integrations using the language you want, as Airbyte connections run as Docker containers.**
**Note that you are able to create connectors using the language you want, as Airbyte connections run as Docker containers.**
## Community support

View File

@@ -4,10 +4,10 @@
* [Getting Started](getting-started-tutorial.md)
* [Changelog](changelog.md)
* [Deploying Airbyte](deploying-airbyte/README.md)
* [On your workstation](deploying-airbyte/on-your-workstation.md)
* [On Your Workstation](deploying-airbyte/on-your-workstation.md)
* [On AWS \(EC2\)](deploying-airbyte/on-aws-ec2.md)
* [On GCP \(Compute Engine\)](deploying-airbyte/on-gcp-compute-engine.md)
* [Integrations](integrations/README.md)
* [Connectors](integrations/README.md)
* [Sources](integrations/sources/README.md)
* [exchangeratesapi.io](integrations/sources/exchangeratesapi-io.md)
* [Facebook Marketing API](integrations/sources/facebook-marketing-api.md)
@@ -28,9 +28,8 @@
* [Local CSV](integrations/destinations/local-csv.md)
* [Postgres](integrations/destinations/postgres.md)
* [Snowflake](integrations/destinations/snowflake.md)
* [Missing an Integration?](integrations/missing-an-integration.md)
* [Custom Connectors](integrations/custom-connectors.md)
* [Integrations Changelog](integrations/integrations-changelog.md)
* [Custom or New Connector](integrations/custom-connectors.md)
* [Connector Changelog](integrations/integrations-changelog.md)
* [Architecture](architecture/README.md)
* [High-level View](architecture/high-level-view.md)
* [Airbyte Specification](architecture/airbyte-specification.md)
@@ -39,13 +38,13 @@
* [Contributing to Airbyte](contributing-to-airbyte/README.md)
* [Code of Conduct](contributing-to-airbyte/code-of-conduct.md)
* [Developing Locally](contributing-to-airbyte/developing-locally.md)
* [Building new connectors](contributing-to-airbyte/building-new-connector/README.md)
* [Building New Connectors](contributing-to-airbyte/building-new-connector/README.md)
* [Python Connectors](contributing-to-airbyte/building-new-connector/python-connectors.md)
* [Java Connectors](contributing-to-airbyte/building-new-connector/java-connectors.md)
* [Code Style](contributing-to-airbyte/code-style.md)
* [Updating Documentation](contributing-to-airbyte/updating-documentation.md)
* [Templates](contributing-to-airbyte/templates/README.md)
* [Integration Doc Template](contributing-to-airbyte/templates/integration-documentation-template.md)
* [Connector Doc Template](contributing-to-airbyte/templates/integration-documentation-template.md)
* [Company Handbook](company-handbook/README.md)
* [Story](company-handbook/future-milestones.md)
* [Culture and Values](company-handbook/culture-and-values.md)

View File

@@ -2,7 +2,7 @@
## Key Takeaways
* The specification is Docker-based; this allows a developer to write an integration in any language they want. All they have to do is put that code in a Docker container that adheres to the interface and protocol described below.
* The specification is Docker-based; this allows a developer to write a connector in any language they want. All they have to do is put that code in a Docker container that adheres to the interface and protocol described below.
* We currently provide templates to make this even easier for those who prefer to work in python or java. These templates allow the developer skip any Docker setup so that they can just implement code against well-defined interfaces in their language of choice.
* The specification is designed to work as a CLI. The Airbyte app is built on top of this CLI.
* The specification defines a standard interface for implementing data integrations: Sources and Destinations.
@@ -12,10 +12,10 @@
#### Contents:
1. [General information about the specification](airbyte-specification.md#general)
2. [Integration primitives](airbyte-specification.md#primitives)
3. [Details of the protocol to pass information between integrations](airbyte-specification.md#the-airbyte-protocol)
2. [Connector primitives](airbyte-specification.md#primitives)
3. [Details of the protocol to pass information between connectors](airbyte-specification.md#the-airbyte-protocol)
This document is focused on the interfaces and primitives around integrations. You can better understand how that fits into the bigger picture by checking out the [Airbyte Architecture]().
This document is focused on the interfaces and primitives around connectors. You can better understand how that fits into the bigger picture by checking out the [Airbyte Architecture](airbyte-specification.md).
## General
@@ -25,11 +25,11 @@ This document is focused on the interfaces and primitives around integrations. Y
### Definitions
* **Airbyte Worker** - This is a core piece of the Airbyte stack that is responsible for 1\) initializing a Source and a Destinations and 2\) passing data from Source to Destination.
* Someone implementing an integration need not ever touch this code, but in this article we mention it to contextualize how data is flowing through Airbyte.
* **Integration** - An integration is code that allows Airbyte to interact with a specific underlying data source \(e.g. Postgres\). In Airbyte, an integration is either a Source or a Destination.
* **Source** - An integration that _pulls_ data from an underlying data source. \(e.g. A Postgres Source reads data from a Postgres database. A Stripe Source reads data from the Stripe API\)
* **Destination** - An integration that _pushes_ data to an underlying data source. \(e.g. A Postgres Destination writes data to a Postgres database\)
* **AirbyteSpecification** - the specification that describes how to implement integrations using a standard interface.
* Someone implementing a connector need not ever touch this code, but in this article we mention it to contextualize how data is flowing through Airbyte.
* **Connector** - A connector is code that allows Airbyte to interact with a specific underlying data source \(e.g. Postgres\). In Airbyte, an integration is either a Source or a Destination.
* **Source** - A connector that _pulls_ data from an underlying data source. \(e.g. A Postgres Source reads data from a Postgres database. A Stripe Source reads data from the Stripe API\)
* **Destination** - A connector that _pushes_ data to an underlying data source. \(e.g. A Postgres Destination writes data to a Postgres database\)
* **AirbyteSpecification** - the specification that describes how to implement connectors using a standard interface.
* **AirbyteProtocol** - the protocol used for inter-process communication.
* **Integration Commands** - the commands that an integration container implements \(e.g. `spec`, `check`, `discover`, `read`/`write`\). We describe these commands in more detail below.
* **Sync** - the act of moving data from a Source to a Destination.
@@ -70,7 +70,7 @@ read(Config, AirbyteCatalog, State) -> Stream<AirbyteMessage>
1. `spec` - a [ConnectorSpecification](https://github.com/airbytehq/airbyte/blob/master/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_message.yaml#L133-L149) wrapped in an `AirbyteMessage` of type `spec`.
* The objective of the spec command is to pull information about how to use a source. The `ConnectorSpecification` contains this information.
* The `connectionSpecification` of the `ConnectorSpecification` must be valid JsonSchema. It describes what inputs are needed in order for the source to interact with the underlying data source.
* e.g. If using a Postgres source, the `ConnectorSpecification` would specify that a `hostname`, `port`, and `password` are required in order for the integration to function.
* e.g. If using a Postgres source, the `ConnectorSpecification` would specify that a `hostname`, `port`, and `password` are required in order for the connector to function.
* The UI reads the JsonSchema in this field in order to render the input fields for a user to fill in.
* This JsonSchema is also used to validate that the provided inputs are valid. e.g. If `port` is one of the fields and the JsonSchema in the `connectorSpecification` specifies that this filed should be a number, if a user inputs "airbyte", they will receive an error. Airbyte adheres to JsonSchema validation rules.
@@ -171,7 +171,7 @@ read(Config, AirbyteCatalog, State) -> Stream<AirbyteMessage>
1. `message stream` - A stream of `AirbyteRecordMessage`s and `AirbyteStateMessage`s piped to stdout.
* This command reads data from the underlying data source and converts it into `AirbyteRecordMessage`.
* Outputting `AirbyteStateMessages` is optional. It can be used to track how much of the data source has been synced.
* The integration ideally will only pull the data described in the `catalog` argument. It is permissible for the integration, however, to ignore the `catalog` and pull data from any stream it can find. If it follows this second behavior, the extra data will be pruned in the worker. We prefer the former behavior because it reduces the amount of data that is transferred and allows control over not sending sensitive data. There are some sources for which this is simply not possible.
* The connector ideally will only pull the data described in the `catalog` argument. It is permissible for the connector, however, to ignore the `catalog` and pull data from any stream it can find. If it follows this second behavior, the extra data will be pruned in the worker. We prefer the former behavior because it reduces the amount of data that is transferred and allows control over not sending sensitive data. There are some sources for which this is simply not possible.
### Destination
@@ -212,11 +212,11 @@ For the sake of brevity, we will not re-describe `spec` and `check`. They are ex
## The Airbyte Protocol
* All messages passed to and from integrations must be wrapped in an `AirbyteMesage` envelope and serialized as JSON. The JsonSchema specification for these messages can be found [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_message.yaml).
* All messages passed to and from connectors must be wrapped in an `AirbyteMesage` envelope and serialized as JSON. The JsonSchema specification for these messages can be found [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_message.yaml).
* Even if a record is wrapped in an `AirbyteMessage` it will only be processed if it appropriate for the given command. e.g. If a source `read` action includes AirbyteMessages in its stream of type Catalog for instance, these messages will be ignored as the `read` interface only expects `AirbyteRecordMessage`s and `AirbyteStateMessage`s. The appropriate `AirbyteMessage` types have been described in each command above.
* **ALL** actions are allowed to return `AirbyteLogMessage`s on stdout. For brevity, we have not mentioned these log messages in the description of each action, but they are always allowed. An `AirbyteLogMessage` wraps any useful logging that the integration wants to provide. These logs will written to Airbyte's log files and output to the console.
* **ALL** actions are allowed to return `AirbyteLogMessage`s on stdout. For brevity, we have not mentioned these log messages in the description of each action, but they are always allowed. An `AirbyteLogMessage` wraps any useful logging that the connector wants to provide. These logs will written to Airbyte's log files and output to the console.
* I/O:
* Integrations receive arguments on the command line via JSON files. `e.g. --catalog catalog.json`
* Connectors receive arguments on the command line via JSON files. `e.g. --catalog catalog.json`
* They read `AirbyteMessage`s from stdin. The destination `write` action is the only command that consumes `AirbyteMessage`s.
* They emit `AirbyteMessage`s on stdout. All commands that output messages use this approach \(even `write` emits `AirbyteLogMessage`s\). e.g. `discover` outputs the `catalog` wrapped in an AirbyteMessage on stdout.
* Messages not wrapped in the `AirbyteMessage` will be ignored.

View File

@@ -1,6 +1,6 @@
# Full Refresh
This readme describes Airbyte conventions around the "full refresh" concept. Out in the world, there are many ways to define this term. We want the behavior of Airbyte integrations to be predictable, so we are adopting a preferred definition. This readme also describes what behavior to fall back on if the preferred convention cannot be used.
This readme describes Airbyte conventions around the "full refresh" concept. Out in the world, there are many ways to define this term. We want the behavior of Airbyte connectors to be predictable, so we are adopting a preferred definition. This readme also describes what behavior to fall back on if the preferred convention cannot be used.
On the nth sync of a full refresh connection:
@@ -73,5 +73,5 @@ Not all data warehouses will necessarily be able to adhere to either of these co
## In the future
We will consider making other flavors of full refresh configurable as first-class citizens in Airbyte. e.g. On new data, copy old data to a new table with a timestamp, and then replace the original table with the new data. As always, we will focus on adding these options in such a way that the behavior of each integration is both well documented and predictable.
We will consider making other flavors of full refresh configurable as first-class citizens in Airbyte. e.g. On new data, copy old data to a new table with a timestamp, and then replace the original table with the new data. As always, we will focus on adding these options in such a way that the behavior of each connector is both well documented and predictable.

View File

@@ -11,7 +11,7 @@ description: A high level view of Airbyte's components.
* `Scheduler Store`: Stores statuses and job information for the scheduler bookkeeping.
* `Config API`: Allows the UI to read and update connection information.
* `Scheduler API`: Allows the UI to read and control jobs \(schema discovery, connection testing, logs...\).
* `Scheduler`: The scheduler orchestrates all the data syncing from the source integration to the destination. It is responsible for tracking success/failure and for triggering syncs based on the configured frequency.
* `Scheduler`: The scheduler orchestrates all the data syncing from the source connector to the destination one. It is responsible for tracking success/failure and for triggering syncs based on the configured frequency.
* `Worker`: The worker connects to the source system, pulls the data and writes it to the destination system.
* `Temporary Storage`: A storage that workers can use whenever they need to spill data on a disk.

View File

@@ -4,7 +4,7 @@ description: Be sure to not miss out on new features and improvements!
# Changelog
If you're interested in our changelog on which new connectors are available out of the box on Airbyte, please check our [integrations changelog](integrations/integrations-changelog.md).
This is the changelog for Airbyte core. For our connector changelog, please visit our [Connector Changelog](integrations/integrations-changelog.md) page.
For a non-exhaustive list of the features we will support in the next few months, please visit our [roadmap overview](../#roadmap).
@@ -14,8 +14,8 @@ If you're interested in our progress on the Airbyte platform, please read below!
Here is what we have in mind:
* Support multiple destinations
* **New destination:** our own Redshift warehouse integration
* Support **multiple destinations**
* **New destination:** our own Redshift warehouse connector
* **New sources:** 10 additional sources connectors, including MSSQL, GCS CSV, S3 CSV, SFTP CSV
* As a bonus if we can, update the onboarding experience with pre-filled demo data for the users who just want to see how Airbyte works with the least effort.
@@ -23,7 +23,7 @@ Here is what we have in mind:
Here is what we are working on right now:
* **New destination**: our own **Snowflake** warehouse integration
* **New destination**: our own **Snowflake** warehouse connector
* **New sources:** Facebook Ads, Google Ads.
## 0.3.0 - delivered on 10/30/2020
@@ -36,8 +36,8 @@ Here is what we are working on right now:
* **a new Admin section** to enable users to add their own connectors, in addition to upgrading the ones they currently use
* improve the developer experience \(DX\) for **contributing new connectors** with additional documentation and a connector protocol
* our own **BigQuery** warehouse integration
* our own **Postgres** warehouse integration
* our own **BigQuery** warehouse connector
* our own **Postgres** warehouse connector
* simplify the process of supporting new Singer taps, ideally make it a 1-day process
## 0.1.0 - delivered on 09/23/2020

View File

@@ -24,13 +24,13 @@ Here is a list of easy [good first issues](https://github.com/airbytehq/airbyte/
## Areas for contributing
### **New integrations**
### **New connectors**
It's easy to add your own integrations to Airbyte! **Since Airbyte connectors are encapsulated within Docker containers, you can use any language you like.** Here are some links on how to add sources and destinations. We haven't built the documentation for all languages yet, so don't hesitate to reach out to us if you'd like help developing integrations in other languages.
It's easy to add your own connector to Airbyte! **Since Airbyte connectors are encapsulated within Docker containers, you can use any language you like.** Here are some links on how to add sources and destinations. We haven't built the documentation for all languages yet, so don't hesitate to reach out to us if you'd like help developing connectors in other languages.
#### **Contributing sources & destinations:**
You can build a Source and a Destination in any language. Since we frequently build integrations in Python, on top of Singer or in Java, we've created generator libraries to get you started quickly. See [Building new connectors](building-new-connector/) to get started.
You can build a Source and a Destination in any language. Since we frequently build connectors in Python, on top of Singer or in Java, we've created generator libraries to get you started quickly. See [Building new connectors](building-new-connector/) to get started.
### **Documentation**
@@ -56,7 +56,7 @@ Feel free to submit a pull request in this repo, if you have something to add ev
## Ways you can contribute
### **Adding to the codebase for an integration or issue**
### **Adding to the codebase for a connector or issue**
First, a big thank you! A few things to keep in mind when contributing code:
@@ -66,9 +66,9 @@ First, a big thank you! A few things to keep in mind when contributing code:
Here are some details about [our review process](./#review-process).
### **Upvoting issues, feature and integration requests**
### **Upvoting issues, feature and connector requests**
You are welcome to add your own reactions to the existing issues. We will take them in consideration in our prioritization efforts, especially for integrations.
You are welcome to add your own reactions to the existing issues. We will take them in consideration in our prioritization efforts, especially for connectors.
❤️ means that this task is CRITICAL to you.
👍 means it is important to you.
@@ -83,9 +83,9 @@ To see what has already been proposed by the community, you can look [here](http
Watch out for duplicates! If you are creating a new issue, please check [existing open](https://github.com/airbytehq/airbyte/issues), or [recently closed](https://github.com/airbytehq/airbyte/issues?utf8=%E2%9C%93&q=is%3Aissue%20is%3Aclosed%20). Having a single voted for issue is far easier for us to prioritize
### **Requesting new integrations**
### **Requesting new connectors**
This is very similar to requesting new features. The template will change a bit and all integration requests will be tagged with the “**community\_new**” and “**area/integration**” labels.
This is very similar to requesting new features. The template will change a bit and all connector requests will be tagged with the “**community\_new**” and “**area/integration**” labels.
To see what has already been proposed by the community, you can look [here](https://github.com/airbytehq/airbyte/labels/area%2Fintegration). Again, watch out for duplicates!
@@ -101,7 +101,7 @@ Please do not create a public GitHub issue. If you've found a security issue, pl
## **Review process**
****If you are considering adding to the codebase or contributing a new integration: a big thank you! We sincerely appreciate your help.
****If you are considering adding to the codebase or contributing a new connector: a big thank you! We sincerely appreciate your help.
As soon as you are done with your development, just put up a PR. You're also always welcome to reach out during or before development. When we review we look at:

View File

@@ -1,19 +1,15 @@
# Building new connectors
# Building New Connectors
A connector takes the form of a Docker image which follows the [Airbyte specification](../../architecture/airbyte-specification.md).
We support 2 types of connectors:
* Sources
* Destinations
We support 2 types of connectors: Sources and Destinations.
To build a new connector, we provide templates so you don't need to start everything from scratch.
## The Airbyte specification
Before you can start building your own connector, you need to understand [Airbyte's data protocol specification](../../architecture/airbyte-specification.md).
## Creating a new Integration
## Creating a new connector
First, make sure you built the project by running
@@ -31,21 +27,21 @@ npm run generate
and follow the interactive prompt.
This will generate a new integration in the `airbyte-integrations/connectors/<your-integration>` directory.
This will generate a new connector in the `airbyte-integrations/connectors/<your-connector>` directory.
Follow the instructions generated in the `CHECKLIST.md` file to bootstrap the integration.
Follow the instructions generated in the `CHECKLIST.md` file to bootstrap the connector.
The generated `README.md` will also contain instructions on how to iterate.
## Updating an Integration
## Updating a connector
Once you've finished iterating on the changes to a connector as specified in its `README.md`, follow these instructions to tell Airbyte to use the latest version of your integration.
Once you've finished iterating on the changes to a connector as specified in its `README.md`, follow these instructions to tell Airbyte to use the latest version of your connector.
1. Bump the version in the `Dockerfile` of the integration \(`LABEL io.airbyte.version=X.X.X`\).
1. Bump the version in the `Dockerfile` of the connector \(`LABEL io.airbyte.version=X.X.X`\).
2. Update the connector version in:
* `STANDARD_SOURCE_DEFINITION` if it is a source
* `STANDARD_DESTINATION_DEFINITION` if it is a destination.
3. Build the integration with the semantic version tag locally:
3. Build the connector with the semantic version tag locally:
```text
./tools/integrations/manage.sh build airbyte-integrations/connectors/<connector-name>

View File

@@ -6,7 +6,7 @@ In order to provide the best developer experience, here are some instructions on
## Python Connector Development
Before working with integrations written in python, we recommend running `./gradlew build` from the root project directory. This will create a `virtualenv` for every integration and helper project and install dependencies locally.
Before working with connectors written in python, we recommend running `./gradlew build` from the root project directory. This will create a `virtualenv` for every connector and helper project and install dependencies locally.
When iterating on a single connector, you will often iterate by running
@@ -24,7 +24,7 @@ This command will:
3. [Flake8](https://pypi.org/project/flake8/) to check formatting
4. [MyPy](https://pypi.org/project/mypy/) to check type usage
At Airbyte, we use IntelliJ IDEA for development. Although it is possible to develop integrations with any IDE, we typically recommend IntelliJ IDEA or PyCharm, since we actively work towards compatibility.
At Airbyte, we use IntelliJ IDEA for development. Although it is possible to develop connectors with any IDE, we typically recommend IntelliJ IDEA or PyCharm, since we actively work towards compatibility.
Our typical development flow is to have one Intellij project for `java` development with `gradle` and a separate Intellij project for python. The following setup steps are written for IntelliJ IDEA but should have similar equivalents for PyCharm:

View File

@@ -1,22 +1,22 @@
---
description: >-
This is the template that should be used when adding documentation for a new
integration.
connector.
---
# Integration Doc Template
# Connector Doc Template
## Sync overview
### Output schema
Is the output schema fixed \(e.g: for an API like Stripe\)? If so, point to the integrations schema \(e.g: link to Stripes documentation\) or describe the schema here directly \(e.g: include a diagram or paragraphs describing the schema\).
Is the output schema fixed \(e.g: for an API like Stripe\)? If so, point to the connector's schema \(e.g: link to Stripes documentation\) or describe the schema here directly \(e.g: include a diagram or paragraphs describing the schema\).
Describe how the integration's schema is mapped to Airbyte concepts. An example description might be: “MagicDB tables become Airbyte Streams and MagicDB columns become Airbyte Fields. In addition, an extracted\_at column is appended to each row being read.”
Describe how the connector's schema is mapped to Airbyte concepts. An example description might be: “MagicDB tables become Airbyte Streams and MagicDB columns become Airbyte Fields. In addition, an extracted\_at column is appended to each row being read.”
### Data type mapping
This section should contain a table mapping each of the integration's data types to Airbyte types. At the moment, Airbyte uses the same types used by [JSONSchema](https://json-schema.org/understanding-json-schema/reference/index.html). `string`, `date-time`, `object`, `array`, `boolean`, `integer`, and `number` are the most commonly used data types.
This section should contain a table mapping each of the connector's data types to Airbyte types. At the moment, Airbyte uses the same types used by [JSONSchema](https://json-schema.org/understanding-json-schema/reference/index.html). `string`, `date-time`, `object`, `array`, `boolean`, `integer`, and `number` are the most commonly used data types.
| Integration Type | Airbyte Type | Notes |
| :--- | :--- | :--- |
@@ -44,8 +44,8 @@ Could this connector hurt the user's database/API/etc... or put too much strain
### Requirements
* What versions of this integration does this implementation support? \(e.g: `postgres v3.14 and above`\)
* What configurations, if any, are required on the integration? \(e.g: `buffer_size > 1024`\)
* What versions of this connector does this implementation support? \(e.g: `postgres v3.14 and above`\)
* What configurations, if any, are required on the connector? \(e.g: `buffer_size > 1024`\)
* Network accessibility requirements
* Credentials/authentication requirements? \(e.g: A DB user with read permissions on certain tables\)

View File

@@ -1,4 +1,4 @@
# On your workstation
# On Your Workstation
{% hint style="info" %}
These instructions have been tested on MacOS

View File

@@ -32,7 +32,7 @@ Now you will see a wizard that allows you choose the data you want to send throu
![](.gitbook/assets/02_set-up-sources%20%281%29%20%281%29%20%281%29%20%281%29.png)
As of our alpha launch, we have one database source \(Postgres\) and two API sources \(an exchange rate API and the Stripe API\). We're currently building an integration framework that makes it easy to create sources and destinations, so you should expect many more soon. Please reach out to us if you need a specific integration or would like to help build one.
As of our alpha launch, we have one database source \(Postgres\) and two API sources \(an exchange rate API and the Stripe API\). We're currently building an integration framework that makes it easy to create sources and destinations, so you should expect many more soon. Please reach out to us if you need a specific connector or would like to help build one.
For now, we will start out with a Postgres source and destination.
@@ -86,7 +86,7 @@ You should now see a list of sources with the source you just added. Click on it
![](.gitbook/assets/04_source-details%20%281%29%20%281%29%20%281%29%20%281%29.png)
One of biggest problems we've seen in tools like Fivetran is the lack of visibility when debugging. In Airbyte, allowing full log access and the ability to debug and fix integration problems is one of our highest priorities. We'll be working hard to make these logs accessible and understandable.
One of biggest problems we've seen in tools like Fivetran is the lack of visibility when debugging. In Airbyte, allowing full log access and the ability to debug and fix connector problems is one of our highest priorities. We'll be working hard to make these logs accessible and understandable.
## 4. Check if the syncing actually worked
@@ -100,7 +100,7 @@ You should see the rows from the source database inside the destination database
And there you have it. You've taken data from one database and replicated it to another. All of the actual configuration for this replication only took place in the UI.
That's it! This is just the beginning of Airbyte. If you have any questions at all, please reach out to us on [Slack](https://slack.airbyte.io/). Were still in alpha, so if you see any rough edges or want to request an integration you need, please create an issue on our [Github](https://github.com/airbytehq/airbyte) or leave a thumbs up on an existing issue.
That's it! This is just the beginning of Airbyte. If you have any questions at all, please reach out to us on [Slack](https://slack.airbyte.io/). Were still in alpha, so if you see any rough edges or want to request a connector you need, please create an issue on our [Github](https://github.com/airbytehq/airbyte) or leave a thumbs up on an existing issue.
Thank you and we hope you enjoy using Airbyte.

View File

@@ -1,2 +1,2 @@
# Integrations
# Connectors

View File

@@ -1,6 +1,20 @@
# Custom Connectors
---
description: Missing a connector?
---
If you want to use connectors that are not part of the official Airbyte distribution, you can add them directly though the UI.
# Custom or New Connector
If you'd like to **ask for a new connector,** you can request it directly [here](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=).
If you'd like to build new connectors and **make them part of the pool of pre-built connectors on Airbyte,** first a big thank you. We invite you to check our [contributing guide on building connectors](../contributing-to-airbyte/building-new-connector/).
If you'd like to build new connectors, or update existing ones, **for your own usage,** without contributing to the Airbyte codebase, read along.
## Developing your own connector
It's easy to code your own connectors on Airbyte. Here is a link to instruct on how to code new sources and destinations: [building new connectors](../contributing-to-airbyte/building-new-connector/)
While the guides in the link above are specific to the languages used most frequently to write integrations, **Airbyte connectors can be written in any language**. Please reach out to us if you'd like help developing connectors in other languages.
## Adding your connectors in the UI

View File

@@ -1,5 +1,6 @@
---
description: 'Data warehouses, data lakes, databases...'
---
# Destinations
* [Contributing Destinations](../../contributing-to-airbyte/#new-integrations)
* Please reach out to us for help develop destinations in other languages.

View File

@@ -2,20 +2,20 @@
description: Do not miss the new connectors we support!
---
# Integrations Changelog
# Connector Changelog
**You can request new integrations directly** [**here**](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=)**.**
**You can request new connectors directly** [**here**](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=)**.**
Note: Airbyte is not built on top of Singer, but is compatible with Singer's protocol. Airbyte's ambitions go beyond what Singer enables to do, so we are building our own protocol that will keep its compatibility with Singer's one.
## Currently under construction
**New sources:** MSSQL, GCS CSV, S3 CSV, SFTP CSV
**New sources:** MSSQL, GCS CSV, S3 CSV, SFTP CSV, Hive
**New destinations:** Redshift
## 11/04/2020
**New sources:** [Facebook Ads](sources/facebook-marketing-api.md), [Google Ads](sources/google-adwords.md)
**New sources:** [Facebook Ads](sources/facebook-marketing-api.md), [Google Ads](sources/google-adwords.md), [Marketo](sources/marketo.md)
**New destination:** [Snowflake](destinations/snowflake.md)
## 10/30/2020

View File

@@ -1,5 +1,6 @@
---
description: 'We''re adding new source connectors every day: APIs, applications, databases...'
---
# Sources
* [Contributing Sources](../../contributing-to-airbyte/#new-integrations)
* Please reach out to us for help develop sources in other languages.