Reorganize BQ docs
This commit is contained in:
@@ -6,15 +6,6 @@ description: >-
|
||||
|
||||
# BigQuery
|
||||
|
||||
## Uploading options
|
||||
There are 2 available options to upload data to bigquery `Standard` and `GCS Staging`.
|
||||
- `Standard` is option to upload data directly from your source to BigQuery storage. This way is faster and requires less resources than GCS one.
|
||||
Please be aware you may see some fails for big datasets and slow sources, i.e. if reading from source takes more than 10-12 hours.
|
||||
This is caused by the Google BigQuery SDK client limitations. For more details please check https://github.com/airbytehq/airbyte/issues/3549
|
||||
- `GCS Uploading (CSV format)`. This approach has been implemented in order to avoid the issue for big datasets mentioned above.
|
||||
At the first step all data is uploaded to GCS bucket and then all moved to BigQuery at one shot stream by stream.
|
||||
The destination-gcs connector is partially used under the hood here, so you may check its documentation for more details.
|
||||
|
||||
## Overview
|
||||
|
||||
The Airbyte BigQuery destination allows you to sync data to BigQuery. BigQuery is a serverless, highly scalable, and cost-effective data warehouse offered by Google Cloud Provider.
|
||||
@@ -40,8 +31,19 @@ Each stream will be output into its own table in BigQuery. Each table will conta
|
||||
| Full Refresh Sync | Yes | |
|
||||
| Incremental - Append Sync | Yes | |
|
||||
| Incremental - Deduped History | Yes | |
|
||||
| Bulk loading | Yes | |
|
||||
| Namespaces | Yes | |
|
||||
|
||||
## Uploading options
|
||||
There are 2 available options to upload data to bigquery `Standard` and `GCS Staging`.
|
||||
- `Standard` is option to upload data directly from your source to BigQuery storage. This way is faster and requires less resources than GCS one.
|
||||
Please be aware you may see some fails for big datasets and slow sources, i.e. if reading from source takes more than 10-12 hours.
|
||||
This is caused by the Google BigQuery SDK client limitations. For more details please check https://github.com/airbytehq/airbyte/issues/3549
|
||||
- `GCS Uploading (CSV format)`. This approach has been implemented in order to avoid the issue for big datasets mentioned above.
|
||||
At the first step all data is uploaded to GCS bucket and then all moved to BigQuery at one shot stream by stream.
|
||||
The destination-gcs connector is partially used under the hood here, so you may check its documentation for more details.
|
||||
|
||||
|
||||
## Getting started
|
||||
|
||||
### Requirements
|
||||
@@ -144,7 +146,7 @@ Therefore, Airbyte BigQuery destination will convert any invalid characters into
|
||||
|
||||
## CHANGELOG
|
||||
|
||||
### destination-bigquery
|
||||
### bigquery
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
@@ -156,7 +158,7 @@ Therefore, Airbyte BigQuery destination will convert any invalid characters into
|
||||
| 0.3.6 | 2021-06-18 | [#3947](https://github.com/airbytehq/airbyte/issues/3947) | Service account credentials are now optional. |
|
||||
| 0.3.4 | 2021-06-07 | [#3277](https://github.com/airbytehq/airbyte/issues/3277) | Add dataset location option |
|
||||
|
||||
### destination-bigquery-denormalized
|
||||
### bigquery-denormalized
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
|
||||
Reference in New Issue
Block a user