* mssql-source:upgrade debezium version to 1.9.6
* more improvements
* upgrade version
* auto-bump connector version
* fix test
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
API changes to support the progress bar.
- The eventual idea is for the save_stats route to be called by the workers during replication. Workers will save stats for a job id and attempt number.
- Make modifications to the /jobs/list and the /jobs/get_debug_info routes to also return estimated bytes/records.
We need both estimated metadata, as well as running states to calculate progress bar and throughput.
- add the save_stats route. This is the route that will be called by workers. I've done my best to reuse existing openapi bodies to reduce duplication.
- add the estimatedRecords and estimatedBytes fields to the AttemptStats body. This is part of the AttemptRead and the AttemptStreamStats objects. This eventually filters up to the jobs/list and jobs/get_debug_info objects. This also adds these to all the endpoints that were previously returning stats information. I think the duplicated data is a small issue and don't think it's worth splitting out a new api objects, though I will gladly do so if folks feel strongly.
minor changes to the AttemptApiController to support the new route.
- I've stubbed out the handlers for now since the backend is not yet implemented.
* Bump version for redshift, bigquery, and snowflake
* auto-bump connector version
* auto-bump connector version
* Log failed refresh token response
* Revert snowflake version bump
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
I am going to edit some of the wording in later PRs, but merging this for now.
* added multicloud info
* Update docs/cloud/getting-started-with-airbyte-cloud.md
Co-authored-by: Joey Marshment-Howell <josephkmh@users.noreply.github.com>
* edited for clarity
* incorporated suggestions
Co-authored-by: Joey Marshment-Howell <josephkmh@users.noreply.github.com>
* Add source-gridly
* Correct current_page init value
* The first batch was fetching twice
* Remove `integration_tests/catalog.json` from source-gridly
* Allow select any view on grid to sync records
* Correct documentationUrl for source gridly
* use class property for source endpoint instead of local variable
* Add tests and format code
* Add gridly.md docs file
* add gridly to source def
* auto-bump connector version
Co-authored-by: Tan Ho <th@localizedirect.com>
Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* New Source: Datadog
* Updating doc
* Adding unit tests
* Renaming limit var
* Updating description in spec
* add source def to seed
* add datadog to source def seed
* run format
* auto-bump connector version
Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* Tmp
* Extract the Attempt API from the V1 API
* Add comments
* Move Connection API out of configuration API
* format
* format
* Rename to Controller
* Rename to Controller
* Add values to the factory
* Change the constructor to use hadler instead of objects needed by the handler
* Update with new tags.
* tmp
* Fix PMD errors
* Extract DB migrator
* Add something that I forgot
* extract destination definition api
* restore destination factory initialization
* extract destination definition specification api
* format
* format
* format
* extract health check api
* extract jobs api
* fix test
* format
* Extract logs api
* Add missing declaration
* Fix build
* Tmp
* format and PR comments
* Extract notification API
* re-organize tags
* Extract all Oauth
* Fix PMD
* init commit
* add docs
* add docs
* Delete logs.txt
* add items
* fix comments
* fix comment
* fix acceptance test
* remove *state.json used for incremental imports test
* Add ignored_fields on listing to get the acceptance test pass
- the crypto market is VERY volatile, the data change between 2 full imports when the test is run
* manually generate source_specs.yaml for coinmarketcap
Co-authored-by: Yiyang Li <yiyangli2010@gmail.com>
* 🎉 New Desination: Heap Analytics [python cdk]
- implement a heap client to load data via the server-side API: https://developers.heap.io/reference/server-side-apis-overview
- the connector supports a generic data source, and the api_type determines the output schema. The output schema is dynamic.
- users pick the columns that will be loaded to the destination
- Consequently, each configured catalog only includes one stream
* add a bootstrap to illustrate the connector
* add destination dest def
* run format all files
* correct unit test
* auto-bump connector version
Co-authored-by: Vincent Koc <koconder@users.noreply.github.com>
Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* Source postgres: encode database name
* Source postgres, mysql: move encoding in util class, apply for mysql
* Source postgres, mysql: make var final
* Source postgres, mysql: bump version
* Source postgres, mysql: format code
* auto-bump connector version
* Source mysql: bump version
* auto-bump connector version
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
Removes a duplicate of the paragraph that begins "Note that we're also setting the `stream_cursor_field` in the stream's `$options`..." which followed an outdated schema.
* Source Intercom: change airbyte-cdk version to 0.2
* Source Intercom: update chagelog
* Source Intercom: bump version and update chagelog
* Source Intercom: Fix PR id in chagelog
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>