* Update 4-connection-checking.md
As discussed in https://airbytehq-team.slack.com/archives/C02UQ9MJ4GG/p1655312684160609 - this example is misleading, as it does not actually connect to any API to validate connectivity.
* Update 4-connection-checking.md
Added a link to OneSignal check_connection, and move the "Note" to below the example
* Fix infinite loop when fetching Amplitude data
* Update changelog and version
* Address review
* fix: unit tests were failing
* chore: bump version in source definitions
* chore: update seed file
Co-authored-by: Harshith Mullapudi <harshithmullapudi@gmail.com>
* Update getting-started-with-airbyte-cloud.md
Edited Getting Started with Airbyte Cloud guide to match the updated Cloud UI.
* Update getting-started-with-airbyte-cloud.md
Updated based on Amruta's suggestions.
* Postgres Source: fixed truncated precision if the value of the millisecond or second is 0
* check CI with 1.15.3 testcontainer
* check CI with 1.15.3 testcontainer
* returned latest version of testcontainer
* fixed checkstyle
* fixed checkstyle
* returned latest testcontainer version
* updated CHANGELOG
* bump version
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* Create interface, factory for metric client
* remove unused func
* change count val to use long
* PR fix
* otel metric client implementation
* merge conflicts resolve
* build fix
* add a test, moved version into deps catalog
* fix test
* add docs for open telemetry
* fix kube setting for otel, and add doc
* helm related fields update for opentel
* S3 destination: Updating processing data types for Avro/Parquet formats
* S3 destination: handle comparing data types
* S3 destination: clean code
* S3 destination: clean code
* S3 destination: handle case with unexpected json schema type
* S3 destination: clean code
* S3 destination: Extract the same logic for Avro/Parquet formats to separate parent class
* S3 destination: clean code
* S3 destination: clean code
* GCS destination: Update data types processing for Avro/Parquet formats
* GCS destination: clean redundant code
* S3 destination: handle case with numbers inside array
* S3 destination: clean code
* S3 destination: add unit test
* S3 destination: update unit test cases with number types.
* S3 destination: update unit tests.
* S3 destination: bump version for s3 and gcs
* auto-bump connector version
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* Fall back to parsing w/ or w/o TZ if parsing a date or a time string fails
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* initial changes
* Edited google Sheets doc
* More edits
* edited the intro and prereqs for BigQuery
* edited the data loading section
* more edits
* Grammatical edits
* Formatting edits
* Bump `source-google-ads` to build for both AMD and ARM
* pin protobuf==3.14
* update readme
* #263 oncall: bump google ads version 15.1.1, protobuf 3.20.0
* auto-bump connector version
Co-authored-by: Denys Davydov <davydov.den18@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* #259 on call: Source Hubspot - fix for the property_history stream which did not emit any records
* #259 Source Hubspot: upd changelog
* #259 oncall: hubspot review fixes
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
* Rename databricks connector
* Rename connector in the seed
* Update changelog with pr id
* auto-bump connector version
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>