* Add namespace test for snowflake
* Enable namespace test for bigquery
* Format code
* Capitalize test case id
* Update exception message to point to test case file
* Update snowflake name transformer to prepend underscore
* Override convertStreamName instead of getIdentifier
* Add missing state message
* Remove unused import
* Disable more namespace test cases
We don't want to introduce changes that will affect existing connections for now.
* Dry method that mutates namespace
* Pass through null
* Normalize namespace
* Fix test case
* Revert consumer factory changes
* Normalize namespace in catalog
* Revert catalog normalization
* Enable namespace test for all snowflake destination tests
* Test namespace for both bigquery destination tests
* Add unit test for bigquery name transformer
* Transform bigquery schema name
* Fix avro name transformer
* Normalize avro namespace
* Standardize namespace in gcs utils
* Bump version for snowflake and bigquery
* Enable namespace test for bigquery denormalized
* Dry bigquery denormalized acceptance test
* Revert some of the variable scope change
* Fix unit test
* Bump version
* Introduce getNamespace method
* Implement getNamespace method for bigquery
* Switch to getNamespace methods
* Update comments
* Fix bigquery denormalized acceptance test
* Format code
* Dry bigquery destination test
* Skip partition test for gcs mode
* Bump version
* Refactor Snowflake internal Staging as model to share staging abilities in jdbc destinations
* Switch Snowflake Copy Destination for Staging destination based off Internal Staging
Co-authored-by: LiRen Tu <tuliren.git@outlook.com>
* Bumpversion of destination-snowflake
* typo
* remove hint block
* Move required scopes to top level header
* Revert "remove hint block"
This reverts commit c7d070abbd.
* update
* anchor link
* update other links too
* reset
* anchor link
* Update doc
* com -> io
* Update docs/integrations/sources/hubspot.md
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* only update doc
* Update changelog
* anchor link
* Bump version
* Update specs
* Capitalize
* link to where to find credentials
* update header level
* Capitalize
* Capitalize
* not all caps
* API Key
* Update cloud setup instructions
Co-authored-by: Sherif A. Nada <snadalive@gmail.com>
* fix for jdk 17
* fixed bug with missing records during S3 staging
* test
* add CHANGELOG
* add assertion using all staging files
* bump redshift version
Co-authored-by: vmaltsev <vitalii.maltsev@globallogic.com>
* Make SchedulerHandler store schema after fetching it
* Add `disable_cache` parameter to discover_schema API
* Return cached catalog if it already exists
* Address code review comments
* Add tests for caching of catalog in SchedulerHandler
* Format fixes
* Fix Acceptance tests
* New code review fixes
- Use upper case for global variable
- Inline definition and assignment of variable
* Set partition key on streams with id field
* reset to master
* Update readme with primary key
* This can be incremental
* I think this can also support incremental
* incremental streams
* incremental
* Missing comma
* Everything can be incremental
* set pk
* Add a primary key
* Add missing pk
* format
* Update doc
* Bump version
* Not everything can be incremental
* fix field
* Update pk
* Update source_specs
* Feat: first cut to allow naming for connections
* fix
* fix: migration
* fix: migration
* fix: formatting
* fix: formatting
* fix: tests
* fix: -> is bit outside of what we do generally
* fix: tests are failing
* fix: tests are failing
* fix: tests are failing
* fix: tests are failing
* fix: tests are failing
* Feat: first cut to allow naming for connections
* fix
* fix: migration
* fix: migration
* fix: formatting
* fix: formatting
* fix: tests
* fix: -> is bit outside of what we do generally
* Handled search queries that would output more than 10K records
* Getting CRM search objects in ascending chronological ortder
* Fixed stream
* Fixed rebase
* Fixed condition
* Added unit test
* Removed unused import
* Started a new query when reached 10K records
* Moved comment
* bump connector docker version
* bump connector version
* correct dockerfile version
* change doc version
Co-authored-by: lgomezm <luis@calixa.io>
Original API used in the example is no longer free, so while following the tutorial I found a different API to use in the tutorial and updated the example code to match the new API. Also made a few small changes.
Update Build-a-connector tutorial
* Update screenshots and instructions to match current UI
* Update code examples to use Polygon.io API
* Add links to language-specific guides
* Consistently use Python (uppercase P) when referring to the language
and python (lowercase) p in code samples
* Consistently do not use "." in lists
* Add an image of Airbyte startup banner
* Add a note for M1 macs
* Remove unused images
* Make Dockerfile consistent with the tutorial
* Add a test for listObjects permission to destination-s3 connector
* add testIAMUserHasListObjectPermission method to S3Destination
and call this method from S3Destination::check. Method throws
an exception if IAM user does not have listObjects permission
on the destination bucket
* add a unit test to S3DestinationTest to verify that S3Destination::check
fails if listObjects throws an exception
* add a unit test to S3DestinationTest to verify that S3Destination::check
succeeds if listObjects succeeds
* Add S3DestinationConfigFactory in order to be able to mock S3 client
used in S3Destination::check
* Addressing review comments:
- separate positive and negative unit tests
- fix formatting
- reuse s3 client for both positive and negative tests
* Add information about PR #10856 to the changelog
* Prepare for publishing new version:
* Bump version to 0.2.10 in Dockerfile
* Bump version to 0.2.10 in changelog
* Update destination-s3 version in connector index
* Update seed spec for destination-s3 connector
* enhanced performance for streams which run 1 requests for each main item.
* removed unused types
* moved common code to StripeSubStream
* updated docs, updated docker version
* updated connector version in source_specs.yaml