* [faker] decouple stream state
* add PR #
* commit Stream instantiate changes
* fixup expected record
* skip backward test for this version too
* Apply suggestions from code review
Co-authored-by: Augustin <augustin@airbyte.io>
* lint
* Create realistic datasets of 10GB, 100GB, and 1TB in size (#20558)
* Faker CSV Streaming utilities
* readme
* don't do a final pipe to jq or you will run out or ram
* doc
* Faker gets 250% faster (#20741)
* Faker is 250% faster
* threads in spec + lint
* pass tests
* revert changes to record helper
* cleanup
* update expected_records
* bump default records-per-slice to 1k
* enforce unique email addresses
* cleanup
* more comments
* `parallelism` and pass tests
* update expected records
* cleanup notes
* update readme
* update expected records
* auto-bump connector version
Co-authored-by: Augustin <augustin@airbyte.io>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>