[tools] prettier rules for .md + formatting cleanup
This commit is contained in:
@@ -1 +1,3 @@
|
||||
airbyte-integrations/bases/base-normalization/integration_tests/normalization_test_output
|
||||
airbyte-ci/connectors/pipelines/tests/test_changelog/result_files
|
||||
airbyte-integrations/bases/connector-acceptance-test/unit_tests/data/docs
|
||||
|
||||
@@ -3,8 +3,7 @@
|
||||
{
|
||||
"files": "*.md",
|
||||
"options": {
|
||||
"printWidth": 100,
|
||||
"proseWrap": "always"
|
||||
"proseWrap": "preserve"
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
@@ -1,2 +1,3 @@
|
||||
# Code of conduct
|
||||
|
||||
View in [docs.airbyte.io](https://docs.airbyte.com/project-overview/code-of-conduct)
|
||||
|
||||
@@ -1,2 +1,3 @@
|
||||
# Contributing
|
||||
|
||||
View on [docs.airbyte.io](https://docs.airbyte.io/contributing-to-airbyte)
|
||||
|
||||
832
CONTRIBUTORS.md
832
CONTRIBUTORS.md
@@ -1,421 +1,421 @@
|
||||
# Contributors
|
||||
|
||||
* [69mb](https://github.com/69mb)
|
||||
* [a-honcharenko](https://github.com/a-honcharenko)
|
||||
* [aadityasinha-dotcom](https://github.com/aadityasinha-dotcom)
|
||||
* [aaronsteers](https://github.com/aaronsteers)
|
||||
* [aazam-gh](https://github.com/aazam-gh)
|
||||
* [abaerptc](https://github.com/abaerptc)
|
||||
* [aballiet](https://github.com/aballiet)
|
||||
* [achaussende](https://github.com/achaussende)
|
||||
* [ad-m](https://github.com/ad-m)
|
||||
* [adam-bloom](https://github.com/adam-bloom)
|
||||
* [adamf](https://github.com/adamf)
|
||||
* [adamschmidt](https://github.com/adamschmidt)
|
||||
* [AetherUnbound](https://github.com/AetherUnbound)
|
||||
* [afranzi](https://github.com/afranzi)
|
||||
* [agrass](https://github.com/agrass)
|
||||
* [ahmed-buksh](https://github.com/ahmed-buksh)
|
||||
* [airbyte-jenny](https://github.com/airbyte-jenny)
|
||||
* [ajmhatch](https://github.com/ajmhatch)
|
||||
* [ajzo90](https://github.com/ajzo90)
|
||||
* [akashkulk](https://github.com/akashkulk)
|
||||
* [akulgoel96](https://github.com/akulgoel96)
|
||||
* [alafanechere](https://github.com/alafanechere)
|
||||
* [alallema](https://github.com/alallema)
|
||||
* [albert-marrero](https://github.com/albert-marrero)
|
||||
* [alex-danilin](https://github.com/alex-danilin)
|
||||
* [alex-gron](https://github.com/alex-gron)
|
||||
* [alexander-marquardt](https://github.com/alexander-marquardt)
|
||||
* [AlexanderBatoulis](https://github.com/AlexanderBatoulis)
|
||||
* [alexandertsukanov](https://github.com/alexandertsukanov)
|
||||
* [alexandr-shegeda](https://github.com/alexandr-shegeda)
|
||||
* [alexchouraki](https://github.com/alexchouraki)
|
||||
* [AlexJameson](https://github.com/AlexJameson)
|
||||
* [alexnikitchuk](https://github.com/alexnikitchuk)
|
||||
* [Alihassanc5](https://github.com/Alihassanc5)
|
||||
* [Allexik](https://github.com/Allexik)
|
||||
* [alovew](https://github.com/alovew)
|
||||
* [AM-I-Human](https://github.com/AM-I-Human)
|
||||
* [amaliaroye](https://github.com/amaliaroye)
|
||||
* [ambirdsall](https://github.com/ambirdsall)
|
||||
* [aminamos](https://github.com/aminamos)
|
||||
* [amitku](https://github.com/amitku)
|
||||
* [Amruta-Ranade](https://github.com/Amruta-Ranade)
|
||||
* [anamargaridarl](https://github.com/anamargaridarl)
|
||||
* [andnig](https://github.com/andnig)
|
||||
* [andresbravog](https://github.com/andresbravog)
|
||||
* [andrewlreeve](https://github.com/andrewlreeve)
|
||||
* [andreyAtBB](https://github.com/andreyAtBB)
|
||||
* [andriikorotkov](https://github.com/andriikorotkov)
|
||||
* [andrzejdackiewicz](https://github.com/andrzejdackiewicz)
|
||||
* [andyjih](https://github.com/andyjih)
|
||||
* [AndyTwiss](https://github.com/AndyTwiss)
|
||||
* [animer3009](https://github.com/animer3009)
|
||||
* [anna-geller](https://github.com/anna-geller)
|
||||
* [annalvova05](https://github.com/annalvova05)
|
||||
* [antixar](https://github.com/antixar)
|
||||
* [antonioneto-hotmart](https://github.com/antonioneto-hotmart)
|
||||
* [anujgupta0711](https://github.com/anujgupta0711)
|
||||
* [Anurag870](https://github.com/Anurag870)
|
||||
* [anushree-agrawal](https://github.com/anushree-agrawal)
|
||||
* [apostoltego](https://github.com/apostoltego)
|
||||
* [archangelic](https://github.com/archangelic)
|
||||
* [arimbr](https://github.com/arimbr)
|
||||
* [arnaudjnn](https://github.com/arnaudjnn)
|
||||
* [ArneZsng](https://github.com/ArneZsng)
|
||||
* [arsenlosenko](https://github.com/arsenlosenko)
|
||||
* [artem1205](https://github.com/artem1205)
|
||||
* [artusiep](https://github.com/artusiep)
|
||||
* [asafepy](https://github.com/asafepy)
|
||||
* [asyarif93](https://github.com/asyarif93)
|
||||
* [augan-rymkhan](https://github.com/augan-rymkhan)
|
||||
* [Auric-Manteo](https://github.com/Auric-Manteo)
|
||||
* [avaidyanatha](https://github.com/avaidyanatha)
|
||||
* [avida](https://github.com/avida)
|
||||
* [avirajsingh7](https://github.com/avirajsingh7)
|
||||
* [axaysagathiya](https://github.com/axaysagathiya)
|
||||
* [azhard](https://github.com/azhard)
|
||||
* [b4stien](https://github.com/b4stien)
|
||||
* [bala-ceg](https://github.com/bala-ceg)
|
||||
* [bazarnov](https://github.com/bazarnov)
|
||||
* [bbugh](https://github.com/bbugh)
|
||||
* [bcbeidel](https://github.com/bcbeidel)
|
||||
* [bdashrad](https://github.com/bdashrad)
|
||||
* [benmoriceau](https://github.com/benmoriceau)
|
||||
* [BenoitFayolle](https://github.com/BenoitFayolle)
|
||||
* [BenoitHugonnard](https://github.com/BenoitHugonnard)
|
||||
* [bgroff](https://github.com/bgroff)
|
||||
* [Bhupesh-V](https://github.com/Bhupesh-V)
|
||||
* [BirdboyBolu](https://github.com/BirdboyBolu)
|
||||
* [bjgbeelen](https://github.com/bjgbeelen)
|
||||
* [bkrausz](https://github.com/bkrausz)
|
||||
* [bleonard](https://github.com/bleonard)
|
||||
* [bnchrch](https://github.com/bnchrch)
|
||||
* [bobvanluijt](https://github.com/bobvanluijt)
|
||||
* [brebuanirello-equinix](https://github.com/brebuanirello-equinix)
|
||||
* [BrentSouza](https://github.com/BrentSouza)
|
||||
* [brianjlai](https://github.com/brianjlai)
|
||||
* [brunofaustino](https://github.com/brunofaustino)
|
||||
* [bstrawson](https://github.com/bstrawson)
|
||||
* [btkcodedev](https://github.com/btkcodedev)
|
||||
* [burmecia](https://github.com/burmecia)
|
||||
* [bzAmin](https://github.com/bzAmin)
|
||||
* [calebfornari](https://github.com/calebfornari)
|
||||
* [cameronwtaylor](https://github.com/cameronwtaylor)
|
||||
* [camro](https://github.com/camro)
|
||||
* [carlkibler](https://github.com/carlkibler)
|
||||
* [carlonuccio](https://github.com/carlonuccio)
|
||||
* [catpineapple](https://github.com/catpineapple)
|
||||
* [cgardens](https://github.com/cgardens)
|
||||
* [chadthman](https://github.com/chadthman)
|
||||
* [chandrasekharan98](https://github.com/chandrasekharan98)
|
||||
* [ChristoGrab](https://github.com/ChristoGrab)
|
||||
* [ChristopheDuong](https://github.com/ChristopheDuong)
|
||||
* [ciancullinan](https://github.com/ciancullinan)
|
||||
* [cirdes](https://github.com/cirdes)
|
||||
* [cjwooo](https://github.com/cjwooo)
|
||||
* [clnoll](https://github.com/clnoll)
|
||||
* [cobobrien](https://github.com/cobobrien)
|
||||
* [coetzeevs](https://github.com/coetzeevs)
|
||||
* [colesnodgrass](https://github.com/colesnodgrass)
|
||||
* [collinscangarella](https://github.com/collinscangarella)
|
||||
* [cpdeethree](https://github.com/cpdeethree)
|
||||
* [CrafterKolyan](https://github.com/CrafterKolyan)
|
||||
* [cstruct](https://github.com/cstruct)
|
||||
* [ct-martin](https://github.com/ct-martin)
|
||||
* [cuyk](https://github.com/cuyk)
|
||||
* [cynthiaxyin](https://github.com/cynthiaxyin)
|
||||
* [CyprienBarbault](https://github.com/CyprienBarbault)
|
||||
* [czuares](https://github.com/czuares)
|
||||
* [Daemonxiao](https://github.com/Daemonxiao)
|
||||
* [dainiussa](https://github.com/dainiussa)
|
||||
* [dalo390](https://github.com/dalo390)
|
||||
* [damianlegawiec](https://github.com/damianlegawiec)
|
||||
* [dandpz](https://github.com/dandpz)
|
||||
* [daniel-cortez-stevenson](https://github.com/daniel-cortez-stevenson)
|
||||
* [danieldiamond](https://github.com/danieldiamond)
|
||||
* [Danucas](https://github.com/Danucas)
|
||||
* [danvass](https://github.com/danvass)
|
||||
* [darian-heede](https://github.com/darian-heede)
|
||||
* [darynaishchenko](https://github.com/darynaishchenko)
|
||||
* [DavidSpek](https://github.com/DavidSpek)
|
||||
* [davinchia](https://github.com/davinchia)
|
||||
* [davydov-d](https://github.com/davydov-d)
|
||||
* [dbyzero](https://github.com/dbyzero)
|
||||
* [ddoyediran](https://github.com/ddoyediran)
|
||||
* [deepansh96](https://github.com/deepansh96)
|
||||
* [delenamalan](https://github.com/delenamalan)
|
||||
* [denis-sokolov](https://github.com/denis-sokolov)
|
||||
* [dependabot[bot]](https://github.com/apps/dependabot)
|
||||
* [dictcp](https://github.com/dictcp)
|
||||
* [didistars328](https://github.com/didistars328)
|
||||
* [digambar-t7](https://github.com/digambar-t7)
|
||||
* [dijonkitchen](https://github.com/dijonkitchen)
|
||||
* [dizel852](https://github.com/dizel852)
|
||||
* [dmateusp](https://github.com/dmateusp)
|
||||
* [domzae](https://github.com/domzae)
|
||||
* [DoNotPanicUA](https://github.com/DoNotPanicUA)
|
||||
* [Dracyr](https://github.com/Dracyr)
|
||||
* [drrest](https://github.com/drrest)
|
||||
* [dtt101](https://github.com/dtt101)
|
||||
* [edbizarro](https://github.com/edbizarro)
|
||||
* [edgao](https://github.com/edgao)
|
||||
* [edmundito](https://github.com/edmundito)
|
||||
* [efimmatytsin](https://github.com/efimmatytsin)
|
||||
* [eliziario](https://github.com/eliziario)
|
||||
* [elliottrabac](https://github.com/elliottrabac)
|
||||
* [emmaling27](https://github.com/emmaling27)
|
||||
* [erica-airbyte](https://github.com/erica-airbyte)
|
||||
* [erohmensing](https://github.com/erohmensing)
|
||||
* [etsybaev](https://github.com/etsybaev)
|
||||
* [eugene-kulak](https://github.com/eugene-kulak)
|
||||
* [evantahler](https://github.com/evantahler)
|
||||
* [ffabss](https://github.com/ffabss)
|
||||
* [flash1293](https://github.com/flash1293)
|
||||
* [franviera92](https://github.com/franviera92)
|
||||
* [freimer](https://github.com/freimer)
|
||||
* [FUT](https://github.com/FUT)
|
||||
* [gaart](https://github.com/gaart)
|
||||
* [ganpatagarwal](https://github.com/ganpatagarwal)
|
||||
* [gargatuma](https://github.com/gargatuma)
|
||||
* [gergelylendvai](https://github.com/gergelylendvai)
|
||||
* [girarda](https://github.com/girarda)
|
||||
* [git-phu](https://github.com/git-phu)
|
||||
* [github-actions[bot]](https://github.com/apps/github-actions)
|
||||
* [Gitznik](https://github.com/Gitznik)
|
||||
* [gordalina](https://github.com/gordalina)
|
||||
* [gosusnp](https://github.com/gosusnp)
|
||||
* [grebessi](https://github.com/grebessi)
|
||||
* [grishick](https://github.com/grishick)
|
||||
* [grubberr](https://github.com/grubberr)
|
||||
* [gvillafanetapia](https://github.com/gvillafanetapia)
|
||||
* [h7kanna](https://github.com/h7kanna)
|
||||
* [haithem-souala](https://github.com/haithem-souala)
|
||||
* [haoranyu](https://github.com/haoranyu)
|
||||
* [harshithmullapudi](https://github.com/harshithmullapudi)
|
||||
* [heade](https://github.com/heade)
|
||||
* [hehex9](https://github.com/hehex9)
|
||||
* [helderco](https://github.com/helderco)
|
||||
* [henriblancke](https://github.com/henriblancke)
|
||||
* [Hesperide](https://github.com/Hesperide)
|
||||
* [hillairet](https://github.com/hillairet)
|
||||
* [himanshuc3](https://github.com/himanshuc3)
|
||||
* [hntan](https://github.com/hntan)
|
||||
* [htrueman](https://github.com/htrueman)
|
||||
* [hydrosquall](https://github.com/hydrosquall)
|
||||
* [iberchid](https://github.com/iberchid)
|
||||
* [igrankova](https://github.com/igrankova)
|
||||
* [igsaf2](https://github.com/igsaf2)
|
||||
* [Imbruced](https://github.com/Imbruced)
|
||||
* [irynakruk](https://github.com/irynakruk)
|
||||
* [isaacharrisholt](https://github.com/isaacharrisholt)
|
||||
* [isalikov](https://github.com/isalikov)
|
||||
* [itaseskii](https://github.com/itaseskii)
|
||||
* [jacqueskpoty](https://github.com/jacqueskpoty)
|
||||
* [Jagrutiti](https://github.com/Jagrutiti)
|
||||
* [jamakase](https://github.com/jamakase)
|
||||
* [jartek](https://github.com/jartek)
|
||||
* [jbfbell](https://github.com/jbfbell)
|
||||
* [jcowanpdx](https://github.com/jcowanpdx)
|
||||
* [jdclarke5](https://github.com/jdclarke5)
|
||||
* [jdpgrailsdev](https://github.com/jdpgrailsdev)
|
||||
* [jeremySrgt](https://github.com/jeremySrgt)
|
||||
* [jhajajaas](https://github.com/jhajajaas)
|
||||
* [jhammarstedt](https://github.com/jhammarstedt)
|
||||
* [jnr0790](https://github.com/jnr0790)
|
||||
* [joelluijmes](https://github.com/joelluijmes)
|
||||
* [johnlafleur](https://github.com/johnlafleur)
|
||||
* [JonsSpaghetti](https://github.com/JonsSpaghetti)
|
||||
* [jonstacks](https://github.com/jonstacks)
|
||||
* [jordan-glitch](https://github.com/jordan-glitch)
|
||||
* [josephkmh](https://github.com/josephkmh)
|
||||
* [jrhizor](https://github.com/jrhizor)
|
||||
* [juliachvyrova](https://github.com/juliachvyrova)
|
||||
* [JulianRommel](https://github.com/JulianRommel)
|
||||
* [juliatournant](https://github.com/juliatournant)
|
||||
* [justinbchau](https://github.com/justinbchau)
|
||||
* [juweins](https://github.com/juweins)
|
||||
* [jzcruiser](https://github.com/jzcruiser)
|
||||
* [kaklakariada](https://github.com/kaklakariada)
|
||||
* [karinakuz](https://github.com/karinakuz)
|
||||
* [kattos-aws](https://github.com/kattos-aws)
|
||||
* [KayakinKoder](https://github.com/KayakinKoder)
|
||||
* [keu](https://github.com/keu)
|
||||
* [kgrover](https://github.com/kgrover)
|
||||
* [kimerinn](https://github.com/kimerinn)
|
||||
* [koconder](https://github.com/koconder)
|
||||
* [koji-m](https://github.com/koji-m)
|
||||
* [krishnaglick](https://github.com/krishnaglick)
|
||||
* [krisjan-oldekamp](https://github.com/krisjan-oldekamp)
|
||||
* [ksengers](https://github.com/ksengers)
|
||||
* [kzzzr](https://github.com/kzzzr)
|
||||
* [lazebnyi](https://github.com/lazebnyi)
|
||||
* [leo-schick](https://github.com/leo-schick)
|
||||
* [letiescanciano](https://github.com/letiescanciano)
|
||||
* [lgomezm](https://github.com/lgomezm)
|
||||
* [lideke](https://github.com/lideke)
|
||||
* [lizdeika](https://github.com/lizdeika)
|
||||
* [lmossman](https://github.com/lmossman)
|
||||
* [maciej-nedza](https://github.com/maciej-nedza)
|
||||
* [macmv](https://github.com/macmv)
|
||||
* [Mainara](https://github.com/Mainara)
|
||||
* [makalaaneesh](https://github.com/makalaaneesh)
|
||||
* [makyash](https://github.com/makyash)
|
||||
* [malikdiarra](https://github.com/malikdiarra)
|
||||
* [marcelopio](https://github.com/marcelopio)
|
||||
* [marcosmarxm](https://github.com/marcosmarxm)
|
||||
* [mariamthiam](https://github.com/mariamthiam)
|
||||
* [masonwheeler](https://github.com/masonwheeler)
|
||||
* [masyagin1998](https://github.com/masyagin1998)
|
||||
* [matter-q](https://github.com/matter-q)
|
||||
* [maxi297](https://github.com/maxi297)
|
||||
* [MaxKrog](https://github.com/MaxKrog)
|
||||
* [mdibaiee](https://github.com/mdibaiee)
|
||||
* [mfsiega-airbyte](https://github.com/mfsiega-airbyte)
|
||||
* [michaelnguyen26](https://github.com/michaelnguyen26)
|
||||
* [michel-tricot](https://github.com/michel-tricot)
|
||||
* [mickaelandrieu](https://github.com/mickaelandrieu)
|
||||
* [midavadim](https://github.com/midavadim)
|
||||
* [mildbyte](https://github.com/mildbyte)
|
||||
* [misteryeo](https://github.com/misteryeo)
|
||||
* [mkhokh-33](https://github.com/mkhokh-33)
|
||||
* [mlavoie-sm360](https://github.com/mlavoie-sm360)
|
||||
* [mmolimar](https://github.com/mmolimar)
|
||||
* [mohamagdy](https://github.com/mohamagdy)
|
||||
* [mohitreddy1996](https://github.com/mohitreddy1996)
|
||||
* [monai](https://github.com/monai)
|
||||
* [mrhallak](https://github.com/mrhallak)
|
||||
* [Muriloo](https://github.com/Muriloo)
|
||||
* [mustangJaro](https://github.com/mustangJaro)
|
||||
* [Mykyta-Serbynevskyi](https://github.com/Mykyta-Serbynevskyi)
|
||||
* [n0rritt](https://github.com/n0rritt)
|
||||
* [nastra](https://github.com/nastra)
|
||||
* [nataliekwong](https://github.com/nataliekwong)
|
||||
* [natalyjazzviolin](https://github.com/natalyjazzviolin)
|
||||
* [nauxliu](https://github.com/nauxliu)
|
||||
* [nguyenaiden](https://github.com/nguyenaiden)
|
||||
* [NipunaPrashan](https://github.com/NipunaPrashan)
|
||||
* [Nmaxime](https://github.com/Nmaxime)
|
||||
* [noahkawasaki-airbyte](https://github.com/noahkawasaki-airbyte)
|
||||
* [noahkawasakigoogle](https://github.com/noahkawasakigoogle)
|
||||
* [novotl](https://github.com/novotl)
|
||||
* [ntucker](https://github.com/ntucker)
|
||||
* [octavia-squidington-iii](https://github.com/octavia-squidington-iii)
|
||||
* [olivermeyer](https://github.com/olivermeyer)
|
||||
* [omid](https://github.com/omid)
|
||||
* [oreopot](https://github.com/oreopot)
|
||||
* [pabloescoder](https://github.com/pabloescoder)
|
||||
* [panhavad](https://github.com/panhavad)
|
||||
* [pecalleja](https://github.com/pecalleja)
|
||||
* [pedroslopez](https://github.com/pedroslopez)
|
||||
* [perangel](https://github.com/perangel)
|
||||
* [peter279k](https://github.com/peter279k)
|
||||
* [PhilipCorr](https://github.com/PhilipCorr)
|
||||
* [philippeboyd](https://github.com/philippeboyd)
|
||||
* [Phlair](https://github.com/Phlair)
|
||||
* [pmossman](https://github.com/pmossman)
|
||||
* [po3na4skld](https://github.com/po3na4skld)
|
||||
* [PoCTo](https://github.com/PoCTo)
|
||||
* [postamar](https://github.com/postamar)
|
||||
* [prasrvenkat](https://github.com/prasrvenkat)
|
||||
* [prateekmukhedkar](https://github.com/prateekmukhedkar)
|
||||
* [proprefenetre](https://github.com/proprefenetre)
|
||||
* [Pwaldi](https://github.com/Pwaldi)
|
||||
* [rach-r](https://github.com/rach-r)
|
||||
* [ramonvermeulen](https://github.com/ramonvermeulen)
|
||||
* [ReptilianBrain](https://github.com/ReptilianBrain)
|
||||
* [rileybrook](https://github.com/rileybrook)
|
||||
* [RobertoBonnet](https://github.com/RobertoBonnet)
|
||||
* [robgleason](https://github.com/robgleason)
|
||||
* [RobLucchi](https://github.com/RobLucchi)
|
||||
* [rodireich](https://github.com/rodireich)
|
||||
* [roisinbolt](https://github.com/roisinbolt)
|
||||
* [roman-romanov-o](https://github.com/roman-romanov-o)
|
||||
* [roman-yermilov-gl](https://github.com/roman-yermilov-gl)
|
||||
* [ron-damon](https://github.com/ron-damon)
|
||||
* [rparrapy](https://github.com/rparrapy)
|
||||
* [ryankfu](https://github.com/ryankfu)
|
||||
* [sajarin](https://github.com/sajarin)
|
||||
* [samos123](https://github.com/samos123)
|
||||
* [sarafonseca-123](https://github.com/sarafonseca-123)
|
||||
* [sashaNeshcheret](https://github.com/sashaNeshcheret)
|
||||
* [SatishChGit](https://github.com/SatishChGit)
|
||||
* [sbjorn](https://github.com/sbjorn)
|
||||
* [schlattk](https://github.com/schlattk)
|
||||
* [scottleechua](https://github.com/scottleechua)
|
||||
* [sdairs](https://github.com/sdairs)
|
||||
* [sergei-solonitcyn](https://github.com/sergei-solonitcyn)
|
||||
* [sergio-ropero](https://github.com/sergio-ropero)
|
||||
* [sh4sh](https://github.com/sh4sh)
|
||||
* [shadabshaukat](https://github.com/shadabshaukat)
|
||||
* [sherifnada](https://github.com/sherifnada)
|
||||
* [Shishir-rmv](https://github.com/Shishir-rmv)
|
||||
* [shrodingers](https://github.com/shrodingers)
|
||||
* [shyngysnurzhan](https://github.com/shyngysnurzhan)
|
||||
* [siddhant3030](https://github.com/siddhant3030)
|
||||
* [sivankumar86](https://github.com/sivankumar86)
|
||||
* [snyk-bot](https://github.com/snyk-bot)
|
||||
* [SofiiaZaitseva](https://github.com/SofiiaZaitseva)
|
||||
* [sophia-wiley](https://github.com/sophia-wiley)
|
||||
* [SPTKL](https://github.com/SPTKL)
|
||||
* [subhamX](https://github.com/subhamX)
|
||||
* [subodh1810](https://github.com/subodh1810)
|
||||
* [suhomud](https://github.com/suhomud)
|
||||
* [supertopher](https://github.com/supertopher)
|
||||
* [swyxio](https://github.com/swyxio)
|
||||
* [tbcdns](https://github.com/tbcdns)
|
||||
* [tealjulia](https://github.com/tealjulia)
|
||||
* [terencecho](https://github.com/terencecho)
|
||||
* [thanhlmm](https://github.com/thanhlmm)
|
||||
* [thomas-vl](https://github.com/thomas-vl)
|
||||
* [timroes](https://github.com/timroes)
|
||||
* [tirth7777777](https://github.com/tirth7777777)
|
||||
* [tjirab](https://github.com/tjirab)
|
||||
* [tkorenko](https://github.com/tkorenko)
|
||||
* [tolik0](https://github.com/tolik0)
|
||||
* [topefolorunso](https://github.com/topefolorunso)
|
||||
* [trowacat](https://github.com/trowacat)
|
||||
* [tryangul](https://github.com/tryangul)
|
||||
* [TSkrebe](https://github.com/TSkrebe)
|
||||
* [tuanchris](https://github.com/tuanchris)
|
||||
* [tuliren](https://github.com/tuliren)
|
||||
* [tyagi-data-wizard](https://github.com/tyagi-data-wizard)
|
||||
* [tybernstein](https://github.com/tybernstein)
|
||||
* [TymoshokDmytro](https://github.com/TymoshokDmytro)
|
||||
* [tyschroed](https://github.com/tyschroed)
|
||||
* [ufou](https://github.com/ufou)
|
||||
* [Upmitt](https://github.com/Upmitt)
|
||||
* [VitaliiMaltsev](https://github.com/VitaliiMaltsev)
|
||||
* [vitaliizazmic](https://github.com/vitaliizazmic)
|
||||
* [vladimir-remar](https://github.com/vladimir-remar)
|
||||
* [vovavovavovavova](https://github.com/vovavovavovavova)
|
||||
* [wallies](https://github.com/wallies)
|
||||
* [winar-jin](https://github.com/winar-jin)
|
||||
* [wissevrowl](https://github.com/wissevrowl)
|
||||
* [Wittiest](https://github.com/Wittiest)
|
||||
* [wjwatkinson](https://github.com/wjwatkinson)
|
||||
* [Xabilahu](https://github.com/Xabilahu)
|
||||
* [xiaohansong](https://github.com/xiaohansong)
|
||||
* [xpuska513](https://github.com/xpuska513)
|
||||
* [yahu98](https://github.com/yahu98)
|
||||
* [yannibenoit](https://github.com/yannibenoit)
|
||||
* [yaroslav-dudar](https://github.com/yaroslav-dudar)
|
||||
* [yaroslav-hrytsaienko](https://github.com/yaroslav-hrytsaienko)
|
||||
* [YatsukBogdan1](https://github.com/YatsukBogdan1)
|
||||
* [ycherniaiev](https://github.com/ycherniaiev)
|
||||
* [yevhenii-ldv](https://github.com/yevhenii-ldv)
|
||||
* [YiyangLi](https://github.com/YiyangLi)
|
||||
* [YowanR](https://github.com/YowanR)
|
||||
* [yuhuishi-convect](https://github.com/yuhuishi-convect)
|
||||
* [yurii-bidiuk](https://github.com/yurii-bidiuk)
|
||||
* [Zawar92](https://github.com/Zawar92)
|
||||
* [zestyping](https://github.com/zestyping)
|
||||
* [Zirochkaa](https://github.com/Zirochkaa)
|
||||
* [zkid18](https://github.com/zkid18)
|
||||
* [zuc](https://github.com/zuc)
|
||||
* [zzstoatzz](https://github.com/zzstoatzz)
|
||||
* [zzztimbo](https://github.com/zzztimbo)
|
||||
- [69mb](https://github.com/69mb)
|
||||
- [a-honcharenko](https://github.com/a-honcharenko)
|
||||
- [aadityasinha-dotcom](https://github.com/aadityasinha-dotcom)
|
||||
- [aaronsteers](https://github.com/aaronsteers)
|
||||
- [aazam-gh](https://github.com/aazam-gh)
|
||||
- [abaerptc](https://github.com/abaerptc)
|
||||
- [aballiet](https://github.com/aballiet)
|
||||
- [achaussende](https://github.com/achaussende)
|
||||
- [ad-m](https://github.com/ad-m)
|
||||
- [adam-bloom](https://github.com/adam-bloom)
|
||||
- [adamf](https://github.com/adamf)
|
||||
- [adamschmidt](https://github.com/adamschmidt)
|
||||
- [AetherUnbound](https://github.com/AetherUnbound)
|
||||
- [afranzi](https://github.com/afranzi)
|
||||
- [agrass](https://github.com/agrass)
|
||||
- [ahmed-buksh](https://github.com/ahmed-buksh)
|
||||
- [airbyte-jenny](https://github.com/airbyte-jenny)
|
||||
- [ajmhatch](https://github.com/ajmhatch)
|
||||
- [ajzo90](https://github.com/ajzo90)
|
||||
- [akashkulk](https://github.com/akashkulk)
|
||||
- [akulgoel96](https://github.com/akulgoel96)
|
||||
- [alafanechere](https://github.com/alafanechere)
|
||||
- [alallema](https://github.com/alallema)
|
||||
- [albert-marrero](https://github.com/albert-marrero)
|
||||
- [alex-danilin](https://github.com/alex-danilin)
|
||||
- [alex-gron](https://github.com/alex-gron)
|
||||
- [alexander-marquardt](https://github.com/alexander-marquardt)
|
||||
- [AlexanderBatoulis](https://github.com/AlexanderBatoulis)
|
||||
- [alexandertsukanov](https://github.com/alexandertsukanov)
|
||||
- [alexandr-shegeda](https://github.com/alexandr-shegeda)
|
||||
- [alexchouraki](https://github.com/alexchouraki)
|
||||
- [AlexJameson](https://github.com/AlexJameson)
|
||||
- [alexnikitchuk](https://github.com/alexnikitchuk)
|
||||
- [Alihassanc5](https://github.com/Alihassanc5)
|
||||
- [Allexik](https://github.com/Allexik)
|
||||
- [alovew](https://github.com/alovew)
|
||||
- [AM-I-Human](https://github.com/AM-I-Human)
|
||||
- [amaliaroye](https://github.com/amaliaroye)
|
||||
- [ambirdsall](https://github.com/ambirdsall)
|
||||
- [aminamos](https://github.com/aminamos)
|
||||
- [amitku](https://github.com/amitku)
|
||||
- [Amruta-Ranade](https://github.com/Amruta-Ranade)
|
||||
- [anamargaridarl](https://github.com/anamargaridarl)
|
||||
- [andnig](https://github.com/andnig)
|
||||
- [andresbravog](https://github.com/andresbravog)
|
||||
- [andrewlreeve](https://github.com/andrewlreeve)
|
||||
- [andreyAtBB](https://github.com/andreyAtBB)
|
||||
- [andriikorotkov](https://github.com/andriikorotkov)
|
||||
- [andrzejdackiewicz](https://github.com/andrzejdackiewicz)
|
||||
- [andyjih](https://github.com/andyjih)
|
||||
- [AndyTwiss](https://github.com/AndyTwiss)
|
||||
- [animer3009](https://github.com/animer3009)
|
||||
- [anna-geller](https://github.com/anna-geller)
|
||||
- [annalvova05](https://github.com/annalvova05)
|
||||
- [antixar](https://github.com/antixar)
|
||||
- [antonioneto-hotmart](https://github.com/antonioneto-hotmart)
|
||||
- [anujgupta0711](https://github.com/anujgupta0711)
|
||||
- [Anurag870](https://github.com/Anurag870)
|
||||
- [anushree-agrawal](https://github.com/anushree-agrawal)
|
||||
- [apostoltego](https://github.com/apostoltego)
|
||||
- [archangelic](https://github.com/archangelic)
|
||||
- [arimbr](https://github.com/arimbr)
|
||||
- [arnaudjnn](https://github.com/arnaudjnn)
|
||||
- [ArneZsng](https://github.com/ArneZsng)
|
||||
- [arsenlosenko](https://github.com/arsenlosenko)
|
||||
- [artem1205](https://github.com/artem1205)
|
||||
- [artusiep](https://github.com/artusiep)
|
||||
- [asafepy](https://github.com/asafepy)
|
||||
- [asyarif93](https://github.com/asyarif93)
|
||||
- [augan-rymkhan](https://github.com/augan-rymkhan)
|
||||
- [Auric-Manteo](https://github.com/Auric-Manteo)
|
||||
- [avaidyanatha](https://github.com/avaidyanatha)
|
||||
- [avida](https://github.com/avida)
|
||||
- [avirajsingh7](https://github.com/avirajsingh7)
|
||||
- [axaysagathiya](https://github.com/axaysagathiya)
|
||||
- [azhard](https://github.com/azhard)
|
||||
- [b4stien](https://github.com/b4stien)
|
||||
- [bala-ceg](https://github.com/bala-ceg)
|
||||
- [bazarnov](https://github.com/bazarnov)
|
||||
- [bbugh](https://github.com/bbugh)
|
||||
- [bcbeidel](https://github.com/bcbeidel)
|
||||
- [bdashrad](https://github.com/bdashrad)
|
||||
- [benmoriceau](https://github.com/benmoriceau)
|
||||
- [BenoitFayolle](https://github.com/BenoitFayolle)
|
||||
- [BenoitHugonnard](https://github.com/BenoitHugonnard)
|
||||
- [bgroff](https://github.com/bgroff)
|
||||
- [Bhupesh-V](https://github.com/Bhupesh-V)
|
||||
- [BirdboyBolu](https://github.com/BirdboyBolu)
|
||||
- [bjgbeelen](https://github.com/bjgbeelen)
|
||||
- [bkrausz](https://github.com/bkrausz)
|
||||
- [bleonard](https://github.com/bleonard)
|
||||
- [bnchrch](https://github.com/bnchrch)
|
||||
- [bobvanluijt](https://github.com/bobvanluijt)
|
||||
- [brebuanirello-equinix](https://github.com/brebuanirello-equinix)
|
||||
- [BrentSouza](https://github.com/BrentSouza)
|
||||
- [brianjlai](https://github.com/brianjlai)
|
||||
- [brunofaustino](https://github.com/brunofaustino)
|
||||
- [bstrawson](https://github.com/bstrawson)
|
||||
- [btkcodedev](https://github.com/btkcodedev)
|
||||
- [burmecia](https://github.com/burmecia)
|
||||
- [bzAmin](https://github.com/bzAmin)
|
||||
- [calebfornari](https://github.com/calebfornari)
|
||||
- [cameronwtaylor](https://github.com/cameronwtaylor)
|
||||
- [camro](https://github.com/camro)
|
||||
- [carlkibler](https://github.com/carlkibler)
|
||||
- [carlonuccio](https://github.com/carlonuccio)
|
||||
- [catpineapple](https://github.com/catpineapple)
|
||||
- [cgardens](https://github.com/cgardens)
|
||||
- [chadthman](https://github.com/chadthman)
|
||||
- [chandrasekharan98](https://github.com/chandrasekharan98)
|
||||
- [ChristoGrab](https://github.com/ChristoGrab)
|
||||
- [ChristopheDuong](https://github.com/ChristopheDuong)
|
||||
- [ciancullinan](https://github.com/ciancullinan)
|
||||
- [cirdes](https://github.com/cirdes)
|
||||
- [cjwooo](https://github.com/cjwooo)
|
||||
- [clnoll](https://github.com/clnoll)
|
||||
- [cobobrien](https://github.com/cobobrien)
|
||||
- [coetzeevs](https://github.com/coetzeevs)
|
||||
- [colesnodgrass](https://github.com/colesnodgrass)
|
||||
- [collinscangarella](https://github.com/collinscangarella)
|
||||
- [cpdeethree](https://github.com/cpdeethree)
|
||||
- [CrafterKolyan](https://github.com/CrafterKolyan)
|
||||
- [cstruct](https://github.com/cstruct)
|
||||
- [ct-martin](https://github.com/ct-martin)
|
||||
- [cuyk](https://github.com/cuyk)
|
||||
- [cynthiaxyin](https://github.com/cynthiaxyin)
|
||||
- [CyprienBarbault](https://github.com/CyprienBarbault)
|
||||
- [czuares](https://github.com/czuares)
|
||||
- [Daemonxiao](https://github.com/Daemonxiao)
|
||||
- [dainiussa](https://github.com/dainiussa)
|
||||
- [dalo390](https://github.com/dalo390)
|
||||
- [damianlegawiec](https://github.com/damianlegawiec)
|
||||
- [dandpz](https://github.com/dandpz)
|
||||
- [daniel-cortez-stevenson](https://github.com/daniel-cortez-stevenson)
|
||||
- [danieldiamond](https://github.com/danieldiamond)
|
||||
- [Danucas](https://github.com/Danucas)
|
||||
- [danvass](https://github.com/danvass)
|
||||
- [darian-heede](https://github.com/darian-heede)
|
||||
- [darynaishchenko](https://github.com/darynaishchenko)
|
||||
- [DavidSpek](https://github.com/DavidSpek)
|
||||
- [davinchia](https://github.com/davinchia)
|
||||
- [davydov-d](https://github.com/davydov-d)
|
||||
- [dbyzero](https://github.com/dbyzero)
|
||||
- [ddoyediran](https://github.com/ddoyediran)
|
||||
- [deepansh96](https://github.com/deepansh96)
|
||||
- [delenamalan](https://github.com/delenamalan)
|
||||
- [denis-sokolov](https://github.com/denis-sokolov)
|
||||
- [dependabot[bot]](https://github.com/apps/dependabot)
|
||||
- [dictcp](https://github.com/dictcp)
|
||||
- [didistars328](https://github.com/didistars328)
|
||||
- [digambar-t7](https://github.com/digambar-t7)
|
||||
- [dijonkitchen](https://github.com/dijonkitchen)
|
||||
- [dizel852](https://github.com/dizel852)
|
||||
- [dmateusp](https://github.com/dmateusp)
|
||||
- [domzae](https://github.com/domzae)
|
||||
- [DoNotPanicUA](https://github.com/DoNotPanicUA)
|
||||
- [Dracyr](https://github.com/Dracyr)
|
||||
- [drrest](https://github.com/drrest)
|
||||
- [dtt101](https://github.com/dtt101)
|
||||
- [edbizarro](https://github.com/edbizarro)
|
||||
- [edgao](https://github.com/edgao)
|
||||
- [edmundito](https://github.com/edmundito)
|
||||
- [efimmatytsin](https://github.com/efimmatytsin)
|
||||
- [eliziario](https://github.com/eliziario)
|
||||
- [elliottrabac](https://github.com/elliottrabac)
|
||||
- [emmaling27](https://github.com/emmaling27)
|
||||
- [erica-airbyte](https://github.com/erica-airbyte)
|
||||
- [erohmensing](https://github.com/erohmensing)
|
||||
- [etsybaev](https://github.com/etsybaev)
|
||||
- [eugene-kulak](https://github.com/eugene-kulak)
|
||||
- [evantahler](https://github.com/evantahler)
|
||||
- [ffabss](https://github.com/ffabss)
|
||||
- [flash1293](https://github.com/flash1293)
|
||||
- [franviera92](https://github.com/franviera92)
|
||||
- [freimer](https://github.com/freimer)
|
||||
- [FUT](https://github.com/FUT)
|
||||
- [gaart](https://github.com/gaart)
|
||||
- [ganpatagarwal](https://github.com/ganpatagarwal)
|
||||
- [gargatuma](https://github.com/gargatuma)
|
||||
- [gergelylendvai](https://github.com/gergelylendvai)
|
||||
- [girarda](https://github.com/girarda)
|
||||
- [git-phu](https://github.com/git-phu)
|
||||
- [github-actions[bot]](https://github.com/apps/github-actions)
|
||||
- [Gitznik](https://github.com/Gitznik)
|
||||
- [gordalina](https://github.com/gordalina)
|
||||
- [gosusnp](https://github.com/gosusnp)
|
||||
- [grebessi](https://github.com/grebessi)
|
||||
- [grishick](https://github.com/grishick)
|
||||
- [grubberr](https://github.com/grubberr)
|
||||
- [gvillafanetapia](https://github.com/gvillafanetapia)
|
||||
- [h7kanna](https://github.com/h7kanna)
|
||||
- [haithem-souala](https://github.com/haithem-souala)
|
||||
- [haoranyu](https://github.com/haoranyu)
|
||||
- [harshithmullapudi](https://github.com/harshithmullapudi)
|
||||
- [heade](https://github.com/heade)
|
||||
- [hehex9](https://github.com/hehex9)
|
||||
- [helderco](https://github.com/helderco)
|
||||
- [henriblancke](https://github.com/henriblancke)
|
||||
- [Hesperide](https://github.com/Hesperide)
|
||||
- [hillairet](https://github.com/hillairet)
|
||||
- [himanshuc3](https://github.com/himanshuc3)
|
||||
- [hntan](https://github.com/hntan)
|
||||
- [htrueman](https://github.com/htrueman)
|
||||
- [hydrosquall](https://github.com/hydrosquall)
|
||||
- [iberchid](https://github.com/iberchid)
|
||||
- [igrankova](https://github.com/igrankova)
|
||||
- [igsaf2](https://github.com/igsaf2)
|
||||
- [Imbruced](https://github.com/Imbruced)
|
||||
- [irynakruk](https://github.com/irynakruk)
|
||||
- [isaacharrisholt](https://github.com/isaacharrisholt)
|
||||
- [isalikov](https://github.com/isalikov)
|
||||
- [itaseskii](https://github.com/itaseskii)
|
||||
- [jacqueskpoty](https://github.com/jacqueskpoty)
|
||||
- [Jagrutiti](https://github.com/Jagrutiti)
|
||||
- [jamakase](https://github.com/jamakase)
|
||||
- [jartek](https://github.com/jartek)
|
||||
- [jbfbell](https://github.com/jbfbell)
|
||||
- [jcowanpdx](https://github.com/jcowanpdx)
|
||||
- [jdclarke5](https://github.com/jdclarke5)
|
||||
- [jdpgrailsdev](https://github.com/jdpgrailsdev)
|
||||
- [jeremySrgt](https://github.com/jeremySrgt)
|
||||
- [jhajajaas](https://github.com/jhajajaas)
|
||||
- [jhammarstedt](https://github.com/jhammarstedt)
|
||||
- [jnr0790](https://github.com/jnr0790)
|
||||
- [joelluijmes](https://github.com/joelluijmes)
|
||||
- [johnlafleur](https://github.com/johnlafleur)
|
||||
- [JonsSpaghetti](https://github.com/JonsSpaghetti)
|
||||
- [jonstacks](https://github.com/jonstacks)
|
||||
- [jordan-glitch](https://github.com/jordan-glitch)
|
||||
- [josephkmh](https://github.com/josephkmh)
|
||||
- [jrhizor](https://github.com/jrhizor)
|
||||
- [juliachvyrova](https://github.com/juliachvyrova)
|
||||
- [JulianRommel](https://github.com/JulianRommel)
|
||||
- [juliatournant](https://github.com/juliatournant)
|
||||
- [justinbchau](https://github.com/justinbchau)
|
||||
- [juweins](https://github.com/juweins)
|
||||
- [jzcruiser](https://github.com/jzcruiser)
|
||||
- [kaklakariada](https://github.com/kaklakariada)
|
||||
- [karinakuz](https://github.com/karinakuz)
|
||||
- [kattos-aws](https://github.com/kattos-aws)
|
||||
- [KayakinKoder](https://github.com/KayakinKoder)
|
||||
- [keu](https://github.com/keu)
|
||||
- [kgrover](https://github.com/kgrover)
|
||||
- [kimerinn](https://github.com/kimerinn)
|
||||
- [koconder](https://github.com/koconder)
|
||||
- [koji-m](https://github.com/koji-m)
|
||||
- [krishnaglick](https://github.com/krishnaglick)
|
||||
- [krisjan-oldekamp](https://github.com/krisjan-oldekamp)
|
||||
- [ksengers](https://github.com/ksengers)
|
||||
- [kzzzr](https://github.com/kzzzr)
|
||||
- [lazebnyi](https://github.com/lazebnyi)
|
||||
- [leo-schick](https://github.com/leo-schick)
|
||||
- [letiescanciano](https://github.com/letiescanciano)
|
||||
- [lgomezm](https://github.com/lgomezm)
|
||||
- [lideke](https://github.com/lideke)
|
||||
- [lizdeika](https://github.com/lizdeika)
|
||||
- [lmossman](https://github.com/lmossman)
|
||||
- [maciej-nedza](https://github.com/maciej-nedza)
|
||||
- [macmv](https://github.com/macmv)
|
||||
- [Mainara](https://github.com/Mainara)
|
||||
- [makalaaneesh](https://github.com/makalaaneesh)
|
||||
- [makyash](https://github.com/makyash)
|
||||
- [malikdiarra](https://github.com/malikdiarra)
|
||||
- [marcelopio](https://github.com/marcelopio)
|
||||
- [marcosmarxm](https://github.com/marcosmarxm)
|
||||
- [mariamthiam](https://github.com/mariamthiam)
|
||||
- [masonwheeler](https://github.com/masonwheeler)
|
||||
- [masyagin1998](https://github.com/masyagin1998)
|
||||
- [matter-q](https://github.com/matter-q)
|
||||
- [maxi297](https://github.com/maxi297)
|
||||
- [MaxKrog](https://github.com/MaxKrog)
|
||||
- [mdibaiee](https://github.com/mdibaiee)
|
||||
- [mfsiega-airbyte](https://github.com/mfsiega-airbyte)
|
||||
- [michaelnguyen26](https://github.com/michaelnguyen26)
|
||||
- [michel-tricot](https://github.com/michel-tricot)
|
||||
- [mickaelandrieu](https://github.com/mickaelandrieu)
|
||||
- [midavadim](https://github.com/midavadim)
|
||||
- [mildbyte](https://github.com/mildbyte)
|
||||
- [misteryeo](https://github.com/misteryeo)
|
||||
- [mkhokh-33](https://github.com/mkhokh-33)
|
||||
- [mlavoie-sm360](https://github.com/mlavoie-sm360)
|
||||
- [mmolimar](https://github.com/mmolimar)
|
||||
- [mohamagdy](https://github.com/mohamagdy)
|
||||
- [mohitreddy1996](https://github.com/mohitreddy1996)
|
||||
- [monai](https://github.com/monai)
|
||||
- [mrhallak](https://github.com/mrhallak)
|
||||
- [Muriloo](https://github.com/Muriloo)
|
||||
- [mustangJaro](https://github.com/mustangJaro)
|
||||
- [Mykyta-Serbynevskyi](https://github.com/Mykyta-Serbynevskyi)
|
||||
- [n0rritt](https://github.com/n0rritt)
|
||||
- [nastra](https://github.com/nastra)
|
||||
- [nataliekwong](https://github.com/nataliekwong)
|
||||
- [natalyjazzviolin](https://github.com/natalyjazzviolin)
|
||||
- [nauxliu](https://github.com/nauxliu)
|
||||
- [nguyenaiden](https://github.com/nguyenaiden)
|
||||
- [NipunaPrashan](https://github.com/NipunaPrashan)
|
||||
- [Nmaxime](https://github.com/Nmaxime)
|
||||
- [noahkawasaki-airbyte](https://github.com/noahkawasaki-airbyte)
|
||||
- [noahkawasakigoogle](https://github.com/noahkawasakigoogle)
|
||||
- [novotl](https://github.com/novotl)
|
||||
- [ntucker](https://github.com/ntucker)
|
||||
- [octavia-squidington-iii](https://github.com/octavia-squidington-iii)
|
||||
- [olivermeyer](https://github.com/olivermeyer)
|
||||
- [omid](https://github.com/omid)
|
||||
- [oreopot](https://github.com/oreopot)
|
||||
- [pabloescoder](https://github.com/pabloescoder)
|
||||
- [panhavad](https://github.com/panhavad)
|
||||
- [pecalleja](https://github.com/pecalleja)
|
||||
- [pedroslopez](https://github.com/pedroslopez)
|
||||
- [perangel](https://github.com/perangel)
|
||||
- [peter279k](https://github.com/peter279k)
|
||||
- [PhilipCorr](https://github.com/PhilipCorr)
|
||||
- [philippeboyd](https://github.com/philippeboyd)
|
||||
- [Phlair](https://github.com/Phlair)
|
||||
- [pmossman](https://github.com/pmossman)
|
||||
- [po3na4skld](https://github.com/po3na4skld)
|
||||
- [PoCTo](https://github.com/PoCTo)
|
||||
- [postamar](https://github.com/postamar)
|
||||
- [prasrvenkat](https://github.com/prasrvenkat)
|
||||
- [prateekmukhedkar](https://github.com/prateekmukhedkar)
|
||||
- [proprefenetre](https://github.com/proprefenetre)
|
||||
- [Pwaldi](https://github.com/Pwaldi)
|
||||
- [rach-r](https://github.com/rach-r)
|
||||
- [ramonvermeulen](https://github.com/ramonvermeulen)
|
||||
- [ReptilianBrain](https://github.com/ReptilianBrain)
|
||||
- [rileybrook](https://github.com/rileybrook)
|
||||
- [RobertoBonnet](https://github.com/RobertoBonnet)
|
||||
- [robgleason](https://github.com/robgleason)
|
||||
- [RobLucchi](https://github.com/RobLucchi)
|
||||
- [rodireich](https://github.com/rodireich)
|
||||
- [roisinbolt](https://github.com/roisinbolt)
|
||||
- [roman-romanov-o](https://github.com/roman-romanov-o)
|
||||
- [roman-yermilov-gl](https://github.com/roman-yermilov-gl)
|
||||
- [ron-damon](https://github.com/ron-damon)
|
||||
- [rparrapy](https://github.com/rparrapy)
|
||||
- [ryankfu](https://github.com/ryankfu)
|
||||
- [sajarin](https://github.com/sajarin)
|
||||
- [samos123](https://github.com/samos123)
|
||||
- [sarafonseca-123](https://github.com/sarafonseca-123)
|
||||
- [sashaNeshcheret](https://github.com/sashaNeshcheret)
|
||||
- [SatishChGit](https://github.com/SatishChGit)
|
||||
- [sbjorn](https://github.com/sbjorn)
|
||||
- [schlattk](https://github.com/schlattk)
|
||||
- [scottleechua](https://github.com/scottleechua)
|
||||
- [sdairs](https://github.com/sdairs)
|
||||
- [sergei-solonitcyn](https://github.com/sergei-solonitcyn)
|
||||
- [sergio-ropero](https://github.com/sergio-ropero)
|
||||
- [sh4sh](https://github.com/sh4sh)
|
||||
- [shadabshaukat](https://github.com/shadabshaukat)
|
||||
- [sherifnada](https://github.com/sherifnada)
|
||||
- [Shishir-rmv](https://github.com/Shishir-rmv)
|
||||
- [shrodingers](https://github.com/shrodingers)
|
||||
- [shyngysnurzhan](https://github.com/shyngysnurzhan)
|
||||
- [siddhant3030](https://github.com/siddhant3030)
|
||||
- [sivankumar86](https://github.com/sivankumar86)
|
||||
- [snyk-bot](https://github.com/snyk-bot)
|
||||
- [SofiiaZaitseva](https://github.com/SofiiaZaitseva)
|
||||
- [sophia-wiley](https://github.com/sophia-wiley)
|
||||
- [SPTKL](https://github.com/SPTKL)
|
||||
- [subhamX](https://github.com/subhamX)
|
||||
- [subodh1810](https://github.com/subodh1810)
|
||||
- [suhomud](https://github.com/suhomud)
|
||||
- [supertopher](https://github.com/supertopher)
|
||||
- [swyxio](https://github.com/swyxio)
|
||||
- [tbcdns](https://github.com/tbcdns)
|
||||
- [tealjulia](https://github.com/tealjulia)
|
||||
- [terencecho](https://github.com/terencecho)
|
||||
- [thanhlmm](https://github.com/thanhlmm)
|
||||
- [thomas-vl](https://github.com/thomas-vl)
|
||||
- [timroes](https://github.com/timroes)
|
||||
- [tirth7777777](https://github.com/tirth7777777)
|
||||
- [tjirab](https://github.com/tjirab)
|
||||
- [tkorenko](https://github.com/tkorenko)
|
||||
- [tolik0](https://github.com/tolik0)
|
||||
- [topefolorunso](https://github.com/topefolorunso)
|
||||
- [trowacat](https://github.com/trowacat)
|
||||
- [tryangul](https://github.com/tryangul)
|
||||
- [TSkrebe](https://github.com/TSkrebe)
|
||||
- [tuanchris](https://github.com/tuanchris)
|
||||
- [tuliren](https://github.com/tuliren)
|
||||
- [tyagi-data-wizard](https://github.com/tyagi-data-wizard)
|
||||
- [tybernstein](https://github.com/tybernstein)
|
||||
- [TymoshokDmytro](https://github.com/TymoshokDmytro)
|
||||
- [tyschroed](https://github.com/tyschroed)
|
||||
- [ufou](https://github.com/ufou)
|
||||
- [Upmitt](https://github.com/Upmitt)
|
||||
- [VitaliiMaltsev](https://github.com/VitaliiMaltsev)
|
||||
- [vitaliizazmic](https://github.com/vitaliizazmic)
|
||||
- [vladimir-remar](https://github.com/vladimir-remar)
|
||||
- [vovavovavovavova](https://github.com/vovavovavovavova)
|
||||
- [wallies](https://github.com/wallies)
|
||||
- [winar-jin](https://github.com/winar-jin)
|
||||
- [wissevrowl](https://github.com/wissevrowl)
|
||||
- [Wittiest](https://github.com/Wittiest)
|
||||
- [wjwatkinson](https://github.com/wjwatkinson)
|
||||
- [Xabilahu](https://github.com/Xabilahu)
|
||||
- [xiaohansong](https://github.com/xiaohansong)
|
||||
- [xpuska513](https://github.com/xpuska513)
|
||||
- [yahu98](https://github.com/yahu98)
|
||||
- [yannibenoit](https://github.com/yannibenoit)
|
||||
- [yaroslav-dudar](https://github.com/yaroslav-dudar)
|
||||
- [yaroslav-hrytsaienko](https://github.com/yaroslav-hrytsaienko)
|
||||
- [YatsukBogdan1](https://github.com/YatsukBogdan1)
|
||||
- [ycherniaiev](https://github.com/ycherniaiev)
|
||||
- [yevhenii-ldv](https://github.com/yevhenii-ldv)
|
||||
- [YiyangLi](https://github.com/YiyangLi)
|
||||
- [YowanR](https://github.com/YowanR)
|
||||
- [yuhuishi-convect](https://github.com/yuhuishi-convect)
|
||||
- [yurii-bidiuk](https://github.com/yurii-bidiuk)
|
||||
- [Zawar92](https://github.com/Zawar92)
|
||||
- [zestyping](https://github.com/zestyping)
|
||||
- [Zirochkaa](https://github.com/Zirochkaa)
|
||||
- [zkid18](https://github.com/zkid18)
|
||||
- [zuc](https://github.com/zuc)
|
||||
- [zzstoatzz](https://github.com/zzstoatzz)
|
||||
- [zzztimbo](https://github.com/zzztimbo)
|
||||
|
||||
```shell
|
||||
p=1;
|
||||
|
||||
11
README.md
11
README.md
@@ -34,11 +34,12 @@ We believe that only an **open-source solution to data movement** can cover the
|
||||
_Screenshot taken from [Airbyte Cloud](https://cloud.airbyte.com/signup)_.
|
||||
|
||||
### Getting Started
|
||||
* [Deploy Airbyte Open Source](https://docs.airbyte.com/quickstart/deploy-airbyte) or set up [Airbyte Cloud](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud) to start centralizing your data.
|
||||
* Create connectors in minutes with our [no-code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview).
|
||||
* Explore popular use cases in our [tutorials](https://airbyte.com/tutorials).
|
||||
* Orchestrate Airbyte syncs with [Airflow](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator), [Prefect](https://docs.airbyte.com/operator-guides/using-prefect-task), [Dagster](https://docs.airbyte.com/operator-guides/using-dagster-integration), [Kestra](https://docs.airbyte.com/operator-guides/using-kestra-plugin) or the [Airbyte API](https://reference.airbyte.com/reference/start).
|
||||
* Easily transform loaded data with [SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) or [dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt).
|
||||
|
||||
- [Deploy Airbyte Open Source](https://docs.airbyte.com/quickstart/deploy-airbyte) or set up [Airbyte Cloud](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud) to start centralizing your data.
|
||||
- Create connectors in minutes with our [no-code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview).
|
||||
- Explore popular use cases in our [tutorials](https://airbyte.com/tutorials).
|
||||
- Orchestrate Airbyte syncs with [Airflow](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator), [Prefect](https://docs.airbyte.com/operator-guides/using-prefect-task), [Dagster](https://docs.airbyte.com/operator-guides/using-dagster-integration), [Kestra](https://docs.airbyte.com/operator-guides/using-kestra-plugin) or the [Airbyte API](https://reference.airbyte.com/reference/start).
|
||||
- Easily transform loaded data with [SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) or [dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt).
|
||||
|
||||
Try it out yourself with our [demo app](https://demo.airbyte.io/), visit our [full documentation](https://docs.airbyte.com/) and learn more about [recent announcements](https://airbyte.com/blog-categories/company-updates). See our [registry](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html) for a full list of connectors already available in Airbyte or Airbyte Cloud.
|
||||
|
||||
|
||||
@@ -173,7 +173,7 @@ corresponds to that version.
|
||||
### Java CDK
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|:--------| :--------- | :--------------------------------------------------------- |:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| 0.33.1 | 2024-05-03 | [\#37824](https://github.com/airbytehq/airbyte/pull/37824) | Add a unit test for cursor based sync |
|
||||
| 0.33.0 | 2024-05-03 | [\#36935](https://github.com/airbytehq/airbyte/pull/36935) | Destinations: Enable non-safe-casting DV2 tests |
|
||||
| 0.32.0 | 2024-05-03 | [\#36929](https://github.com/airbytehq/airbyte/pull/36929) | Destinations: Assorted DV2 changes for mysql |
|
||||
|
||||
@@ -1,10 +1,13 @@
|
||||
# Developing an SSH Connector
|
||||
|
||||
## Goal
|
||||
|
||||
Easy development of any connector that needs the ability to connect to a resource via SSH Tunnel.
|
||||
|
||||
## Overview
|
||||
|
||||
Our SSH connector support is designed to be easy to plug into any existing connector. There are a few major pieces to consider:
|
||||
|
||||
1. Add SSH Configuration to the Spec - for SSH, we need to take in additional configuration, so we need to inject extra fields into the connector configuration.
|
||||
2. Add SSH Logic to the Connector - before the connector code begins to execute we need to start an SSH tunnel. This library provides logic to create that tunnel (and clean it up).
|
||||
3. Acceptance Testing - it is a good practice to include acceptance testing for the SSH version of a connector for at least one of the SSH types (password or ssh key). While unit testing for the SSH functionality exists in this package (coming soon), high-level acceptance testing to make sure this feature works with the individual connector belongs in the connector.
|
||||
@@ -12,40 +15,47 @@ Our SSH connector support is designed to be easy to plug into any existing conne
|
||||
## How To
|
||||
|
||||
### Add SSH Configuration to the Spec
|
||||
|
||||
1. The `SshHelpers` class provides 2 helper functions that injects the SSH configuration objects into a spec JsonSchema for an existing connector. Usually the `spec()` method for a connector looks like `Jsons.deserialize(MoreResources.readResource("spec.json"), ConnectorSpecification.class);`. These helpers are just injecting the ssh spec (`ssh-tunnel-spec.json`) into that spec.
|
||||
2. You may need to update tests to reflect that new fields have been added to the spec. Usually updating the tests just requires using these helpers in the tests.
|
||||
|
||||
### Add SSH Logic to the Connector
|
||||
|
||||
1. This package provides a Source decorated class to make it easy to add SSH logic to an existing source. Simply pass the source you want to wrap into the constructor of the `SshWrappedSource`. That class also requires two other fields: `hostKey` and `portKey`. Both of these fields are pointers to fields in the connector specification. The `hostKey` is a pointer to the field that hold the host of the resource you want to connect and `portKey` is the port. In a simple case, where the host name for a connector is just defined in the top-level `host` field, then `hostKey` would simply be: `["host"]`. If that field is nested, however, then it might be: `["database", "configuration", "host"]`.
|
||||
|
||||
### Acceptance Testing
|
||||
|
||||
1. The only difference between existing acceptance testing and acceptance testing with SSH is that the configuration that is used for testing needs to contain additional fields. You can see the `Postgres Source ssh key creds` in lastpass to see an example of what that might look like. Those credentials leverage an existing bastion host in our test infrastructure. (As future work, we want to get rid of the need to use a static bastion server and instead do it in docker so we can run it all locally.)
|
||||
|
||||
## Misc
|
||||
|
||||
### How to wrap the protocol in an SSH Tunnel
|
||||
|
||||
For `spec()`, `check()`, and `discover()` wrapping the connector in an SSH tunnel is easier to think about because when they return all work is done and the tunnel can be closed. Thus, each of these methods can simply be wrapped in a try-with-resource of the SSH Tunnel.
|
||||
|
||||
For `read()` and `write()` they return an iterator and consumer respectively that perform work that must happen within the SSH Tunnel after the method has returned. Therefore, the `close` function on the iterator and consumer have to handle closing the SSH tunnel; the methods themselves cannot just be wrapped in a try-with-resource. This is handled for you by the `SshWrappedSource`, but if you need to implement any of this manually you must take it into account.
|
||||
|
||||
### Name Mangling
|
||||
|
||||
One of the least intuitive pieces of the SSH setup to follow is the replacement of host names and ports. The reason `SshWrappedSource` needs to know how to get the hostname and port of the database you are trying to connect to is that when it builds the SSH tunnel that forwards to the database, it needs to know the hostname and port so that the tunnel forwards requests to the right place. After the SSH tunnel is established and forwarding to the database, the connector code itself runs.
|
||||
|
||||
There's a trick here though! The connector should NOT try to connect to the hostname and port of the database. Instead, it should be trying to connect to `localhost` and whatever port we are forwarding to the database. The `SshTunnel#sshWrap` removes the original host and port from the configuration for the connector and replaces it with `localhost` and the correct port. So from the connector code's point of view it is just operating on localhost.
|
||||
|
||||
There is a tradeoff here.
|
||||
* (Good) The way we have structured this allows users to configure a connector in the UI in a way that it is intuitive to user. They put in the host and port they think about referring to the database as (they don't need to worry about any of the localhost version).
|
||||
* (Good) The connector code does not need to know anything about SSH, it can just operate on the host and port it gets (and we let SSH Tunnel handle swapping the names for us) which makes writing a connector easier.
|
||||
* (Bad) The downside is that the `SshTunnel` logic is more complicated because it is absorbing all of this name swapping so that neither user nor connector developer need to worry about it. In our estimation, the good outweighs the extra complexity incurred here.
|
||||
|
||||
- (Good) The way we have structured this allows users to configure a connector in the UI in a way that it is intuitive to user. They put in the host and port they think about referring to the database as (they don't need to worry about any of the localhost version).
|
||||
- (Good) The connector code does not need to know anything about SSH, it can just operate on the host and port it gets (and we let SSH Tunnel handle swapping the names for us) which makes writing a connector easier.
|
||||
- (Bad) The downside is that the `SshTunnel` logic is more complicated because it is absorbing all of this name swapping so that neither user nor connector developer need to worry about it. In our estimation, the good outweighs the extra complexity incurred here.
|
||||
|
||||
### Acceptance Testing via ssh tunnel using SshBastion and JdbcDatabaseContainer in Docker
|
||||
|
||||
1. The `SshBastion` class provides 3 helper functions:
|
||||
`initAndStartBastion()`to initialize and start SSH Bastion server in Docker test container and creates new `Network` for bastion and tested jdbc container
|
||||
`getTunnelConfig()`which return JsoneNode with all necessary configuration to establish ssh tunnel. Connection configuration for integration tests is now taken directly from container settings and does not require a real database connection
|
||||
`stopAndCloseContainers` to stop and close SshBastion and JdbcDatabaseContainer at the end of the test
|
||||
|
||||
## Future Work
|
||||
* Add unit / integration testing for `ssh` package.
|
||||
* Restructure spec so that instead of having `SSH Key Authentication` or `Password Authentication` options for `tunnel_method`, just have an `SSH` option and then within that `SSH` option have a `oneOf` for password or key. This is blocked because we cannot use `oneOf`s nested in `oneOf`s.
|
||||
* Improve the process of acceptance testing by allowing doing acceptance testing using a bastion running in a docker container instead of having to use dedicated infrastructure and a static database.
|
||||
|
||||
- Add unit / integration testing for `ssh` package.
|
||||
- Restructure spec so that instead of having `SSH Key Authentication` or `Password Authentication` options for `tunnel_method`, just have an `SSH` option and then within that `SSH` option have a `oneOf` for password or key. This is blocked because we cannot use `oneOf`s nested in `oneOf`s.
|
||||
- Improve the process of acceptance testing by allowing doing acceptance testing using a bastion running in a docker container instead of having to use dedicated infrastructure and a static database.
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -11,12 +11,12 @@ To use these helpers, install the CDK with the `vector-db-based` extra:
|
||||
pip install airbyte-cdk[vector-db-based]
|
||||
```
|
||||
|
||||
|
||||
The helpers can be used in the following way:
|
||||
* Add the config models to the spec of the connector
|
||||
* Implement the `Indexer` interface for your specific database
|
||||
* In the check implementation of the destination, initialize the indexer and the embedder and call `check` on them
|
||||
* In the write implementation of the destination, initialize the indexer, the embedder and pass them to a new instance of the writer. Then call the writers `write` method with the iterable for incoming messages
|
||||
|
||||
- Add the config models to the spec of the connector
|
||||
- Implement the `Indexer` interface for your specific database
|
||||
- In the check implementation of the destination, initialize the indexer and the embedder and call `check` on them
|
||||
- In the write implementation of the destination, initialize the indexer, the embedder and pass them to a new instance of the writer. Then call the writers `write` method with the iterable for incoming messages
|
||||
|
||||
If there are no connector-specific embedders, the `airbyte_cdk.destinations.vector_db_based.embedder.create_from_config` function can be used to get an embedder instance from the config.
|
||||
|
||||
|
||||
@@ -1,20 +1,24 @@
|
||||
## Behavior
|
||||
|
||||
The Airbyte protocol defines the actions `spec`, `discover`, `check` and `read` for a source to be compliant. Here is the high-level description of the flow for a file-based source:
|
||||
* spec: calls AbstractFileBasedSpec.documentation_url and AbstractFileBasedSpec.schema to return a ConnectorSpecification.
|
||||
* discover: calls Source.streams, and subsequently Stream.get_json_schema; this uses Source.open_file to open files during schema discovery.
|
||||
* check: Source.check_connection is called from the entrypoint code (in the main CDK).
|
||||
* read: Stream.read_records calls Stream.list_files which calls Source.list_matching_files, and then also uses Source.open_file to parse records from the file handle.
|
||||
|
||||
- spec: calls AbstractFileBasedSpec.documentation_url and AbstractFileBasedSpec.schema to return a ConnectorSpecification.
|
||||
- discover: calls Source.streams, and subsequently Stream.get_json_schema; this uses Source.open_file to open files during schema discovery.
|
||||
- check: Source.check_connection is called from the entrypoint code (in the main CDK).
|
||||
- read: Stream.read_records calls Stream.list_files which calls Source.list_matching_files, and then also uses Source.open_file to parse records from the file handle.
|
||||
|
||||
## How to Implement Your Own
|
||||
|
||||
To create a file-based source a user must extend three classes – AbstractFileBasedSource, AbstractFileBasedSpec, and AbstractStreamReader – to create an implementation for the connector’s specific storage system. They then initialize a FileBasedSource with the instance of AbstractStreamReader specific to their storage system.
|
||||
|
||||
The abstract classes house the vast majority of the logic required by file-based sources. For example, when extending AbstractStreamReader, users only have to implement three methods:
|
||||
* list_matching_files: lists files matching the glob pattern(s) provided in the config.
|
||||
* open_file: returns a file handle for reading.
|
||||
* config property setter: concrete implementations of AbstractFileBasedStreamReader's config setter should assert that `value` is of the correct config type for that type of StreamReader.
|
||||
|
||||
- list_matching_files: lists files matching the glob pattern(s) provided in the config.
|
||||
- open_file: returns a file handle for reading.
|
||||
- config property setter: concrete implementations of AbstractFileBasedStreamReader's config setter should assert that `value` is of the correct config type for that type of StreamReader.
|
||||
|
||||
The result is that an implementation of a source might look like this:
|
||||
|
||||
```
|
||||
class CustomStreamReader(AbstractStreamReader):
|
||||
def open_file(self, remote_file: RemoteFile) -> FileHandler:
|
||||
@@ -47,41 +51,50 @@ For more information, feel free to check the docstrings of each classes or check
|
||||
## Supported File Types
|
||||
|
||||
### Avro
|
||||
|
||||
Avro is a serialization format developed by [Apache](https://avro.apache.org/docs/). Avro configuration options for the file-based CDK:
|
||||
* `double_as_string`: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers.
|
||||
|
||||
- `double_as_string`: Whether to convert double fields to strings. This is recommended if you have decimal numbers with a high degree of precision because there can be a loss precision when handling floating point numbers.
|
||||
|
||||
### CSV
|
||||
|
||||
CSV is a format loosely described by [RFC 4180](https://www.rfc-editor.org/rfc/rfc4180). The format is quite flexible which leads to a ton of options to consider:
|
||||
* `delimiter`: The character delimiting individual cells in the CSV data. By name, CSV is comma separated so the default value is `,`
|
||||
* `quote_char`: When quoted fields are used, it is possible for a field to span multiple lines, even when line breaks appear within such field. The default quote character is `"`.
|
||||
* `escape_char`: The character used for escaping special characters.
|
||||
* `encoding`: The character encoding of the file. By default, `UTF-8`
|
||||
* `double_quote`: Whether two quotes in a quoted CSV value denote a single quote in the data.
|
||||
* `quoting_behavior`: The quoting behavior determines when a value in a row should have quote marks added around it.
|
||||
* `skip_rows_before_header`: The number of rows to skip before the header row. For example, if the header row is on the 3rd row, enter 2 in this field.
|
||||
* `skip_rows_after_header`: The number of rows to skip after the header row.
|
||||
* `autogenerate_column_names`: If your CSV does not have a header row, the file-based CDK will need this enable to generate column names.
|
||||
* `null_values`: As CSV does not explicitly define a value for null values, the user can specify a set of case-sensitive strings that should be interpreted as null values.
|
||||
* `true_values`: As CSV does not explicitly define a value for positive boolean, the user can specify a set of case-sensitive strings that should be interpreted as true values.
|
||||
* `false_values`: As CSV does not explicitly define a value for negative boolean, the user can specify a set of case-sensitive strings that should be interpreted as false values.
|
||||
|
||||
- `delimiter`: The character delimiting individual cells in the CSV data. By name, CSV is comma separated so the default value is `,`
|
||||
- `quote_char`: When quoted fields are used, it is possible for a field to span multiple lines, even when line breaks appear within such field. The default quote character is `"`.
|
||||
- `escape_char`: The character used for escaping special characters.
|
||||
- `encoding`: The character encoding of the file. By default, `UTF-8`
|
||||
- `double_quote`: Whether two quotes in a quoted CSV value denote a single quote in the data.
|
||||
- `quoting_behavior`: The quoting behavior determines when a value in a row should have quote marks added around it.
|
||||
- `skip_rows_before_header`: The number of rows to skip before the header row. For example, if the header row is on the 3rd row, enter 2 in this field.
|
||||
- `skip_rows_after_header`: The number of rows to skip after the header row.
|
||||
- `autogenerate_column_names`: If your CSV does not have a header row, the file-based CDK will need this enable to generate column names.
|
||||
- `null_values`: As CSV does not explicitly define a value for null values, the user can specify a set of case-sensitive strings that should be interpreted as null values.
|
||||
- `true_values`: As CSV does not explicitly define a value for positive boolean, the user can specify a set of case-sensitive strings that should be interpreted as true values.
|
||||
- `false_values`: As CSV does not explicitly define a value for negative boolean, the user can specify a set of case-sensitive strings that should be interpreted as false values.
|
||||
|
||||
### JSONL
|
||||
|
||||
[JSONL](https://jsonlines.org/) (or JSON Lines) is a format where each row is a JSON object. There are no configuration option for this format. For backward compatibility reasons, the JSONL parser currently supports multiline objects even though this is not part of the JSONL standard. Following some data gathering, we reserve the right to remove the support for this. Given that files have multiline JSON objects, performances will be slow.
|
||||
|
||||
### Parquet
|
||||
|
||||
Parquet is a file format defined by [Apache](https://parquet.apache.org/). Configuration options are:
|
||||
* `decimal_as_float`: Whether to convert decimal fields to floats. There is a loss of precision when converting decimals to floats, so this is not recommended.
|
||||
|
||||
- `decimal_as_float`: Whether to convert decimal fields to floats. There is a loss of precision when converting decimals to floats, so this is not recommended.
|
||||
|
||||
### Document file types (PDF, DOCX, Markdown)
|
||||
|
||||
For file share source connectors, the `unstructured` parser can be used to parse document file types. The textual content of the whole file will be parsed as a single record with a `content` field containing the text encoded as markdown.
|
||||
|
||||
To use the unstructured parser, the libraries `poppler` and `tesseract` need to be installed on the system running the connector. For example, on Ubuntu, you can install them with the following command:
|
||||
|
||||
```
|
||||
apt-get install -y tesseract-ocr poppler-utils
|
||||
```
|
||||
|
||||
on Mac, you can install these via brew:
|
||||
|
||||
```
|
||||
brew install poppler
|
||||
brew install tesseract
|
||||
@@ -92,32 +105,35 @@ brew install tesseract
|
||||
Having a schema allows for the file-based CDK to take action when there is a discrepancy between a record and what are the expected types of the record fields.
|
||||
|
||||
Schema can be either inferred or user provided.
|
||||
* If the user defines it a format using JSON types, inference will not apply. Input schemas are a key/value pair of strings describing column name and data type. Supported types are `["string", "number", "integer", "object", "array", "boolean", "null"]`. For example, `{"col1": "string", "col2": "boolean"}`.
|
||||
* If the user enables schemaless sync, schema will `{"data": "object"}` and therefore emitted records will look like `{"data": {"col1": val1, …}}`. This is recommended if the contents between files in the stream vary significantly, and/or if data is very nested.
|
||||
* Else, the file-based CDK will infer the schema depending on the file type. Some file formats defined the schema as part of their metadata (like Parquet), some do on the record-level (like Avro) and some don't have any explicit typing (like JSON or CSV). Note that all CSV values are inferred as strings except where we are supporting legacy configurations. Any file format that does not define their schema on a metadata level will require the file-based CDK to iterate to a number of records. There is a limit of bytes that will be consumed in order to infer the schema.
|
||||
|
||||
- If the user defines it a format using JSON types, inference will not apply. Input schemas are a key/value pair of strings describing column name and data type. Supported types are `["string", "number", "integer", "object", "array", "boolean", "null"]`. For example, `{"col1": "string", "col2": "boolean"}`.
|
||||
- If the user enables schemaless sync, schema will `{"data": "object"}` and therefore emitted records will look like `{"data": {"col1": val1, …}}`. This is recommended if the contents between files in the stream vary significantly, and/or if data is very nested.
|
||||
- Else, the file-based CDK will infer the schema depending on the file type. Some file formats defined the schema as part of their metadata (like Parquet), some do on the record-level (like Avro) and some don't have any explicit typing (like JSON or CSV). Note that all CSV values are inferred as strings except where we are supporting legacy configurations. Any file format that does not define their schema on a metadata level will require the file-based CDK to iterate to a number of records. There is a limit of bytes that will be consumed in order to infer the schema.
|
||||
|
||||
### Validation Policies
|
||||
|
||||
Users will be required to select one of 3 different options, in the event that records are encountered that don’t conform to the schema.
|
||||
|
||||
* Skip nonconforming records: check each record to see if it conforms to the user-input or inferred schema; skip the record if it doesn't conform. We keep a count of the number of records in each file that do and do not conform and emit a log message with these counts once we’re done reading the file.
|
||||
* Emit all records: emit all records, even if they do not conform to the user-provided or inferred schema. Columns that don't exist in the configured catalog probably won't be available in the destination's table since that's the current behavior.
|
||||
Only error if there are conflicting field types or malformed rows.
|
||||
* Stop the sync and wait for schema re-discovery: if a record is encountered that does not conform to the configured catalog’s schema, we log a message and stop the whole sync. Note: this option is not recommended if the files have very different columns or datatypes, because the inferred schema may vary significantly at discover time.
|
||||
- Skip nonconforming records: check each record to see if it conforms to the user-input or inferred schema; skip the record if it doesn't conform. We keep a count of the number of records in each file that do and do not conform and emit a log message with these counts once we’re done reading the file.
|
||||
- Emit all records: emit all records, even if they do not conform to the user-provided or inferred schema. Columns that don't exist in the configured catalog probably won't be available in the destination's table since that's the current behavior.
|
||||
Only error if there are conflicting field types or malformed rows.
|
||||
- Stop the sync and wait for schema re-discovery: if a record is encountered that does not conform to the configured catalog’s schema, we log a message and stop the whole sync. Note: this option is not recommended if the files have very different columns or datatypes, because the inferred schema may vary significantly at discover time.
|
||||
|
||||
When the `schemaless` is enabled, validation will be skipped.
|
||||
|
||||
## Breaking Changes (compared to previous S3 implementation)
|
||||
|
||||
* [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively.
|
||||
* [CSV] Before, a string value would not be considered as `null_values` if the column type was a string. We will now start to cast string columns with values matching `null_values` to null.
|
||||
* [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string.
|
||||
* [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger.
|
||||
- [CSV] Mapping of type `array` and `object`: before, they were mapped as `large_string` and hence casted as strings. Given the new changes, if `array` or `object` is specified, the value will be casted as `array` and `object` respectively.
|
||||
- [CSV] Before, a string value would not be considered as `null_values` if the column type was a string. We will now start to cast string columns with values matching `null_values` to null.
|
||||
- [CSV] `decimal_point` option is deprecated: It is not possible anymore to use another character than `.` to separate the integer part from non-integer part. Given that the float is format with another character than this, it will be considered as a string.
|
||||
- [Parquet] `columns` option is deprecated: You can use Airbyte column selection in order to have the same behavior. We don't expect it, but this could have impact on the performance as payload could be bigger.
|
||||
|
||||
## Incremental syncs
|
||||
|
||||
The file-based connectors supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes):
|
||||
|
||||
| Feature | Supported? |
|
||||
| :--------------------------------------------- |:-----------|
|
||||
| :--------------------------------------------- | :--------- |
|
||||
| Full Refresh Sync | Yes |
|
||||
| Incremental Sync | Yes |
|
||||
| Replicate Incremental Deletes | No |
|
||||
@@ -128,6 +144,7 @@ The file-based connectors supports the following [sync modes](https://docs.airby
|
||||
We recommend you do not manually modify files that are already synced. The connector has file-level granularity, which means adding or modifying a row in a CSV file will trigger a re-sync of the content of that file.
|
||||
|
||||
### Incremental sync
|
||||
|
||||
After the initial sync, the connector only pulls files that were modified since the last sync.
|
||||
|
||||
The connector checkpoints the connection states when it is done syncing all files for a given timestamp. The connection's state only keeps track of the last 10 000 files synced. If more than 10 000 files are synced, the connector won't be able to rely on the connection state to deduplicate files. In this case, the connector will initialize its cursor to the minimum between the earliest file in the history, or 3 days ago.
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
## Breaking Changes & Limitations
|
||||
|
||||
* [bigger scope than Concurrent CDK] checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs
|
||||
* `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source
|
||||
* send_per_stream_state is always True for Concurrent CDK
|
||||
* Using stream_state during read_records: The concern is that today, stream_instance.get_updated_state is called on every record and read_records on every slice. The implication is that the argument stream_state passed to read_records will have the value after the last stream_instance.get_updated_state of the previous slice. For Concurrent CDK, this is not possible as slices are processed in an unordered way.
|
||||
* Cursor fields can only be data-time formatted as epoch. Eventually, we want to move to ISO 8601 as it provides more flexibility but for the first iteration on Stripe, it was easier to use the same format that was already used
|
||||
- [bigger scope than Concurrent CDK] checkpointing state was acting on the number of records per slice. This has been changed to consider the number of records per syncs
|
||||
- `Source.read_state` and `Source._emit_legacy_state_format` are now classmethods to allow for developers to have access to the state before instantiating the source
|
||||
- send_per_stream_state is always True for Concurrent CDK
|
||||
- Using stream_state during read_records: The concern is that today, stream_instance.get_updated_state is called on every record and read_records on every slice. The implication is that the argument stream_state passed to read_records will have the value after the last stream_instance.get_updated_state of the previous slice. For Concurrent CDK, this is not possible as slices are processed in an unordered way.
|
||||
- Cursor fields can only be data-time formatted as epoch. Eventually, we want to move to ISO 8601 as it provides more flexibility but for the first iteration on Stripe, it was easier to use the same format that was already used
|
||||
|
||||
@@ -16,10 +16,12 @@ docs page and relation to other sections.
|
||||
Each time a new module added to `airbyte-cdk/python/airbyte_cdk` module you'll need to update the Sphinx rst schema.
|
||||
|
||||
Let's dive into using an example:
|
||||
|
||||
- Assuming we're going to add a new package `airbyte_cdk/new_package`;
|
||||
- Let this file contain a few modules: `airbyte_cdk/new_package/module1.py` and `airbyte_cdk/new_package/module2.py`;
|
||||
- The above structure should be in `rst` config as:
|
||||
- Add this block directly into `index.rst`:
|
||||
|
||||
```
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
@@ -27,7 +29,9 @@ Let's dive into using an example:
|
||||
|
||||
api/airbyte_cdk.new_package
|
||||
```
|
||||
|
||||
- Add a new file `api/airbyte_cdk.new_package.rst` with the following content:
|
||||
|
||||
```
|
||||
Submodules
|
||||
----------
|
||||
@@ -67,7 +71,6 @@ To generate the docs,
|
||||
run `python generate_rst_schema.py -o _source/api ../../python/airbyte_cdk -f -t _source/templates`
|
||||
from the `airbyte-cdk/python/reference_docs` root.
|
||||
|
||||
|
||||
## Building the docs locally
|
||||
|
||||
After the `rst` files created to correctly represent current project structure you may build the docs locally.
|
||||
@@ -77,7 +80,6 @@ This build could be useful on each `airbyte-cdk` update, especially if the packa
|
||||
- Run `make html` from the `airbyte-cdk/python/reference_docs` root;
|
||||
- Check out the `airbyte-cdk/python/reference_docs/_build` for the new documentation built.
|
||||
|
||||
|
||||
## Publishing to Read the Docs
|
||||
|
||||
Our current sphinx docs setup is meant to be published to [readthedocs](https://readthedocs.org/).
|
||||
|
||||
@@ -6,18 +6,17 @@ Our connector build pipeline ([`airbyte-ci`](https://github.com/airbytehq/airbyt
|
||||
Our base images are declared in code, using the [Dagger Python SDK](https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/).
|
||||
|
||||
- [Python base image code declaration](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/base_images/python/bases.py)
|
||||
- ~Java base image code declaration~ *TODO*
|
||||
|
||||
- ~Java base image code declaration~ _TODO_
|
||||
|
||||
## Where are the Dockerfiles?
|
||||
|
||||
Our base images are not declared using Dockerfiles.
|
||||
They are declared in code using the [Dagger Python SDK](https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/).
|
||||
We prefer this approach because it allows us to interact with base images container as code: we can use python to declare the base images and use the full power of the language to build and test them.
|
||||
However, we do artificially generate Dockerfiles for debugging and documentation purposes.
|
||||
|
||||
|
||||
|
||||
### Example for `airbyte/python-connector-base`:
|
||||
|
||||
```dockerfile
|
||||
FROM docker.io/python:3.9.18-slim-bookworm@sha256:44b7f161ed03f85e96d423b9916cdc8cb0509fb970fd643bdbc9896d49e1cad0
|
||||
RUN ln -snf /usr/share/zoneinfo/Etc/UTC /etc/localtime
|
||||
@@ -31,28 +30,26 @@ RUN sh -c apt-get update && apt-get install -y tesseract-ocr=5.3.0-2 poppler-uti
|
||||
RUN mkdir /usr/share/nltk_data
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Base images
|
||||
|
||||
|
||||
### `airbyte/python-connector-base`
|
||||
|
||||
| Version | Published | Docker Image Address | Changelog |
|
||||
|---------|-----------|--------------|-----------|
|
||||
| 1.2.0 | ✅| docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9 | Add CDK system dependencies: nltk data, tesseract, poppler. |
|
||||
| 1.1.0 | ✅| docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c | Install socat |
|
||||
| 1.0.0 | ✅| docker.io/airbyte/python-connector-base:1.0.0@sha256:dd17e347fbda94f7c3abff539be298a65af2d7fc27a307d89297df1081a45c27 | Initial release: based on Python 3.9.18, on slim-bookworm system, with pip==23.2.1 and poetry==1.6.1 |
|
||||
|
||||
| ------- | --------- | --------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- |
|
||||
| 1.2.0 | ✅ | docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9 | Add CDK system dependencies: nltk data, tesseract, poppler. |
|
||||
| 1.1.0 | ✅ | docker.io/airbyte/python-connector-base:1.1.0@sha256:bd98f6505c6764b1b5f99d3aedc23dfc9e9af631a62533f60eb32b1d3dbab20c | Install socat |
|
||||
| 1.0.0 | ✅ | docker.io/airbyte/python-connector-base:1.0.0@sha256:dd17e347fbda94f7c3abff539be298a65af2d7fc27a307d89297df1081a45c27 | Initial release: based on Python 3.9.18, on slim-bookworm system, with pip==23.2.1 and poetry==1.6.1 |
|
||||
|
||||
## How to release a new base image version (example for Python)
|
||||
|
||||
### Requirements
|
||||
* [Docker](https://docs.docker.com/get-docker/)
|
||||
* [Poetry](https://python-poetry.org/docs/#installation)
|
||||
* Dockerhub logins
|
||||
|
||||
- [Docker](https://docs.docker.com/get-docker/)
|
||||
- [Poetry](https://python-poetry.org/docs/#installation)
|
||||
- Dockerhub logins
|
||||
|
||||
### Steps
|
||||
|
||||
1. `poetry install`
|
||||
2. Open `base_images/python/bases.py`.
|
||||
3. Make changes to the `AirbytePythonConnectorBaseImage`, you're likely going to change the `get_container` method to change the base image.
|
||||
@@ -61,25 +58,28 @@ RUN mkdir /usr/share/nltk_data
|
||||
6. Cut a new base image version by running `poetry run generate-release`. You'll need your DockerHub credentials.
|
||||
|
||||
It will:
|
||||
- Prompt you to pick which base image you'd like to publish.
|
||||
- Prompt you for a major/minor/patch/pre-release version bump.
|
||||
- Prompt you for a changelog message.
|
||||
- Run the sanity checks on the new version.
|
||||
- Optional: Publish the new version to DockerHub.
|
||||
- Regenerate the docs and the registry json file.
|
||||
|
||||
- Prompt you to pick which base image you'd like to publish.
|
||||
- Prompt you for a major/minor/patch/pre-release version bump.
|
||||
- Prompt you for a changelog message.
|
||||
- Run the sanity checks on the new version.
|
||||
- Optional: Publish the new version to DockerHub.
|
||||
- Regenerate the docs and the registry json file.
|
||||
|
||||
7. Commit and push your changes.
|
||||
8. Create a PR and ask for a review from the Connector Operations team.
|
||||
|
||||
**Please note that if you don't publish your image while cutting the new version you can publish it later with `poetry run publish <repository> <version>`.**
|
||||
No connector will use the new base image version until its metadata is updated to use it.
|
||||
If you're not fully confident with the new base image version please:
|
||||
- please publish it as a pre-release version
|
||||
- try out the new version on a couple of connectors
|
||||
- cut a new version with a major/minor/patch bump and publish it
|
||||
- This steps can happen in different PRs.
|
||||
|
||||
- please publish it as a pre-release version
|
||||
- try out the new version on a couple of connectors
|
||||
- cut a new version with a major/minor/patch bump and publish it
|
||||
- This steps can happen in different PRs.
|
||||
|
||||
## Running tests locally
|
||||
|
||||
```bash
|
||||
poetry run pytest
|
||||
# Static typing checks
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
# CI Credentials
|
||||
|
||||
CLI tooling to read and manage GSM secrets:
|
||||
|
||||
- `write-to-storage` download a connector's secrets locally in the connector's `secrets` folder
|
||||
- `update-secrets` uploads new connector secret version that were locally updated.
|
||||
|
||||
@@ -43,26 +44,31 @@ pipx install git+https://github.com/airbytehq/airbyte.git#subdirectory=airbyte-c
|
||||
This command installs `ci_credentials` and makes it globally available in your terminal.
|
||||
|
||||
> [!Note]
|
||||
>
|
||||
> - `--force` is required to ensure updates are applied on subsequent installs.
|
||||
> - `--python=python3.10` is required to ensure the correct python version is used.
|
||||
|
||||
## Get GSM access
|
||||
|
||||
Download a Service account json key that has access to Google Secrets Manager.
|
||||
`ci_credentials` expects `GCP_GSM_CREDENTIALS` to be set in environment to be able to access secrets.
|
||||
|
||||
### Create Service Account
|
||||
* Go to https://console.cloud.google.com/iam-admin/serviceaccounts/create?project=dataline-integration-testing
|
||||
* In step #1 `Service account details`, set a name and a relevant description
|
||||
* In step #2 `Grant this service account access to project`, select role `Owner` (there is a role that is more scope but I based this decision on others `<user>-testing` service account)
|
||||
|
||||
- Go to https://console.cloud.google.com/iam-admin/serviceaccounts/create?project=dataline-integration-testing
|
||||
- In step #1 `Service account details`, set a name and a relevant description
|
||||
- In step #2 `Grant this service account access to project`, select role `Owner` (there is a role that is more scope but I based this decision on others `<user>-testing` service account)
|
||||
|
||||
### Create Service Account Token
|
||||
* Go to https://console.cloud.google.com/iam-admin/serviceaccounts?project=dataline-integration-testing
|
||||
* Find your service account and click on it
|
||||
* Go in the tab "KEYS"
|
||||
* Click on "ADD KEY -> Create new key" and select JSON. This will download a file on your computer
|
||||
|
||||
- Go to https://console.cloud.google.com/iam-admin/serviceaccounts?project=dataline-integration-testing
|
||||
- Find your service account and click on it
|
||||
- Go in the tab "KEYS"
|
||||
- Click on "ADD KEY -> Create new key" and select JSON. This will download a file on your computer
|
||||
|
||||
### Setup ci_credentials
|
||||
* In your .zshrc, add: `export GCP_GSM_CREDENTIALS=$(cat <path to JSON file>)`
|
||||
|
||||
- In your .zshrc, add: `export GCP_GSM_CREDENTIALS=$(cat <path to JSON file>)`
|
||||
|
||||
## Development
|
||||
|
||||
@@ -75,9 +81,11 @@ pipx install --editable airbyte-ci/connectors/ci_credentials/
|
||||
This is useful when you are making changes to the package and want to test them in real-time.
|
||||
|
||||
> [!Note]
|
||||
>
|
||||
> - The package name is `ci_credentials`, not `airbyte-ci`. You will need this when uninstalling or reinstalling.
|
||||
|
||||
## Usage
|
||||
|
||||
After installation, you can use the `ci_credentials` command in your terminal.
|
||||
|
||||
## Run it
|
||||
@@ -101,6 +109,7 @@ VERSION=dev ci_credentials all write-to-storage
|
||||
```
|
||||
|
||||
### Update secrets
|
||||
|
||||
To upload to GSM newly updated configurations from `airbyte-integrations/connectors/source-bings-ads/secrets/updated_configurations`:
|
||||
|
||||
```bash
|
||||
|
||||
@@ -3,5 +3,6 @@
|
||||
`common_utils` is a Python package that provides common utilities that are used in other `airbyte-ci` tools, such as `ci_credentials` and `base_images`.
|
||||
|
||||
Currently:
|
||||
|
||||
- Logger
|
||||
- GCS API client
|
||||
|
||||
@@ -105,6 +105,7 @@ poe type_check
|
||||
```bash
|
||||
poe lint
|
||||
```
|
||||
|
||||
## Changelog
|
||||
|
||||
### 1.3.1
|
||||
@@ -120,6 +121,7 @@ Added `CheckConnectorMaxSecondsBetweenMessagesValue` check that verifies presenc
|
||||
Added `ValidateBreakingChangesDeadlines` check that verifies the minimal compliance of breaking change rollout deadline.
|
||||
|
||||
### 1.1.0
|
||||
|
||||
Introduced the `Check.run_on_released_connectors` flag.
|
||||
|
||||
### 1.0.4
|
||||
@@ -141,4 +143,5 @@ Fix access to connector types: it should be accessed from the `Connector.connect
|
||||
- Make `CheckPublishToPyPiIsEnabled` run on source connectors only.
|
||||
|
||||
### 1.0.0
|
||||
|
||||
Initial release of `connectors-qa` package.
|
||||
|
||||
@@ -5,15 +5,15 @@ These checks are running in our CI/CD pipeline and are used to ensure a connecto
|
||||
Meeting these standards means that the connector will be able to be safely integrated into the Airbyte platform and released to registries (DockerHub, Pypi etc.).
|
||||
You can consider these checks as a set of guidelines to follow when developing a connector.
|
||||
They are by no mean replacing the need for a manual review of the connector codebase and the implementation of good test suites.
|
||||
|
||||
{% for category, checks in checks_by_category.items() %}
|
||||
## {{ category.value }}
|
||||
{% for check in checks %}
|
||||
### {{ check.name }}
|
||||
*Applies to the following connector types: {{ ', '.join(check.applies_to_connector_types) }}*
|
||||
*Applies to the following connector languages: {{ ', '.join(check.applies_to_connector_languages) }}*
|
||||
*Applies to connector with {{ ', '.join(check.applies_to_connector_support_levels) if check.applies_to_connector_support_levels else 'any' }} support level*
|
||||
|
||||
_Applies to the following connector types: {{ ', '.join(check.applies_to_connector_types) }}_
|
||||
_Applies to the following connector languages: {{ ', '.join(check.applies_to_connector_languages) }}_
|
||||
_Applies to connector with {{ ', '.join(check.applies_to_connector_support_levels) if check.applies_to_connector_support_levels else 'any' }} support level_
|
||||
|
||||
{{ check.description }}
|
||||
{%- endfor %}
|
||||
{% endfor %}
|
||||
{%- endfor %}
|
||||
|
||||
@@ -3,12 +3,14 @@
|
||||
This project contains utilities for running connector tests against live data.
|
||||
|
||||
## Requirements
|
||||
* `docker`
|
||||
* `Python ^3.10`
|
||||
* `pipx`
|
||||
* `poetry`
|
||||
|
||||
- `docker`
|
||||
- `Python ^3.10`
|
||||
- `pipx`
|
||||
- `poetry`
|
||||
|
||||
## Install
|
||||
|
||||
```bash
|
||||
# From airbyte-ci/connectors/live-tests
|
||||
poetry install
|
||||
@@ -39,19 +41,22 @@ Options:
|
||||
This command is made to run any of the following connector commands against one or multiple connector images.
|
||||
|
||||
**Available connector commands:**
|
||||
* `spec`
|
||||
* `check`
|
||||
* `discover`
|
||||
* `read` or `read_with_state` (requires a `--state-path` to be passed)
|
||||
|
||||
- `spec`
|
||||
- `check`
|
||||
- `discover`
|
||||
- `read` or `read_with_state` (requires a `--state-path` to be passed)
|
||||
|
||||
It will write artifacts to an output directory:
|
||||
* `stdout.log`: The collected standard output following the command execution
|
||||
* `stderr.log`: The collected standard error following the c
|
||||
* `http_dump.txt`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `9.0.1`) for debugging.
|
||||
* `airbyte_messages.db`: A DuckDB database containing the messages produced by the connector.
|
||||
* `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector.
|
||||
|
||||
- `stdout.log`: The collected standard output following the command execution
|
||||
- `stderr.log`: The collected standard error following the c
|
||||
- `http_dump.txt`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `9.0.1`) for debugging.
|
||||
- `airbyte_messages.db`: A DuckDB database containing the messages produced by the connector.
|
||||
- `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector.
|
||||
|
||||
#### Example
|
||||
|
||||
Let's run `debug` to check the output of `read` on two different versions of the same connector:
|
||||
|
||||
```bash
|
||||
@@ -99,22 +104,27 @@ poetry run live-tests debug read \
|
||||
```
|
||||
|
||||
##### Consuming `http_dump.mitm`
|
||||
|
||||
You can install [`mitmproxy`](https://mitmproxy.org/):
|
||||
|
||||
```bash
|
||||
pipx install mitmproxy
|
||||
```
|
||||
|
||||
And run:
|
||||
|
||||
```bash
|
||||
mitmweb --rfile=http_dump.mitm
|
||||
```
|
||||
|
||||
## Regression tests
|
||||
|
||||
We created a regression test suite to run tests to compare the outputs of connector commands on different versions of the same connector.
|
||||
|
||||
## Tutorial(s)
|
||||
* [Loom Walkthrough (Airbyte Only)](https://www.loom.com/share/97c49d7818664b119cff6911a8a211a2?sid=4570a5b6-9c81-4db3-ba33-c74dc5845c3c)
|
||||
* [Internal Docs (Airbyte Only)](https://docs.google.com/document/d/1pzTxJTsooc9iQDlALjvOWtnq6yRTvzVtbkJxY4R36_I/edit)
|
||||
|
||||
- [Loom Walkthrough (Airbyte Only)](https://www.loom.com/share/97c49d7818664b119cff6911a8a211a2?sid=4570a5b6-9c81-4db3-ba33-c74dc5845c3c)
|
||||
- [Internal Docs (Airbyte Only)](https://docs.google.com/document/d/1pzTxJTsooc9iQDlALjvOWtnq6yRTvzVtbkJxY4R36_I/edit)
|
||||
|
||||
### How to Use
|
||||
|
||||
@@ -123,6 +133,7 @@ We created a regression test suite to run tests to compare the outputs of connec
|
||||
You can run the existing test suites with the following command:
|
||||
|
||||
#### With local connection objects (`config.json`, `catalog.json`, `state.json`)
|
||||
|
||||
```bash
|
||||
poetry run pytest src/live_tests/regression_tests \
|
||||
--connector-image=airbyte/source-faker \
|
||||
@@ -134,6 +145,7 @@ poetry run pytest src/live_tests/regression_tests \
|
||||
```
|
||||
|
||||
#### Using a live connection
|
||||
|
||||
The live connection objects will be fetched.
|
||||
|
||||
```bash
|
||||
@@ -142,27 +154,28 @@ The live connection objects will be fetched.
|
||||
--target-version=dev \
|
||||
--control-version=latest \
|
||||
--pr-url=<PR-URL> # The URL of the PR you are testing
|
||||
```
|
||||
```
|
||||
|
||||
You can also pass local connection objects path to override the live connection objects with `--config-path`, `--state-path` or `--catalog-path`.
|
||||
|
||||
#### Test artifacts
|
||||
|
||||
The test suite run will produce test artifacts in the `/tmp/regression_tests_artifacts/` folder.
|
||||
**They will get cleared after each test run on prompt exit. Please do not copy them elsewhere in your filesystem as they contain sensitive data that are not meant to be stored outside of your debugging session!**
|
||||
|
||||
##### Artifacts types
|
||||
* `report.html`: A report of the test run.
|
||||
* `stdout.log`: The collected standard output following the command execution
|
||||
* `stderr.log`: The collected standard error following the command execution
|
||||
* `http_dump.mitm`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `>=10`) for debugging.
|
||||
* `http_dump.har`: An `mitmproxy` http stream log in HAR format (a JSON encoded version of the mitm dump).
|
||||
* `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector.
|
||||
* `duck.db`: A DuckDB database containing the messages produced by the connector.
|
||||
* `dagger.log`: The log of the Dagger session, useful for debugging errors unrelated to the tests.
|
||||
|
||||
- `report.html`: A report of the test run.
|
||||
- `stdout.log`: The collected standard output following the command execution
|
||||
- `stderr.log`: The collected standard error following the command execution
|
||||
- `http_dump.mitm`: An `mitmproxy` http stream log. Can be consumed with `mitmweb` (version `>=10`) for debugging.
|
||||
- `http_dump.har`: An `mitmproxy` http stream log in HAR format (a JSON encoded version of the mitm dump).
|
||||
- `airbyte_messages`: A directory containing `.jsonl` files for each message type (logs, records, traces, controls, states etc.) produced by the connector.
|
||||
- `duck.db`: A DuckDB database containing the messages produced by the connector.
|
||||
- `dagger.log`: The log of the Dagger session, useful for debugging errors unrelated to the tests.
|
||||
|
||||
**Tests can also write specific artifacts like diffs under a directory named after the test function.**
|
||||
|
||||
|
||||
```
|
||||
/tmp/regression_tests_artifacts
|
||||
└── session_1710754231
|
||||
@@ -235,16 +248,17 @@ The test suite run will produce test artifacts in the `/tmp/regression_tests_art
|
||||
│ ├── stderr.log
|
||||
│ └── stdout.log
|
||||
└── dagger.log
|
||||
```
|
||||
```
|
||||
|
||||
#### HTTP Proxy and caching
|
||||
|
||||
We use a containerized `mitmproxy` to capture the HTTP traffic between the connector and the source. Connector command runs produce `http_dump.mitm` (can be consumed with `mitmproxy` (version `>=10`) for debugging) and `http_dump.har` (a JSON encoded version of the mitm dump) artifacts.
|
||||
The traffic recorded on the control connector is passed to the target connector proxy to cache the responses for requests with the same URL. This is useful to avoid hitting the source API multiple times when running the same command on different versions of the connector.
|
||||
|
||||
### Custom CLI Arguments
|
||||
|
||||
| Argument | Description | Required/Optional |
|
||||
|----------------------------|---------------------------------------------------------------------------------------------------------------------------|-------------------|
|
||||
| -------------------------- | -------------------------------------------------------------------------------------------------------------- | ----------------- |
|
||||
| `--connector-image` | Docker image name of the connector to debug (e.g., `airbyte/source-faker:latest`, `airbyte/source-faker:dev`). | Required |
|
||||
| `--control-version` | Version of the control connector for regression testing. | Required |
|
||||
| `--target-version` | Version of the connector being tested. (Defaults to dev) | Optional |
|
||||
@@ -259,86 +273,112 @@ The traffic recorded on the control connector is passed to the target connector
|
||||
| `--stream` | Name of the stream to test. Can be specified multiple times to test multiple streams. | Optional |
|
||||
| `--should-read-with-state` | Specify whether to read with state. If not provided, a prompt will appear to choose. | Optional |
|
||||
|
||||
|
||||
## Changelog
|
||||
|
||||
### 0.17.0
|
||||
|
||||
Enable running in GitHub actions.
|
||||
|
||||
### 0.16.0
|
||||
|
||||
Enable running with airbyte-ci.
|
||||
|
||||
### 0.15.0
|
||||
|
||||
Automatic retrieval of connection objects for regression tests. The connection id is not required anymore.
|
||||
|
||||
### 0.14.2
|
||||
|
||||
Fix KeyError when target & control streams differ.
|
||||
|
||||
### 0.14.1
|
||||
|
||||
Improve performance when reading records per stream.
|
||||
|
||||
### 0.14.0
|
||||
|
||||
Track usage via Segment.
|
||||
|
||||
### 0.13.0
|
||||
|
||||
Show test docstring in the test report.
|
||||
|
||||
### 0.12.0
|
||||
|
||||
Implement a test to compare schema inferred on both control and target version.
|
||||
|
||||
### 0.11.0
|
||||
|
||||
Create a global duckdb instance to store messages produced by the connector in target and control version.
|
||||
|
||||
### 0.10.0
|
||||
|
||||
Show record count per stream in report and list untested streams.
|
||||
|
||||
### 0.9.0
|
||||
|
||||
Make the regressions tests suite better at handling large connector outputs.
|
||||
|
||||
### 0.8.1
|
||||
|
||||
Improve diff output.
|
||||
|
||||
### 0.8.0
|
||||
|
||||
Regression tests: add an HTML report.
|
||||
|
||||
### 0.7.0
|
||||
|
||||
Improve the proxy workflow and caching logic + generate HAR files.
|
||||
|
||||
### 0.6.6
|
||||
|
||||
Exit pytest if connection can't be retrieved.
|
||||
|
||||
### 0.6.6
|
||||
|
||||
Cleanup debug files when prompt is closed.
|
||||
|
||||
### 0.6.5
|
||||
|
||||
Improve ConnectorRunner logging.
|
||||
|
||||
### 0.6.4
|
||||
|
||||
Add more data integrity checks to the regression tests suite.
|
||||
|
||||
### 0.6.3
|
||||
|
||||
Make catalog diffs more readable.
|
||||
|
||||
### 0.6.2
|
||||
|
||||
Clean up regression test artifacts on any exception.
|
||||
|
||||
### 0.6.1
|
||||
|
||||
Modify diff output for `discover` and `read` tests.
|
||||
|
||||
### 0.5.1
|
||||
|
||||
Handle connector command execution errors.
|
||||
|
||||
### 0.5.0
|
||||
|
||||
Add new tests and confirmation prompts.
|
||||
|
||||
### 0.4.0
|
||||
|
||||
Introduce DuckDB to store the messages produced by the connector.
|
||||
|
||||
### 0.3.0
|
||||
|
||||
Pass connection id to the regression tests suite.
|
||||
|
||||
### 0.2.0
|
||||
|
||||
Declare the regression tests suite.
|
||||
|
||||
### 0.1.0
|
||||
|
||||
Implement initial primitives and a `debug` command to run connector commands and persist the outputs to local storage.
|
||||
|
||||
@@ -10,7 +10,6 @@ To use this submodule, it is recommended that you use Poetry to manage dependenc
|
||||
poetry install
|
||||
```
|
||||
|
||||
|
||||
## Generating Models
|
||||
|
||||
This submodule includes a tool for generating Python models from JSON Schema specifications. To generate the models, we use the library [datamodel-code-generator](https://github.com/koxudaxi/datamodel-code-generator). The generated models are stored in `models/generated`.
|
||||
@@ -24,13 +23,14 @@ poetry run poe generate-models
|
||||
|
||||
This will read the JSON Schema specifications in `models/src` and generate Python models in `models/generated`.
|
||||
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
poetry run pytest
|
||||
```
|
||||
|
||||
## Validating Metadata Files
|
||||
|
||||
To be considered valid, a connector must have a metadata.yaml file which must conform to the [ConnectorMetadataDefinitionV0](./metadata_service/models/src/ConnectorMetadataDefinitionV0.yaml) schema, and a documentation file.
|
||||
|
||||
The paths to both files must be passed to the validate command.
|
||||
@@ -42,6 +42,7 @@ poetry run metadata_service validate tests/fixtures/metadata_validate/valid/meta
|
||||
## Useful Commands
|
||||
|
||||
### Replicate Production Data in your Development Bucket
|
||||
|
||||
This will replicate all the production data to your development bucket. This is useful for testing the metadata service with real up to date data.
|
||||
|
||||
_💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_
|
||||
@@ -53,6 +54,7 @@ TARGET_BUCKET=<YOUR-DEV_BUCKET> poetry poe replicate-prod
|
||||
```
|
||||
|
||||
### Copy specific connector version to your Development Bucket
|
||||
|
||||
This will copy the specified connector version to your development bucket. This is useful for testing the metadata service with a specific version of a connector.
|
||||
|
||||
_💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_
|
||||
@@ -62,6 +64,7 @@ TARGET_BUCKET=<YOUR-DEV_BUCKET> CONNECTOR="airbyte/source-stripe" VERSION="3.17.
|
||||
```
|
||||
|
||||
### Promote Connector Version to Latest
|
||||
|
||||
This will promote the specified connector version to the latest version in the registry. This is useful for creating a mocked registry in which a prerelease connector is treated as if it was already published.
|
||||
|
||||
_💡 Note: A prerequisite is you have [gsutil](https://cloud.google.com/storage/docs/gsutil) installed and have run `gsutil auth login`_
|
||||
|
||||
@@ -6,7 +6,6 @@ Url: {{ last_action_url }}
|
||||
|
||||
Run time: {{ last_action_run_time }}
|
||||
|
||||
|
||||
CONNECTORS: total: {{ total_connectors }}
|
||||
|
||||
Sources: total: {{ source_stats["total"] }} / tested: {{ source_stats["tested"] }} / success: {{ source_stats["success"] }} ({{ source_stats["success_percent"] }}%)
|
||||
@@ -17,7 +16,6 @@ Destinations: total: {{ destination_stats["total"] }} / tested: {{ destination_s
|
||||
|
||||
{{ failed_last_build_only }}
|
||||
|
||||
|
||||
**FAILED TWO LAST BUILDS - {{ failed_last_build_two_builds_count }} connectors:**
|
||||
|
||||
{{ failed_last_build_two_builds }}
|
||||
|
||||
@@ -1,26 +1,25 @@
|
||||
|
||||
## What is `airbyte-ci`?
|
||||
|
||||
`airbyte-ci` is a CLI written as a python package which is made to execute CI operations on the `airbyte` repo. It is heavily using the [Dagger](https://dagger.cloud/) library to build and orchestrate Docker containers programatically. It enables a centralized and programmatic approach at executing CI logics which can seamlessly run both locally and in remote CI environments.
|
||||
|
||||
You can read more why we are using Dagger and the benefit it has provided in this [blog post](https://dagger.io/blog/airbyte-use-case)
|
||||
|
||||
|
||||
## When is a contribution to `airbyte-ci` a good fit for your use case?
|
||||
|
||||
* When you want to make global changes to connectors artifacts and build logic.
|
||||
* When you want to execute something made to run both in CI or for local development. As airbyte-ci logic relies on container orchestration you can have reproducible environment and execution both locally and in a remote CI environment.
|
||||
* When you want to orchestrate the tests and release of an internal package in CI.
|
||||
- When you want to make global changes to connectors artifacts and build logic.
|
||||
- When you want to execute something made to run both in CI or for local development. As airbyte-ci logic relies on container orchestration you can have reproducible environment and execution both locally and in a remote CI environment.
|
||||
- When you want to orchestrate the tests and release of an internal package in CI.
|
||||
|
||||
## Who can I ask help from?
|
||||
|
||||
The tool has been maintained by multiple Airbyters.
|
||||
Our top contributors who can help you figuring the best approach to implement your use case are:
|
||||
* [@alafanechere](https://github.com/alafanechere).
|
||||
* [@postamar](https://github.com/postamar)
|
||||
* [@erohmensing](https://github.com/erohmensing)
|
||||
* [@bnchrch](https://github.com/bnchrch)
|
||||
* [@stephane-airbyte](https://github.com/stephane-airbyte)
|
||||
|
||||
- [@alafanechere](https://github.com/alafanechere).
|
||||
- [@postamar](https://github.com/postamar)
|
||||
- [@erohmensing](https://github.com/erohmensing)
|
||||
- [@bnchrch](https://github.com/bnchrch)
|
||||
- [@stephane-airbyte](https://github.com/stephane-airbyte)
|
||||
|
||||
## Where is the code?
|
||||
|
||||
@@ -53,26 +52,32 @@ There are multiple way to have dev install of the tool. Feel free to grab the on
|
||||
**Please note that all the install mode lead to an editable install. There's no need to re-install the tool following a code change**.
|
||||
|
||||
### System requirements
|
||||
* `Python` > 3.10
|
||||
* [`Poetry`](https://python-poetry.org/) or [`pipx`](https://github.com/pypa/pipx)
|
||||
|
||||
- `Python` > 3.10
|
||||
- [`Poetry`](https://python-poetry.org/) or [`pipx`](https://github.com/pypa/pipx)
|
||||
|
||||
### Installation options
|
||||
|
||||
There are many ways to install Python tools / packages.
|
||||
|
||||
For most users we recommend you use `make` but `pipx` and `poetry` are also viable options
|
||||
|
||||
#### With `make`
|
||||
|
||||
```bash
|
||||
# From airbyte repo root:
|
||||
make tools.airbyte-ci-dev.install
|
||||
```
|
||||
```
|
||||
|
||||
#### With `pipx`
|
||||
|
||||
```bash
|
||||
# From airbyte-ci/connectors/pipelines:
|
||||
pipx install --editable --force .
|
||||
```
|
||||
|
||||
#### With `poetry`
|
||||
|
||||
⚠️ This places you in a python environment specific to airbyte-ci. This can be a problem if you are developing airbyte-ci and testing/using your changes in another python project.
|
||||
|
||||
```bash
|
||||
@@ -81,17 +86,19 @@ poetry install
|
||||
poetry shell
|
||||
```
|
||||
|
||||
|
||||
## Main libraries used in the tool
|
||||
|
||||
### [Click](https://click.palletsprojects.com/en/8.1.x/)
|
||||
|
||||
This is a python light CLI framework we use to declare entrypoint. You'll interact with it if you have to deal with commands, command groups, option, arguments etc.
|
||||
|
||||
### [Dagger](https://dagger-io.readthedocs.io/en/sdk-python-v0.9.6/)
|
||||
|
||||
This is an SDK to build, execute and interact with Docker containers in Python. It's basically a nice API on top of [BuildKit](https://docs.docker.com/build/buildkit/). We use containers to wrap the majority of `airbyte-ci` operations as it allows us to:
|
||||
* Execute language agnostic operations: you can execute bash commands, gradle tasks, etc. in containers with Python. Pure magic!
|
||||
* Benefit from caching by default. You can consider a Dagger operation a "line in a Dockerfile". Each operation is cached by BuildKit if the inputs of the operation did not change.
|
||||
* As Dagger exposes async APIs we can easily implement concurrent logic. This is great for performance.
|
||||
|
||||
- Execute language agnostic operations: you can execute bash commands, gradle tasks, etc. in containers with Python. Pure magic!
|
||||
- Benefit from caching by default. You can consider a Dagger operation a "line in a Dockerfile". Each operation is cached by BuildKit if the inputs of the operation did not change.
|
||||
- As Dagger exposes async APIs we can easily implement concurrent logic. This is great for performance.
|
||||
|
||||
**Please note that we are currently using v0.9.6 of Dagger. The library is under active development so please refer to [this specific version documentation](https://dagger-io.readthedocs.io/en/sdk-python-v0.9.6/) if you want an accurate view of the available APIs.**
|
||||
|
||||
@@ -102,9 +109,9 @@ As Dagger exposes async APIs we use `anyio` (and the `asyncer` wrapper sometimes
|
||||
|
||||
## Design principles
|
||||
|
||||
*The principles set out below are ideals, but the first iterations on the project did not always respect them. Don't be surprised if you see code that contradicts what we're about to say (tech debt...).*
|
||||
_The principles set out below are ideals, but the first iterations on the project did not always respect them. Don't be surprised if you see code that contradicts what we're about to say (tech debt...)._
|
||||
|
||||
### `airbyte-ci` is *just* an orchestrator
|
||||
### `airbyte-ci` is _just_ an orchestrator
|
||||
|
||||
Ideally the steps declared in airbyte-ci pipeline do not contain any business logic themselves. They call external projects, within containers, which contains the business logic.
|
||||
|
||||
@@ -113,8 +120,9 @@ Following this principles will help in decoupling airbyte-ci from other project
|
||||
Maintaining business logic in smaller projects also increases velocity, as introducing a new logic would not require changing airbyte-ci and, which is already a big project in terms of code lines.
|
||||
|
||||
#### Good examples of this principle
|
||||
* `connectors-qa`: We want to run specific static checks on all our connectors: we introduced a specific python package ([`connectors-qa`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/connectors_qa/README.md#L1))which declares and run the checks on connectors. We orchestrate the run of this package inside the [QaChecks](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/common.py#L122) step. This class is just aware of the tool location, its entry point, and what has to be mounted to the container for the command to run.
|
||||
* Internal package testing: We expose an `airbyte-ci test` command which can run a CI pipeline on an internal poetry package. The pipeline logic is declared at the package level with `poe` tasks in the package `pyproject.toml`. `airbyte-ci` is made aware about what is has to run by parsing the content of the `[tool.airbyte_ci]` section of the `pyproject.toml`file. [Example](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/pyproject.toml#L39)
|
||||
|
||||
- `connectors-qa`: We want to run specific static checks on all our connectors: we introduced a specific python package ([`connectors-qa`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/connectors_qa/README.md#L1))which declares and run the checks on connectors. We orchestrate the run of this package inside the [QaChecks](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/common.py#L122) step. This class is just aware of the tool location, its entry point, and what has to be mounted to the container for the command to run.
|
||||
- Internal package testing: We expose an `airbyte-ci test` command which can run a CI pipeline on an internal poetry package. The pipeline logic is declared at the package level with `poe` tasks in the package `pyproject.toml`. `airbyte-ci` is made aware about what is has to run by parsing the content of the `[tool.airbyte_ci]` section of the `pyproject.toml`file. [Example](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/base_images/pyproject.toml#L39)
|
||||
|
||||
### No command or pipeline should be language specific
|
||||
|
||||
@@ -125,30 +133,36 @@ We oftentimes have to introduce new flows for connectors / CDK. Even if the need
|
||||
The `airbyte-ci connectors build` command can build multiple connectors of different languages in a single execution.
|
||||
The higher level [`run_connector_build_pipeline` function](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/__init__.py#L36) is connector language agnostic and calls connector language specific sub pipelines according to the connector language.
|
||||
We have per-language submodules in which language specific `BuildConnectorImages` classes are implemented:
|
||||
* [`python_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/python_connectors.py)
|
||||
* [`java_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/java_connectors.py#L14)
|
||||
|
||||
- [`python_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/python_connectors.py)
|
||||
- [`java_connectors.py`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/build_image/steps/java_connectors.py#L14)
|
||||
|
||||
### Pipelines are functions, steps are classes
|
||||
|
||||
A pipeline is a function:
|
||||
* instantiating and running steps
|
||||
* collecting step results and acting according to step results
|
||||
* returning a report
|
||||
|
||||
- instantiating and running steps
|
||||
- collecting step results and acting according to step results
|
||||
- returning a report
|
||||
|
||||
A step is a class which inheriting from the `Step` base class:
|
||||
* Can be instantiated with parameters
|
||||
* Has a `_run` method which:
|
||||
* Performs one or multiple operations according to input parameter and context values
|
||||
* Returns a `StepResult` which can have a `succeeded`, `failed` or `skipped` `StepStatus`
|
||||
|
||||
- Can be instantiated with parameters
|
||||
- Has a `_run` method which:
|
||||
- Performs one or multiple operations according to input parameter and context values
|
||||
- Returns a `StepResult` which can have a `succeeded`, `failed` or `skipped` `StepStatus`
|
||||
|
||||
**Steps should ideally not call other steps and the DAG of steps can be understand by reading the pipeline function.**
|
||||
|
||||
#### Step examples:
|
||||
* [`PytestStep`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L29)
|
||||
* [`GradleTask`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/steps/gradle.py#L21)
|
||||
|
||||
- [`PytestStep`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L29)
|
||||
- [`GradleTask`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/steps/gradle.py#L21)
|
||||
|
||||
#### Pipelines examples:
|
||||
* [`run_connector_publish_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/publish/pipeline.py#L296)
|
||||
* [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48)
|
||||
|
||||
- [`run_connector_publish_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/publish/pipeline.py#L296)
|
||||
- [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48)
|
||||
|
||||
## Main classes
|
||||
|
||||
@@ -157,12 +171,13 @@ A step is a class which inheriting from the `Step` base class:
|
||||
Pipeline contexts are instantiated on each command execution and produced according to the CLI inputs. We populate this class with global configuration, helpers and attributes that are accessed during pipeline and step execution.
|
||||
|
||||
It has, for instance, the following attributes:
|
||||
* The dagger client
|
||||
* The list of modified files on the branch
|
||||
* A `connector` attribute
|
||||
* A `get_connector_dir` method to interact with the connector
|
||||
* Global secrets to connect to protected resources
|
||||
* A `is_ci` attribute to know if the current execution is a local or CI one.
|
||||
|
||||
- The dagger client
|
||||
- The list of modified files on the branch
|
||||
- A `connector` attribute
|
||||
- A `get_connector_dir` method to interact with the connector
|
||||
- Global secrets to connect to protected resources
|
||||
- A `is_ci` attribute to know if the current execution is a local or CI one.
|
||||
|
||||
We use `PipelineContext` with context managers so that we can easily handle setup and teardown logic of context (like producing a `Report`)
|
||||
|
||||
@@ -171,25 +186,26 @@ We use `PipelineContext` with context managers so that we can easily handle setu
|
||||
`Step` is an abstract class. It is meant to be inherited for implementation of pipeline steps which are use case specific. `Step` exposes a public `run` method which calls a private `_run` method wrapped with progress logger and a retry mechanism.
|
||||
|
||||
When declaring a `Step` child class you are expected to:
|
||||
* declare a `title` attribute or `property`
|
||||
* implement the `_run` method which should return a `StepResult` object. You are free to override the `Step` methods if needed.
|
||||
|
||||
- declare a `title` attribute or `property`
|
||||
- implement the `_run` method which should return a `StepResult` object. You are free to override the `Step` methods if needed.
|
||||
|
||||
### [`Result` / `StepResult`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/models/steps.py#L86)
|
||||
|
||||
The `Result` class (and its subclasses) are meant to characterize the result of a `Step` execution.
|
||||
`Result` objects are build with:
|
||||
* `StepStatus` (success/failure/skipped)
|
||||
* `stderr`: The standard error of the operation execution
|
||||
* `stdout` : The standard output of the operation execution
|
||||
* `excinfo`: An Exception instance if you want to handle an operation error
|
||||
* `output`: Any object you'd like to attach to the result for reuse in other Steps
|
||||
* `artifacts`: Any object produced by the Step that you'd like to attach to the `Report`
|
||||
|
||||
- `StepStatus` (success/failure/skipped)
|
||||
- `stderr`: The standard error of the operation execution
|
||||
- `stdout` : The standard output of the operation execution
|
||||
- `excinfo`: An Exception instance if you want to handle an operation error
|
||||
- `output`: Any object you'd like to attach to the result for reuse in other Steps
|
||||
- `artifacts`: Any object produced by the Step that you'd like to attach to the `Report`
|
||||
|
||||
### [`Report`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/models/reports.py#L34)
|
||||
|
||||
A `Report` object is instantiated on `PipelineContext` teardown with a collection of step results. It is meant to persists execution results as json / html locally and in remote storage to share them with users or other automated processes.
|
||||
|
||||
|
||||
## Github Action orchestration
|
||||
|
||||
A benefit of declaring CI logic in a centralized python package is that our CI logic can be agnostic from the CI platform it runs on. We are currently using GitHub actions. This section will explain how we run `airbyte-ci` in GitHub actions.
|
||||
@@ -197,16 +213,18 @@ A benefit of declaring CI logic in a centralized python package is that our CI l
|
||||
### Multiple workflows re-using the same actions
|
||||
|
||||
Each CI use case has its own Github Action worfklow:
|
||||
* [Connector testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/connectors_tests.yml#L1)
|
||||
* [Connector publish](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/publish_connectors.yml#L1)
|
||||
* [Internal package testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/airbyte-ci-tests.yml#L1)
|
||||
* etc.
|
||||
|
||||
- [Connector testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/connectors_tests.yml#L1)
|
||||
- [Connector publish](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/publish_connectors.yml#L1)
|
||||
- [Internal package testing](https://github.com/airbytehq/airbyte/blob/master/.github/workflows/airbyte-ci-tests.yml#L1)
|
||||
- etc.
|
||||
|
||||
They all use the [`run-airbyte-ci` re-usable action](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L1)to which they provide the `airbyte-ci` command the workflow should run and other environment specific options.
|
||||
|
||||
The `run-airbyte-ci` action does the following:
|
||||
* [Pull Dagger image and install airbyte-ci from binary (or sources if the tool was changed on the branch)](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L105)
|
||||
* [Run the airbyte-ci command passed as an input with other options also passed as inputs](https://github.com/airbytehq/airbyte/blob/main/.github/actions/run-airbyte-ci/action.yml#L111)
|
||||
|
||||
- [Pull Dagger image and install airbyte-ci from binary (or sources if the tool was changed on the branch)](https://github.com/airbytehq/airbyte/blob/master/.github/actions/run-airbyte-ci/action.yml#L105)
|
||||
- [Run the airbyte-ci command passed as an input with other options also passed as inputs](https://github.com/airbytehq/airbyte/blob/main/.github/actions/run-airbyte-ci/action.yml#L111)
|
||||
|
||||
## A full example: breaking down the execution flow of a connector test pipeline
|
||||
|
||||
@@ -215,12 +233,14 @@ Let's describe and follow what happens when we run:
|
||||
|
||||
**This command is meant to run tests on connectors that were modified on the branch.**
|
||||
Let's assume I modified the `source-faker` connector.
|
||||
|
||||
### 1. The `airbyte-ci` command group
|
||||
|
||||
On command execution the [`airbyte-ci` command group](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/cli/airbyte_ci.py#L186) acts as the main entrypoint. It is:
|
||||
* Provisioning the click context object with options values, that can be accessed in downstream commands.
|
||||
* Checking if the local docker configuration is correct
|
||||
* Wrapping the command execution with `dagger run` to get their nice terminal UI (unless `--disable-dagger-run` is passed)
|
||||
|
||||
- Provisioning the click context object with options values, that can be accessed in downstream commands.
|
||||
- Checking if the local docker configuration is correct
|
||||
- Wrapping the command execution with `dagger run` to get their nice terminal UI (unless `--disable-dagger-run` is passed)
|
||||
|
||||
### 2. The `connectors` command subgroup
|
||||
|
||||
@@ -229,13 +249,15 @@ It continues to populate the click context with other connectors specific option
|
||||
**It also computes the list of modified files on the branch and attach this list to the click context.** The `get_modified_files` function basically performs a `git diff` between the current branch and the `--diffed-branch` .
|
||||
|
||||
### 3. Reaching the `test` command
|
||||
|
||||
After going through the command groups we finally reach the actual command the user wants to execute: the [`test` command](https://github.com/airbytehq/airbyte/blob/main/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/commands.py#L72).
|
||||
|
||||
This function:
|
||||
* Sends a pending commit status check to Github when we are running in CI
|
||||
* Determines which steps should be skipped or kept according to user inputs (by building a `RunStepOptions` object)
|
||||
* Instantiate one `ConnectorContext` per connector under test: we only modified `source-faker` so we'll have a single `ConnectorContext` to work with.
|
||||
* Call `run_connectors_pipelines` with the `ConnectorContext`s and
|
||||
|
||||
- Sends a pending commit status check to Github when we are running in CI
|
||||
- Determines which steps should be skipped or kept according to user inputs (by building a `RunStepOptions` object)
|
||||
- Instantiate one `ConnectorContext` per connector under test: we only modified `source-faker` so we'll have a single `ConnectorContext` to work with.
|
||||
- Call `run_connectors_pipelines` with the `ConnectorContext`s and
|
||||
|
||||
#### 4. Globally dispatching pipeline logic in `run_connectors_pipeline`
|
||||
|
||||
@@ -243,17 +265,19 @@ This function:
|
||||
`run_connectors_pipeline`, as its taking a pipeline callable, it has no specific pipeline logic.
|
||||
|
||||
This function:
|
||||
* Instantiates the dagger client
|
||||
* Create a task group to concurrently run the pipeline callable: we'd concurrently run test pipeline on multiple connectors if multiple connectors were modified.
|
||||
* The concurrency of the pipeline is control via a semaphore object.
|
||||
|
||||
- Instantiates the dagger client
|
||||
- Create a task group to concurrently run the pipeline callable: we'd concurrently run test pipeline on multiple connectors if multiple connectors were modified.
|
||||
- The concurrency of the pipeline is control via a semaphore object.
|
||||
|
||||
#### 5. Actually running the pipeline in [`run_connector_test_pipeline`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L48)
|
||||
|
||||
*Reminder: this function is called for each connector selected for testing. It takes a `ConnectorContext` and a `Semaphore` as inputs.*
|
||||
_Reminder: this function is called for each connector selected for testing. It takes a `ConnectorContext` and a `Semaphore` as inputs._
|
||||
|
||||
The specific steps to run in the pipeline for a connector is determined by the output of the [`get_test_steps`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/pipeline.py#L32) function which is building a step tree according to the connector language.
|
||||
|
||||
**You can for instance check the declared step tree for python connectors [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/test/steps/python_connectors.py#L249).**:
|
||||
|
||||
```python
|
||||
def get_test_steps(context: ConnectorContext) -> STEP_TREE:
|
||||
"""
|
||||
@@ -292,7 +316,7 @@ def get_test_steps(context: ConnectorContext) -> STEP_TREE:
|
||||
]
|
||||
```
|
||||
|
||||
After creating the step tree (a.k.a a *DAG*) it enters the `Semaphore` and `PipelineContext` context manager to execute the steps to run with `run_steps`. `run_steps` executes steps concurrently according to their dependencies.
|
||||
After creating the step tree (a.k.a a _DAG_) it enters the `Semaphore` and `PipelineContext` context manager to execute the steps to run with `run_steps`. `run_steps` executes steps concurrently according to their dependencies.
|
||||
|
||||
Once the steps are executed we get step results. We can build a `ConnectorReport` from these results. The report is finally attached to the `context` so that it gets persisted on `context` teardown.
|
||||
|
||||
@@ -329,12 +353,14 @@ async def run_connector_test_pipeline(context: ConnectorContext, semaphore: anyi
|
||||
```
|
||||
|
||||
#### 6. `ConnectorContext` teardown
|
||||
|
||||
Once the context manager is exited (when we exit the `async with context` block) the [`ConnectorContext.__aexit__` function is executed](https://github.com/airbytehq/airbyte/blob/main/airbyte-ci/connectors/pipelines/pipelines/airbyte_ci/connectors/context.py#L237)
|
||||
|
||||
This function:
|
||||
* Determines the global success or failure state of the pipeline according to the StepResults
|
||||
* Uploads connector secrets back to GSM if they got updated
|
||||
* Persists the report to disk
|
||||
* Prints the report to the console
|
||||
* Uploads the report to remote storage if we're in CI
|
||||
* Updates the per connector commit status check
|
||||
|
||||
- Determines the global success or failure state of the pipeline according to the StepResults
|
||||
- Uploads connector secrets back to GSM if they got updated
|
||||
- Persists the report to disk
|
||||
- Prints the report to the console
|
||||
- Uploads the report to remote storage if we're in CI
|
||||
- Updates the per connector commit status check
|
||||
|
||||
@@ -298,14 +298,14 @@ flowchart TD
|
||||
#### Options
|
||||
|
||||
| Option | Multiple | Default value | Description |
|
||||
| ------------------------------------------------------- | -------- | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| ------------------------------------------------------- | -------- | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- |
|
||||
| `--skip-step/-x` | True | | Skip steps by id e.g. `-x unit -x acceptance` |
|
||||
| `--only-step/-k` | True | | Only run specific steps by id e.g. `-k unit -k acceptance` |
|
||||
| `--fail-fast` | False | False | Abort after any tests fail, rather than continuing to run additional tests. Use this setting to confirm a known bug is fixed (or not), or when you only require a pass/fail result. |
|
||||
| `--code-tests-only` | True | False | Skip any tests not directly related to code updates. For instance, metadata checks, version bump checks, changelog verification, etc. Use this setting to help focus on code quality during development. |
|
||||
| `--concurrent-cat` | False | False | Make CAT tests run concurrently using pytest-xdist. Be careful about source or destination API rate limits. |
|
||||
| `--<step-id>.<extra-parameter>=<extra-parameter-value>` | True | | You can pass extra parameters for specific test steps. More details in the extra parameters section below |
|
||||
| `--ci-requirements` | False | | | Output the CI requirements as a JSON payload. It is used to determine the CI runner to use.
|
||||
| `--ci-requirements` | False | | | Output the CI requirements as a JSON payload. It is used to determine the CI runner to use. |
|
||||
|
||||
Note:
|
||||
|
||||
@@ -465,8 +465,8 @@ Meant to be run on a cron script.
|
||||
|
||||
Actions:
|
||||
|
||||
* Upgrades dependecies to the current versions
|
||||
* Can make a pull request and bump version, changelog
|
||||
- Upgrades dependecies to the current versions
|
||||
- Can make a pull request and bump version, changelog
|
||||
|
||||
```
|
||||
Usage: airbyte-ci connectors up_to_date [OPTIONS]
|
||||
@@ -484,19 +484,19 @@ Options:
|
||||
|
||||
Get source-openweather up to date. If there are changes, bump the version and add to changelog:
|
||||
|
||||
* `airbyte-ci connectors --name=source-openweather up_to_date`: upgrades main dependecies
|
||||
* `airbyte-ci connectors --name=source-openweather up_to_date --dev`: forces update if there are only dev changes
|
||||
* `airbyte-ci connectors --name=source-openweather up_to_date --dep pytest@^8.10 --dep airbyte-cdk@0.80.0`: allows update to toml files as well
|
||||
* `airbyte-ci connectors --name=source-openweather up_to_date --pull`: make a pull request for it
|
||||
- `airbyte-ci connectors --name=source-openweather up_to_date`: upgrades main dependecies
|
||||
- `airbyte-ci connectors --name=source-openweather up_to_date --dev`: forces update if there are only dev changes
|
||||
- `airbyte-ci connectors --name=source-openweather up_to_date --dep pytest@^8.10 --dep airbyte-cdk@0.80.0`: allows update to toml files as well
|
||||
- `airbyte-ci connectors --name=source-openweather up_to_date --pull`: make a pull request for it
|
||||
|
||||
### Other things it could do
|
||||
### Other things it could do
|
||||
|
||||
* upgrade it the latest base image
|
||||
* make sure it's the newest version of pytest
|
||||
* do a `poetry update` to update everything else
|
||||
* make the pull requests on a well known branch, replacing the last one if still open
|
||||
* bump the toml and metadata and changelog
|
||||
* also bump the manifest version of the CDK
|
||||
- upgrade it the latest base image
|
||||
- make sure it's the newest version of pytest
|
||||
- do a `poetry update` to update everything else
|
||||
- make the pull requests on a well known branch, replacing the last one if still open
|
||||
- bump the toml and metadata and changelog
|
||||
- also bump the manifest version of the CDK
|
||||
|
||||
### <a id="connectors-bump_version"></a>`connectors bump_version` command
|
||||
|
||||
|
||||
@@ -35,13 +35,13 @@ class ChangelogEntry:
|
||||
def __eq__(self, other: object) -> bool:
|
||||
if not isinstance(other, ChangelogEntry):
|
||||
return False
|
||||
retVal = (
|
||||
entry_matches = (
|
||||
self.date == other.date
|
||||
and self.version == other.version
|
||||
and self.pr_number == other.pr_number
|
||||
and self.comment == other.comment
|
||||
)
|
||||
return retVal
|
||||
return entry_matches
|
||||
|
||||
def __ne__(self, other: object) -> bool:
|
||||
return not (self.__eq__(other))
|
||||
@@ -103,6 +103,10 @@ class Changelog:
|
||||
self.new_entries.add(ChangelogEntry(date, version, pull_request_number, comment))
|
||||
|
||||
def to_markdown(self) -> str:
|
||||
"""
|
||||
Generates the complete markdown content for the changelog,
|
||||
including both original and new entries, sorted by version, date, pull request number, and comment.
|
||||
"""
|
||||
all_entries = set(self.original_entries.union(self.new_entries))
|
||||
sorted_entries = sorted(
|
||||
sorted(
|
||||
|
||||
@@ -16,6 +16,11 @@ pytestmark = [
|
||||
|
||||
PATH_TO_INITIAL_FILES = Path("airbyte-ci/connectors/pipelines/tests/test_changelog/initial_files")
|
||||
PATH_TO_RESULT_FILES = Path("airbyte-ci/connectors/pipelines/tests/test_changelog/result_files")
|
||||
|
||||
# When WRITE_TO_RESULT_FILE is set to True, all tests below will generate the resulting markdown
|
||||
# and write it back to the fixture files.
|
||||
# This is useful when you changed the source files and need to regenrate the fixtures.
|
||||
# The comparison against target will still fail, but it will succeed on the subsequent test run.
|
||||
WRITE_TO_RESULT_FILE = False
|
||||
|
||||
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
@@ -1,13 +1,14 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-02 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-02 | [123457](https://github.com/airbytehq/airbyte/pull/123457) | test2 |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test1 |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
# Postgres
|
||||
|
||||
Airbyte's certified Postgres connector offers the following features:
|
||||
* Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
- Replicate data from tables, views and materilized views. Other data objects won't be replicated to the destination like indexes, permissions.
|
||||
|
||||
| Version | Date | Pull Request | Subject |
|
||||
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| ------- | ---------- | -------------------------------------------------------- | ----------------------------------------------- |
|
||||
| 3.4.0 | 2024-03-01 | [123456](https://github.com/airbytehq/airbyte/pull/123456) | test |
|
||||
| 3.3.3 | 2024-01-26 | [34573](https://github.com/airbytehq/airbyte/pull/34573) | Adopt CDK v0.16.0 |
|
||||
| 3.3.2 | 2024-01-24 | [34465](https://github.com/airbytehq/airbyte/pull/34465) | Check xmin only if user selects xmin sync mode. |
|
||||
| 3.3.1 | 2024-01-10 | [34119](https://github.com/airbytehq/airbyte/pull/34119) | Adopt java CDK version 0.11.5. |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag | |
|
||||
| 3.3.0 | 2023-12-19 | [33437](https://github.com/airbytehq/airbyte/pull/33437) | Remove LEGACY state flag |
|
||||
|
||||
Laurem Ipsum blah blah
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
|
||||
This test suite is focusing on testing a simple stream (non-nested) of data similar to `source-exchangerates` using two different
|
||||
`destination_sync_modes`:
|
||||
|
||||
- `incremental` + `overwrite` with stream `exchange_rate`
|
||||
- `incremental` + `append_dedup` with stream `dedup_exchange_rate`
|
||||
|
||||
|
||||
@@ -25,6 +25,7 @@ CREATE SCHEMA INTEGRATION_TEST_NORMALIZATION.TEST_SCHEMA;
|
||||
```
|
||||
|
||||
If you ever need to start over, use this:
|
||||
|
||||
```sql
|
||||
DROP DATABASE IF EXISTS INTEGRATION_TEST_NORMALIZATION;
|
||||
DROP USER IF EXISTS INTEGRATION_TEST_USER_NORMALIZATION;
|
||||
|
||||
@@ -1,42 +1,55 @@
|
||||
# Changelog
|
||||
|
||||
## 3.7.0
|
||||
|
||||
Add `validate_state_messages` to TestBasicRead.test_read:: Validate that all states contain neither legacy state emissions nor missing source stats in the state message.
|
||||
|
||||
## 3.6.0
|
||||
|
||||
Relaxing CATs validation when a stream has a primary key defined.
|
||||
|
||||
## 3.5.0
|
||||
|
||||
Add `validate_stream_statuses` to TestBasicRead.test_read:: Validate all statuses for all streams in the catalogs were emitted in correct order.
|
||||
|
||||
## 3.4.0
|
||||
|
||||
Add TestConnectorDocumentation suite for validating connectors documentation structure and content.
|
||||
|
||||
## 3.3.3
|
||||
|
||||
Аix `NoAdditionalPropertiesValidator` if no type found in `items`
|
||||
|
||||
## 3.3.2
|
||||
|
||||
Fix TestBasicRead.test_read.validate_schema: set `additionalProperties` to False recursively for objects.
|
||||
|
||||
## 3.3.1
|
||||
|
||||
Fix TestSpec.test_oauth_is_default_method to skip connectors that doesn't have predicate_key object.
|
||||
|
||||
## 3.3.0
|
||||
|
||||
Add `test_certified_connector_has_allowed_hosts` and `test_certified_connector_has_suggested_streams` tests to the `connector_attribute` test suite
|
||||
|
||||
## 3.2.0
|
||||
|
||||
Add TestBasicRead.test_all_supported_file_types_present, which validates that all supported file types are present in the sandbox account for certified file-based connectors.
|
||||
|
||||
## 3.1.0
|
||||
|
||||
Add TestSpec.test_oauth_is_default_method test with OAuth is default option validation.
|
||||
|
||||
## 3.0.1
|
||||
|
||||
Upgrade to Dagger 0.9.6
|
||||
|
||||
## 3.0.0
|
||||
|
||||
Upgrade to Dagger 0.9.5
|
||||
|
||||
## 2.2.0
|
||||
|
||||
Add connector_attribute test suite and stream primary key validation
|
||||
|
||||
## 2.1.4
|
||||
@@ -62,343 +75,455 @@ Support loading it from its Dagger container id for better performance.
|
||||
Install pytest-xdist to support running tests in parallel.
|
||||
|
||||
## 2.0.2
|
||||
|
||||
Make `test_two_sequential_reads` handle namespace property in stream descriptor.
|
||||
|
||||
## 2.0.1
|
||||
|
||||
Changing `format` or `airbyte_type` in a field definition of a schema or specification is now a breaking change.
|
||||
|
||||
## 2.0.0
|
||||
|
||||
Update test_incremental.test_two_sequential_reads to be unaware of the contents of the state message. This is to support connectors that have a custom implementation of a cursor.
|
||||
|
||||
## 1.0.4
|
||||
|
||||
Fix edge case in skip_backward_compatibility_tests_fixture on discovery: if the current config structure is not compatible with the previous connector version, the discovery command failing and the previous connector version catalog could not be retrieved.
|
||||
|
||||
## 1.0.3
|
||||
|
||||
Add tests for display_type property
|
||||
|
||||
## 1.0.2
|
||||
|
||||
Fix bug in skip_backward_compatibility_tests_fixture, the previous connector version could not be retrieved.
|
||||
|
||||
## 1.0.1
|
||||
|
||||
Pin airbyte-protocol-model to <1.0.0.
|
||||
|
||||
## 1.0.0
|
||||
|
||||
Bump to Python 3.10, use dagger instead of docker-py in the ConnectorRunner.
|
||||
|
||||
## 0.11.5
|
||||
|
||||
Changing test output and adding diff to test_read
|
||||
|
||||
## 0.11.4
|
||||
|
||||
Relax checking of `oneOf` common property and allow optional `default` keyword additional to `const` keyword.
|
||||
|
||||
## 0.11.3
|
||||
|
||||
Refactor test_oauth_flow_parameters to validate advanced_auth instead of the deprecated authSpecification
|
||||
|
||||
## 0.11.2
|
||||
|
||||
Do not enforce spec.json/spec.yaml
|
||||
|
||||
## 0.11.1
|
||||
|
||||
Test connector image labels and make sure they are set correctly and match metadata.yaml.
|
||||
|
||||
## 0.11.0
|
||||
|
||||
Add backward_compatibility.check_if_field_removed test to check if a field has been removed from the catalog.
|
||||
|
||||
## 0.10.8
|
||||
|
||||
Increase the connection timeout to Docker client to 2 minutes ([context](https://github.com/airbytehq/airbyte/issues/27401))
|
||||
|
||||
## 0.10.7
|
||||
|
||||
Fix on supporting arrays in the state (ensure string are parsed as string and not int)
|
||||
|
||||
## 0.10.6
|
||||
|
||||
Supporting arrays in the state by allowing ints in cursor_paths
|
||||
|
||||
## 0.10.5
|
||||
|
||||
Skipping test_catalog_has_supported_data_types as it is failing on too many connectors. Will first address globally the type/format problems at scale and then re-enable it.
|
||||
|
||||
## 0.10.4
|
||||
|
||||
Fixing bug: test_catalog_has_supported_data_types should support stream properties having `/` in it.
|
||||
|
||||
## 0.10.3
|
||||
|
||||
Fixing bug: test_catalog_has_supported_data_types , integer is a supported airbyte type.
|
||||
|
||||
## 0.10.2
|
||||
|
||||
Fixing bug: test_catalog_has_supported_data_types was failing when a connector stream property is named 'type'.
|
||||
|
||||
## 0.10.1
|
||||
|
||||
Reverting to 0.9.0 as the latest version. 0.10.0 was released with a bug failing CAT on a couple of connectors.
|
||||
|
||||
## 0.10.0
|
||||
|
||||
Discovery test: add validation that fails if the declared types/format/airbyte_types in the connector's streams properties are not [supported data types](https://docs.airbyte.com/understanding-airbyte/supported-data-types/) or if their combination is invalid.
|
||||
|
||||
## 0.9.0
|
||||
|
||||
Basic read test: add validation that fails if undeclared columns are present in records. Add `fail_on_extra_fields` input parameter to ignore this failure if desired.
|
||||
|
||||
## 0.8.0
|
||||
|
||||
Spec tests: Make sure grouping and ordering properties are used in a consistent way.
|
||||
|
||||
## 0.7.2
|
||||
|
||||
TestConnection: assert that a check with `exception` status emits a trace message.
|
||||
|
||||
## 0.7.1
|
||||
|
||||
Discovery backward compatibility tests: handle errors on previous connectors catalog retrieval. Return None when the discovery failed. It should unblock the situation when tests fails even if you bypassed backward compatibility tests.
|
||||
|
||||
## 0.7.0
|
||||
|
||||
Basic read test: add `ignored_fields`, change configuration format by adding optional `bypass_reason` [#22996](https://github.com/airbytehq/airbyte/pull/22996)
|
||||
|
||||
## 0.6.1
|
||||
|
||||
Fix docker API - "Error" is optional. [#22987](https://github.com/airbytehq/airbyte/pull/22987)
|
||||
|
||||
## 0.6.0
|
||||
|
||||
Allow passing custom environment variables to the connector under test. [#22937](https://github.com/airbytehq/airbyte/pull/22937).
|
||||
|
||||
## 0.5.3
|
||||
|
||||
Spec tests: Make `oneOf` checks work for nested `oneOf`s. [#22395](https://github.com/airbytehq/airbyte/pull/22395)
|
||||
|
||||
## 0.5.2
|
||||
|
||||
Check that `emitted_at` increases during subsequent reads. [#22291](https://github.com/airbytehq/airbyte/pull/22291)
|
||||
|
||||
## 0.5.1
|
||||
|
||||
Fix discovered catalog caching for different configs. [#22301](https://github.com/airbytehq/airbyte/pull/22301)
|
||||
|
||||
## 0.5.0
|
||||
|
||||
Re-release of 0.3.0 [#21451](https://github.com/airbytehq/airbyte/pull/21451)
|
||||
|
||||
# Renamed image from `airbyte/source-acceptance-test` to `airbyte/connector-acceptance-test` - Older versions are only available under the old name
|
||||
|
||||
## 0.4.0
|
||||
|
||||
Revert 0.3.0
|
||||
|
||||
## 0.3.0
|
||||
|
||||
(Broken) Add various stricter checks for specs (see PR for details). [#21451](https://github.com/airbytehq/airbyte/pull/21451)
|
||||
|
||||
## 0.2.26
|
||||
|
||||
Check `future_state` only for incremental streams. [#21248](https://github.com/airbytehq/airbyte/pull/21248)
|
||||
|
||||
## 0.2.25
|
||||
|
||||
Enable bypass reason for future state test config.[#20549](https://github.com/airbytehq/airbyte/pull/20549)
|
||||
|
||||
## 0.2.24
|
||||
|
||||
Check for nullity of docker runner in `previous_discovered_catalog_fixture`.[#20899](https://github.com/airbytehq/airbyte/pull/20899)
|
||||
|
||||
## 0.2.23
|
||||
|
||||
Skip backward compatibility tests on specifications if actual and previous specifications and discovered catalogs are identical.[#20435](https://github.com/airbytehq/airbyte/pull/20435)
|
||||
|
||||
## 0.2.22
|
||||
|
||||
Capture control messages to store and use updated configurations. [#19979](https://github.com/airbytehq/airbyte/pull/19979).
|
||||
|
||||
## 0.2.21
|
||||
|
||||
Optionally disable discovered catalog caching. [#19806](https://github.com/airbytehq/airbyte/pull/19806).
|
||||
|
||||
## 0.2.20
|
||||
|
||||
Stricter integer field schema validation. [#19820](https://github.com/airbytehq/airbyte/pull/19820).
|
||||
|
||||
## 0.2.19
|
||||
|
||||
Test for exposed secrets: const values can not hold secrets. [#19465](https://github.com/airbytehq/airbyte/pull/19465).
|
||||
|
||||
## 0.2.18
|
||||
|
||||
Test connector specification against exposed secret fields. [#19124](https://github.com/airbytehq/airbyte/pull/19124).
|
||||
|
||||
## 0.2.17
|
||||
|
||||
Make `incremental.future_state` mandatory in `high` `test_strictness_level`. [#19085](https://github.com/airbytehq/airbyte/pull/19085/).
|
||||
|
||||
## 0.2.16
|
||||
|
||||
Run `basic_read` on the discovered catalog in `high` `test_strictness_level`. [#18937](https://github.com/airbytehq/airbyte/pull/18937).
|
||||
|
||||
## 0.2.15
|
||||
|
||||
Make `expect_records` mandatory in `high` `test_strictness_level`. [#18497](https://github.com/airbytehq/airbyte/pull/18497/).
|
||||
|
||||
## 0.2.14
|
||||
|
||||
Fail basic read in `high` `test_strictness_level` if no `bypass_reason` is set on empty_streams. [#18425](https://github.com/airbytehq/airbyte/pull/18425/).
|
||||
|
||||
## 0.2.13
|
||||
|
||||
Fail tests in `high` `test_strictness_level` if all tests are not configured. [#18414](https://github.com/airbytehq/airbyte/pull/18414/).
|
||||
|
||||
## 0.2.12
|
||||
|
||||
Declare `bypass_reason` field in test configuration. [#18364](https://github.com/airbytehq/airbyte/pull/18364).
|
||||
|
||||
## 0.2.11
|
||||
|
||||
Declare `test_strictness_level` field in test configuration. [#18218](https://github.com/airbytehq/airbyte/pull/18218).
|
||||
|
||||
## 0.2.10
|
||||
|
||||
Bump `airbyte-cdk~=0.2.0`
|
||||
|
||||
## 0.2.9
|
||||
|
||||
Update tests after protocol change making `supported_sync_modes` a required property of `AirbyteStream` [#15591](https://github.com/airbytehq/airbyte/pull/15591/)
|
||||
|
||||
## 0.2.8
|
||||
|
||||
Make full refresh tests tolerant to new records in a sequential read.[#17660](https://github.com/airbytehq/airbyte/pull/17660/)
|
||||
|
||||
## 0.2.7
|
||||
|
||||
Fix a bug when a state is evaluated once before used in a loop of `test_read_sequential_slices` [#17757](https://github.com/airbytehq/airbyte/pull/17757/)
|
||||
|
||||
## 0.2.6
|
||||
|
||||
Backward compatibility hypothesis testing: disable "filtering too much" health check. [#17871](https://github.com/airbytehq/airbyte/pull/17871)
|
||||
|
||||
## 0.2.5
|
||||
|
||||
Unit test `test_state_with_abnormally_large_values` to check state emission testing is working. [#17791](https://github.com/airbytehq/airbyte/pull/17791)
|
||||
|
||||
## 0.2.4
|
||||
|
||||
Make incremental tests compatible with per stream states.[#16686](https://github.com/airbytehq/airbyte/pull/16686/)
|
||||
|
||||
## 0.2.3
|
||||
|
||||
Backward compatibility tests: improve `check_if_type_of_type_field_changed` to make it less radical when validating specs and allow `'str' -> ['str', '<another_type>']` type changes.[#16429](https://github.com/airbytehq/airbyte/pull/16429/)
|
||||
|
||||
## 0.2.2
|
||||
|
||||
Backward compatibility tests: improve `check_if_cursor_field_was_changed` to make it less radical and allow stream addition to catalog.[#15835](https://github.com/airbytehq/airbyte/pull/15835/)
|
||||
|
||||
## 0.2.1
|
||||
|
||||
Don't fail on updating `additionalProperties`: fix IndexError [#15532](https://github.com/airbytehq/airbyte/pull/15532/)
|
||||
|
||||
## 0.2.0
|
||||
|
||||
Finish backward compatibility syntactic tests implementation: check that cursor fields were not changed. [#15520](https://github.com/airbytehq/airbyte/pull/15520/)
|
||||
|
||||
## 0.1.62
|
||||
|
||||
Backward compatibility tests: add syntactic validation of catalogs [#15486](https://github.com/airbytehq/airbyte/pull/15486/)
|
||||
|
||||
## 0.1.61
|
||||
|
||||
Add unit tests coverage computation [#15443](https://github.com/airbytehq/airbyte/pull/15443/).
|
||||
|
||||
## 0.1.60
|
||||
|
||||
Backward compatibility tests: validate fake previous config against current connector specification. [#15367](https://github.com/airbytehq/airbyte/pull/15367)
|
||||
|
||||
## 0.1.59
|
||||
|
||||
Backward compatibility tests: add syntactic validation of specs [#15194](https://github.com/airbytehq/airbyte/pull/15194/).
|
||||
|
||||
## 0.1.58
|
||||
|
||||
Bootstrap spec backward compatibility tests. Add fixtures to retrieve a previous connector version spec [#14954](https://github.com/airbytehq/airbyte/pull/14954/).
|
||||
|
||||
## 0.1.57
|
||||
|
||||
Run connector from its image `working_dir` instead of from `/data`.
|
||||
|
||||
## 0.1.56
|
||||
|
||||
Add test case in `TestDiscovery` and `TestConnection` to assert `additionalProperties` fields are set to true if they are declared [#14878](https://github.com/airbytehq/airbyte/pull/14878/).
|
||||
|
||||
## 0.1.55
|
||||
|
||||
Add test case in `TestDiscovery` to assert `supported_sync_modes` stream field in catalog is set and not empty.
|
||||
|
||||
## 0.1.54
|
||||
|
||||
Fixed `AirbyteTraceMessage` test case to make connectors fail more reliably.
|
||||
|
||||
## 0.1.53
|
||||
|
||||
Add more granular incremental testing that walks through syncs and verifies records according to cursor value.
|
||||
|
||||
## 0.1.52
|
||||
|
||||
Add test case for `AirbyteTraceMessage` emission on connector failure: [#12796](https://github.com/airbytehq/airbyte/pull/12796/).
|
||||
|
||||
## 0.1.51
|
||||
|
||||
- Add `threshold_days` option for lookback window support in incremental tests.
|
||||
- Update CDK to prevent warnings when encountering new `AirbyteTraceMessage`s.
|
||||
|
||||
## 0.1.50
|
||||
|
||||
Added support for passing a `.yaml` file as `spec_path`.
|
||||
|
||||
## 0.1.49
|
||||
|
||||
Fixed schema parsing when a JSONschema `type` was not present - we now assume `object` if the `type` is not present.
|
||||
|
||||
## 0.1.48
|
||||
|
||||
Add checking that oneOf common property has only `const` keyword, no `default` and `enum` keywords: [#11704](https://github.com/airbytehq/airbyte/pull/11704)
|
||||
|
||||
## 0.1.47
|
||||
|
||||
Added local test success message containing git hash: [#11497](https://github.com/airbytehq/airbyte/pull/11497)
|
||||
|
||||
## 0.1.46
|
||||
|
||||
Fix `test_oneof_usage` test: [#9861](https://github.com/airbytehq/airbyte/pull/9861)
|
||||
|
||||
## 0.1.45
|
||||
|
||||
Check for not allowed keywords `allOf`, `not` in connectors schema: [#9851](https://github.com/airbytehq/airbyte/pull/9851)
|
||||
|
||||
## 0.1.44
|
||||
|
||||
Fix incorrect name of `primary_keys` attribute: [#9768](https://github.com/airbytehq/airbyte/pull/9768)
|
||||
|
||||
## 0.1.43
|
||||
|
||||
`TestFullRefresh` test can compare records using PKs: [#9768](https://github.com/airbytehq/airbyte/pull/9768)
|
||||
|
||||
## 0.1.36
|
||||
|
||||
Add assert that `spec.json` file does not have any `$ref` in it: [#8842](https://github.com/airbytehq/airbyte/pull/8842)
|
||||
|
||||
## 0.1.32
|
||||
|
||||
Add info about skipped failed tests in `/test` command message on GitHub: [#8691](https://github.com/airbytehq/airbyte/pull/8691)
|
||||
|
||||
## 0.1.31
|
||||
|
||||
Take `ConfiguredAirbyteCatalog` from discover command by default
|
||||
|
||||
## 0.1.30
|
||||
|
||||
Validate if each field in a stream has appeared at least once in some record.
|
||||
|
||||
## 0.1.29
|
||||
|
||||
Add assert that output catalog does not have any `$ref` in it
|
||||
|
||||
## 0.1.28
|
||||
|
||||
Print stream name when incremental sync tests fail
|
||||
|
||||
## 0.1.27
|
||||
|
||||
Add ignored fields for full refresh test (unit tests)
|
||||
|
||||
## 0.1.26
|
||||
|
||||
Add ignored fields for full refresh test
|
||||
|
||||
## 0.1.25
|
||||
|
||||
Fix incorrect nested structures compare.
|
||||
|
||||
## 0.1.24
|
||||
|
||||
Improve message about errors in the stream's schema: [#6934](https://github.com/airbytehq/airbyte/pull/6934)
|
||||
|
||||
## 0.1.23
|
||||
|
||||
Fix incorrect auth init flow check defect.
|
||||
|
||||
## 0.1.22
|
||||
|
||||
Fix checking schemas with root `$ref` keyword
|
||||
|
||||
## 0.1.21
|
||||
|
||||
Fix rootObject oauth init parameter check
|
||||
|
||||
## 0.1.20
|
||||
|
||||
Add oauth init flow parameter verification for spec.
|
||||
|
||||
## 0.1.19
|
||||
|
||||
Assert a non-empty overlap between the fields present in the record and the declared json schema.
|
||||
|
||||
## 0.1.18
|
||||
|
||||
Fix checking date-time format against nullable field.
|
||||
|
||||
## 0.1.17
|
||||
|
||||
Fix serialize function for acceptance-tests: [#5738](https://github.com/airbytehq/airbyte/pull/5738)
|
||||
|
||||
## 0.1.16
|
||||
|
||||
Fix for flake8-ckeck for acceptance-tests: [#5785](https://github.com/airbytehq/airbyte/pull/5785)
|
||||
|
||||
## 0.1.15
|
||||
|
||||
Add detailed logging for acceptance tests: [5392](https://github.com/airbytehq/airbyte/pull/5392)
|
||||
|
||||
## 0.1.14
|
||||
|
||||
Fix for NULL datetime in MySQL format (i.e. `0000-00-00`): [#4465](https://github.com/airbytehq/airbyte/pull/4465)
|
||||
|
||||
## 0.1.13
|
||||
|
||||
Replace `validate_output_from_all_streams` with `empty_streams` param: [#4897](https://github.com/airbytehq/airbyte/pull/4897)
|
||||
|
||||
## 0.1.12
|
||||
|
||||
Improve error message when data mismatches schema: [#4753](https://github.com/airbytehq/airbyte/pull/4753)
|
||||
|
||||
## 0.1.11
|
||||
|
||||
Fix error in the naming of method `test_match_expected` for class `TestSpec`.
|
||||
|
||||
## 0.1.10
|
||||
|
||||
Add validation of input config.json against spec.json.
|
||||
|
||||
## 0.1.9
|
||||
|
||||
Add configurable validation of schema for all records in BasicRead test: [#4345](https://github.com/airbytehq/airbyte/pull/4345)
|
||||
|
||||
The validation is ON by default.
|
||||
To disable validation for the source you need to set `validate_schema: off` in the config file.
|
||||
|
||||
## 0.1.8
|
||||
|
||||
Fix cursor_path to support nested and absolute paths: [#4552](https://github.com/airbytehq/airbyte/pull/4552)
|
||||
|
||||
## 0.1.7
|
||||
|
||||
Add: `test_spec` additionally checks if Dockerfile has `ENV AIRBYTE_ENTRYPOINT` defined and equal to space_joined `ENTRYPOINT`
|
||||
|
||||
## 0.1.6
|
||||
|
||||
Add test whether PKs present and not None if `source_defined_primary_key` defined: [#4140](https://github.com/airbytehq/airbyte/pull/4140)
|
||||
|
||||
## 0.1.5
|
||||
|
||||
Add configurable timeout for the acceptance tests: [#4296](https://github.com/airbytehq/airbyte/pull/4296)
|
||||
|
||||
@@ -1,17 +1,21 @@
|
||||
# Connector Acceptance Tests (CAT)
|
||||
|
||||
This package gathers multiple test suites to assess the sanity of any Airbyte connector.
|
||||
It is shipped as a [pytest](https://docs.pytest.org/en/7.1.x/) plugin and relies on pytest to discover, configure and execute tests.
|
||||
Test-specific documentation can be found [here](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/).
|
||||
|
||||
## Configuration
|
||||
|
||||
The acceptance tests are configured via the `acceptance-test-config.yml` YAML file, which is passed to the plugin via the `--acceptance-test-config` option.
|
||||
|
||||
## Running the acceptance tests locally
|
||||
|
||||
Note there are MANY ways to do this at this time, but we are working on consolidating them.
|
||||
|
||||
Which method you choose to use depends on the context you are in.
|
||||
|
||||
Pre-requisites:
|
||||
|
||||
- Setting up a Service Account for Google Secrets Manager (GSM) access. See [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/ci_credentials/README.md)
|
||||
- Ensuring that you have the `GCP_GSM_CREDENTIALS` environment variable set to the contents of your GSM service account key file.
|
||||
- [Poetry](https://python-poetry.org/docs/#installation) installed
|
||||
@@ -22,6 +26,7 @@ Pre-requisites:
|
||||
_Note: Install instructions for airbyte-ci are [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) _
|
||||
|
||||
**This runs connector acceptance and other tests that run in our CI**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=<connector-name> test
|
||||
```
|
||||
@@ -66,15 +71,15 @@ poetry install
|
||||
poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-faker --pdb
|
||||
```
|
||||
|
||||
|
||||
### Manually
|
||||
|
||||
1. `cd` into your connector project (e.g. `airbyte-integrations/connectors/source-pokeapi`)
|
||||
2. Edit `acceptance-test-config.yml` according to your need. Please refer to our [Connector Acceptance Test Reference](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference/) if you need details about the available options.
|
||||
3. Build the connector docker image ( e.g.: `airbyte-ci connectors --name=source-pokeapi build`)
|
||||
4. Use one of the following ways to run tests (**from your connector project directory**)
|
||||
|
||||
|
||||
## Developing on the acceptance tests
|
||||
|
||||
You may want to iterate on the acceptance test project itself: adding new tests, fixing a bug etc.
|
||||
These iterations are more conveniently achieved by remaining in the current directory.
|
||||
|
||||
@@ -82,14 +87,14 @@ These iterations are more conveniently achieved by remaining in the current dire
|
||||
2. Run the unit tests on the acceptance tests themselves: `poetry run pytest unit_tests` (add the `--pdb` option if you want to enable the debugger on test failure)
|
||||
3. To run specific unit test(s), add `-k` to the above command, e.g. `poetry run python -m pytest unit_tests -k 'test_property_can_store_secret'`. You can use wildcards `*` here as well.
|
||||
4. Make the changes you want:
|
||||
* Global pytest fixtures are defined in `./connector_acceptance_test/conftest.py`
|
||||
* Existing test modules are defined in `./connector_acceptance_test/tests`
|
||||
* `acceptance-test-config.yaml` structure is defined in `./connector_acceptance_test/config.py`
|
||||
- Global pytest fixtures are defined in `./connector_acceptance_test/conftest.py`
|
||||
- Existing test modules are defined in `./connector_acceptance_test/tests`
|
||||
- `acceptance-test-config.yaml` structure is defined in `./connector_acceptance_test/config.py`
|
||||
5. Unit test your changes by adding tests to `./unit_tests`
|
||||
6. Run the unit tests on the acceptance tests again: `poetry run pytest unit_tests`, make sure the coverage did not decrease. You can bypass slow tests by using the `slow` marker: `poetry run pytest unit_tests -m "not slow"`.
|
||||
7. Manually test the changes you made by running acceptance tests on a specific connector:
|
||||
* First build the connector to ensure your local image is up-to-date: `airbyte-ci connectors --name=source-pokeapi build`
|
||||
* Then run the acceptance tests on the connector: `poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-pokeapi`
|
||||
- First build the connector to ensure your local image is up-to-date: `airbyte-ci connectors --name=source-pokeapi build`
|
||||
- Then run the acceptance tests on the connector: `poetry run pytest -p connector_acceptance_test.plugin --acceptance-test-config=../../connectors/source-pokeapi`
|
||||
8. Make sure you updated `docs/connector-development/testing-connectors/connector-acceptance-tests-reference.md` according to your changes
|
||||
9. Update the project changelog `airbyte-integrations/bases/connector-acceptance-test/CHANGELOG.md`
|
||||
10. Open a PR on our GitHub repository
|
||||
@@ -98,8 +103,9 @@ These iterations are more conveniently achieved by remaining in the current dire
|
||||
13. Merge your PR
|
||||
|
||||
## Migrating `acceptance-test-config.yml` to latest configuration format
|
||||
|
||||
We introduced changes in the structure of `acceptance-test-config.yml` files in version 0.2.12.
|
||||
The *legacy* configuration format is still supported but should be deprecated soon.
|
||||
The _legacy_ configuration format is still supported but should be deprecated soon.
|
||||
To migrate a legacy configuration to the latest configuration format please run:
|
||||
|
||||
```bash
|
||||
|
||||
@@ -12,6 +12,7 @@ npm run generate
|
||||
```
|
||||
|
||||
### Using Docker
|
||||
|
||||
If you don't want to install `npm` you can run the generator using Docker:
|
||||
|
||||
```
|
||||
@@ -21,6 +22,7 @@ If you don't want to install `npm` you can run the generator using Docker:
|
||||
## Contributions
|
||||
|
||||
### Testing connector templates
|
||||
|
||||
To test that the templates generate valid code, we follow a slightly non-obvious strategy. Since the templates
|
||||
themselves do not contain valid Java/Python/etc.. syntax, we can't build them directly.
|
||||
At the same time, due to the way Gradle works (where phase 1 is "discovering" all the projects that need to be
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
# TODO: Define your stream schemas
|
||||
|
||||
Your connector must describe the schema of each stream it can output using [JSONSchema](https://json-schema.org).
|
||||
|
||||
The simplest way to do this is to describe the schema of your streams using one `.json` file per stream. You can also dynamically generate the schema of your stream in code, or you can combine both approaches: start with a `.json` file and dynamically add properties to it.
|
||||
@@ -6,15 +7,19 @@ The simplest way to do this is to describe the schema of your streams using one
|
||||
The schema of a stream is the return value of `Stream.get_json_schema`.
|
||||
|
||||
## Static schemas
|
||||
|
||||
By default, `Stream.get_json_schema` reads a `.json` file in the `schemas/` directory whose name is equal to the value of the `Stream.name` property. In turn `Stream.name` by default returns the name of the class in snake case. Therefore, if you have a class `class EmployeeBenefits(HttpStream)` the default behavior will look for a file called `schemas/employee_benefits.json`. You can override any of these behaviors as you need.
|
||||
|
||||
Important note: any objects referenced via `$ref` should be placed in the `shared/` directory in their own `.json` files.
|
||||
|
||||
## Dynamic schemas
|
||||
|
||||
If you'd rather define your schema in code, override `Stream.get_json_schema` in your stream class to return a `dict` describing the schema using [JSONSchema](https://json-schema.org).
|
||||
|
||||
## Dynamically modifying static schemas
|
||||
|
||||
Override `Stream.get_json_schema` to run the default behavior, edit the returned value, then return the edited value:
|
||||
|
||||
```
|
||||
def get_json_schema(self):
|
||||
schema = super().get_json_schema()
|
||||
|
||||
@@ -6,6 +6,7 @@ This component is used by the `/connector-performance` GitHub action and is used
|
||||
destination connectors on a number of datasets.
|
||||
|
||||
Associated files are:
|
||||
|
||||
<li>Main.java - the main entrypoint for the harness
|
||||
<li>PerformanceTest.java - sets up the destination connector, sends records to it, and measures throughput
|
||||
<li>run-harness-process.yaml - kubernetes file that processes dynamic arguments and runs the harness
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/amazon-sqs)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_amazon_sqs/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-amazon-sqs build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-amazon-sqs build
|
||||
An image will be built with the tag `airbyte/destination-amazon-sqs:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-amazon-sqs:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-amazon-sqs:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-amazon-sqs:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-amazon-sqs test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-amazon-sqs test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -1,20 +1,25 @@
|
||||
# Amazon SQS Destination
|
||||
|
||||
## What
|
||||
|
||||
This is a connector for producing messages to an [Amazon SQS Queue](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html)
|
||||
|
||||
## How
|
||||
|
||||
### Sending messages
|
||||
|
||||
Amazon SQS allows messages to be sent individually or in batches. Currently, this Destination only supports sending messages individually. This can
|
||||
have performance implications if sending high volumes of messages.
|
||||
|
||||
#### Message Body
|
||||
|
||||
By default, the SQS Message body is built using the AirbyteMessageRecord's 'data' property.
|
||||
|
||||
If the **message_body_key** config item is set, we use the value as a key within the AirbyteMessageRecord's 'data' property. This could be
|
||||
improved to handle nested keys by using JSONPath syntax to lookup values.
|
||||
|
||||
For example, given the input Record:
|
||||
|
||||
```
|
||||
{
|
||||
"data":
|
||||
@@ -28,6 +33,7 @@ For example, given the input Record:
|
||||
```
|
||||
|
||||
With no **message_body_key** set, the output SQS Message body will be
|
||||
|
||||
```
|
||||
{
|
||||
"parent_key": {
|
||||
@@ -38,6 +44,7 @@ With no **message_body_key** set, the output SQS Message body will be
|
||||
```
|
||||
|
||||
With **message_body_key** set to `parent_key`, the output SQS Message body will be
|
||||
|
||||
```
|
||||
{
|
||||
"nested_key": "nested_value"
|
||||
@@ -45,15 +52,18 @@ With **message_body_key** set to `parent_key`, the output SQS Message body will
|
||||
```
|
||||
|
||||
#### Message attributes
|
||||
|
||||
The airbyte_emmited_at timestamp is added to every message as an Attribute by default. This could be improved to allow the user to set Attributes through the UI, or to take keys from the Record as Attributes.
|
||||
|
||||
#### FIFO Queues
|
||||
A Queue URL that ends with '.fifo' **must** be a valid FIFO Queue. When the queue is FIFO, the *message_group_id* property is required.
|
||||
|
||||
A Queue URL that ends with '.fifo' **must** be a valid FIFO Queue. When the queue is FIFO, the _message_group_id_ property is required.
|
||||
|
||||
Currently, a unique uuid4 is generated as the dedupe ID for every message. This could be improved to allow the user to specify a path in the Record
|
||||
to use as a dedupe ID.
|
||||
|
||||
### Credentials
|
||||
|
||||
Requires an AWS IAM Access Key ID and Secret Key.
|
||||
|
||||
This could be improved to add support for configured AWS profiles, env vars etc.
|
||||
|
||||
@@ -6,18 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/astra)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_astra/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -27,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -36,6 +40,7 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Use `airbyte-ci` to build your connector
|
||||
|
||||
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
|
||||
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
|
||||
Then running the following command will build your connector:
|
||||
@@ -43,15 +48,18 @@ Then running the following command will build your connector:
|
||||
```bash
|
||||
airbyte-ci connectors --name destination-astra build
|
||||
```
|
||||
|
||||
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-astra:dev`.
|
||||
|
||||
##### Customizing our build process
|
||||
|
||||
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
|
||||
You can customize our build process by adding a `build_customization.py` module to your connector.
|
||||
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
|
||||
It will be imported at runtime by our build process and the functions will be called if they exist.
|
||||
|
||||
Here is an example of a `build_customization.py` module:
|
||||
|
||||
```python
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container:
|
||||
```
|
||||
|
||||
#### Build your own connector image
|
||||
|
||||
This connector is built using our dynamic built process in `airbyte-ci`.
|
||||
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
|
||||
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
|
||||
@@ -79,6 +88,7 @@ It does not rely on a Dockerfile.
|
||||
If you would like to patch our connector and build your own a simple approach would be to:
|
||||
|
||||
1. Create your own Dockerfile based on the latest version of the connector image.
|
||||
|
||||
```Dockerfile
|
||||
FROM airbyte/destination-astra:latest
|
||||
|
||||
@@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code
|
||||
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
```
|
||||
|
||||
Please use this as an example. This is not optimized.
|
||||
|
||||
2. Build your image:
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-astra:dev .
|
||||
# Running the spec command against your patched connector
|
||||
docker run airbyte/destination-astra:dev spec
|
||||
````
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-astra:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-astra:dev check --config /secrets/config.json
|
||||
@@ -112,7 +127,9 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
### Unit Tests
|
||||
To run unit tests locally, from the connector directory run:
|
||||
```
|
||||
|
||||
poetry run pytest -s unit_tests
|
||||
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
@@ -120,7 +137,9 @@ There are two types of integration tests: Acceptance Tests (Airbyte's test suite
|
||||
#### Custom Integration tests
|
||||
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
|
||||
```
|
||||
|
||||
poetry run pytest -s integration_tests
|
||||
|
||||
```
|
||||
#### Acceptance Tests
|
||||
Coming soon:
|
||||
@@ -141,3 +160,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
1. Create a Pull Request.
|
||||
1. Pat yourself on the back for being an awesome contributor.
|
||||
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
```
|
||||
|
||||
@@ -55,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-aws-datalake build
|
||||
```
|
||||
@@ -65,6 +66,7 @@ airbyte-ci connectors --name=destination-aws-datalake build
|
||||
An image will be built with the tag `airbyte/destination-aws-datalake:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-aws-datalake:dev .
|
||||
```
|
||||
@@ -80,14 +82,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-aws-datalake:dev
|
||||
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-aws-datalake:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-aws-datalake test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
@@ -97,11 +101,13 @@ All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The re
|
||||
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-aws-datalake test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -109,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -13,21 +13,24 @@ As a community contributor, you will need access to Azure to run the integration
|
||||
- Feel free to modify the config files with different settings in the acceptance test file (e.g. `AzureBlobStorageJsonlDestinationAcceptanceTest.java`, method `getFormatConfig`), as long as they follow the schema defined in [spec.json](src/main/resources/spec.json).
|
||||
|
||||
## Airbyte Employee
|
||||
|
||||
- Access the `Azure Blob Storage Account` secrets on Last Pass.
|
||||
- Replace the `config.json` under `sample_secrets`.
|
||||
- Rename the directory from `sample_secrets` to `secrets`.
|
||||
|
||||
### Infra setup
|
||||
|
||||
1. Log in to the [Azure portal](https://portal.azure.com/#home) using the `integration-test@airbyte.io` account
|
||||
1. Go to [Storage Accounts](https://portal.azure.com/#view/HubsExtension/BrowseResource/resourceType/Microsoft.Storage%2FStorageAccounts)
|
||||
1. Create a new storage account with a reasonable name (currently `airbyteteststorage`), under the `integration-test-rg` resource group.
|
||||
1. In the `Redundancy` setting, choose `Locally-redundant storage (LRS)`.
|
||||
1. Hit `Review` (you can leave all the other settings as the default) and then `Create`.
|
||||
1. In the `Redundancy` setting, choose `Locally-redundant storage (LRS)`.
|
||||
1. Hit `Review` (you can leave all the other settings as the default) and then `Create`.
|
||||
1. Navigate into that storage account -> `Containers`. Make a new container with a reasonable name (currently `airbytetescontainername`).
|
||||
1. Then go back up to the storage account -> `Access keys`. This is the `azure_blob_storage_account_key` config field.
|
||||
1. There are two keys; use the first one. We don't need 100% uptime on our integration tests, so there's no need to alternate between the two keys.
|
||||
1. There are two keys; use the first one. We don't need 100% uptime on our integration tests, so there's no need to alternate between the two keys.
|
||||
|
||||
## Add New Output Format
|
||||
|
||||
- Add a new enum in `AzureBlobStorageFormat'.
|
||||
- Modify `spec.json` to specify the configuration of this new format.
|
||||
- Update `AzureBlobStorageFormatConfigs` to be able to construct a config for this new format.
|
||||
|
||||
@@ -1,12 +1,15 @@
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-bigquery:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -15,16 +18,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-bigquery:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-bigquery:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-bigquery:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-bigquery:dev check --config /secrets/config.json
|
||||
@@ -33,22 +40,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/bigquery`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/BigQueryDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-bigquery:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-bigquery:integrationTest
|
||||
```
|
||||
@@ -56,7 +70,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-bigquery test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -64,4 +80,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/chroma)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_chroma/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -34,9 +39,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-chroma build
|
||||
```
|
||||
@@ -44,12 +50,15 @@ airbyte-ci connectors --name=destination-chroma build
|
||||
An image will be built with the tag `airbyte/destination-chroma:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-chroma:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-chroma:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-chroma:dev check --config /secrets/config.json
|
||||
@@ -58,35 +67,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-chroma test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run:
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-chroma test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -94,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-clickhouse:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-clickhouse:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-clickhouse:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-clickhouse:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-clickhouse:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/clickhouse`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/clickhouseDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-clickhouse:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-clickhouse:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-clickhouse test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -15,4 +15,3 @@ This destination connector uses ClickHouse official JDBC driver, which uses HTTP
|
||||
## API Reference
|
||||
|
||||
The ClickHouse reference documents: [https://clickhouse.com/docs/en/](https://clickhouse.com/docs/en/)
|
||||
|
||||
|
||||
@@ -54,9 +54,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-convex build
|
||||
```
|
||||
@@ -64,6 +65,7 @@ airbyte-ci connectors --name=destination-convex build
|
||||
An image will be built with the tag `airbyte/destination-convex:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-convex:dev .
|
||||
```
|
||||
@@ -79,14 +81,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-convex:dev check
|
||||
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-convex:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-convex test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
@@ -99,7 +103,9 @@ We split dependencies between two groups, dependencies that are:
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-convex test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -107,4 +113,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/cumulio)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_cumulio/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -47,9 +54,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-cumulio build
|
||||
```
|
||||
@@ -57,12 +65,15 @@ airbyte-ci connectors --name=destination-cumulio build
|
||||
An image will be built with the tag `airbyte/destination-cumulio:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-cumulio:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-cumulio:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-cumulio:dev check --config /secrets/config.json
|
||||
@@ -71,23 +82,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-cumulio test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-cumulio test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -95,4 +113,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/databend)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_databend/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-databend build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-databend build
|
||||
An image will be built with the tag `airbyte/destination-databend:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-databend:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-databend:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databend:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-databend test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-databend test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -4,17 +4,21 @@ This is the repository for the Databricks destination connector in Java.
|
||||
For information about how to use this connector within Airbyte, see [the User Documentation](https://docs.airbyte.io/integrations/destinations/databricks).
|
||||
|
||||
## Databricks JDBC Driver
|
||||
|
||||
This connector requires a JDBC driver to connect to Databricks cluster. Before using this connector, you must agree to the [JDBC ODBC driver license](https://databricks.com/jdbc-odbc-driver-license). This means that you can only use this driver to connector third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols.
|
||||
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-databricks:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, you will need access to AWS S3, Azure blob storage, and Databricks cluster to run the integration tests:
|
||||
|
||||
- Create a Databricks cluster. See [documentation](https://docs.databricks.com/clusters/create.html).
|
||||
@@ -34,16 +38,20 @@ From the Airbyte repository root, run:
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-databricks:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-databricks:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-databricks:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databricks:dev check --config /secrets/config.json
|
||||
@@ -52,22 +60,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/databricks`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/databricksDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-databricks:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-databricks:integrationTest
|
||||
```
|
||||
@@ -75,7 +90,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-databricks test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -83,4 +100,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -5,7 +5,9 @@ This destination is a "safe" version of the [E2E Test destination](https://docs.
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dev-null:build
|
||||
```
|
||||
@@ -13,16 +15,20 @@ From the Airbyte repository root, run:
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dev-null:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-dev-null:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-dev-null:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-dev-null:dev check --config /secrets/config.json
|
||||
@@ -31,12 +37,16 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dev-null:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dev-null:integrationTest
|
||||
```
|
||||
@@ -44,7 +54,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-dev-null test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -52,4 +64,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,23 +6,28 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
python -m pip install --upgrade pip
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -31,6 +36,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/duckdb)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_duckdb/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -40,6 +46,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config integration_tests/config.json
|
||||
@@ -47,26 +54,28 @@ python main.py discover --config integration_tests/config.json
|
||||
cat integration_tests/messages.jsonl| python main.py write --config integration_tests/config.json --catalog integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-duckdb build [--architecture=...]
|
||||
```
|
||||
|
||||
|
||||
An image will be built with the tag `airbyte/destination-duckdb:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-duckdb:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-duckdb:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-duckdb:dev check --config /secrets/config.json
|
||||
@@ -74,25 +83,31 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-duckdb:dev check
|
||||
cat integration_tests/messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-duckdb:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-duckdb test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-duckdb test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -100,4 +115,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dynamodb:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dynamodb:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-dynamodb:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-dynamodb:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-dynamodb:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/dynamodb`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/dynamodbDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dynamodb:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-dynamodb:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-dynamodb test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -5,27 +5,34 @@ This is the repository for the Null destination connector in Java. For informati
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-e2e-test:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
No credential is needed for this connector.
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-e2e-test:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-e2e-test:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-e2e-test:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-e2e-test:dev check --config /secrets/config.json
|
||||
@@ -34,25 +41,33 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
#### Cloud variant
|
||||
|
||||
The cloud variant of this connector is Dev Null Destination. It only allows the "silent" mode. When this mode is changed, please make sure that the Dev Null Destination is updated and published accordingly as well.
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/e2e-test`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. See example(s) in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/e2e-test/`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-e2e-test:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-e2e-test:integrationTest
|
||||
```
|
||||
@@ -60,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-e2e-test test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -68,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-elasticsearch:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-elasticsearch:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-elasticsearch:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-elasticsearch:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-elasticsearch:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/elasticsearch`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/elasticsearchDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-elasticsearch:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-elasticsearch:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-elasticsearch test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -28,6 +28,7 @@ You can create an index ahead of time for field type customization.
|
||||
Basic authentication and API key authentication are supported.
|
||||
|
||||
## Development
|
||||
|
||||
See the Elasticsearch client tests for examples on how to use the library.
|
||||
|
||||
[https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java](https://github.com/elastic/elasticsearch-java/blob/main/java-client/src/test/java/co/elastic/clients/elasticsearch/end_to_end/RequestTest.java)
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/firebolt)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_firebolt/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ cat integration_tests/messages.jsonl | python main.py write --config secrets/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-firebolt build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-firebolt build
|
||||
An image will be built with the tag `airbyte/destination-firebolt:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-firebolt:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-firebolt:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-firebolt:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat integration_tests/messages.jsonl | docker run --rm -v $(pwd)/secrets:/secret
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-firebolt test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-firebolt test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -18,5 +18,5 @@ This connector uses [firebolt-sdk](https://pypi.org/project/firebolt-sdk/), whic
|
||||
|
||||
## Notes
|
||||
|
||||
* Integration testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector.
|
||||
* S3 is generally faster writing strategy and should be preferred.
|
||||
- Integration testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector.
|
||||
- S3 is generally faster writing strategy and should be preferred.
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -40,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -49,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-firestore build
|
||||
```
|
||||
@@ -59,12 +66,15 @@ airbyte-ci connectors --name=destination-firestore build
|
||||
An image will be built with the tag `airbyte/destination-firestore:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-firestore:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-firestore:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-firestore:dev check --config /secrets/config.json
|
||||
@@ -73,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-firestore test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-firestore test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -97,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -15,13 +15,15 @@ As a community contributor, you can follow these steps to run integration tests.
|
||||
## Airbyte Employee
|
||||
|
||||
- Access the `SECRET_DESTINATION-GCS__CREDS` secrets on SecretManager, and put it in `sample_secrets/config.json`.
|
||||
_ Access the `SECRET_DESTINATION-GCS_NO_MULTIPART_ROLE_CREDS` secrets on SecretManager, and put it in `sample_secrets/insufficient_roles_config.json`.
|
||||
\_ Access the `SECRET_DESTINATION-GCS_NO_MULTIPART_ROLE_CREDS` secrets on SecretManager, and put it in `sample_secrets/insufficient_roles_config.json`.
|
||||
- Rename the directory from `sample_secrets` to `secrets`.
|
||||
|
||||
### GCP Service Account for Testing
|
||||
|
||||
Two service accounts have been created in our GCP for testing this destination. Both of them have access to Cloud Storage through HMAC keys. The keys are persisted together with the connector integration test credentials in LastPass.
|
||||
|
||||
- Account: `gcs-destination-connector-test@dataline-integration-testing.iam.gserviceaccount.com`
|
||||
|
||||
- This account has the required permission to pass the integration test. Note that the uploader needs `storage.multipartUploads` permissions, which may not be intuitive.
|
||||
- Role: `GCS Destination User`
|
||||
- Permissions:
|
||||
@@ -48,6 +50,7 @@ Two service accounts have been created in our GCP for testing this destination.
|
||||
- LastPass entry: `destination gcs creds (no multipart permission)`
|
||||
|
||||
## Add New Output Format
|
||||
|
||||
- Add a new enum in `S3Format`.
|
||||
- Modify `spec.json` to specify the configuration of this new format.
|
||||
- Update `S3FormatConfigs` to be able to construct a config for this new format.
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/google-sheets)
|
||||
to generate the necessary credentials. Then create a file `secrets/config_oauth.json` conforming to the `destination_google_sheets/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config_oauth.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config_oauth.json
|
||||
@@ -48,9 +55,10 @@ cat integration_tests/test_data/messages.txt | python main.py write --config sec
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-google-sheets build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-google-sheets build
|
||||
An image will be built with the tag `airbyte/destination-google-sheets:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-google-sheets:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-google-sheets:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-google-sheets:dev check --config /secrets/config_oauth.json
|
||||
@@ -72,23 +83,30 @@ cat integration_tests/test_data/messages.txt | docker run --rm -v $(pwd)/secrets
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-google-sheets test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-google-sheets test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -32,7 +32,6 @@ the [instructions](https://docs.airbyte.io/connector-development#using-credentia
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-iceberg:buildConnectorImage
|
||||
```
|
||||
@@ -83,7 +82,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-iceberg test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -91,4 +92,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -9,4 +9,3 @@ Spark, Trino, PrestoDB, Flink, Hive and Impala using a high-performance table fo
|
||||
|
||||
The Iceberg reference
|
||||
documents: [https://iceberg.apache.org/docs/latest/api/](https://iceberg.apache.org/docs/latest/api/)
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-kafka:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-kafka:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-kafka:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-kafka:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-kafka:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/kafka`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/kafkaDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-kafka:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-kafka:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-kafka test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -5,22 +5,27 @@ This is the repository for the [Kvdb](https://kvdb.io) destination connector, wr
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -29,12 +34,15 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-kvdb:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials from [Kvdb](https://kvdb.io/docs/api/), and then create a file `secrets/config.json` conforming to the `destination_kvdb/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
See `integration_tests/sample_config.json` for a sample config file.
|
||||
@@ -43,6 +51,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -52,10 +61,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-kvdb build
|
||||
```
|
||||
@@ -63,51 +72,71 @@ airbyte-ci connectors --name=destination-kvdb build
|
||||
An image will be built with the tag `airbyte/destination-kvdb:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-kvdb:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-kvdb:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-kvdb:dev check --config /secrets/config.json
|
||||
# messages.jsonl is a file containing line-separated JSON representing AirbyteMessages
|
||||
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-kvdb:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
## Testing
|
||||
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
|
||||
|
||||
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
|
||||
First install test dependencies into your virtual environment:
|
||||
|
||||
```
|
||||
pip install .[tests]
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
python -m pytest unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all destination connectors) and custom integration tests (which are specific to this connector).
|
||||
|
||||
#### Custom Integration tests
|
||||
|
||||
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
|
||||
|
||||
```
|
||||
python -m pytest integration_tests
|
||||
```
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-kvdb test
|
||||
```
|
||||
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-kvdb test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -115,4 +144,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.10.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/langchain)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_langchain/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-langchain build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-langchain build
|
||||
An image will be built with the tag `airbyte/destination-langchain:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-langchain:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-langchain:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-langchain:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-langchain test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-langchain test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
# Langchain Destination Connector Bootstrap
|
||||
|
||||
This destination does three things:
|
||||
* Split records into chunks and separates metadata from text data
|
||||
* Embeds text data into an embedding vector
|
||||
* Stores the metadata and embedding vector in a vector database
|
||||
|
||||
- Split records into chunks and separates metadata from text data
|
||||
- Embeds text data into an embedding vector
|
||||
- Stores the metadata and embedding vector in a vector database
|
||||
|
||||
The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/meilisearch)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_meilisearch/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-meilisearch build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-meilisearch build
|
||||
An image will be built with the tag `airbyte/destination-meilisearch:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-meilisearch:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-meilisearch:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-meilisearch:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-meilisearch test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-meilisearch test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/milvus)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_langchain/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -34,8 +39,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Use `airbyte-ci` to build your connector
|
||||
|
||||
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
|
||||
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
|
||||
Then running the following command will build your connector:
|
||||
@@ -43,15 +48,18 @@ Then running the following command will build your connector:
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-milvus build
|
||||
```
|
||||
|
||||
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-milvus:dev`.
|
||||
|
||||
##### Customizing our build process
|
||||
|
||||
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
|
||||
You can customize our build process by adding a `build_customization.py` module to your connector.
|
||||
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
|
||||
It will be imported at runtime by our build process and the functions will be called if they exist.
|
||||
|
||||
Here is an example of a `build_customization.py` module:
|
||||
|
||||
```python
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container:
|
||||
```
|
||||
|
||||
#### Build your own connector image
|
||||
|
||||
This connector is built using our dynamic built process in `airbyte-ci`.
|
||||
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
|
||||
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
|
||||
@@ -79,6 +88,7 @@ It does not rely on a Dockerfile.
|
||||
If you would like to patch our connector and build your own a simple approach would be to:
|
||||
|
||||
1. Create your own Dockerfile based on the latest version of the connector image.
|
||||
|
||||
```Dockerfile
|
||||
FROM airbyte/destination-milvus:latest
|
||||
|
||||
@@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code
|
||||
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
```
|
||||
|
||||
Please use this as an example. This is not optimized.
|
||||
|
||||
2. Build your image:
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-milvus:dev .
|
||||
# Running the spec command against your patched connector
|
||||
docker run airbyte/destination-milvus:dev spec
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-langchain:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-langchain:dev check --config /secrets/config.json
|
||||
@@ -107,35 +122,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-milvus test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run:
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-milvus test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -143,4 +169,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
# Milvus Destination Connector Bootstrap
|
||||
|
||||
This destination does three things:
|
||||
* Split records into chunks and separates metadata from text data
|
||||
* Embeds text data into an embedding vector
|
||||
* Stores the metadata and embedding vector in a vector database
|
||||
|
||||
- Split records into chunks and separates metadata from text data
|
||||
- Embeds text data into an embedding vector
|
||||
- Stores the metadata and embedding vector in a vector database
|
||||
|
||||
The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/.
|
||||
|
||||
|
||||
@@ -5,17 +5,21 @@ This is the repository for the Pinecone destination connector, written in Python
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/pinecone)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_pinecone/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -25,6 +29,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -33,8 +38,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Use `airbyte-ci` to build your connector
|
||||
|
||||
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
|
||||
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
|
||||
Then running the following command will build your connector:
|
||||
@@ -42,15 +47,18 @@ Then running the following command will build your connector:
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-pinecone build
|
||||
```
|
||||
|
||||
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-pinecone:dev`.
|
||||
|
||||
##### Customizing our build process
|
||||
|
||||
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
|
||||
You can customize our build process by adding a `build_customization.py` module to your connector.
|
||||
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
|
||||
It will be imported at runtime by our build process and the functions will be called if they exist.
|
||||
|
||||
Here is an example of a `build_customization.py` module:
|
||||
|
||||
```python
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -70,6 +78,7 @@ async def post_connector_install(connector_container: Container) -> Container:
|
||||
```
|
||||
|
||||
#### Build your own connector image
|
||||
|
||||
This connector is built using our dynamic built process in `airbyte-ci`.
|
||||
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
|
||||
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
|
||||
@@ -78,6 +87,7 @@ It does not rely on a Dockerfile.
|
||||
If you would like to patch our connector and build your own a simple approach would be to:
|
||||
|
||||
1. Create your own Dockerfile based on the latest version of the connector image.
|
||||
|
||||
```Dockerfile
|
||||
FROM airbyte/destination-pinecone:latest
|
||||
|
||||
@@ -88,16 +98,21 @@ RUN pip install ./airbyte/integration_code
|
||||
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
```
|
||||
|
||||
Please use this as an example. This is not optimized.
|
||||
|
||||
2. Build your image:
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-pinecone:dev .
|
||||
# Running the spec command against your patched connector
|
||||
docker run airbyte/destination-pinecone:dev spec
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-pinecone:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-pinecone:dev check --config /secrets/config.json
|
||||
@@ -106,35 +121,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-pinecone test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run:
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-pinecone test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -142,4 +168,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
# Pinecone Destination Connector Bootstrap
|
||||
|
||||
This destination does three things:
|
||||
* Split records into chunks and separates metadata from text data
|
||||
* Embeds text data into an embedding vector
|
||||
* Stores the metadata and embedding vector in Pinecone
|
||||
|
||||
- Split records into chunks and separates metadata from text data
|
||||
- Embeds text data into an embedding vector
|
||||
- Stores the metadata and embedding vector in Pinecone
|
||||
|
||||
The record processing is using the text split components from https://python.langchain.com/docs/modules/data_connection/document_transformers/.
|
||||
@@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.10.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/qdrant)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_qdrant/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -34,9 +39,10 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-qdrant build
|
||||
```
|
||||
@@ -44,12 +50,15 @@ airbyte-ci connectors --name=destination-qdrant build
|
||||
An image will be built with the tag `airbyte/destination-qdrant:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-qdrant:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-qdrant:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-qdrant:dev check --config /secrets/config.json
|
||||
@@ -58,35 +67,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-qdrant test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
To run integration tests locally, make sure you have a secrets/config.json as explained above, and then run:
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-qdrant test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -94,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/rabbitmq)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_rabbitmq/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-rabbitmq build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-rabbitmq build
|
||||
An image will be built with the tag `airbyte/destination-rabbitmq:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-rabbitmq:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-rabbitmq:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-rabbitmq:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-rabbitmq test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-rabbitmq test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-redis:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-redis:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-redis:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-redis:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-redis:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/redis`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/redisDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-redis:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-redis:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-redis test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,7 +6,6 @@ Redis has built-in replication, Lua scripting, LRU eviction, transactions, and d
|
||||
To achieve top performance, Redis works with an in-memory dataset. Depending on your use case, you can persist your data either by periodically dumping the dataset to disk or by appending each command to a disk-based log. You can also disable persistence if you just need a feature-rich, networked, in-memory cache.
|
||||
[Read more about Redis](https://redis.io/)
|
||||
|
||||
|
||||
This connector maps an incoming Airbyte namespace and stream to a different key in the Redis data structure. The connector supports the `append` sync mode by
|
||||
adding keys to an existing keyset and `overwrite` by deleting the existing ones and replacing them with the new ones.
|
||||
|
||||
|
||||
@@ -22,5 +22,5 @@ Consult the integration test area for Redshift.
|
||||
|
||||
The actual secrets for integration tests can be found in Google Cloud Secrets Manager. Search on redshift for the labels:
|
||||
|
||||
- SECRET_DESTINATION-REDSHIFT__CREDS - used for Standard tests. (__config.json__)
|
||||
- SECRET_DESTINATION-REDSHIFT_STAGING__CREDS - used for S3 Staging tests. (__config_staging.json__)
|
||||
- SECRET_DESTINATION-REDSHIFT**CREDS - used for Standard tests. (**config.json\_\_)
|
||||
- SECRET_DESTINATION-REDSHIFT_STAGING**CREDS - used for S3 Staging tests. (**config_staging.json\_\_)
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-s3-glue:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-s3-glue:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-s3-glue:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-s3-glue:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-s3-glue:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/s3_glue`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/s3_glueDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-s3-glue:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-s3-glue:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-s3-glue test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -19,6 +19,7 @@ As a community contributor, you will need access to AWS to run the integration t
|
||||
- Rename the directory from `sample_secrets` to `secrets`.
|
||||
|
||||
## Add New Output Format
|
||||
|
||||
- Add a new enum in `S3Format`.
|
||||
- Modify `spec.json` to specify the configuration of this new format.
|
||||
- Update `S3FormatConfigs` to be able to construct a config for this new format.
|
||||
|
||||
@@ -8,22 +8,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -32,6 +37,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sftp-json)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -41,6 +47,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -50,9 +57,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-sftp-json build
|
||||
```
|
||||
@@ -60,12 +68,15 @@ airbyte-ci connectors --name=destination-sftp-json build
|
||||
An image will be built with the tag `airbyte/destination-sftp-json:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-sftp-json:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-sftp-json:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev check --config /secrets/config.json
|
||||
@@ -74,23 +85,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-sftp-json test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sftp-json test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -98,4 +116,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -95,7 +95,9 @@ DROP WAREHOUSE IF EXISTS INTEGRATION_TEST_WAREHOUSE_DESTINATION;
|
||||
```
|
||||
|
||||
### Setup for various error-case users:
|
||||
|
||||
Log in as the `INTEGRATION_TEST_USER_DESTINATION` user, and run this:
|
||||
|
||||
```sql
|
||||
drop schema if exists INTEGRATION_TEST_DESTINATION.TEXT_SCHEMA;
|
||||
create schema INTEGRATION_TEST_DESTINATION.TEXT_SCHEMA;
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sqlite)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sqlite/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-sqlite build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-sqlite build
|
||||
An image will be built with the tag `airbyte/destination-sqlite:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-sqlite:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-sqlite:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sqlite:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-sqlite test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sqlite test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ For information about how to use this connector within Airbyte, see [the user do
|
||||
#### Build with Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:build
|
||||
```
|
||||
@@ -24,15 +25,18 @@ If you are an Airbyte core member, you must follow the [instructions](https://do
|
||||
#### Build
|
||||
|
||||
Build the connector image with Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:buildConnectorImage
|
||||
```
|
||||
|
||||
When building with Gradle, the Docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` labels in
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Following example commands are Starburst Galaxy-specific version of the [Airbyte protocol commands](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol):
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-starburst-galaxy:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-starburst-galaxy:dev check --config /secrets/config.json
|
||||
@@ -44,10 +48,13 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
All commands should be run from airbyte project root.
|
||||
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-starburst-galaxy:integrationTest
|
||||
```
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-teradata:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,16 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-teradata:buildConnectorImage
|
||||
```
|
||||
|
||||
Once built, the docker image name and tag on your host will be `airbyte/destination-teradata:dev`.
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-teradata:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-teradata:dev check --config /secrets/config.json
|
||||
@@ -38,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/teradata`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/teradataDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-teradata:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-teradata:integrationTest
|
||||
```
|
||||
@@ -61,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-teradata test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -69,4 +85,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -52,9 +52,10 @@ cat integration_tests/messages.jsonl | python main.py write --config secrets/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-timeplus build
|
||||
```
|
||||
@@ -62,6 +63,7 @@ airbyte-ci connectors --name=destination-timeplus build
|
||||
An image will be built with the tag `airbyte/destination-timeplus:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-timeplus:dev .
|
||||
```
|
||||
@@ -77,14 +79,16 @@ docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-timeplus:dev chec
|
||||
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-timeplus:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-timeplus test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
@@ -97,7 +101,9 @@ We split dependencies between two groups, dependencies that are:
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-timeplus test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -105,4 +111,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/typesense)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_typesense/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-typesense build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-typesense build
|
||||
An image will be built with the tag `airbyte/destination-typesense:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-typesense:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-typesense:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-typesense:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-typesense test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-typesense test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.9`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/vectara)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_vectara/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -25,19 +29,18 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `destination vectara test creds`
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
python main.py write --config secrets/config.json --catalog integration_tests/configured_catalog.json
|
||||
```
|
||||
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Use `airbyte-ci` to build your connector
|
||||
|
||||
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
|
||||
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
|
||||
Then running the following command will build your connector:
|
||||
@@ -45,15 +48,18 @@ Then running the following command will build your connector:
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-vectara build
|
||||
```
|
||||
|
||||
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-vectara:dev`.
|
||||
|
||||
##### Customizing our build process
|
||||
|
||||
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
|
||||
You can customize our build process by adding a `build_customization.py` module to your connector.
|
||||
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
|
||||
It will be imported at runtime by our build process and the functions will be called if they exist.
|
||||
|
||||
Here is an example of a `build_customization.py` module:
|
||||
|
||||
```python
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -73,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container:
|
||||
```
|
||||
|
||||
#### Build your own connector image
|
||||
|
||||
This connector is built using our dynamic built process in `airbyte-ci`.
|
||||
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
|
||||
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
|
||||
@@ -81,6 +88,7 @@ It does not rely on a Dockerfile.
|
||||
If you would like to patch our connector and build your own a simple approach would be to:
|
||||
|
||||
1. Create your own Dockerfile based on the latest version of the connector image.
|
||||
|
||||
```Dockerfile
|
||||
FROM airbyte/destination-vectara:latest
|
||||
|
||||
@@ -91,9 +99,11 @@ RUN pip install ./airbyte/integration_code
|
||||
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
```
|
||||
|
||||
Please use this as an example. This is not optimized.
|
||||
|
||||
2. Build your image:
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-vectara:dev .
|
||||
# Running the spec command against your patched connector
|
||||
@@ -101,7 +111,9 @@ docker run airbyte/destination-vectara:dev spec
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-vectara:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-vectara:dev check --config /secrets/config.json
|
||||
@@ -110,39 +122,50 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-vectara test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all destination connectors) and custom integration tests (which are specific to this connector).
|
||||
|
||||
#### Custom Integration tests
|
||||
|
||||
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Coming soon:
|
||||
|
||||
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `pyproject.toml`.
|
||||
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `[tool.poetry.dependencies]` list.
|
||||
* required for the testing need to go to `[tool.poetry.group.dev.dependencies]` list
|
||||
|
||||
- required for your connector to work need to go to `[tool.poetry.dependencies]` list.
|
||||
- required for the testing need to go to `[tool.poetry.group.dev.dependencies]` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-vectara test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -150,4 +173,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,17 +6,21 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
### Installing the connector
|
||||
|
||||
From this connector directory, run:
|
||||
|
||||
```bash
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/weaviate)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_weaviate/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -26,6 +30,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -34,8 +39,8 @@ python main.py write --config secrets/config.json --catalog integration_tests/co
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Use `airbyte-ci` to build your connector
|
||||
|
||||
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
|
||||
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
|
||||
Then running the following command will build your connector:
|
||||
@@ -43,15 +48,18 @@ Then running the following command will build your connector:
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-weaviate build
|
||||
```
|
||||
|
||||
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-weaviate:dev`.
|
||||
|
||||
##### Customizing our build process
|
||||
|
||||
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
|
||||
You can customize our build process by adding a `build_customization.py` module to your connector.
|
||||
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
|
||||
It will be imported at runtime by our build process and the functions will be called if they exist.
|
||||
|
||||
Here is an example of a `build_customization.py` module:
|
||||
|
||||
```python
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -71,6 +79,7 @@ async def post_connector_install(connector_container: Container) -> Container:
|
||||
```
|
||||
|
||||
#### Build your own connector image
|
||||
|
||||
This connector is built using our dynamic built process in `airbyte-ci`.
|
||||
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
|
||||
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
|
||||
@@ -79,6 +88,7 @@ It does not rely on a Dockerfile.
|
||||
If you would like to patch our connector and build your own a simple approach would be to:
|
||||
|
||||
1. Create your own Dockerfile based on the latest version of the connector image.
|
||||
|
||||
```Dockerfile
|
||||
FROM airbyte/destination-weaviate:latest
|
||||
|
||||
@@ -89,16 +99,21 @@ RUN pip install ./airbyte/integration_code
|
||||
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
|
||||
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
|
||||
```
|
||||
|
||||
Please use this as an example. This is not optimized.
|
||||
|
||||
2. Build your image:
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-weaviate:dev .
|
||||
# Running the spec command against your patched connector
|
||||
docker run airbyte/destination-weaviate:dev spec
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-weaviate:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-weaviate:dev check --config /secrets/config.json
|
||||
@@ -107,35 +122,46 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-weaviate test
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
To run unit tests locally, from the connector directory run:
|
||||
|
||||
```
|
||||
poetry run pytest -s unit_tests
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
To run integration tests locally, make sure you create a secrets/config.json as explained above, and then run:
|
||||
|
||||
```
|
||||
poetry run pytest -s integration_tests
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-weaviate test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -143,4 +169,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,22 +6,27 @@ For information about how to use this connector within Airbyte, see [the documen
|
||||
## Local development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**To iterate on this connector, make sure to complete this prerequisites section.**
|
||||
|
||||
#### Minimum Python version required `= 3.7.0`
|
||||
|
||||
#### Build & Activate Virtual Environment and install dependencies
|
||||
|
||||
From this connector directory, create a virtual environment:
|
||||
|
||||
```
|
||||
python -m venv .venv
|
||||
```
|
||||
|
||||
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
|
||||
development environment of choice. To activate it from the terminal, run:
|
||||
|
||||
```
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
|
||||
|
||||
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
|
||||
@@ -30,6 +35,7 @@ If this is mumbo jumbo to you, don't worry about it, just put your deps in `setu
|
||||
should work as you expect.
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/xata)
|
||||
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_xata/spec.json` file.
|
||||
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
@@ -39,6 +45,7 @@ See `integration_tests/sample_config.json` for a sample config file.
|
||||
and place them into `secrets/config.json`.
|
||||
|
||||
### Locally running the connector
|
||||
|
||||
```
|
||||
python main.py spec
|
||||
python main.py check --config secrets/config.json
|
||||
@@ -48,9 +55,10 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
|
||||
|
||||
### Locally running the connector docker image
|
||||
|
||||
|
||||
#### Build
|
||||
|
||||
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-xata build
|
||||
```
|
||||
@@ -58,12 +66,15 @@ airbyte-ci connectors --name=destination-xata build
|
||||
An image will be built with the tag `airbyte/destination-xata:dev`.
|
||||
|
||||
**Via `docker build`:**
|
||||
|
||||
```bash
|
||||
docker build -t airbyte/destination-xata:dev .
|
||||
```
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-xata:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-xata:dev check --config /secrets/config.json
|
||||
@@ -72,23 +83,30 @@ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integr
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
||||
|
||||
```bash
|
||||
airbyte-ci connectors --name=destination-xata test
|
||||
```
|
||||
|
||||
### Customizing acceptance Tests
|
||||
|
||||
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
|
||||
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
||||
|
||||
## Dependency Management
|
||||
|
||||
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
|
||||
We split dependencies between two groups, dependencies that are:
|
||||
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
* required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
|
||||
- required for the testing need to go to `TEST_REQUIREMENTS` list
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-xata test`
|
||||
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
|
||||
3. Make sure the `metadata.yaml` content is up to date.
|
||||
@@ -96,4 +114,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
|
||||
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
||||
6. Pat yourself on the back for being an awesome contributor.
|
||||
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
||||
|
||||
|
||||
@@ -6,12 +6,15 @@ For information about how to use this connector within Airbyte, see [the User Do
|
||||
## Local development
|
||||
|
||||
#### Building via Gradle
|
||||
|
||||
From the Airbyte repository root, run:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-yellowbrick:build
|
||||
```
|
||||
|
||||
#### Create credentials
|
||||
|
||||
**If you are a community contributor**, generate the necessary credentials and place them in `secrets/config.json` conforming to the spec file in `src/main/resources/spec.json`.
|
||||
Note that the `secrets` directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.
|
||||
|
||||
@@ -20,15 +23,20 @@ Note that the `secrets` directory is git-ignored by default, so there is no dang
|
||||
### Locally running the connector docker image
|
||||
|
||||
#### Build
|
||||
|
||||
Build the connector image via Gradle:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-yellowbrick:airbyteDocker
|
||||
```
|
||||
|
||||
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
|
||||
the Dockerfile.
|
||||
|
||||
#### Run
|
||||
|
||||
Then run any of the connector commands as follows:
|
||||
|
||||
```
|
||||
docker run --rm airbyte/destination-yellowbrick:dev spec
|
||||
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-yellowbrick:dev check --config /secrets/config.json
|
||||
@@ -37,22 +45,29 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
We use `JUnit` for Java tests.
|
||||
|
||||
### Unit and Integration Tests
|
||||
|
||||
Place unit tests under `src/test/io/airbyte/integrations/destinations/yellowbrick`.
|
||||
|
||||
#### Acceptance Tests
|
||||
|
||||
Airbyte has a standard test suite that all destination connectors must pass. Implement the `TODO`s in
|
||||
`src/test-integration/java/io/airbyte/integrations/destinations/yellowbrickDestinationAcceptanceTest.java`.
|
||||
|
||||
### Using gradle to run tests
|
||||
|
||||
All commands should be run from airbyte project root.
|
||||
To run unit tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-yellowbrick:unitTest
|
||||
```
|
||||
|
||||
To run acceptance and custom integration tests:
|
||||
|
||||
```
|
||||
./gradlew :airbyte-integrations:connectors:destination-yellowbrick:integrationTest
|
||||
```
|
||||
@@ -60,7 +75,9 @@ To run acceptance and custom integration tests:
|
||||
## Dependency Management
|
||||
|
||||
### Publishing a new version of the connector
|
||||
|
||||
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
||||
|
||||
1. Make sure your changes are passing unit and integration tests.
|
||||
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
|
||||
1. Create a Pull Request.
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user