## What Generates version 2.0 of the Airbyte platform documentation using Docusaurus's built-in versioning system. This creates a frozen snapshot of the current documentation that users can reference. Requested by ian.alton@airbyte.io via [Slack thread](https://airbytehq-team.slack.com/archives/D08FX8EC9L0/p1760490197805979?thread_ts=1760490197.805979). Link to Devin run: https://app.devin.ai/sessions/689693593bac44f4903f476aa17b872e ## How - Ran `pnpm run docusaurus docs:version:platform 2.0` in the docusaurus directory - This automatically: - Created `platform_versioned_docs/version-2.0/` containing a snapshot of all current platform docs - Created `platform_versioned_sidebars/version-2.0-sidebars.json` with the sidebar navigation structure - Updated `platform_versions.json` to add "2.0" to the version list - Ran prettier to format the JSON files - Verified the documentation builds successfully locally (build completed in ~3 minutes with only pre-existing broken anchor warnings) ## Review guide 1. **Verify timing**: Confirm this is the correct time to release version 2.0 of the documentation 2. **Version order**: Check `docusaurus/platform_versions.json` - verify "2.0" is first in the array (newest version first) 3. **Build verification**: Ensure CI/Vercel builds pass without errors 4. **Spot check**: Optionally review 2-3 files in `docusaurus/platform_versioned_docs/version-2.0/` to ensure content looks reasonable Note: This is a standard Docusaurus versioning operation that creates a frozen snapshot of the current "next" documentation. The generated files are extensive (500+ files) but follow Docusaurus conventions. ## User Impact Users will see version 2.0 available in the version dropdown on docs.airbyte.com. This provides a stable reference point for platform documentation at this point in time. Existing versions (1.6, 1.7, 1.8) remain unchanged. ## Can this PR be safely reverted and rolled back? - [x] YES 💚 This is an additive change that doesn't modify existing versioned docs. Reverting would simply remove version 2.0 from the version list and delete the associated documentation files. Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com> Co-authored-by: ian.alton@airbyte.io <ian.alton@airbyte.io>
3.6 KiB
description, products
| description | products |
|---|---|
| Start triggering Airbyte jobs with Dagster in minutes | oss-* |
Using the Dagster Integration
Airbyte is an official integration in the Dagster project. The Airbyte Integration allows you to trigger synchronization jobs in Airbyte, and this tutorial will walk through configuring your Dagster Ops to do so.
The Airbyte Task documentation on the Dagster project can be found here. We also have a tutorial on dynamically configuring Airbyte using dagster-airbyte.
1. Set up the tools
First, make sure you have Docker installed. We'll be using the docker-compose command, so your install should contain docker-compose.
Start Airbyte
If this is your first time using Airbyte, we suggest going through our Basic Tutorial. This tutorial will use the Connection set up in the basic tutorial.
For the purposes of this tutorial, set your Connection's sync frequency to manual. Dagster will be responsible for manually triggering the Airbyte job.
Install Dagster
If you don't have a Dagster installed, we recommend following this guide to set one up.
2. Create the Dagster Op to trigger your Airbyte job
Creating a simple Dagster DAG to run an Airbyte Sync Job
Create a new folder called airbyte_dagster and create a file airbyte_dagster.py.
from dagster import job
from dagster_airbyte import airbyte_resource, airbyte_sync_op
my_airbyte_resource = airbyte_resource.configured(
{
"host": {"env": "AIRBYTE_HOST"},
"port": {"env": "AIRBYTE_PORT"},
}
)
sync_foobar = airbyte_sync_op.configured({"connection_id": "your-connection-uuid"}, name="sync_foobar")
@job(resource_defs={"airbyte": my_airbyte_resource})
def my_simple_airbyte_job():
sync_foobar()
The Airbyte Dagster Resource accepts the following parameters:
host: The host URL to your Airbyte instance.port: The port value you have selected for your Airbyte instance.use_https: If your server use secure HTTP connection.request_max_retries: The maximum number of times requests to the Airbyte API should be retried before failing.request_retry_delay: Time in seconds to wait between each request retry.
The Airbyte Dagster Op accepts the following parameters:
connection_id: The Connection UUID you want to triggerpoll_interval: The time in seconds that will be waited between successive polls.poll_timeout: he maximum time that will waited before this operation is timed out.
After running the file, dagster job execute -f airbyte_dagster.py this will trigger the job with Dagster.
That's it!
Don't be fooled by our simple example of only one Dagster Flow. Airbyte is a powerful data integration platform supporting many sources and destinations. The Airbyte Dagster Integration means Airbyte can now be easily used with the Dagster ecosystem - give it a shot!
We love to hear any questions or feedback on our Slack. If you see any rough edges or want to request a connector, feel free to create an issue on our Github or thumbs up an existing issue.
Related articles and guides
For additional information about using Dagster and Airbyte together, see the following: