Airbyte Terraform Provider

Getting Started with Airbyte's Terraform Provider

  • The Airbyte terraform provider can be used with Airbyte OSS, Cloud, and Enterprise.
  • It does not support all OSS connectors. It supports those available in both OSS & Cloud.
  • Troubleshooting questions can be asked in our community slack channel #ask-ai or #ask-community-troubleshooting.
  • The launch blog post contains context & use-case examples for this Terraform Provider.
  • We have a full playlist on YouTube with a step-by-step video guide to help you get started

1. Downloading the Provider

Next, head to the Airbyte Terraform Provider page

  1. Select the button in the top-right corner titled "Use Provider"
  2. Copy and paste the source code into a file named
  3. Run terraform init
  4. Run terraform plan
  5. Run terraform apply

2. Setup the Provider

This section describes how to use the Terraform Provider to provision resources in Airbyte. First, you will need to include the Airbyte Terraform Provider in your provider's list. If you followed the steps above, you should already have a file named main.tfif you already have a Terraform file setup, add the following:

terraform {
  required_providers {
    airbyte = {
      source = "airbytehq/airbyte"
      version = "0.4.1"

provider "airbyte" {
  // If running on Airbyte Cloud, 
  // generate & save your API key from
  bearer_auth = var.api_key
  // If running locally (Airbyte OSS) with docker-compose using the airbyte-proxy, 
  // include the actual password/username you've set up (or use the defaults below)
  password = "password"
  username = "airbyte"
  // if running locally (Airbyte OSS), include the server url to the airbyte-api-server
  server_url = "http://localhost:8006/v1/" // (and UI is at

We also suggest you use a file named variables.tfthat can hold things like your API Key. You can then reference these in yourmain.tffile, and the user will be prompted for these values. Adding this to your main.tfwill allow you to authenticate to the API. It would look like this

variable "api_key" {
  type = string

variable "workspace_id" {
  type = string

3. Create a Source

To create a new Source, pick the Source resource you want to create. In this example, we are going to use the Stripe source.

resource "airbyte_source_stripe" "my_source_stripe" {
  configuration = {
    sourceType = "stripe"
    account_id = "acct_123"
    client_secret = "sklive_abc"
    start_date = "2023-07-01T00:00:00Z"
    lookback_window_days = 0
    slice_range = 365
  name = "Stripe"
  workspace_id = var.workspace_id

Applying this will create a new Source in your Airbyte Workspace.

For Custom Connectors, provide the definition id along with a configuration string in a JSON format. The configuration can be a entered as a string or utilizing terraform's jsonencode may be helpful.

resource "airbyte_source_custom" "custom" {
  name = "custom source connector"
  workspace_id = var.workspace_id
  definition_id = "d96b3d38-a35c-4f68-902d-212f4b214ed2"
  configuration = jsonencode({"configuration_1"="setting_1"})

4. Create a Destination

Similarly to creating a Source, a Destination is a named resource with a configuration. In this example, we will use the BigQuery Destination. You can also create this BigQuery Destination automatically using our BigQuery Terraform Module.

resource "airbyte_destination_bigquery" "my_destination_bigquery" {
  configuration = {
    big_query_client_buffer_size_mb = 15
    credentials_json                = "...my_credentials_json..."
    dataset_id                      = "...my_dataset_id..."
    dataset_location                = "US"
    destination_type                = "bigquery"
    loading_method = {
      destination_bigquery_loading_method_gcs_staging = {
        credential = {
          destination_bigquery_loading_method_gcs_staging_credential_hmac_key = {
            credential_type    = "HMAC_KEY"
            hmac_key_access_id = "1234567890abcdefghij1234"
            hmac_key_secret    = "1234567890abcdefghij1234567890ABCDEFGHIJ"
        file_buffer_count        = 10
        gcs_bucket_name          = "airbyte_sync"
        gcs_bucket_path          = "data_sync/test"
        keep_files_in_gcs_bucket = "Keep all tmp files in GCS"
        method                   = "GCS Staging"
    project_id              = "...my_project_id..."
    transformation_priority = "batch"
  name         = "BigQuery"
  workspace_id = var.workspace_id

5. Create a Connection

Now we will show how to create a Connection with the configured Source and Destination.

resource "airbyte_connection" "stripe_bigquery" {
  name = "Stripe to BigQuery"
  source_id = airbyte_source_stripe.my_source_stripe.source_id
  destination_id = airbyte_destination_bigquery.my_destination_bigquery.destination_id
  configuration = {
    schedule = {
      schedule_type = "manual"