gcp_bigquery_write_api

Streams data into BigQuery using the Storage Write API.

Introduced in version 4.90.0.

Writes messages to a BigQuery table using the Storage Write API. This provides higher throughput and lower latency than the legacy streaming API or load jobs.

Messages can be formatted as JSON (default) or raw protobuf bytes. When using JSON format the component automatically fetches the table schema and converts each message to the corresponding proto representation.

The proto3 JSON mapping encodes int64 and uint64 values as strings. JSON messages with integer fields must use string values (e.g. "age": "30" not "age": 30). Otherwise the write will fail with an unmarshalling error.

When batching is enabled the table name is resolved from the first message in each batch. All messages in the same batch are written to that table.

  • Common

  • Advanced

outputs:
  label: ""
  gcp_bigquery_write_api:
    project: ""
    dataset: "" # No default (required)
    table: "" # No default (required)
    message_format: json
    max_in_flight: 64
    batching:
      count: 0
      byte_size: 0
      period: ""
      check: ""
      processors: [] # No default (optional)
    credentials_json: ""
outputs:
  label: ""
  gcp_bigquery_write_api:
    project: ""
    dataset: "" # No default (required)
    table: "" # No default (required)
    message_format: json
    max_in_flight: 64
    batching:
      count: 0
      byte_size: 0
      period: ""
      check: ""
      processors: [] # No default (optional)
    credentials_json: ""
    target_principal: ""
    delegates: []
    stream_idle_timeout: 5m
    stream_sweep_interval: 1m
    endpoint:
      http: ""
      grpc: ""

Fields

batching

Allows you to configure a batching policy.

Type: object

# Examples:
batching:
  byte_size: 5000
  count: 0
  period: 1s

# ---

batching:
  count: 10
  period: 1s

# ---

batching:
  check: this.contains("END BATCH")
  count: 0
  period: 1m

batching.byte_size

An amount of bytes at which the batch should be flushed. If 0 disables size based batching.

Type: int

Default: 0

batching.check

A Bloblang query that should return a boolean value indicating whether a message should end a batch.

Type: string

Default: ""

# Examples:
check: this.type == "end_of_transaction"

batching.count

A number of messages at which the batch should be flushed. If 0 disables count based batching.

Type: int

Default: 0

batching.period

A period in which an incomplete batch should be flushed regardless of its size.

Type: string

Default: ""

# Examples:
period: 1s

# ---

period: 1m

# ---

period: 500ms

batching.processors[]

A list of processors to apply to a batch as it is flushed. This allows you to aggregate and archive the batch however you see fit. Please note that all resulting messages are flushed as a single batch, therefore splitting the batch into smaller batches using these processors is a no-op.

Type: processor

# Examples:
processors:
  - archive:
      format: concatenate


# ---

processors:
  - archive:
      format: lines


# ---

processors:
  - archive:
      format: json_array

credentials_json

An optional JSON string containing GCP credentials. If empty, credentials are loaded from the environment.

This field contains sensitive information that usually shouldn’t be added to a configuration directly. For more information, see Secrets.

Type: string

Default: ""

dataset

The BigQuery dataset ID.

Type: string

delegates[]

Optional delegation chain for chained service account impersonation. Each service account must be granted roles/iam.serviceAccountTokenCreator on the next in the chain.

Type: array

Default: []

endpoint

Optional endpoint overrides for the BigQuery and Storage Write API clients.

Type: object

endpoint.grpc

Override the BigQuery Storage gRPC endpoint. Useful for local emulators.

Type: string

Default: ""

endpoint.http

Override the BigQuery HTTP endpoint. Useful for local emulators.

Type: string

Default: ""

max_in_flight

The maximum number of messages to have in flight at a given time. Increase this to improve throughput.

Type: int

Default: 64

message_format

The format of input messages. Use 'json' to have the component convert JSON to proto automatically. Use 'protobuf' to supply raw proto-encoded bytes.

Type: string

Default: json

Options: json, protobuf

project

The GCP project ID. If empty, the project is auto-detected from the environment.

Type: string

Default: ""

stream_idle_timeout

How long a cached stream can remain unused before being closed. Relevant when the table field uses interpolation to route to many tables.

Type: string

Default: 5m

stream_sweep_interval

How often to check for idle streams to close.

Type: string

Default: 1m

table

The BigQuery table ID. Supports interpolation functions. When batching, resolved from the first message in each batch.

Type: string

target_principal

Service account email to impersonate. When set, the output obtains tokens acting as this service account. Requires the caller to have roles/iam.serviceAccountTokenCreator on the target.

Type: string

Default: ""