aws_bedrock_embeddings

Beta

Generates vector embeddings from text prompts, using the AWS Bedrock API.

  • Common

  • Advanced

# Common config fields, showing default values
label: ""
aws_bedrock_embeddings:
  model: amazon.titan-embed-text-v1 # No default (required)
  text: "" # No default (optional)
# All config fields, showing default values
label: ""
aws_bedrock_embeddings:
  region: ""
  endpoint: ""
  credentials:
    from_ec2_role: false
    role: ""
    role_external_id: ""
  model: amazon.titan-embed-text-v1 # No default (required)
  text: "" # No default (optional)

This processor sends text prompts to your chosen large language model (LLM), which generates vector embeddings for them using the AWS Bedrock API.

For more information, see the AWS Bedrock documentation.

Fields

model

The ID of the LLM that you want to use to generate vector embeddings. For a full list, see the AWS Bedrock documentation.

Type: string

# Examples
model: amazon.titan-embed-text-v1
model: amazon.titan-embed-text-v2:0
model: cohere.embed-english-v3
model: cohere.embed-multilingual-v3

text

The prompt you want to generate a vector embedding for. The processor submits the entire payload as a string.

Type: string

region

The region in which your AWS resources are hosted.

Type: string

Default: ""

endpoint

Specify a custom endpoint for the AWS API.

Type: string

Default: ""

credentials

Manually configure the AWS credentials to use (optional). For more information, see the Amazon Web Services guide.

Type: object

credentials.from_ec2_role

Use the credentials of a host EC2 machine configured to assume an IAM role associated with the instance.

Type: bool

Default: false

credentials.role

The role ARN to assume.

Type: string

Default: ""

credentials.role_external_id

An external ID to use when assuming a role.

Type: string

Default: ""