gcp_vertex_ai_chat
Beta
Generates responses to messages in a chat conversation, using the Vertex API AI.
Introduced in version 4.34.0.
-
Common
-
Advanced
# Common configuration fields, showing default values
label: ""
gcp_vertex_ai_chat:
project: "" # No default (required)
credentials_json: "" # No default (optional)
location: us-central1 # No default (optional)
model: gemini-1.5-pro-001 # No default (required)
prompt: "" # No default (optional)
temperature: 0 # No default (optional)
max_tokens: 0 # No default (optional)
response_format: text
# All configuration fields, showing default values
label: ""
gcp_vertex_ai_chat:
project: "" # No default (required)
credentials_json: "" # No default (optional)
location: us-central1 # No default (optional)
model: gemini-1.5-pro-001 # No default (required)
prompt: "" # No default (optional)
system_prompt: "" # No default (optional)
temperature: 0 # No default (optional)
max_tokens: 0 # No default (optional)
response_format: text
top_p: 0 # No default (optional)
top_k: 0 # No default (optional)
stop: [] # No default (optional)
presence_penalty: 0 # No default (optional)
frequency_penalty: 0 # No default (optional)
This processor sends prompts to your chosen large language model (LLM) and generates text from the responses, using the Vertex AI API.
For more information, see the Vertex AI documentation.
Fields
credentials_json
An optional field to set a Google Service Account Credentials JSON.
This field contains sensitive information that usually shouldn’t be added to a configuration directly. For more information, see Secrets. |
Type: string
location
Specify the location of a fine tuned model. For base models, you can omit this field.
Type: string
# Examples
location: us-central1
model
The name of the LLM to use. For a full list of models, see the Vertex AI Model Garden.
Type: string
# Examples
model: gemini-1.5-pro-001
model: gemini-1.5-flash-001
prompt
The prompt you want to generate a response for. By default, the processor submits the entire payload as a string. This field supports interpolation functions.
Type: string
system_prompt
The system prompt to submit to the Vertex AI LLM. This field supports interpolation functions.
Type: string
response_format
The format of the generated response. You must also prompt the model to output the appropriate response type.
Type: string
Default: "text"
Options:
text
, json
.
stop
Sets the stop sequences to use. When this pattern is encountered the LLM stops generating text and returns the final response.
Type: array