A type of Large Language Model (LLM) that interacts with the Bedrock service. It extends the base LLM class and implements the BaseBedrockInput interface. The class is designed to authenticate and interact with the Bedrock service, which is a part of Amazon Web Services (AWS). It uses AWS credentials for authentication and can be configured with various parameters such as the model to use, the AWS region, and the maximum number of tokens to generate.

Hierarchy (view full)

Constructors

Properties

codec: EventStreamCodec = ...
credentials: CredentialType
fetchFn: {
    (input, init?): Promise<Response>;
    (input, init?): Promise<Response>;
}

Type declaration

    • (input, init?): Promise<Response>
    • Parameters

      • input: string | URL | Request
      • Optional init: RequestInit

      Returns Promise<Response>

    • (input, init?): Promise<Response>
    • Parameters

      • input: RequestInfo
      • Optional init: RequestInit

      Returns Promise<Response>

model: string = "amazon.titan-tg1-large"
region: string
streaming: boolean = false
endpointHost?: string
maxTokens?: number = undefined
modelKwargs?: Record<string, unknown>
stopSequences?: string[]

⚠️ Deprecated ⚠️

This feature is deprecated and will be removed in the future.

It is not recommended for use.

temperature?: number = undefined

Methods

  • Parameters

    • Optional options: Omit<BaseLLMCallOptions, never>

    Returns {
        maxTokens: undefined | number;
        model: string;
        modelKwargs: undefined | Record<string, unknown>;
        region: string;
        stop: undefined | string[];
        temperature: undefined | number;
    }

    • maxTokens: undefined | number
    • model: string
    • modelKwargs: undefined | Record<string, unknown>
    • region: string
    • stop: undefined | string[]
    • temperature: undefined | number

Generated using TypeDoc