Skip to main content

Class: OpenAIContextAwareAgent

Runner will manage the task execution and provide a high-level API for the user

Extends

Constructors

new OpenAIContextAwareAgent()

new OpenAIContextAwareAgent(params): OpenAIContextAwareAgent

Parameters

params: LLMAgentParams & ContextAwareConfig

Returns

OpenAIContextAwareAgent

Overrides

OpenAIAgent.constructor

Defined in

packages/llamaindex/src/agent/openai.ts:32

Properties

createStore()

createStore: () => object = AgentRunner.defaultCreateStore

Returns

object

Inherited from

OpenAIAgent.createStore

Defined in

packages/llamaindex/src/agent/llm.ts:44


taskHandler

taskHandler: TaskHandler<LLM<object, object>> = AgentRunner.defaultTaskHandler

Inherited from

OpenAIAgent.taskHandler

Defined in

packages/llamaindex/src/agent/llm.ts:45

Accessors

chatHistory

get chatHistory(): ChatMessage<AdditionalMessageOptions>[]

Returns

ChatMessage<AdditionalMessageOptions>[]

Inherited from

OpenAIAgent.chatHistory

Defined in

packages/llamaindex/src/agent/base.ts:277


llm

get llm(): AI

Returns

AI

Inherited from

OpenAIAgent.llm

Defined in

packages/llamaindex/src/agent/base.ts:273


verbose

get verbose(): boolean

Returns

boolean

Inherited from

OpenAIAgent.verbose

Defined in

packages/llamaindex/src/agent/base.ts:281

Methods

chat()

chat(params)

chat(params): Promise<EngineResponse>

Send message along with the class's current chat history to the LLM.

Parameters

params: ChatEngineParamsNonStreaming

Returns

Promise<EngineResponse>

Inherited from

OpenAIAgent.chat

Defined in

packages/llamaindex/src/agent/base.ts:348

chat(params)

chat(params): Promise<ReadableStream<EngineResponse>>

Send message along with the class's current chat history to the LLM.

Parameters

params: ChatEngineParamsStreaming

Returns

Promise<ReadableStream<EngineResponse>>

Inherited from

OpenAIAgent.chat

Defined in

packages/llamaindex/src/agent/base.ts:349


createTask()

createTask(message, stream, verbose, chatHistory?): ReadableStream<TaskStepOutput<LLM<object, object>, object, object>>

Parameters

message: MessageContent

stream: boolean = false

verbose: undefined | boolean = undefined

chatHistory?: ChatMessage<object>[]

Returns

ReadableStream<TaskStepOutput<LLM<object, object>, object, object>>

Inherited from

OpenAIAgent.createTask

Defined in

packages/llamaindex/src/agent/base.ts:308


getTools()

getTools(query): BaseToolWithCall[] | Promise<BaseToolWithCall[]>

Parameters

query: MessageContent

Returns

BaseToolWithCall[] | Promise<BaseToolWithCall[]>

Inherited from

OpenAIAgent.getTools

Defined in

packages/llamaindex/src/agent/base.ts:289


reset()

reset(): void

Resets the chat history so that it's empty.

Returns

void

Inherited from

OpenAIAgent.reset

Defined in

packages/llamaindex/src/agent/base.ts:285