LangChain Plugin: LLM
Overview
Section titled “Overview”FlotorchLangChainLLM
is a LangChain BaseChatModel
backed by the FloTorch Gateway. It supports tool/function bindings and optional structured outputs.
from flotorch.langchain.llm import FlotorchLangChainLLM
API_KEY = "<your_api_key>"BASE_URL = "https://gateway.flotorch.cloud"MODEL_ID = "<your_flotorch_model_id>"
llm = FlotorchLangChainLLM(model_id=MODEL_ID, api_key=API_KEY, base_url=BASE_URL)
Bind tools
Section titled “Bind tools”llm_with_tools = llm.bind_tools(tools)
Bind functions (OpenAI functions agent)
Section titled “Bind functions (OpenAI functions agent)”llm_with_functions = llm.bind(functions=functions)
Structured outputs
Section titled “Structured outputs”structured = llm.with_structured_output(schema)result = structured.invoke(messages)
- Uses FloTorch Gateway
/api/openai/v1/chat/completions
- Works with LangChain
create_openai_functions_agent