Skip to content

LangChain Plugin: LLM

FlotorchLangChainLLM is a LangChain BaseChatModel backed by the FloTorch Gateway. It supports tool/function bindings and optional structured outputs.


from flotorch.langchain.llm import FlotorchLangChainLLM
API_KEY = "<your_api_key>"
BASE_URL = "https://gateway.flotorch.cloud"
MODEL_ID = "<your_flotorch_model_id>"
llm = FlotorchLangChainLLM(model_id=MODEL_ID, api_key=API_KEY, base_url=BASE_URL)

llm_with_tools = llm.bind_tools(tools)
llm_with_functions = llm.bind(functions=functions)
structured = llm.with_structured_output(schema)
result = structured.invoke(messages)

  • Uses FloTorch Gateway /api/openai/v1/chat/completions
  • Works with LangChain create_openai_functions_agent