Skip to content

CrewAI Plugin: LLM

FlotorchCrewAILLM is a CrewAI-compatible LLM wrapper that uses FloTorch’s Gateway for model inference. It provides seamless integration with CrewAI’s agent framework.


from flotorch.crewai.llm import FlotorchCrewAILLM
API_KEY = "<your_api_key>"
BASE_URL = "https://gateway.flotorch.cloud"
MODEL_ID = "<your_flotorch_model_id>"

Constructor:

FlotorchCrewAILLM(
model_id: str,
api_key: str,
base_url: str,
)

Creates a CrewAI-compatible LLM that wraps FloTorch’s Gateway LLM.


  • Seamless integration with CrewAI’s agent framework
  • Automatic response parsing and formatting
  • Error handling with fallback responses

from crewai import Agent
# Create LLM
llm = FlotorchCrewAILLM(
model_id=MODEL_ID,
api_key=API_KEY,
base_url=BASE_URL
)
# Use with CrewAI Agent
agent = Agent(
role="Customer Support Specialist",
goal="Help customers with their inquiries",
backstory="You are a helpful customer support agent",
llm=llm,
verbose=True
)

  • Uses FloTorch Gateway’s /api/openai/v1/chat/completions endpoint
  • Compatible with all CrewAI agent configurations