CrewAI Plugin: LLM
Overview
Section titled “Overview”FlotorchCrewAILLM
is a CrewAI-compatible LLM wrapper that uses FloTorch’s Gateway for model inference. It provides seamless integration with CrewAI’s agent framework.
from flotorch.crewai.llm import FlotorchCrewAILLM
API_KEY = "<your_api_key>"BASE_URL = "https://gateway.flotorch.cloud"MODEL_ID = "<your_flotorch_model_id>"
FlotorchCrewAILLM
Section titled “FlotorchCrewAILLM”Constructor:
FlotorchCrewAILLM( model_id: str, api_key: str, base_url: str,)
Creates a CrewAI-compatible LLM that wraps FloTorch’s Gateway LLM.
Features
Section titled “Features”CrewAI Integration
Section titled “CrewAI Integration”- Seamless integration with CrewAI’s agent framework
- Automatic response parsing and formatting
- Error handling with fallback responses
Usage Example
Section titled “Usage Example”from crewai import Agent
# Create LLMllm = FlotorchCrewAILLM( model_id=MODEL_ID, api_key=API_KEY, base_url=BASE_URL)
# Use with CrewAI Agentagent = Agent( role="Customer Support Specialist", goal="Help customers with their inquiries", backstory="You are a helpful customer support agent", llm=llm, verbose=True)
- Uses FloTorch Gateway’s
/api/openai/v1/chat/completions
endpoint - Compatible with all CrewAI agent configurations