Skip to content

WorkflowAI/python-sdk

Repository files navigation

header

Python SDK for WorkflowAI

PyPI version License: Apache 2.0 Python versions Examples

Official SDK from WorkflowAI for Python.

This SDK is designed for Python teams who prefer code-first development. It provides greater control through direct code integration while still leveraging the full power of the WorkflowAI platform, complementing the web-app experience.

Try in CursorAI:

install `pip workflowai` and from https://docs.workflowai.com/python-sdk/agent build an agent that [add description of the agent you want to build]
hello-agent.mp4

Key Features

  • Model-agnostic: Works with all major AI models including OpenAI, Anthropic, Claude, Google/Gemini, Mistral, DeepSeek, Grok with a unified interface that makes switching between providers seamless. View all supported models.
model-agnostic.mp4
  • Open-source and flexible deployment: WorkflowAI is fully open-source with flexible deployment options. Run it self-hosted on your own infrastructure for maximum data control, or use the managed WorkflowAI Cloud service for hassle-free updates and automatic scaling.

  • Structured output: Uses Pydantic models to validate and structure AI responses. WorkflowAI ensures your AI responses always match your defined structure, simplifying integrations, reducing parsing errors, and making your data reliable and ready to use. Learn more about structured input and output.

structured-outputs.mp4
  • Observability integrated: Built-in monitoring and logging capabilities that provide insights into your AI workflows, making debugging and optimization straightforward. Learn more about observability features.
observability.mp4
  • Streaming supported: Enables real-time streaming of AI responses for low latency applications, with immediate validation of partial outputs. Learn more about streaming capabilities.
class ProductInput(BaseModel):
    description: str = Field()

class Category(str, enum.Enum):
    ELECTRONICS = "Electronics"
    CLOTHING = "Clothing"
    HOME_GOODS = "Home Goods"
    BEAUTY = "Beauty"
    SPORTS = "Sports"

class ProductAnalysisOutput(BaseModel):
    tags: list[str] = Field(default_factory=list)
    summary: str = Field()
    category: Category = Field()

@workflowai.agent(id="product-tagger", model=Model.DEEPSEEK_V3_LATEST)
async def product_analyzer(input: ProductInput) -> ProductAnalysisOutput:
    """
    Analyze a product description.
    """

async for chunk in product_analyzer.stream(ProductInput(description="....")):
    # chunk is a partial ProductAnalysisOutput object. Fields are progressively
    # filled, but the object structure respects the type hint even when incomplete.
    print(chunk.output)
streaming.mp4
  • Provider fallback: Automatically switches to alternative AI providers when the primary provider fails, ensuring high availability and reliability for your AI applications. This feature allows you to define fallback strategies that maintain service continuity even during provider outages or rate limiting.

provider-fallback

  • Hosted tools: Comes with powerful hosted tools like web search and web browsing capabilities, allowing your agents to access real-time information from the internet. These tools enable your AI applications to retrieve up-to-date data, research topics, and interact with web content without requiring complex integrations. Learn more about hosted tools.
tools-search.mp4
  • Custom tools support: Easily extend your agents' capabilities by creating custom tools tailored to your specific needs. Whether you need to query internal databases, call external APIs, or perform specialized calculations, WorkflowAI's tool framework makes it simple to augment your AI with domain-specific functionality. Learn more about custom tools.
# Sync tool
def get_current_time(timezone: Annotated[str, "The timezone to get the current time in. e-g Europe/Paris"]) -> str:
    """Return the current time in the given timezone in iso format"""
    return datetime.now(ZoneInfo(timezone)).isoformat()

# Tools can also be async
async def get_latest_pip_version(package_name: Annotated[str, "The name of the pip package to check"]) -> str:
    """Fetch the latest version of a pip package from PyPI"""
    url = f"https://pypi.org/pypi/{package_name}/json"
    async with httpx.AsyncClient() as client:
        response = await client.get(url)
        response.raise_for_status()
        data = response.json()
        return data['info']['version']

@workflowai.agent(
    id="research-helper",
    tools=[get_current_time, get_latest_pip_version],
    model=Model.GPT_4O_LATEST,
)
async def answer_question(_: AnswerQuestionInput) -> AnswerQuestionOutput:
    ...
  • Integrated with WorkflowAI: The SDK seamlessly syncs with the WorkflowAI web application, giving you access to a powerful playground where you can edit prompts and compare models side-by-side. This hybrid approach combines the flexibility of code-first development with the visual tools needed for effective prompt engineering and model evaluation.

  • Multimodality support: Build agents that can handle multiple modalities, such as images, PDFs, documents, and audio. Learn more about multimodal capabilities.

multimodality.mp4
  • Caching support: To save money and improve latency, WorkflowAI supports caching. When enabled, identical requests return cached results instead of making new API calls to AI providers. Learn more about caching capabilities.

  • Cost tracking: Automatically calculates and tracks the cost of each AI model run, providing transparency and helping you manage your AI budget effectively. Learn more about cost tracking.

class AnswerQuestionInput(BaseModel):
    question: str

class AnswerQuestionOutput(BaseModel):
    answer: str

@workflowai.agent(id="answer-question")
async def answer_question(input: AnswerQuestionInput) -> AnswerQuestionOutput:
    """
    Answer a question.
    """
    ...

run = await answer_question.run(AnswerQuestionInput(question="What is the history of Paris?"))
print(f"Cost: $ {run.cost_usd:.5f}")
print(f"Latency: {run.duration_seconds:.2f}s")

# Cost: $ 0.00745
# Latency: 8.99s

Get Started

workflowai requires Python 3.9 or higher.

pip install workflowai

API Key

To get started quickly, get an API key from WorkflowAI Cloud. For maximum control over your data, you can also use your self-hosted instance, though this requires additional setup time.

Then, set the WORKFLOWAI_API_KEY environment variable:

export WORKFLOWAI_API_KEY="your-api-key"

First Agent

Here's a simple example of a WorkflowAI agent that extracts structured flight information from email content:

import asyncio
from datetime import datetime
from enum import Enum

from pydantic import BaseModel, Field

import workflowai
from workflowai import Model

# Input class
class EmailInput(BaseModel):
    email_content: str

# Output class
class FlightInfo(BaseModel):
    # Enum for standardizing flight status values
    class Status(str, Enum):
        """Possible statuses for a flight booking."""
        CONFIRMED = "Confirmed"
        PENDING = "Pending"
        CANCELLED = "Cancelled"
        DELAYED = "Delayed"
        COMPLETED = "Completed"

    passenger: str
    airline: str
    flight_number: str
    from_airport: str = Field(description="Three-letter IATA airport code for departure")
    to_airport: str = Field(description="Three-letter IATA airport code for arrival")
    departure: datetime
    arrival: datetime
    status: Status

# Agent definition
@workflowai.agent(
    id="flight-info-extractor",
    model=Model.GEMINI_2_0_FLASH_LATEST,
)
async def extract_flight_info(email_input: EmailInput) -> FlightInfo:
    # Agent prompt
    """
    Extract flight information from an email containing booking details.
    """
    ...


async def main():
    email = """
    Dear Jane Smith,

    Your flight booking has been confirmed. Here are your flight details:

    Flight: UA789
    From: SFO
    To: JFK
    Departure: 2024-03-25 9:00 AM
    Arrival: 2024-03-25 5:15 PM
    Booking Reference: XYZ789

    Total Journey Time: 8 hours 15 minutes
    Status: Confirmed

    Thank you for choosing United Airlines!
    """
    run = await extract_flight_info.run(EmailInput(email_content=email))
    print(run)


if __name__ == "__main__":
    asyncio.run(main())


# Output:
# ==================================================
# {
#   "passenger": "Jane Smith",
#   "airline": "United Airlines",
#   "flight_number": "UA789",
#   "from_airport": "SFO",
#   "to_airport": "JFK",
#   "departure": "2024-03-25T09:00:00",
#   "arrival": "2024-03-25T17:15:00",
#   "status": "Confirmed"
# }
# ==================================================
# Cost: $ 0.00009
# Latency: 1.18s
# URL: https://workflowai.com/_/agents/flight-info-extractor/runs/0195ee02-bdc3-72b6-0e0b-671f0b22b3dc

Ready to run! This example works straight out of the box - no tweaking needed.

Agents built with workflowai SDK can be run in the WorkflowAI web application too.

WorkflowAI Playground

And the runs executed via the SDK are synced with the web application.

WorkflowAI Runs

Documentation

Complete documentation is available at docs.workflowai.com/python-sdk.

Examples

Workflows

For advanced workflow patterns and examples, please refer to the Workflows README for more details.

  • chain.py: Sequential processing where tasks execute in a fixed sequence, ideal for linear processes.
  • routing.py: Directs work based on intermediate results to specialized agents, adapting behavior based on context.
  • parallel_processing.py: Splits work into independent subtasks that run concurrently for faster processing.
  • orchestrator_worker.py: An orchestrator plans work, and multiple worker agents execute parts in parallel.
  • evaluator_optimizer.py: Employs an iterative feedback loop to evaluate and refine output quality.
  • chain_of_agents.py: Processes long documents sequentially across multiple agents, passing findings along the chain.
  • agent_delegation.py: Enables dynamic workflows where one agent invokes other agents through tools based on the task.

Cursor Integration

Building agents is even easier with Cursor by adding WorkflowAI docs as a documentation source:

  1. In Cursor chat, type @docs.
  2. Select "+ Add new doc" (at the bottom of the list).
  3. Add https://docs.workflowai.com/ as a documentation source.
  4. Save the settings.

Now, Cursor will have access to the WorkflowAI docs.

Contributing

See the CONTRIBUTING.md file for more details. Thank you!

Acknowledgments

Thanks to ell for the inspiration! ✨

About

Official Python SDK for WorkflowAI

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •