LettreAI Documentation
  • Home
  • User Guide
  • Nutshell
  • Manual
  • Examples
  • API
    • Getting Started
      • AI Task Documentation
      • Installation
      • Quick Start
    • User Guide
      • User Guide
    • Nutshell
      • AI-Task in a Nutshell
    • Manual
      • Core Concepts
      • Configuration
      • Instructions
      • Productions
      • Functions
      • LLM Integration - Claude
    • Examples
      • Basic Examples
      • Advanced Examples
      • Instruction Examples
      • Production Examples
    • Reports
      • AI Task Reports
    • API Reference
      • Pipeline API
      • Engine API
      • Functions API
    • Development
      • Contributing
      • Architecture

    AI Task System Architecture

    This document provides an overview of the AI Task system architecture, explaining how the various components work together to process content through AI-powered workflows.

    System Components

    The AI Task system consists of several interconnected components:

                        ┌───────────────┐
                        │  ai-partitur  │
                        │  (CLI Entry)  │
                        └───────┬───────┘
                                │
                                ▼
    ┌───────────────┐    ┌───────────────┐    ┌───────────────┐
    │  Partitur     │◄───┤   Partitur    │───►│   Orchestra   │
    │  Files (.yml) │    │   Module      │    │   Class       │
    └───────────────┘    └───────┬───────┘    └───────┬───────┘
                                │                     │
                                ▼                     ▼
                        ┌───────────────┐    ┌───────────────┐
                        │  Instruction  │    │   Function    │
                        │  Templates    │    │   Registry    │
                        └───────────────┘    └───────────────┘

    Key Components

    1. CLI Entry Points:
      • ai-partitur: Command-line interface for running partitur files
      • ai-task: General command-line interface for the system
      • ai-reporter: Specialized interface for reporting features
    2. Core Processing Modules:
      • partitur.py: Core module for processing partitur files
      • orchestra.py: Class for orchestrating complex workflows
      • function registry: Collection of processing functions
    3. Configuration Files:
      • Partitur files (YAML): Define processing workflows
      • Instruction templates (Jinja2): Provide prompts for AI models
    4. Processing Engines:
      • Google Gemini API integration
      • Local function execution

    Module Relationships

    partitur_cli.py and partitur.py

    The partitur_cli.py module provides the command-line interface for the ai-partitur command. It: 1. Parses command-line arguments 2. Locates partitur files 3. Calls the process_partitur function from partitur.py

    The partitur.py module contains the core processing functionality: 1. find_partitur_file(): Resolves partitur file paths 2. process_partitur(): Executes the defined workflow 3. Support functions for specific operations (transcribe, convert, etc.)

    Orchestra Class

    The Orchestra class in orchestra.py provides a more advanced orchestration system: 1. It can be used programmatically from Python code 2. Supports reporting and detailed execution tracking 3. Handles custom function loading and execution 4. Provides a more flexible environment for complex workflows

    The Orchestra class can process both: - steps key (original format) - pipe key (newer format used by the partitur module)

    Execution Flow

    1. Command-Line Invocation:

      ai-partitur inqua_full 32101
    2. Partitur Resolution:

      • Search for inqua_full.yml in current directory, local partitur directory, or global directory
      • Load the partitur file content
    3. Workflow Execution:

      • Parse the partitur definition
      • Resolve template and file paths
      • Execute each step in sequence
      • Capture results and handle errors
    4. Output Generation:

      • Create output files as defined in the partitur
      • Generate reports if enabled

    Key Differences Between partitur.py and Orchestra

    Feature partitur.py Orchestra Class
    Access Command-line Programmatic API
    Path Resolution Relative to partitur file Configurable
    Function Loading Built-in functions Dynamic loading
    Reporting Basic console output Structured reports
    Template Support Basic Advanced with Jinja2
    Error Handling Basic Advanced with recovery

    Using the Right Component

    • For simple workflows: Use the ai-partitur command-line interface
    • For complex applications: Use the Orchestra class programmatically
    • For custom processing: Extend the system with custom functions

    Extending the System

    Adding Custom Functions

    Create a Python module with functions that follow this signature:

    def my_custom_function(pipe_data, params):
        # Process data and parameters
        # ...
        return updated_pipe_data

    Creating Custom Partitur Files

    1. Create a YAML file with the workflow definition
    2. Define steps with appropriate types and parameters
    3. Place in the partitur directory of your project
    4. Execute with the ai-partitur command

    Future Development

    The AI Task system architecture is designed to be extensible:

    1. Additional AI Models: Support for more AI models beyond Gemini
    2. Pipeline Optimization: Improved performance and parallelization
    3. Web Interface: Visual editor for partitur files
    4. Plug-in System: Standardized plugin architecture for extensions

    LettreAI Documentation

     
    • Edit this page
    • Report an issue
    • License: MIT