LettreAI Documentation
  • Home
  • User Guide
  • Nutshell
  • Manual
  • Examples
  • API
  1. Manual
  2. Configuration
  • Getting Started
    • AI Task Documentation
    • Installation
    • Quick Start
  • User Guide
    • User Guide
  • Nutshell
    • AI-Task in a Nutshell
  • Manual
    • Core Concepts
    • Configuration
    • Instructions
    • Productions
    • Functions
    • LLM Integration - Claude
  • Examples
    • Basic Examples
    • Advanced Examples
    • Instruction Examples
    • Production Examples
  • Reports
    • AI Task Reports
  • API Reference
    • Pipeline API
    • Engine API
    • Functions API
  • Development
    • Contributing
    • Architecture
  1. Manual
  2. Configuration

Configuration

Configuration

AI Task provides several configuration options to customize its behavior. This page explains the available configuration options and how to use them.

Task Configuration Files

Task configuration files are YAML files with a .ai extension that define the pipeline for a specific task. These files contain the following sections:

Basic Configuration

name: "Task Name"
description: "Task Description"
template_dir: "path/to/templates"  # Optional, defaults to "templates"
  • name: The name of the task
  • description: A description of what the task does
  • template_dir: The directory where templates are stored (optional)

Pipeline Configuration

The pipe section defines the sequence of tasks that make up the pipeline:

pipe:
  - type: function
    function: aisource
    params:
      file: "input.txt"
  
  - type: llm
    tmpl: "process"
    model: "claude-3-7-sonnet-latest"
  
  - type: function
    function: airesult
    params:
      file: "output.txt"
      format: "text"

Each task in the pipeline has the following properties:

  • type: The type of task (function or llm)
  • For function tasks:
    • function: The name of the function to execute
    • params: Parameters to pass to the function
  • For LLM tasks:
    • tmpl: The name of the template to use
    • model: The LLM model to use
    • params: Additional parameters for the LLM (optional)

Global Configuration

AI Task also supports global configuration options that apply to all tasks. These options can be set in a configuration file at ~/.ai-task/config.yaml or through environment variables.

API Keys

api_keys:
  anthropic: "your-anthropic-api-key"
  openai: "your-openai-api-key"

Alternatively, you can set environment variables:

export ANTHROPIC_API_KEY="your-anthropic-api-key"
export OPENAI_API_KEY="your-openai-api-key"

Default Settings

defaults:
  template_dir: "path/to/templates"
  model: "claude-3-7-sonnet-latest"
  monitoring: true
  report_dir: "path/to/reports"
  • template_dir: The default directory for templates
  • model: The default LLM model to use
  • monitoring: Whether to enable monitoring by default
  • report_dir: The default directory for reports

Command-Line Options

When running AI Task from the command line, you can specify several options:

python -m ai_task.run_tasks example.ai [options]

Available options:

  • --monitor: Enable detailed monitoring of the pipeline execution
  • --report: Generate a report of the pipeline execution
  • --report-file FILE: Specify the name of the report file
  • --preview: Preview the report in a web browser after generation
  • --input KEY=VALUE: Provide input values for the pipeline

Next Steps

Now that you understand how to configure AI Task, you can:

  • Learn about templates
  • Discover built-in functions
  • Understand LLM integration
Core Concepts
Instructions

LettreAI Documentation

 
  • Edit this page
  • Report an issue
  • License: MIT