Skip to content

System Requirements

This document outlines the technical requirements, dependencies, and system specifications for the Atlas framework.

Overview

Atlas is a Python-based framework for building advanced, knowledge-augmented multi-agent systems. This document specifies the requirements for developing, deploying, and using Atlas.

Software Requirements

Python Environment

Atlas requires a Python environment with the following specifications:

RequirementSpecificationNotes
Python Version>=3.13Earlier versions are not supported due to type hint features
Package Manageruv (recommended)pip can be used but uv provides better dependency management
Virtual EnvironmentRequiredIsolated environment to avoid dependency conflicts

Core Dependencies

The following core dependencies are required for Atlas:

DependencyVersionPurpose
langgraph>=0.4.1Multi-agent workflow orchestration
chromadb>=1.0.7Vector database for knowledge storage
anthropic>=0.50.0Anthropic Claude API client
pydantic>=2.11.4Data validation and settings management
pathspec>=0.12.1File pattern matching for document ingestion
requests>=2.31.0HTTP client for API communication

Optional Dependencies

These dependencies are optional but recommended for specific functionality:

DependencyVersionPurpose
openai>=1.20.0OpenAI API client for GPT models
types-requests>=2.31.0Type stubs for requests package
mypy>=1.8.0Static type checking
ruff>=0.3.0Linting and code formatting
black>=24.3.0Code formatting
pytest>=8.0.0Unit and integration testing

Hardware Requirements

Development Environment

Minimum specifications for a development environment:

ComponentMinimumRecommended
CPU2 cores4+ cores
RAM4 GB8+ GB
Disk Space2 GB5+ GB
Operating SystemLinux, macOS, WindowsLinux, macOS

Production Environment

Recommended specifications for a production environment:

ComponentMinimumRecommended
CPU4 cores8+ cores
RAM8 GB16+ GB
Disk Space10 GB + knowledge base20+ GB + knowledge base
NetworkReliable internet connectionHigh-speed, low-latency connection

Note: The disk space required depends on the size of the knowledge base. Vector databases can grow significantly for large document collections.

API Requirements

Anthropic API

For using the Anthropic provider:

RequirementDetails
API KeyValid Anthropic API key with active subscription
Rate LimitsSufficient rate limits for your usage patterns
Token BudgetAdequate token budget for your usage volume
ModelsAccess to Claude models (claude-3-7-sonnet-20250219 recommended)

OpenAI API (Optional)

For using the OpenAI provider:

RequirementDetails
API KeyValid OpenAI API key with active subscription
Rate LimitsSufficient rate limits for your usage patterns
Token BudgetAdequate token budget for your usage volume
ModelsAccess to GPT models (gpt-4o recommended)

Ollama (Optional)

For using the Ollama provider:

RequirementDetails
Ollama InstallationOllama installed and running
ModelsRequired models pulled and available
HardwareSufficient GPU/CPU for local model inference

Network Requirements

Atlas requires network connectivity for API access:

ServiceAccessPortNotes
Anthropic APIapi.anthropic.com443 (HTTPS)Required for Anthropic provider
OpenAI APIapi.openai.com443 (HTTPS)Required for OpenAI provider
Ollama APIlocalhost11434 (HTTP)Required for Ollama provider

Storage Requirements

ChromaDB Storage

ChromaDB requires storage for the vector database:

RequirementSpecificationNotes
PathUser-configurableDefaults to ~/atlas_chroma_db
PermissionsRead/write accessThe application needs permission to create and modify files
SpaceDepends on document volumeApproximately 1-5 KB per document chunk plus embeddings
BackupRecommendedRegular backups of the database directory

Document Storage

For document ingestion:

RequirementSpecificationNotes
File TypesText, Markdown, etc.Support varies by file type
AccessRead access to source filesApplication needs permission to read files
ProcessingTemporary space for processingUsed during chunking and embedding generation

Memory Requirements

Memory usage depends on several factors:

ComponentMemory UsageScaling Factors
Base Operation~200-500 MB-
ChromaDB~50-100 MB + collection sizeScales with number of documents
Document Processing~100-500 MBScales with document size
Multi-Agent Operations~100 MB per active agentScales with number of agents
StreamingMinimal overhead-

Environment Variables

Atlas uses environment variables for configuration:

Required Variables

VariablePurposeDefault
ANTHROPIC_API_KEYAnthropic API authenticationNone

Optional Variables

VariablePurposeDefault
OPENAI_API_KEYOpenAI API authenticationNone
OPENAI_ORGANIZATIONOpenAI organization IDNone
ATLAS_DB_PATHChromaDB storage location~/atlas_chroma_db
ATLAS_COLLECTION_NAMEChromaDB collection nameatlas_knowledge_base
ATLAS_DEFAULT_MODELDefault model to useclaude-3-7-sonnet-20250219
ATLAS_DEFAULT_PROVIDERDefault provider to useanthropic
ATLAS_MAX_TOKENSMaximum tokens for responses2000
ATLAS_LOG_LEVELLogging verbosityINFO
ATLAS_ENABLE_TELEMETRYEnable telemetrytrue
SKIP_API_KEY_CHECKSkip API key validationfalse

Installation Process

Basic Installation

bash
# Create and activate a virtual environment
uv venv
source .venv/bin/activate

# Install Atlas
uv pip install -e .

# Install development tools
uv pip install ruff black mypy pytest

Development Setup

bash
# Clone the repository
git clone <repository-url>
cd atlas

# Create and activate virtual environment
uv venv
source .venv/bin/activate

# Install in development mode
uv pip install -e .

# Install development dependencies
uv pip install ruff black mypy pytest

Deployment Considerations

Environment Configuration

  • Use environment variables or a .env file for configuration
  • Set appropriate logging levels
  • Configure appropriate token limits
  • Set up proper error handling

Performance Optimization

  • Tune ChromaDB parameters for retrieval performance
  • Configure appropriate token limits to balance quality and cost
  • Consider parallel processing for multi-agent workflows
  • Monitor and adjust worker count based on load

Security Considerations

  • Secure API keys using environment variables or secure storage
  • Implement proper access controls for the knowledge base
  • Consider data privacy when ingesting and storing documents
  • Validate and sanitize user inputs

Monitoring and Maintenance

  • Implement logging for system activity
  • Monitor API usage and costs
  • Regularly back up the vector database
  • Plan for dependency updates

Compatibility Notes

LangGraph Compatibility

Atlas requires LangGraph 0.4.1 or later, which includes several API changes from older versions:

  • MemorySaver import path changed to from langgraph.saver import MemorySaver
  • CheckpointAt is now available in the checkpoint module
  • Graph compilation API has been enhanced with additional options
  • State management has improved type safety

Anthropic API Compatibility

Atlas works with the Anthropic API v0.50.0 or later:

  • Uses the Messages API format
  • Supports structured content formats
  • Handles streaming through the official SDK

ChromaDB Compatibility

Atlas requires ChromaDB 1.0.7 or later:

  • Uses the PersistentClient for storage
  • Implements the latest embedding and retrieval APIs
  • Handles collections using the latest API format
  • Design Principles - Overview of Atlas design philosophy
  • Environment Variables Reference - Detailed list of configuration options (deprecated)
  • CLI Reference - Command-line interface documentation (deprecated)
  • Testing Guide - Testing approach and instructions (deprecated)

Released under the MIT License.