Getting StartedConfiguration

Configuration

Configure settings (LLM, FRED, storage, cache, BSM inputs) for CLI via environment variables / .env, or for library use by passing objects into get_container(). Configuration does not change domain math; it selects providers, keys, and paths only.

Configuration Methods

  1. CLI: Environment variables or a .env file.
  2. Library: Pass LLMConfig and optional fred_api_key into get_container(). See Library Integration below.

Environment Variables (CLI)

Create a .env file in your project root (same directory as pyproject.toml):

# .env
COPINANCEOS_GEMINI_API_KEY=your-api-key-here

LLM Provider Setup

Implemented backends: gemini, openai (Chat Completions), and ollama. The factory rejects unknown provider names (see Using as a Library — LLMConfig).

Gemini (Cloud)

  1. Get an API key from Google AI Studio.
  2. Set in .env: COPINANCEOS_GEMINI_API_KEY=your-key
  3. Verify: copinance analyze equity AAPL --question "What is the current price?"

Model selection (optional):

COPINANCEOS_GEMINI_MODEL=gemini-1.5-pro   # default
COPINANCEOS_GEMINI_MODEL=gemini-2.5-flash # faster

Text streaming (CLI): For question-driven runs, pass --stream on the analyze group (before the subcommand) or --stream on generic research (copinance --stream "…"). --json disables streaming. Output prints to stdout as tokens arrive; run metadata and saved JSON still appear after. See Using as a Library — LLM text streaming for LLMConfig and programmatic use.

OpenAI (Cloud)

  1. Get an API key from OpenAI (or use a base URL for an OpenAI-compatible HTTP API).
  2. Set in .env:
COPINANCEOS_LLM_PROVIDER=openai
COPINANCEOS_OPENAI_API_KEY=sk-...
COPINANCEOS_OPENAI_MODEL=gpt-4o-mini
# Optional — custom / enterprise endpoint:
# COPINANCEOS_OPENAI_BASE_URL=https://api.openai.com/v1
  1. Verify: copinance analyze equity AAPL --question "What is the current price?"

Ollama (Local)

  1. Install from ollama.ai and run ollama pull for a model you have locally (e.g. llama3.2 or llama3.1).
  2. Set in .env:
COPINANCEOS_LLM_PROVIDER=ollama
COPINANCEOS_OLLAMA_BASE_URL=http://localhost:11434
COPINANCEOS_OLLAMA_MODEL=llama3.2
  1. Verify: copinance analyze equity AAPL --question "What is the current price?"

FRED API (Macro Data)

Optional. Improves macro analysis; yfinance fallback is used without it.

  1. Get a free key at FRED API.
  2. Set: COPINANCEOS_FRED_API_KEY=your-fred-api-key
  3. Verify: copinance analyze macro (results should show "source": "fred" where applicable).

SEC EDGAR (edgartools)

Question-driven analysis routes SEC filing metadata and filing body tools to EdgarToolsFundamentalProvider (copinance_os.data.providers.sec.edgartools), built on edgartools (import name edgar). The SEC requires a User-Agent identity (name and email) for programmatic access.

Configure identity (pick one):

  • Environment: EDGAR_IDENTITY — e.g. Your Name you@example.com (common with edgartools docs).
  • Copinance-prefixed: COPINANCEOS_EDGAR_IDENTITY — same format.
  • Default: If unset, settings use a built-in project identity so local runs work; override in production with your own contact string.

Responses are cached under the same cache as other tools (see below), with per-operation TTLs (e.g. filing lists vs. filing text) to limit repeat requests to SEC servers.

Option Greek estimation (BSM)

Optional. Affects analytic delta/gamma/theta/vega/rho attached to options chains when QuantLib is installed.

# Annual risk-free rate (decimal e.g. 0.045). Omit to use built-in default.
COPINANCEOS_OPTION_GREEKS_RISK_FREE_RATE=0.045
 
# Default dividend yield when chain metadata has no `dividend_yield`. Omit for 0.
COPINANCEOS_OPTION_GREEKS_DIVIDEND_YIELD_DEFAULT=0

Reserved OptionsChain.metadata keys and profile preferences are documented in Options chain metadata.

Storage and Cache

# Storage (optional)
COPINANCEOS_STORAGE_TYPE=file
COPINANCEOS_STORAGE_PATH=~/.copinance
 
# Cache (optional, default: true)
COPINANCEOS_CACHE_ENABLED=true
  • Library: Use get_container(..., cache_enabled=False) or cache_manager=.... See Using as a Library.

SEC / edgartools: Filing metadata and filing content from EDGAR are stored in the same file cache as tool results (versioned under your persistence/cache paths). Disabling cache or using a custom cache_manager applies to EDGAR-backed calls as well.

Complete .env Examples

Gemini:

COPINANCEOS_LLM_PROVIDER=gemini
COPINANCEOS_GEMINI_API_KEY=AIzaSy...your-key
COPINANCEOS_GEMINI_MODEL=gemini-1.5-pro
COPINANCEOS_FRED_API_KEY=your-fred-key
# Optional — SEC/EDGAR (edgartools); override default identity for production
# EDGAR_IDENTITY="Your Name you@company.com"
COPINANCEOS_STORAGE_TYPE=file
COPINANCEOS_STORAGE_PATH=~/.copinance

OpenAI:

COPINANCEOS_LLM_PROVIDER=openai
COPINANCEOS_OPENAI_API_KEY=sk-...
COPINANCEOS_OPENAI_MODEL=gpt-4o-mini
COPINANCEOS_STORAGE_TYPE=file
COPINANCEOS_STORAGE_PATH=~/.copinance

Ollama:

COPINANCEOS_LLM_PROVIDER=ollama
COPINANCEOS_OLLAMA_BASE_URL=http://localhost:11434
COPINANCEOS_OLLAMA_MODEL=llama3.2
COPINANCEOS_STORAGE_TYPE=file
COPINANCEOS_STORAGE_PATH=~/.copinance

Security

Never commit .env. Add to .gitignore:

.env
.env.local

Troubleshooting

  • “LLM analyzer not configured”: Check .env location and variable names; restart terminal.
  • “Gemini not available”: pip install google-genai
  • “Gemini API key is not configured”: No quotes around the key; exact name COPINANCEOS_GEMINI_API_KEY.
  • OpenAI errors / “openai package is not installed”: Ensure project deps are installed (pip install -e .); the openai library is declared in pyproject.toml. Check COPINANCEOS_OPENAI_API_KEY when using env-based CLI config.

Library Integration

When using Copinance OS as a library, pass config into get_container(); env vars are for CLI only.

  • LLMConfig: Required for question-driven analysis. Example: get_container(llm_config=LLMConfig(provider="gemini", api_key="...", model="gemini-1.5-pro")).
  • FRED: Optional. get_container(..., fred_api_key="your-key").
  • Storage: To avoid creating a .copinance directory on disk, use get_container(..., storage_type="memory") or set COPINANCEOS_STORAGE_TYPE=memory in env. See Storage and Persistence.

Full container options and examples: Using as a Library.