Examples
The repository includes runnable examples under /examples.
Each example is its own Go module so it shows exactly which packages a downstream
application imports after the multi-module refactor.
Run examples from their own directory:
If you are working inside a local checkout with go.work enabled, either add
the example module to the workspace or disable workspace mode for the command:
PowerShell:
The example go.mod files use local replace directives so they build against
the checkout. In your own application, remove those replace directives and
install the modules you import with go get.
Package Examples
llm/openai-chat— basic chat completion withllm/openaillm/anthropic-stream— streaming chat withllm/anthropicllm/builtin-tools— server-side built-in tools acrossanthropic,gemini,openai-responses,groq-compoundembeddings/voyage— text embeddings withembeddings/voyageimage/gemini— image generation withimage/geminitts/elevenlabs— text-to-speech withtts/elevenlabsstt/deepgram— speech-to-text withstt/deepgramrerankers/cohere— document reranking withrerankers/coherefim/mistral— fill-in-the-middle code completion withfim/mistralbatch/concurrent— concurrent batch processing around an LLM clientagent/basic— a minimal agent using an LLM clienttokens/truncate— local context truncation without an API key
Provider Switching
Provider-switch examples show the main point of the modality interfaces: the business logic stays typed against the shared interface, while only construction changes per vendor.
llm/provider-switch—openai,anthropic,geminiembeddings/provider-switch—openai,voyage,cohereimage/provider-switch—openai,geminitts/provider-switch—openai,elevenlabsstt/provider-switch—openai,deepgramrerankers/provider-switch—cohere,voyagefim/provider-switch—mistral,deepseekbatch/provider-switch—openai,anthropic,geminiagent/provider-switch—openai,anthropic,gemini
Set AI_PROVIDER to choose the implementation:
PowerShell:
Observability And Pricing
tracing/otel— configures OpenTelemetry withtracing.New, attaches a stdout span exporter, and runs a traced LLM request.model/pricing— estimates chat, embedding, image, and TTS costs from the public model registry fields.
The tracing example prints spans locally by default. For collector-based export,
configure tracing.New with OTLP settings such as OTEL_EXPORTER_OTLP_ENDPOINT.
Environment Variables
Set the provider key used by the example you run:
OPENAI_API_KEYfor OpenAI LLM, embedding, image, TTS, STT, batch, agent, and tracing examplesANTHROPIC_API_KEYfor Anthropic LLM, batch, and agent examplesVOYAGE_API_KEYfor Voyage embedding and reranker examplesGEMINI_API_KEYfor Gemini LLM, image, batch, and agent examplesGROQ_API_KEYfor thegroq-compoundprovider inllm/builtin-toolsXAI_API_KEYfor thexai-responsesprovider inllm/builtin-toolsELEVENLABS_API_KEYfor ElevenLabs TTS examplesDEEPGRAM_API_KEYfor Deepgram STT examplesCOHERE_API_KEYfor Cohere embedding and reranker examplesMISTRAL_API_KEYfor Mistral FIM examplesDEEPSEEK_API_KEYfor DeepSeek FIM examples
model/pricing and tokens/truncate run locally and do not require credentials.
Generated Files
Audio and image examples may write generated artifacts next to the example program:
image/geminiwritesgemini-image.pngimage/provider-switchwrites<provider>-image.pngtts/elevenlabswriteselevenlabs-speech.mp3tts/provider-switchwrites<provider>-speech.mp3stt/deepgramexpects a local audio file path as its only argumentstt/provider-switchexpects a local audio file path as its only argument
Do not commit generated audio or image outputs.