Documentation Index
Fetch the complete documentation index at: https://launchdarkly-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Overview
This guide shows you how to build a simple AI-powered chatbot using LaunchDarkly AI Configs with multiple AI providers, including Anthropic, OpenAI, and Google. Using AI Configs, you can manage models and prompts outside of code, switch providers without redeploying, and monitor performance in real time. You’ll learn how to:- Create a basic chatbot application
- Configure AI models dynamically without code changes
- Create and manage multiple AI Config variations
- Apply user contexts for personalizing AI behavior
- Switch between different AI providers seamlessly
- Monitor and track AI performance metrics
Prerequisites
Before you begin, you need the following:Required accounts
Access to the following accounts:- A LaunchDarkly account: Sign up at app.launchdarkly.com
- At least one AI provider account:
- Anthropic: console.anthropic.com
- OpenAI: platform.openai.com
- Google AI: ai.google.dev
Development environment requirements
A development environment with:- Python 3.8 or later
- pip package manager
- Basic Python knowledge
- A code editor, such as VS Code or PyCharm
API and SDK keys
The following keys:- Your LaunchDarkly SDK key
- An API key from at least one AI provider
Before you start
This guide builds a chatbot using completion-based AI Configs in a messages array format. If you use LangGraph or CrewAI, you may want to use agent mode instead. The following sections include best practices to help you avoid common issues and reduce debugging time.Do not cache configs across users
Reusing configs across users breaks targeting. Instead, fetch a fresh config for each request:Provide a fallback config
Provide a fallback so your application does not crash when unexpected issues occur, such as LaunchDarkly being unavailable or API keys being incorrect:Check if the config is enabled
Check if the config is enabled before using it:Do not include personally identifiable information (PII) in contexts
Never send PII to LaunchDarkly. Here’s a bad and a good example:Limit conversation history
Your chat history grows with every turn. After 50 exchanges, each request may include thousands of tokens. Here’s how to limit it:Track token usage
Without tracking, it is difficult to understand how token usage affects cost. Here’s how to track token usage:Your provider methods should return the full response object, not just text, so you can access usage metadata. The code examples here return full responses where tracking is needed.