Documentation Index
Fetch the complete documentation index at: https://launchdarkly-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Overview
This topic documents how to get started with the Go AI SDK, and links to reference information on all of the supported features. The Go AI SDK is designed for use with LaunchDarkly’s AI Configs. It is in a pre-1.0 release and the API may change based on feedback. Active feature development is ongoing in the Python and Node.js AI SDKs, so this SDK receives new features at a slower pace. You can follow development or contribute on GitHub.LaunchDarkly’s SDKs are open source. In addition to this reference guide, we provide source, API reference documentation, and sample applications:
| Resource | Location |
|---|---|
| SDK API documentation | SDK API docs |
| GitHub repository | ldai |
| Sample application | |
| Published module | pkg.go.dev |
Get started
LaunchDarkly AI SDKs interact with AI Configs. AI Configs are the LaunchDarkly resources that manage model configurations and messages for your generative AI applications. You can use the Go AI SDK to customize your AI Config based on the context that you provide. This means both the messages and the model evaluation in your generative AI application are specific to each end user, at runtime. You can also use the AI SDKs to record metrics from your AI model generation, including duration and tokens. Follow these instructions to start using the Go AI SDK in your application.Install the SDK
First, install the AI SDK as a dependency in your application. How you do this depends on what dependency management system you are using:- If you are using the standard Go modules system, import the SDK packages in your code and
go buildwill automatically download them. The SDK and its dependencies are modules. - Otherwise, use the
go getcommand and specify the SDK version, such asgo get github.com/launchdarkly/go-server-sdk/ldai.
Initialize the client
After you install and import the SDK, create a single, shared instance ofLDClient. Then, use it to initialize the LDAIClient. The LDAIClient is how you interact with AI Configs. Specify the SDK key to authorize your application to connect to a particular environment within LaunchDarkly.
The Go SDK uses an SDK key. Keys are specific to each project and environment. They are available from Project settings, on the Environments list. To learn more about key types, read Keys.
ld, as shown above.
Configure the context
Next, configure the context that will use the AI Config, that is, the context that will encounter generated AI content in your application. The context attributes determine which variation of the AI Config LaunchDarkly serves to the end user, based on the targeting rules in your AI Config. If you are using template variables in the messages in your AI Config’s variations, the context attributes also fill in values for the template variables. Here’s how:Customize an AI Config
Then, useConfig() to customize the AI Config. Customization means that any variables you include in the messages when you define the AI Config variation have their values set to the context attributes and variables you pass to the Config() method.
The customization process within the AI SDK is similar to evaluating flags in one of LaunchDarkly’s client-side, server-side, or edge SDKs, in that the SDK completes the customization without a separate network call. The Config() function takes an AI Config key, a context, and a fallback value. It performs the evaluation, then returns a Config object with the customized messages and model, and a Tracker object to capture performance metrics. If it cannot perform the evaluation or LaunchDarkly is unreachable, it returns the fallback value. For example, you might use an empty, disabled Config as a fallback value, or a fully configured default. Either way, you should make sure to check for this case and handle it appropriately in your application.
After you call Config(), you can pass the customized messages directly to your AI.
Here’s how:
Call provider, record metrics from AI model generation
Finally, use theTrackRequest function to make a request to your generative AI provider and record metrics from your AI model generation. Make sure to check whether the returned cfg is enabled, and handle the disabled case appropriately in your application.
Here’s how:
Track* functions to record these metrics manually. The TrackMetric function is expecting a response, so you may need to do this if your application requires streaming.
To learn more, read Tracking AI metrics.