Datadog, the monitoring and security platform for cloud applications, has announced a new integration that monitors OpenAI API usage patterns, costs and performance for various OpenAI models, including GPT-4 and other completion models. Datadog’s observability capabilities simplify the process of data collection through tracing libraries so that customers can easily and quickly start monitoring their OpenAI usage.
By providing actionable insights into the API’s usage, latency and costs, organizations will be able to use AI models more effectively so that they can focus on what matters most—improving day-to-day operations and innovating on products and services.
Datadog’s integration with OpenAI enables organizations to:
- Understand Usage: With an easy setup process, users can quickly realize comprehensive insights into overall OpenAI usage of APIs, token consumption and costs split by service, teams and API keys with Datadog’s out-of-the-box dashboards. Preconfigured out-of-the-box alerts also help users stay on top of OpenAI API rate limits.
- Optimize Performance: By tracking API error rates, rate limits and response times, Datadog allows users to identify and isolate application performance issues from API performance issues. Furthermore, users can view their traces and logs containing prompt and completion examples to understand key application bottlenecks and user behaviors.
- Track Costs: Users can review token allocation and analyze the associated costs of OpenAI API calls. Datadog offers insights into token allocation by model, service and organization to help teams manage their expenses more effectively and avoid unexpected bills.
- Cover Multiple AI Models: In addition to covering the GPT family of large language models, Datadog’s integration enables organizations to track performance, costs and usage for other OpenAI models, including Ada, Babbage, Curie and Davinci.