Datadog's Platform Expands to Support Monitoring and Troubleshooting of Generative AI Applications
Generative AI-based features such as AI assistants and copilots are quickly becoming an important part of all software product roadmaps. While there is a lot of promise in these emerging capabilities, deploying them in customer-facing applications brings many challenges including cost, availability and accuracy.
The tech stacks used in generative AI are evolving quickly while new application frameworks, models, vector databases, service chains and supporting technologies are seeing rapid adoption and usage. In order to keep up, organizations require observability solutions that can adapt and evolve along with AI stacks.
Today,
These capabilities include integrations for the end-to-end AI stack:
- AI Infrastructure and compute: NVIDIA, CoreWeave, AWS, Azure and Google Cloud
- Embeddings and data management: Weaviate, Pinecone and Airbyte
- Model serving and deployment: Torchserve, VertexAI and Amazon Sagemaker
- Model layer: OpenAI and Azure OpenAI
- Orchestration framework: LangChain
Additionally,
LLM observability includes:
- Model catalog: Monitor and alert on model usage, costs and API performance.
- Model performance: Identify model performance issues based on different data characteristics provided out of the box, such as prompt and response lengths, API latencies and token counts.
- Model drift: Categorization of prompts and responses into clusters enabling performance tracking and drift detection over time.
"It's essential for teams to measure the time and resources they are investing in their AI models, especially as tech stacks continue to modernize," said Yrieix Garnier, VP of Product at
About
Forward-Looking Statements
This press release may include certain "forward-looking statements" within the meaning of Section 27A of the Securities Act of 1933, as amended, or the Securities Act, and Section 21E of the Securities Exchange Act of 1934, as amended including statements on the benefits of new products and features. These forward-looking statements reflect our current views about our plans, intentions, expectations, strategies and prospects, which are based on the information currently available to us and on assumptions we have made. Actual results may differ materially from those described in the forward-looking statements and are subject to a variety of assumptions, uncertainties, risks and factors that are beyond our control, including those risks detailed under the caption "Risk Factors" and elsewhere in our
Contact
press@datadoghq.com
View original content to download multimedia:https://www.prnewswire.com/news-releases/datadogs-platform-expands-to-support-monitoring-and-troubleshooting-of-generative-ai-applications-301892193.html
SOURCE