# AI Performance

Sentry's AI Performance allows you to see critical metrics for your LLM pipelines, like token usage and duration. In addition to having a dedicated space to monitor AI performance, you can also look view Sentry's Frontend, Backend, and Mobile Insights to understand the performance of the applications running your pipelines.

## [Learn More](https://docs.sentry.io/product/insights/ai.md#learn-more)

* #### [AI Agents](https://docs.sentry.io/product/insights/ai/agents.md)

  Learn how to use Sentry's AI Agent monitoring tools to trace and debug your AI agent workflows, including agent runs, tool calls, and model interactions.

* #### [MCP](https://docs.sentry.io/product/insights/ai/mcp.md)

  Learn how to use Sentry's MCP monitoring tools to trace and debug your Model Context Protocol implementations, including server connections, resource access, and tool executions.
