We're thrilled to announce the release of the Rootly MCP Server, available on GitHub and licensed under Apache 2.0.
Users can now connect Rootly to any MCP-compatible AI tool, such as Cursor, Claude, and Copilot, giving responders the ability to resolve production incidents in under a minute without leaving their IDE.
What’s MCP?
MCP (Model Context Protocol) is a new standard created by Anthropic for connecting AI assistants to the systems where data lives. It helps you build agents and workflows on top of LLMs.
In the context of Rootly, our MCP Server opens up exciting new possibilities, allowing incident management teams and engineers to leverage incident insights and data directly within their AI workflows. This means less context switching, faster resolution times, and smarter incident handling—all through a standardized, open protocol.
MCP and incident resolution
In the context of incident resolution, MCP can help responders bring the context Rootly provides directly into their favorite coding tools, avoiding context switching and decreasing the time it takes to solve incidents.
Rootly customer Brex is leveraging MCP extensively to enhance developer productivity. “Developers deliver their highest value when focused within their IDE,” says Jarrod Ruhland, Principal Engineer at Brex. “Our hypothesis is that integrating Rootly directly into editors will accelerate incident investigation and resolution and increase developer efficiency.”
In the demo video below, we see an incident directly imported from Rootly using the MCP protocol into Cursor chat. From here, we can see the incident title, severity, and all other metadata available in Rootly. Then, we see the engineer prompting Cursor for a solution. Cursor finds the root cause and suggests a fix. The engineer just has to review the proposed change and save it.
Thanks to the Rootly MCP server, we went from an incident to a solution in under a minute.
How to get started
You can install the package on PyPi or by cloning the project repo. If you want to import it directly in your favorite IDE, here is the JSON configuration which has been tested for Cursor and Windsurf.
We've included several targeted features to enhance the MCP server's functionality. The server dynamically generates MCP resources based on Rootly's OpenAPI (Swagger) specification, ensuring endpoint mappings remain consistently up-to-date.
To maintain optimal context for LLM processing, the server defaults pagination to 10 items per response for incident-related endpoints, preventing context window overflow.
Additionally, the server deliberately limits the number of exposed API paths for two primary reasons. Firstly, context management: Rootly's API exposes a lot of paths, which can overwhelm AI agents, impacting their ability to perform straightforward actions effectively. By default, the server only exposesand /incidents/{incident_id}/alerts. Secondly, security: restricting exposed paths allows us to carefully control the type of information and actions accessible to users via the MCP server. We explain how to change the available paths in the .
MCP future
As the MCP ecosystem expands, this server will facilitate incident responders in easily connecting with a growing assortment of AI models and tools, while ensuring consistent and reliable integration patterns.
The Rootly MCP server code source is available on Rootly AI Labs GitHub, licensed under Apache-2.0 license, it’s a prototype version. We are happily accepting features and pull request, let’s shape future of reliability together.
Get the latest from Rootly
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
AI-Powered On-Call and Incident Response
Get more features at half the cost of legacy tools.