Skip to main content

Using dataopslive MCP server

DataOps.live and Snowflake are the primary platforms for running your data operations. Assist includes built-in MCP (Model Context Protocol) servers for both.

Let’s understand the dataopslive MCP server in this guide. The next page covers the Snowflake MCP server.

The dataopslive MCP server provides tools that allow Assist to interact with DataOps projects, Vaults (which store encrypted secrets and configuration), and Knowledge base (includes documentation from DataOps.live, Snowflake, and dbt).

Here are some tools it offers:

  • search_knowledge_base(): searches across DataOps.live, Snowflake, dbt documentation, and Vaults configs.
  • tools (e.g., get_project_id, list_pipelines, list_jobs_of_pipeline, get_job_log) fetch project and job information from your DataOps.live environment.

Note: The MCP servers are automatically enabled. If you need to configure them, this guide shows how to do that.

How does the dataopslive MCP work?

When you send a prompt to Assist, the server evaluates whether the request requires external context. For example,

  • If the request relates to documentation, it calls search_knowledge_base.
  • If the request involves pipelines or jobs, it calls the tools.

Suppose, I ask “How do I configure SOLE database definitions?” Assist uses the search_knowledge_base tool to query the documentation sources and returns the configuration steps directly in the response.

dataopslive MCP server !!shadow!!

For simple prompts, the MCP server uses a single tool to return results. For complex prompts, it handles multiple subtasks by calling different tools in sequence.

How does the Dataopslive MCP server handle complex prompts?

For complex prompts, the MCP server chains multiple tools to gather and combine information before responding.

Example prompt: Check recent SOLE job failures and provide troubleshooting guidance

The MCP server breaks this request into several smaller tasks and assigns each task to a specific tool.

  • get_project_id – Retrieve project information.
  • list_pipelines – Find pipelines from the last 24 hours with status "failed".
  • list_jobs_of_pipeline – Identify jobs with names containing "SOLE" or "Snowflake Setup".
  • get_job_log – Get logs from SOLE-related failed jobs
  • search_knowledge_base – Look up troubleshooting steps for "SOLE troubleshooting" and "Snowflake connection issues".

The MCP server executes this chain and returns a consolidated troubleshooting summary.