LangGrant unveils LEDGE MCP Server for safer LLM data
LangGrant has launched a new server platform that sits between large language models and enterprise databases, aiming to give organisations AI-driven analytics and automation without exposing underlying data to the models.
The product, called LEDGE MCP Server, acts as a database orchestration and governance engine for applications that use large language models (LLMs). It focuses on multi-database reasoning, automated analytics workflows and controlled access to production-like data for AI agent development.
LangGrant, previously known as Windocks, has built its business around database modernisation, cloning and synthetic data. The company said the new server works with existing database systems such as Oracle, SQL Server, Postgres and Snowflake.
"The LEDGE MCP Server removes the friction between LLMs and enterprise data," said Ramesh Parameswaran, CEO, CTO and co-founder, LangGrant. "With this release, enterprises can apply agentic AI directly to existing database environments like Oracle, SQL Server, Postgres, Snowflake - securely, cost-effectively, and with full human oversight."
Security and cost
LangGrant is targeting organisations that want to apply LLMs and AI assistants to operational data but face restrictions on data movement and concerns over cost. Many enterprises restrict direct access from external AI systems to governed databases. They often also limit bulk data exports that could breach policy or regulation.
The company said LEDGE MCP Server keeps raw data inside governed systems. The server exposes metadata, schemas and relationships to the LLM rather than full datasets. Analytics and reasoning processes operate on this structured context instead of transmitting tables or large result sets.
LangGrant has paired this with token usage dashboards and budgeting features for LLM calls. These functions track and manage the token consumption associated with analytics and planning tasks. The company said this reduces the volume of data sent to models and lowers API-related costs.
Analytics planning
A central feature of the new server is automated generation of multi-step analytics plans. The system uses LLMs to design and orchestrate query workflows. It then executes these workflows against connected databases.
LangGrant said LEDGE MCP Server breaks down complex analytical questions into a sequence of database operations. It then manages query planning, query execution and the integration of results across multiple sources.
The firm said this process remains reviewable and auditable by human teams. Organisations can inspect the proposed plans and resulting queries before or after execution. LangGrant said this approach addresses common concerns about LLM hallucinations in query generation and about opaque automation in data environments.
Agent development
The product also targets developers who are building agentic AI applications. These agents often require access to production-like data structures for testing and tuning, but enterprises are cautious about exposing live systems.
LangGrant said LEDGE MCP Server provisions isolated database clones and containers on demand. These environments mirror production schemas and data characteristics. The company said this avoids direct use of live databases and reduces the spread of unmanaged copies.
The cloning features draw on LangGrant's earlier work with containerised databases. The company introduced SQL Server containers on Windows a decade ago. It later added containerised Oracle database clones and has worked on synthetic data for testing and development.
Multi-database context
Another element of LEDGE MCP Server is automated context building across heterogeneous databases. The platform scans connected systems. It maps schemas, table relationships and metadata into a unified representation that an LLM can interpret.
LangGrant said this gives LLMs a view of the "data landscape" without direct access to the underlying rows and columns. The system is designed for cases where users need to join tables across multiple platforms or run analytics that span several business domains.
The company is positioning the server as vendor-agnostic with respect to AI agents. It said LEDGE MCP Server supports agents from any supplier. The orchestration and governance layer sits between those agents and the organisation's data infrastructure.
Context engineering push
LangGrant is placing the launch in the wider trend of context engineering in AI. Many organisations are experimenting with LLMs in software development, reporting and decision support. They often rely on manual query writing, ad hoc data pipelines and prompt-level context feeding using tools such as AI coding assistants.
The firm argues that these methods do not scale for complex, governed database estates. Automated context construction, governed access and structured analytics plans are presented as requirements for moving from experimental pilots into production environments.
The LEDGE MCP Server is now available for trial use. LangGrant said interested teams can work with the system on automated context engineering, multi-database reasoning and controlled agentic AI development ahead of potential production deployments.
"The LEDGE MCP Server removes the friction between LLMs and enterprise data," said Parameswaran.