May 15, 2025

Partnerships

Mercury dLLM from Inception Labs Powering Microsoft NLWeb

Mercury dLLM from Inception Labs Powering Microsoft NLWeb

Burzin Patel

Burzin Patel

VP of Product

VP of Product

At today's Microsoft Build conference keynote, Satya Nadella announced NLWeb, an open project designed to simplify the creation of natural language interfaces for websites, making it easy to turn any website into an AI-powered application. NLWeb’s focus on building RESTful conversational interfaces for semi-structured content, such as web product catalogs, event listings, and recipes, aligns perfectly with Mercury, Inception Labs’ ultra-fast diffusion Large Language Model (dLLM) that supports real-time interactions.

At Inception Labs, we believe that the future of user interaction hinges on speed and accuracy. Our Mercury family of models is the first commercial-scale LLM based on diffusion (the technology that powers tools like MidJourney and Sora) and delivers real-time LLM performance, enabling low-latency conversational agents in areas like coding assistants, voice applications, and agentic systems. Microsoft’s NLWeb announcement supports this vision in a powerful new way by providing a structured foundation for grounded natural language responses that is designed to be hallucination free and gives power back to the content websites (unlike other search models like ChatGPT, Perplexity, etc.) which take traffic away from the source websites. When combined with Mercury, NLWeb’s architecture allows for lightning-fast, natural conversations grounded in real data as depicted in the plot below.



In this plot, you can see that the Mercury dLLM is significantly faster than GPT-4.1 Mini and Claude 3.5 Haiku.

At Inception Labs, we’re thrilled to partner with Microsoft on this integration, along with other partners like TripAdvisor, Shopify, and Snowflake. This solution brings together the NLWeb’s ecosystem with our cutting-edge dLLM models, leveraging speed, reliability, and intelligence together in one seamless experience.

You can try out the Mercury dLLM on our Playground or via our API. If you are an enterprise customer looking for a private instance or to fine-tune the Mercury dLLM for your custom data before using it with NLWeb, please reach out to us at sales@inceptionlabs.ai - we have just the right solution for you.

You can read more about the announcement here and here.

Related Reading