Today, we’re happy to present the newest addition to the plethora of data tools that work with Cube, the semantic layer for building data apps. Delphi provides a way to interact with Cube in natural language and delivers insights straight to business users in Slack.
Powered by the newest large language model (LLM) technology from OpenAI, Delphi brings the accessibility of self-service analytics to the next level and enables a compelling use case for semantic layers. See Cube and Delphi in action:
Learn more about Delphi and semantic superiority from David Jayatillake, co-founder and CEO at Delphi Labs:
Natural language interface for data
Our mission for Delphi is to solve the whole workflow, from question to answer, with data. Today, this is broken—self-serve BI tools haven’t delivered on this promise to date, and data teams are bombarded with ad hoc requests at a scale greater than ever before as businesses move to operate with data.
When we ask data leaders in the space what % of their organization is capable of self-serving with data, the answer is usually between 10 to 25%, with only 10% being very competent. It’s been three years since “dashboards are dead,” but unfortunately they’re still being used as the primary way to serve data. In my experience, no stakeholder ever wanted a dashboard. They wanted an answer, an explanation, a prediction, data to use elsewhere operationally, a graph or a slide to make a specific point. This is the problem space we aim to solve with Delphi.
There are many other entrants who are trying to offer a natural language interface to data in this LLM gold rush. However, almost all of them are pointing the LLM at a database schema and getting it to generate SQL. We believe this is the wrong approach: every time a query is answered using this method, the LLM generates a minimal semantic layer on the fly in order to derive meaning from the data. As humans created the data structures and their meaning in the first place, why introduce a probabilistic element into the meaning of your data?
We believe that working with semantic layers curated by data folks is the best approach. It synthesizes the information already baked into the semantic layer, truly enabling safe and accurate answers. It also allows us to say that we can’t answer the question—a very normal outcome when asking a data team a question; LLMs connected directly to a data store will simply guess, as they don’t have a concept of a world and what does, and doesn’t, exist in it, which a semantic layer offers.
This is why we’re so excited to work with semantic layers like Cube—we’re already blown away by the interest we’ve had from the Cube community and hope to continue to work with you all!
As I’ve written in my substack, I believe the future of semantic layers is standalone. It’s what the people want.
What’s next?
Here's what an early adopter of Delphi and Cube thinks:
Getting started with Delphi on Cube was a breeze and we really quickly had a conversational AI interface to our data warehouse. The accuracy was great and some of the Delphi features really helped explain our data and ensure we knew how we were querying it. I'd highly recommend it for it's wow factor and enabling moving towards a data driven culture.
Interested? Implement your semantic layer with Cube by getting started with Cube Cloud today or by contacting us. Also, sign up for Delphi and see it in action at slack.cube.dev in the #random-delphi-demo
channel: