Superconductive, creators of Great Expectations, raises $40M to launch a commercial version of its open source data quality tool

Enterprise

Data quality — the practice of testing and ensuring that the data and data sets you are using are what you expect them to be — has become a key component in the world of data science. Data may be the “new oil”; but if it’s too crude, you may not be able to use it.

Today, a startup building tools to make it easier to measure and ensure the quality of the data you are using is announcing some funding, a sign of how attention has been shifting to this area.

Superconductive — a startup best known for creating and maintaining the Great Expectations open source data quality tool — has raised $40 million in a Series B round of funding. It will be using the capital both to keep building out its open source product and community, and to ready its first commercial product — a less-technical, and more accessible version of Great Expectations that can be used more than just engineers and data scientists — set to launch later this year.

Once the commercial offering is released, it will be named Great Expectations Cloud.

As Abe Gong, the CEO and co-founder of Superconductive describes it, data quality has long been a priority for engineering and data science teams. But as data usage and access become increasingly democratized in increasingly digitized organizations — thanks in part to low-code and no-code software — data quality becomes a point of consideration (not an “issue” or “challenge”, Gong is quick to point out) for more people. The thinking goes that having data quality tools that more people can use and understand will give people the ability to understand limitations or gaps, and fix them.

“The broader question is, how does everyone in the organization get to a point where they trust what the data does and what it is trying to do,” he said. “The engineering team might trust it but it might not be aligned with other teams. It doesn’t matter if it’s correct, it’s still doubting that data is fit for the purpose I want to use it for.”

Even without a commercial product, Salt Lake City-based Superconductive is getting a lot of attention from high places. Tiger Global is leading the round, with previous backers Index, CRV, and Root Ventures also participating. The company is not disclosing its valuation, but we understand that the dilution is less than 15%, which puts it at over $267 million.

The funding is coming less than a year since Superconductive raised a $21 million Series A, in May 2021. Part of the reason investors have come knocking so soon after the last round is because of the strong traction for its open source tools.

Great Expectations is currently seeing over 2.5 million monthly downloads (closer to 3 million, Gong told me), while members of its community, which it maintains on Slack, has now crossed 6,000 (the downloads are based on machines running Great Expectations, while the Slack users are engineers actively working with the tools). Companies adopting it include Vimeo, Heineken, Calm, and Komodo Health; and it also finds its way into use via ecosystem partners Databricks, Astronomer, Prefect and more.

Great Expectations got its start when Gong and his co-founder James Campbell — both computer scientists with decades of experience between them — initially were building tools to address the issue of data quality for organizations working in healthcare. They eventually pivoted the business to tackle the bigger opportunity: the issues healthcare organizations faced were the same as those faced by companies in other verticals.

The crux of the matter is that when engineers are building analytics or other tooling to work with data, they may not be taking into account whether the data being ingested by those tools is in the right state to be used correctly (as one example, are dates entered in the same, consistent formats, or if not how best to reorganize them). Or, they may not have considered the different ways that users of the analytics might end up using them. For instance, what happens when an end-of-month analytics dashboard is suddenly looked at in the middle of the month? will the insights still be consistent or will they throw people off completely because of how the formula and processes have been set up?).

“By the end of month, the numbers would be correct, you might see a drop in sales in mid-month,” Gong said. “The engineering team might say that it’s correct because the system is still calculating, but from a business perspective a lot might get confused, even if the system is working correctly.”

Great Expectations sets out to “fix” these situations with tools that help set parameters on data to ensure it stays consistent, and at the same level of quality. The so-called “expectations” repository — some built by Superconductive, and many built by the community — are declarative statements that are set up to both make sense to humans, but also computers so that they can do the work behind the commands.

Superconductive cites figures from Gartner that support the idea of data quality being a growing issue for organizations. The analysts estimate that currently organizations see costs of $12.9 million annually because of poor data quality — both because the data hasn’t performed as it should, but also because of the decisions that the poor data has led to. Gartner predicts that this year, 70% of organizations will turn to tracking data quality levels to address this.

That also means Superconductive has competition. Companies like Microsoft, SAS, Talend, and others have built data quality tools as a complement to other data services that they provide. Gong also said that a lot of companies build “homegrown” solutions, although these can run into limitations as internal tools often do. Superconductive believes that it has a lot of opportunity in the space for a few different reasons.

First is the fact that it already has a large community using its open-source tools, which becomes a funnel for users of the commercial product. Second is that it’s dedicated to the task of data quality.

“Others tend to slice it differently,” he said. “Sometimes you hear about data quality in the context of data observability and so it’s focused on engineers and not looking at the wider role. We see ourselves as different, a bottom-up open solution looking at the broader scope of this as our mission, not just an engineering problem.”

Investors, especially those who have had experience themselves with the pain points of debugging software, and knew the same issues existed with data, seem to agree.

“The vision was simple, yet ambitious: to create a single place to observe, monitor, and collaborate on the quality of your data, at any level of granularity, on any system,” Bryan Offutt of Index Ventures wrote at the time of their first investment in the company in 20201. “By giving data teams an end-to-end way to monitor quality from pipeline to production, Abe wanted to bring the same ability to pinpoint and resolve issues that exists in traditional software to the world of data. Finally, data teams could catch issues before they made their way to end users. It was as if Abe had read the book on every single problem I had experienced as an Engineer working on data pipelines. It felt like the data world had its own DataDog.”

Products You May Like

Articles You May Like

Here’s the full list of 49 US AI startups that have raised $100M or more in 2024
Indian startups raised 32% fewer rounds in 2024 as VCs got selective
Revenue-based financing startups continue to raise capital in MENA, where the model just works
Juniper Ventures spins out of Climate Capital to invest in synthetic biology for the climate
Hyme Energy signs global deal with Arla to scale thermal storage tech

Leave a Reply

Your email address will not be published. Required fields are marked *