If you want to achieve an enterprise view of your data, your solution options basically fall into one of two camps:
- Move it and integrate.
- Leave it and virtualize.
Metanautix’s co-founder, Theo Vassilakis, contends that both add unnecessary complexity to enterprise data analytics.
“A lot of the times, that’s where the complexity comes from: Oh hold on, let me do a little Informatica here, let me do a little virtualization here, and let me do a little Teradata there,” Vassilakis said during a recent interview. “So, solving the same business problem, some of the data sets you’ll have to move and some of the data sets you’re not going to be able to move. Additionally, you end up having to do the moving with one system and then the querying with another system.”
Vassilakis spent nearly a decade as a principal engineer/engineering director at Google. He worked with all types of data manipulation, including Dremel, scalable ad hoc query software that can access multiple types of data sources. Fellow co-founder Toli Lerios worked as an engineer at Facebook dealing with photo, image and video data.
They realized they were both working on different aspects of the same problem — Big Data. In 2012, they left to start Metanautix with the goal of building an enterprise-class solution.
What makes Metanautix’s offering unique is that it’s one solution that can adapt to your business need. Sometimes, silos exist for a reason, Vassilakis points out, whether that’s compliance, the cost of adding another license or simply political issues. Their solution addresses that, while still supporting integration or migration, he said.
Vassilakis and Lerios call it a data compute engine, and say it fits between the two edges of “move data, never move data.”
To be clear, this isn’t a new analytics tool and it’s certainly not storage. It’s middleware software that can run on-premise, in the cloud, or wherever you want it. “You shouldn’t have to move your data to have to adapt to our business model,” he adds.
They wanted to take the flexible data capabilities used at Facebook and Google and create a solution that was easy to deploy, manage and run in a traditional enterprise environment. It needed to scale, and they wanted to ensure that enterprises could easily leverage all data, whether it was from a database, tweets, Web pages, photos or video.
“We’re basically, instead of being best of breed in one of the categories or playing just in the virtualization space or just in the ETL space or just in the analytics space, what we’re trying to do is build an admittedly thinner solution, but that has elements of all three categories so that business users can go end to end,” Vassilakis said.
The company is a Tableau partner, so one use case it is targeting is joining data from Hadoop with Teradata, other enterprise applications or RDMS and then serving it in Tableau, he explained.
It’s all based on a simple, but for many, profound idea: The data is what matters.
“Google grew so fast — what I got to see in fast forward is many systems were born and died in rapid succession, but the data was always the same data,” he said. “And it moved in all those different systems, because really that was the business. I think that’s happening in the broadest way at organizations out there, but it’s happening maybe at a slower pace and it’s harder to see.”
Loraine Lawson is a veteran technology reporter and blogger. She currently writes the Integration blog for IT Business Edge, which covers all aspects of integration technology, including data governance and best practices. She has also covered IT/Business Alignment and IT Security for IT Business Edge. Before becoming a freelance writer, Lawson worked at TechRepublic as a site editor and writer, covering mobile, IT management, IT security and other technology trends. Previously, she was a webmaster at the Kentucky Transportation Cabinet and a newspaper journalist. Follow Lawson at Google+ and on Twitter.