Data Lakes: 8 Enterprise Data Management Requirements

Email     |     Share  
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12
Next Next

Metadata and Governance

Two areas that are still less mature in data lake technologies, such as Hadoop, are metadata and governance. Metadata refers to update and access requests as well as schema. These capabilities are provided in the context of the conventional relational data warehouse, where updates are more easily tracked and schema is more constrained.

Work in open source on metadata and governance is progressing, but there is not widespread agreement on a particular implementation. For example, Apache Sentry helps enforce role-based authorization to Hadoop data. It works with some, but not all, Hadoop tools.

Enterprises looking to better manage metadata and governance currently employ custom solutions or simply live with limited functionality in this regard. Recently, LinkedIn open sourced an internal tool called WhereHows that may prove to improve the ability to collect, discover, and understand metadata in the data lake. Look to see commercial data integration solution providers develop new ways to manage metadata and governance in the enterprise data lake.

2016 is the year of the data lake. It will surround and, in some cases, drown the data warehouse, and we'll see significant technology innovations, methodologies and reference architectures that turn the promise of broader data access and Big Data insights into a reality. But Big Data solutions must mature and go beyond the role of being primarily developer tools for highly skilled programmers. The enterprise data lake will allow organizations to track, manage and leverage data they've never had access to in the past. New data management strategies are already leading to more predictive and prescriptive analytics that are driving improved customer-service experiences, cost savings and an overall competitive advantage when there is the right alignment with key business initiatives.

So whether your enterprise data warehouse is on life support or moving into maintenance mode, it will most likely continue to do what it's good at for the time being: operational and historical reporting and analysis (a.k.a. rear-view mirror).

As you consider adopting an enterprise data lake strategy to manage more dynamic, poly-structured data, your data integration strategy must also evolve to handle the new requirements. Thinking that you can simply hire more developers to write code or rely on your legacy rows-and-columns-centric tools is a recipe to sink in a data swamp instead of swimming in a data lake. In this slideshow, Craig Stewart, VP product management at SnapLogic, has identified eight enterprise data management requirements that must be addressed in order to get maximum value from your Big Data technology investments.

 

Related Topics : APC, Resellers, Data Replication, Extract Transform and Load, Structured Data Integration

 
More Slideshows

mobile87-190x128.jpg How to Find Business Value in Your Data Through Modernization

Data only becomes a meaningful and valuable asset when organizations can transform it into actionable insights. ...  More >>

LiaisonTechUncontrolledData0x 5 Steps to Wrangle Uncontrolled Data Flow

As the availability of data exponentially increases, unprecedented opportunities exist to do all kinds of amazing things, but these opportunities also come with data wrangling challenges. ...  More >>

Misc70-190x128.jpg 5 Data Warehouse Design Mistakes to Avoid

If you are designing a data warehouse, you need to map out all the areas where there is a potential for your project to fail, before you begin. ...  More >>

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.