[Video] Architectural Considerations for Using Hot and Cold Data on HDFS

//[Video] Architectural Considerations for Using Hot and Cold Data on HDFS

Using Hot and Cold Data on HDFS

We are excited to bring you a video recording of a recent meetup presentation by Esgyn experts on Hot and Cold Data on HDFS.

 

Hot data is the data you want to access quickly (mostly for reporting purposes). Cold data is the data you will access infrequently (mostly for Business Intelligence (BI) or analytics purposes). Hot data can identified based on either data volume (e.g., latest 100 GB data), or a specific time period (1 day or 1 week), or a specific set of data.

 

Enterprises use different platforms for storing and managing hot data and cold data. Using different databases requires moving or duplicating data from one database platform to another. Moving data from one database to another for different purposes such as reporting and BI/Analytics causes delays in reporting and requires maintenance due to schema changes.

 

With the right infrastructure on Hadoop File System (HDFS), you can avoid data duplication and movement and store hot and cold data at the same platform.

 

EsgynDB is built on Hadoop to help you minimize the data movement and allow you to store both hot and cold data without having to move the data around. That advantages are reduced server footprint, reduced licensing and support costs, and most importantly real-time insights on hot and cold data together.

 

Esgyn’s Rao Kakarlamudi and Hans Zeller recently spoke at Milpitas Big Data meetup on this topic and we are excited to present you the video recording.

 

2019-04-19T17:34:36+00:00

About the Author:

Ken is an experienced software development leader with deep knowledge of building and delivering enterprise-class database products and supporting platforms from the ground up. Prior to Esgyn, Ken held a variety of positions at HP ranging from Director roles in Product Management, QA, and Development to Chief of Staff. Most recently Ken played a key role in launching Apache Trafodion as an open source project in collaboration with HP Labs, and has the dubious honor of giving the project its name.Ken earned his Bachelor’s degree in Physics from University of Wales with a specialty in Computing Physics and has been a lifelong student of technology ever since.