[Webinar Archive] Delivering Hybrid Transactional and Analytical Processing (HTAP) on Hadoop

/, Webinar/[Webinar Archive] Delivering Hybrid Transactional and Analytical Processing (HTAP) on Hadoop

Hybrid Transactional/Analytical Processing (HTAP) on Hadoop

As businesses become more agile, the need for real-time and near real-time analysis on transactional data has become more important than ever. For database veterans, transactions and analytics have always been on two different systems. Such silo-ed approach resulted in expensive ETL processes, specialized data marts, SLA issues, and most importantly analytics on old data. 


The latest architectural trends are increasingly leading the way to enable both transactions and analytics on the same data store. Gartner has called this capability of delivering transactions and analytics on the same data store and mixed workloads as Hybrid Transactional/Analytical Processing (HTAP).  

Per Gartner, there are two types of HTAP – in-process HTAP and point-of-decision HTAP.  The demand for HTAP has always existed from the business, but the technology limitations have forced IT not to deliver thus far. 
  • How can we change this scenario and help businesses become real-time?
  • As the adoption of Hadoop and Big data continues, is there a way to leverage that infrastructure to achieve this database nirvana?
  • Is it necessary to go all the way to in-memory computing (IMC) or can we leverage intermediate steps such as caching? 

You will learn:

  • What is Hybrid Transactional/Analytics Processing?
  • How does HTAP differ from existing data platforms/architectures?
  • How to differentiate between in-process HTAP and point-of-decision HTAP?
  • Why Hadoop opens new opportunities for HTAP?
  • What business benefits one can expect from HTAP?
  • What approaches are required to implement HTAP?
  • What workloads are pertinent for HTAP?
  • How to extend HTAP to structured, semi-structured and unstructured data?

Who should watch?

  • CIOs, Data Officers
  • IT managers
  • Big Data and solution architects

Expert Panel

Rohit Jain, CTO 

Rohit is a database industry veteran for over three decades. He is a co-founder and CTO of Esgyn Corporation. Rohit drives the technology and architectural vision for EsgynDB, which is based on open source Apache Trafodion. Rohit is an industry expert in HTAP and the author of Database Nirvana – Delivering Hybrid Transactional/Analytical Processing, published by O’Reilly Media. Before co-founding Esgyn, Rohit held senior technical leadership role in Hewlett-Packard (HP) and Tandem and delivered innovative database products that were used by global enterprises such as Walmart and Wells Fargo.
Rohit has a Masters in Business Administration from University of Michigan, Ann Arbor and lives in Austin, TX.

Rao Kakarlamudi, Head of Pre-sales and Principal Architect

Rao Kakarlamudi is Head of Pre-sales and Principal Architect at Esgyn and is responsible for Pre-sales and Proof-of-Concept (POC) programs. Rao has been with Esgyn since the beginning and started his career with Tandem Computers in Tandem NonStop SQL product group, which eventually became responsible for delivering Enterprise Data Warehouse (EDW) products within Hewlett Packard (HP). Rao has over two decades of experience in designing, architecting and building scalable database platforms for both on-prem and cloud.
Rao has a Master’s degree in Computer Science from the University of Nevada and has over 20 years of technology experience in building highly-scalable, mission-critical transaction systems that are used by global enterprises.

About the Author:

Rohit Jain is Esgyn's Chief Technology Officer.Rohit has worn many hats in his career, including solutions architect, database consultant, developer, development manager and product manager. Prior to joining Esgyn, Rohit was a Chief Technologist at Hewlett-Packard for SeaQuest and Trafodion. In his 39 years in applications and databases, Rohit has driven pioneering efforts in Massively Parallel Processing and distributed computing solutions for both operational and analytical workloads.