Overview

Hosted By: Tony Hajdari, Technical Program Manager, Professional Services, Kinetica

Mastering stream processing allows organizations to react quickly to current conditions. This means you also need to get data into your analytics platform quickly. Many organizations choose Confluent or Apache Kafka to support their events streaming pipeline. Adding Kinetica to receive this streaming data lets you to perform high-scale, in-the-moment analytics on streaming data with the entire context of your business.

IN THIS TALK WE’LL SHOW YOU HOW TO:

  • Ingest data into Kinetica
  • Setup Confluent or Apache Kafka
  • Build the Kinetica Kafka connector
  • Configure the connector
  • Tune the connector to further increase throughput
  • Write events from Kinetica to a Kafka topic

Watch Recording