In this session of our Azure Databricks Training series, we’ll teach you how to connect directly to data sources like TCP/IP sockets and the Kafka messaging system, transform and output data, and finally create compelling continuously-updated visualizations to drive greater impact for your teams.

During this session we will cover:

  • Connecting to TCIP and Kafka as streaming sources
  • Use the DataFrame API to transform streaming data
  • Output the results to various sinks
  • Use Databricks visualization feature to create a continuously updated visualization of  processed streaming data.