top of page

Structured Streaming in Fire


Overview

We are excited to announce that Fire now supports Structured Streaming. You can connect with Kafka, make transforms, analytics and store your results into another store.

As the amount of IoT devices etc. have increased, Stream Processing and Analytics are coming up in more and more use cases. However building and maintaining Streaming Jobs is hard and messy.

Fire makes it seamless to build and have your Streaming Jobs running in minutes.

Structured Streaming Workflow

Below is a Structured Streaming Workflow in Fire. It does the following:

  • Reads streaming data from Kafka

  • Splits the incoming text with a separator to create more columns

  • Saves the data to HDFS/S3 as Parquet files


Structured Streaming Processors

Below is the initial list of Structured Streaming Processors for reading and writing streaming data.


Structured Streaming Job Execution

Structured Streaming Jobs are executed like another other workflow in Fire. Just that they continue to run till you stop or kill them.

Below is a list of few executions of a Structured Streaming Job.


Conclusion

With the latest release of Fire, you can have your Streaming Jobs running in minutes. Most of the 200+ processors in Fire would work with Structured Streaming too.


87 views0 comments

Recent Posts

See All

コメント


bottom of page