One of the strong point of Sparkflows is that it enables users to use its Nodes/Building Blocks for both batch and streaming jobs.
This means that users can easily mould a streaming workflow into batch and vice-versa. With the 100+ Building Blocks available it becomes very powerful as they can be applied to both Batch and Streaming.
The way Sparkflows does it is by having the Nodes process everything as DataFrames. When doing streaming analytics, the incoming DStream is converted to DataFrames and then taken through the workflow execution engine.