Ignite Discussions : Ask Questions, Find Answers, Share Expertise about Sparkflows
In Sparkflows, we can use the “Apache Logs” processor to read and process log data. It reads a log file and loads it as a DataFrame. Thereafter DataFrame can be used for further analysis.
To use the “Apache Logs” Processor:
Browse and select a log file from a location in the ‘Path’ field. File present in the specified path would be read and converted to a DataFrame for further processing.
For more information read the Sparkflows Documentation here: