Programming Language Support
Sparkflows in addition to providing a Low-code and no-code approach, also allows professional developers to use their programming expertise.
Developers can write in their language of choice which includes Python, Scala, SQL, Jython, etc. For an organization, this provides the double advantage of being able to utilize their current development talent as well as enable development-like capabilities in the hands of the business users.
Using programming nodes, Sparkflows can reuse existing codes, cut down the migration effort and organizations can utilize project teams' time in focusing on newer AI projects.
Use Custom Code
Sparkflows allow programmers to read custom files, images, and videos and build the Data preparation logic to be fed into a custom state-of-the-art machine learning model like Unit, diffusion models, etc. All of these are possible because Sparkflows provide the user with an interface to write code to cater to complex research use cases. This custom logic can later be added as a node in Sparkflows and used as a reusable artifact in Self-Service mode.
Reuse Existing Code
Programmers can leverage the Scala/Pyspark node to reuse or write business logic which will run in a distributed manner. This is especially important when the data transformation needed is very specific to a use case, be it in the data pipeline or for building features while developing a machine learning pipeline.
Build Using SDK
Using Sparkflows SDK. a programmer can extend Sparkflow capabilities and experience by having an organization's specific business rule or logic encapsulated inside a custom node.
This custom node will be available and can be reused by the organizations business users via Sparkflows interface.
With Sparkflows, organizations can now build custom
nodes and have an efficient way to have the node available as a Self-Service node while still leveraging on the scalability and performance of Sparkflows standard nodes.