Spring Cloud Data Flow 2.2 offers better task execution control

Spring Cloud Data Flow 2.2 offers better task execution control

Spring Cloud Data Flow, a tool suite for streaming and batch data processing, is now generally available in v2.2.

The new version allows users to stop and delete task executions from the SCDF dashboard and the integrated shell. When deleting tasks made up of batch jobs, only the parent tasks can be deleted and the corresponding execution data is canned as well.

The feature can also be used to clean up task executions, which is why the UI got a corresponding action. When looking into the task execution details in the UI, there is now a section that displays the logging output in local, Kubernetes, and Cloud Foundry platforms to help when looking for bottlenecks etc. 

If you’d like to use Apache Kafka, RabbitMQ, or Amazon Kinesis in your setup but don’t know where to start, the data flow team has added recipes to configure and manage stream applications with those tools. Additional recipes for Kubernetes and Cloud Foundry are meant to help with file ingestion and ETL (extract, transform, load) processing. Stability when working with other platforms is meant to be better now, thanks to a couple of new acceptance tests.

Spring Cloud Data Flow is open source and licensed under the Apache License 2.0. It is made up of a core domain module, that includes concepts for streams and tasks, an app registry, a shell, and a data flow server that provides a REST API and UI. The processing pipelines that can be built with SCDF consist of Spring Boot apps that use the company’s frameworks Spring Cloud Stream or Spring Cloud Task.