Develop data-heavy workflows
Develop data-intensive, cloud-native applications where they are designed to run — in your cluster, in the cloud
Applications may exhibit unexpected behavior when dealing with large volumes of data. It is essential to validate your code using extensive data sets before deploying it in a production environment. In realistic settings, you can assess data consistency and the effects of data changes or substantial data sets on your microservices. This is particularly crucial if your microservice depends on databases, caches, or external data sources such as S3 buckets, Postgres DBs, and queues (Kafka, RabbitMQ, etc..), as these systems can introduce challenges that are can’t get reproduced with mocks.
Most Developers have 3 options:
- Connect to a remote database and deal with latency
- Download large datasets to the local machine and keep the local data schemas synced with the latest ones from the remote environment,
- Deploy code changes through the pipeline to a remote environment that provides the necessary data but this process takes a long time.
Eventually, developers face slow feedback loops.
With Velocity (Fast feedback loop):
Developers can validate code against real cluster data quickly, without slow large data transfer. Testing occurs against live datasets for fast feedback. Furthermore, any schema changes made in the existing environment will automatically apply to the developed service.
From a security standpoint, data remains within the organization's cloud account. No personally identifiable information (PII) gets transferred to local laptops.
Getting Started with Argo Workflows
Argo Workflows is a K8s native workflow engine that allows you to run all kinds of workflows in Kubernetes by leveraging native resources, such as K8s Pods, to execute the individual steps of the workflow. Workflows can be kicked off in all kinds of ways, which – along with the enormous range of customizations possible – is what makes Argo Workflows such a versatile tool for running cloud-native workflows.