Session
From 1TB to 1PB: Scaling GDPR-Native Pipelines with Argo Workflows
Scaling data processing from terabytes to petabytes is a technical challenge; doing it while strictly adhering to GDPR is an architectural one. In this session, I will share the evolution of our sovereign data platform, detailing how we transitioned from simple batch jobs to a complex, event-driven architecture that processes petabytes of data without leaving European soil. We will dive into the design of our "Compliance Car Wash", a mandatory ingestion pattern where every data stream is automatically scanned, masked, and anonymised by Argo Workflows before hitting persistent storage. Learn the hard-won lessons of scaling storage, optimising Argo for high throughput, and baking compliance directly into the infrastructure code, ensuring that as your data grows, your risk profile doesn't.
Arya Soni
DevOps & SRE | Kubernetes & Multi-Cloud Architect (AWS/GCP) | Reduced Cloud Costs by 40% | Infrastructure as Code (Terraform) | CI/CD | MLOps
Gurugram, India
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top