Event Driven Data Streams | Connect

Helvetia Insurance Switzerland
Event-Driven Data Integration Insurance

As part of the "Event Driven Data Streams | Connect" project, b-nova supported Helvetia Insurance Switzerland in building and evolving event-driven data streams between distributed systems. The focus was on the reliable integration of heterogeneous applications via Apache Kafka, the modeling of topics and data flows with Avro schemas, and the technical implementation of robust interfaces for asynchronous communication based on Quarkus and Java. Additionally, we guided schema evolution, error handling, testing, monitoring, GitOps-based deployment, and the continuous improvement of the integration landscape through to production.

Biggest challenge

Reliable integration of distributed systems through asynchronous data streams with consistent schema evolution and clearly defined integration boundaries

What we did

Design and implementation of event-driven integration logic with Kafka, Avro, and Quarkus including GitOps deployment, monitoring, and operational support

Main tools we used

Kafka, Avro, Quarkus, Java, OpenAPI, GitHub Actions, ArgoCD, OpenShift

Tasks

Analysis of business requirements and close coordination with stakeholders to define integration goals and data flows
Identification and evaluation of source and target systems for event-driven integrations
Design of the topic architecture, partitioning strategy, and data flow modeling
Design and management of Avro schemas including schema registry and controlled schema evolution
Implementation of asynchronous integration logic and Kafka-based producer and consumer services with Quarkus
Definition and maintenance of API contracts using OpenAPI for synchronous interface connections
Setup of robust error handling with dead-letter queues, retry mechanisms, and traceability across data flows
Implementation of authentication and authorization using OIDC for secure service-to-service communication
Connection to existing backend and third-party systems via REST APIs for hybrid integration scenarios
Systematic testing of integration logic including contract tests and end-to-end validation
Containerization and deployment of services on OpenShift with Kustomize for environment-specific configuration
Setup and maintenance of CI/CD pipelines with GitHub Actions and GitOps-based deployment via ArgoCD
Implementation of monitoring, logging, and alerting to track Kafka cluster health and data flow performance
Iterative optimization of the integration landscape based on operational experience and stakeholder feedback
Comprehensive documentation of architecture, operational processes, and knowledge transfer to internal teams

Technologies

Kafka Avro Quarkus Java OIDC OpenAPI GitHub Actions ArgoCD Kustomize OpenShift