A large multinational enterprise typically comprises thousands of applications, with data inputs spanning hybrid/multi-cloud environments, IoT devices, mobile platforms, and distributed operations.
These data events occur in real time – a customer places an online order, a supplier updates inventory, a passenger scans a boarding pass, or a sensor detects a sudden temperature change. These events are asynchronous, each triggering follow-on actions that ripple across various departments and operations within an enterprise.
The vast amount of critical data involved in day-to-day operations is driving a dramatic shift in how increasingly globalised business systems are integrated.
The old integration methods can’t meet the demands of today’s real-time business world
Traditional solutions such as iPaaS and ESB are still being used to “knit” together data across this complex web of disparate systems. However, they struggle to keep pace with the demands of today’s data-driven landscape. Synchronous, point-to-point solutions simply aren’t built to handle the growing volumes of real-time data flowing through modern enterprises, nor the increasing number of applications.
These traditional methods result in a complex web of fragile connectivity that is difficult to manage, fails to handle bursts of data, and lacks robustness against failures and outages. This leads to longer application runtimes, ultimately degrading the user experience.
To simplify integration, become more real time, and better connected, organisations need to embrace an “event-driven” approach.
Event-driven data needs event-driven integration
Event-driven integration offers a more efficient approach to connecting systems through the instantaneous sharing of real-time events.
At its core, whenever an event occurs within a system, a message is published to a central hub called an event broker. Other systems subscribe to this broker, or a network of event brokers, known as an event mesh, receiving messages in real time and reacting accordingly. This on-demand, “data as a service” approach enables flexibility and scalability, offering a more dynamic and responsive alternative to traditional methods.
The analyst community is already there
Leading analysts recognise event-driven integration as key to optimising the real-time movement of business-critical data. Gartner identified this trend in the evolution of IT systems architecture, marking a shift from viewing IT systems as “data-centric custodians” to seeing them as the “nervous system” of an enterprise. More specifically, “data in motion” is now being viewed as the real source of effective decision-making, rather than “data at rest.”
It’s worth noting that this event-driven integration approach works seamlessly with an organisation’s existing integration platform as a service (iPaaS), complementing it with an event-driven platform. By using both together, IT teams can incrementally migrate appropriate information flows to become event-driven, enabling a phased implementation of key business processes over time.
IDC recently noted in its “IDC Market Glance – Connectivity Automation, 2Q24” report by Shari Lava and Andrew Gens, June 2024: “Another trend that is helping fuel the growth of event-driven architecture (EDA) is the potential to utilise EDA in conjunction with iPaaS to split queuing, avoid bottlenecks, and manage workload and data traffic asynchronously. With an event broker layer, organisations can also gain visibility into the state of queued messages even in cases of sending errors.”
Turning integration inside out
An “event-driven” approach means re-thinking and re-architecting integration.
For the past 20 years, integration has been fairly centralised, with monolithic applications and numerous dependencies. Current integration approaches place components in the data path – i.e., at the core – whether real time or batch, requiring connectivity and transformation. The challenge with centralised integration is that it tightly couples connectors, transformations, mappings, and potentially transactional contexts into one deployable runtime, through which all data must flow synchronously. This can result in bottlenecks, similar to the issues seen in monolithic applications.
Event-driven integration turns this approach inside out.
It moves integrations and connectors to the edge, enabling decentralised, real-time data flow and events at the centre. This results in an architecture that is more agile, scalable, robust, and capable of supporting real-time operations, much like event-driven microservices.
Today’s use cases demand an architecture capable of handling both traffic bursts and slow or offline consumers without impairing performance. It must scale to accommodate increases in consumers, producers, and data volume, while ensuring the delivery of data even to temporarily unavailable consumers. The architecture must also allow for easy integration of new processing components and distribution mechanisms, without compromising the overall design, and be adaptable to emerging technologies like generative AI agents and large language models (LLMs).
Going inside out unlocks business insights
Arriving at an optimal event-driven integration implementation does not happen overnight. It’s an evolutionary process that can be measured in four key milestones:
- Breaking down data silos
Traditional systems can result in data silos, making it difficult to access and share information. Event-driven integration helps improve the availability of data, ensuring that relevant information is accessible when needed. - Managing unexpected bursts in traffic
The decoupling of components serves as a “shock absorber” that helps manage unexpected traffic spikes, such as sudden increases in demand, without disrupting operations. This decoupling also strengthens infrastructure, making it more resilient to failures and outages. - Facilitating smoother integration of new systems
Event-driven integration can make it easier to integrate new applications and services, enabling organisations to adapt to new technologies with less disruption. - Providing real-time visibility
Real-time data integration gives users a more consistent and up-to-date view of information, which can help create a smoother and more efficient experience in day-to-day operations.
Event-driven integration in the real world: Organisations thinking outside traditional integration models
Across industries such as financial services, manufacturing, and retail, organisations are embracing event-driven use cases.
Dutch beer manufacturer Heineken has applied an event-driven system where production line events automatically update inventory and trigger order fulfilment for distributors. This system spans its global and local beer and cider brands, which are sold in over 190 countries.
Similarly, German grocery chain EDEKA is using an event-driven approach to modernise its supply chain and merchandise management. By switching from batch updates between siloed systems to real-time data sharing, EDEKA aims to improve operational efficiency and coordination across its operations.
According to a recent IDC Infobrief, 82% of survey respondents have either implemented or plan to implement two to three event-driven use cases in the near future.
Rethinking integration architecture to meet today’s needs
Growing data volumes and increasing connectivity, alongside shifting consumption models and customer expectations, are prompting large organisations to rethink how they structure the flow of critical information. Traditional integration approaches may not meet the demands of businesses that need to align with customer, employee, and supplier requirements in real time.
Event-driven integration provides a potential solution for multinational organisations looking to modernise their integration strategies. It can help businesses become more adaptable and scalable, as the demand for real-time responsiveness continues to rise in increasingly digitised environments.