(S1 - E1): How to integrate SAP into an Event-Driven Architecture?

Want to learn how to integrate events into SAP from the one and only Perry Krol?

In our first podcast episode of 'Talking Event Driven' Perry emphasizes the importance of standard API connectors to streamline the process for SAP support. Basically is a must-watch for anyone navigating the complexities of Event-Driven Architecture!

Key takeaway #1

Avoid the Database Trap with SAP: Going directly to the SAP database for integrations is a common pitfall. For inbound data, it bypasses critical business logic. For outbound, it exposes a cryptic data model that is difficult for data engineers to interpret.

Key takeaway #2

Leverage Application-Level Event Triggers: A much better approach is to tap into SAP's native, application-level event triggers. This allows you to capture events with rich business context directly from the source.

Key takeaway #3

Shift Left: Empower the SAP Team to Create Data Products: Instead of exposing table-level data and leaving the interpretation to downstream consumers, empower the SAP team to act as data product owners.

Key takeaway #1

Avoid the Database Trap with SAP: Going directly to the SAP database for integrations is a common pitfall. For inbound data, it bypasses critical business logic. For outbound, it exposes a cryptic data model that is difficult for data engineers to interpret.

Key takeaway #2

Leverage Application-Level Event Triggers: A much better approach is to tap into SAP's native, application-level event triggers. This allows you to capture events with rich business context directly from the source.

Key takeaway #3

Shift Left: Empower the SAP Team to Create Data Products: Instead of exposing table-level data and leaving the interpretation to downstream consumers, empower the SAP team to act as data product owners.

Key takeaway #1

Avoid the Database Trap with SAP: Going directly to the SAP database for integrations is a common pitfall. For inbound data, it bypasses critical business logic. For outbound, it exposes a cryptic data model that is difficult for data engineers to interpret.

Key takeaway #2

Leverage Application-Level Event Triggers: A much better approach is to tap into SAP's native, application-level event triggers. This allows you to capture events with rich business context directly from the source.

Key takeaway #3

Shift Left: Empower the SAP Team to Create Data Products: Instead of exposing table-level data and leaving the interpretation to downstream consumers, empower the SAP team to act as data product owners.

Unlocking your core: modern event-driven patterns for integrating with SAP

For many enterprises, SAP is the operational heart of the business, a powerful but often monolithic system of record. Integrating with it in a modern, event-driven way is one of the most critical challenges organizations face today, especially with the massive transformation initiatives around the move to S/4HANA.

How do you bridge the gap between this traditional ERP world and the data-centric world of modern analytics and microservices? We sat down with Perry Krol, Head of Solutions Engineering for SAP at Confluent, to get his insights on our very first episode of Talking Event-Driven.

The classic pitfall: the lure of the database

When teams first approach integrating SAP with Kafka, the most common anti-pattern is to go straight to the database.

"Getting data out in an event-driven way from the database... that kind of works," Perry explains. "But the data model of the SAP system is very cryptic... and most tables and columns are abbreviations in German. It's very hard to understand the business semantics."

This approach pushes a massive burden onto downstream data engineers, who are unfamiliar with the SAP data model and are forced to reassemble raw, cryptic table data into meaningful business objects like a sales order. For inbound data, it's even worse: bypassing the application layer means you skip all the critical business logic and validation rules, which Perry describes as "signing your death sentence."

A better way: application-level eventing

A far more robust pattern is to tap into the event triggers that exist within the SAP application layer itself. SAP has long used internal eventing mechanisms to drive its own business workflows (e.g., a manager approval being required before a requisition becomes a purchase order).

Modern approaches leverage these triggers:

  • For Outbound Data: Instead of database CDC, use application-level triggers like Change Pointers or the newer, more granular Business Events available in S/4HANA. These triggers are based on business object changes and carry much richer context than a simple table update.
  • For Inbound Data: Instead of writing to the database, use stable, supported integration APIs like IDOCs or OData REST APIs. This ensures all data goes through the proper validation and business logic within SAP.

Shifting left: from raw data to high-quality data products

The most powerful evolution in this space is the "shift-left" movement for data quality. Instead of extracting raw data and leaving the complex task of cleaning, correlating, and interpreting it to downstream data teams, we are seeing a push to create high-quality data products at the source.

"Who would be a better person to expose the data in a semantically better representation... than the application?" Perry asks.

In this model, the SAP team becomes a data product owner. Using modern tools, they can:

  1. Listen for a granular business event trigger within SAP (e.g., DeliveryAddressChanged on a sales order).
  1. Use their deep business knowledge to enrich this event with the full, relevant payload directly within the SAP environment.
  1. Publish a clean, complete, and semantically rich business object (e.g., the full sales order) to a Kafka topic.

This approach provides a massive advantage: downstream consumers, whether they are operational microservices or analytical platforms like Snowflake and Databricks, receive a high-quality, ready-to-use data product. It eliminates redundant data engineering work and ensures that the business context is preserved from the very beginning.

Conclusion: bridging the two worlds

For too long, the operational world of ERP and the analytical world of data have existed next to each other, speaking different languages. By embracing application-level eventing and a "shift-left" data product mindset, organizations can finally bridge this gap.

This allows you to not only improve the quality and agility of your integrations but also to unlock new value by contextualizing data from different domains. Imagine enriching real-time IoT data from a factory floor with business process data from SAP. Suddenly, you don't just know that a machine is underperforming, you know which customer order it's impacting.

This is the future of SAP integration: moving beyond simply transferring bits and bytes to creating a rich, contextual, and real-time stream of business value.

If you're embarking on your S/4HANA transformation and need to build a modern, event-driven integration strategy, we have the expertise to help.

Let's Modernize Your SAP Integration

We value your privacy! We use cookies to enhance your browsing experience and analyse our traffic.
By clicking "Accept All", you consent to our use of cookies.
Dark blue outline of a cookie icon.