This blog contains experience gained over the years of implementing (and de-implementing) large scale IT applications/software.

SAP Sapphire 2024 Virtual Session List

Like many of you, I didn’t go to SAP Sapphire 2024 :’-(
Instead I decided to watch as many sessions as my teeny, tiny brain would let me ingest and turn my findings into a YouTube video.
The video link is below and below that is my complete session list of 40 sessions, including the links to the sessions directly (thank me later 🙂 ).

Complete Session List

Main SAP Sapphire 2024 virtual event front page:
https://www.sap.com/events/sapphire/virtual.html

Transforming business: SAP Signavio and LeanIX solutions – ERP902v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714350995982001Z03t

Drive value along your transformation journey with SAP Signavio and LeanIX – ERP170
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713874513924001HJzu

Bringing you the future now with SAP BTP – BTP188
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798094001MK9P

SAP Business Technology Platform – BTP900v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714350995530001ZfN7

Unlocking business potential with GROW with SAP – ERP186
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860796860001MPN2

SAP Preferred Success: Adoption road map for cloud solutions from SAP – SER130
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861889962001k6W0

Discover recent and upcoming innovations in Joule – BAI207
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861889606001kSEw

End-to-end carbon management: The path to a green ledger – SUS217v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861891746001kZle

Clean core strategy for an optimized IT landscape – TRE308
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798146001MAdD

Extend SAP S/4HANA with SAP Build and SAP Build Code – BTP224v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797748001MEB0

Streamline your journey to the cloud with RISE with SAP Methodology – SER241
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861891061001kmqO

Navigating the AI Era with SAP: Insights from the Executive Suite – BAI220v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861892194001kC9K

Megathemes are transforming business models across all industries – TRE317v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713894437789001Yt3E

Data in the age of AI – BTP189
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798456001MXvw

Unlock business agility with cloud ERP – ERP169
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798871001Mp0U

Discover recent and upcoming AI innovations for business functions – BAI202
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861889545001kWzS

Drive a sustainable, clean-core extension strategy with SAP BTP – BTP185
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713894437832001YFkS

Strengthen the value of cloud ERP – ERP189
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797995001M03l

Supercharge your business and get innovation-ready with RISE with SAP – ERP168
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713894437868001Yhuk

SAP S/4HANA Cloud Public Edition: Find out what’s next – ERP127
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797612001MOwz

Learn about SAP’s responsible AI practices – BAI206v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861892069001kRf2

The new ERP mindset – TRE321
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798676001MSu6

Achieve real-world results with SAP Business AI solutions – BAI200
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861890303001kLQl

Future of cloud ERP with AI-driven transformation and fast-track adoption – ERP900v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714350995724001Zlgs

Looking forward: SAP Datasphere – BTP113
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797383001MZtd

Managing your AI risk: Responsible AI – TRE312
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714350995775001Zrj4

See Joule in action and empower your business – BAI208v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861892259001kEmO

Business data fabric: The future of data and AI – BTP151v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860799084001Ma7x

Experience SAP Business AI in action – BAI900v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714095582869001wto9

Enterprise automation – BTP901v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714095582998001wbuA

Charting your ERP transformation and future SAP software landscape – ERP106v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713874182668001YDO5

The power of sustainability, ERP, and AI – SUS900v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1714095583131001wkWQ

Unleash the power of automation: Your path to enterprise success – BTP187
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860798402001Mzcx

SAP S/4HANA Cloud Private Edition: Discover planned innovations – ERP126
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797552001MDH8

SAP runs SAP: Leading the AI revolution in customer support – SER242
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861890934001k3KR

How to manage AI risk – TRE312v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861890875001ksC9

Explore the road map for SAP Digital Manufacturing – SCM100
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713883127034001kqNC

SAP Green Ledger and an ERP-centric approach to reinvent carbon accounting – ERP199v
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797852001MKzW

Explore the road map for SAP Business Technology Platform – BTP100
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713860797243001MUHd

Optimizing innovation: road map for SAP Cloud ALM – SER227
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861889840001kYQg

Bringing AI from prototype to production – TRE307
https://www.sap.com/events/sapphire/virtual/flow/sap/s24/catalog/page/catalog/session/1713861890399001k1Mx

Hey, you made it to the bottom!
Which one’s did you watch and what did you learn?

SAP’s Deeper Partnership with Red Hat

An announcement back in February 2023 from Waldorf tells us of a “deepening” partnership between SAP and the Enterprise Linux Operating System vendor Red Hat.

They have a long history together already, with the SAP Linux Labs encompassing the Red Hat tech team to ensure SAP on Red Hat Linux works and performs as it should.
 

Here are the lines of significance from the SAP news article: https://news.sap.com/2023/02/red-hat-and-sap-deepen-partnership/

…SAP is boosting support for the RISE with SAP solution using Red Hat Enterprise Linux as the preferred operating system for net new business for RISE with SAP solution deployments.

The platform builds on this trust by offering a consistent, reliable foundation for SAP software deployments, providing a standard Linux backbone to support SAP customers across hybrid and multi-cloud environments.

…building on Red Hat’s scalable, flexible, open hybrid cloud infrastructure.

…SAP’s internal IT environments and SAP Enterprise Cloud Services adopting Red Hat Enterprise Linux can gain greater flexibility to address modern and future technology requirements.

“…Red Hat Enterprise Linux offers enhanced performance capabilities to support RISE with SAP solution deployments across cloud environments…

There are a lot of points to cover and, as always, a little history is useful.
Grab a bagel (that’s what American’s eat right?) put some Obatzda cheese on it (it’s German, I’m trying to equate eating with the subject of this article) and settle in for a read.

Who is Red Hat?

You can read all about Red Hat on Wikipedia here: https://en.wikipedia.org/wiki/Red_Hat , but suffice to say:

  • It is owned by IBM since 2019.
  • It owns Ansible.
  • It owns Red Hat Enterprise Linux CoreOS (RHCOS), which is the production Linux Operating System beneath the container platform OpenShift.  RHCOS is built on the same Red Hat Enterprise Linux (RHEL) kernel.

What is RISE with SAP?

There are many views on why “RISE with SAP” came to fruition and who it benefits, but the official line is that RISE with SAP is a solution designed to support the needs of the customer’s business in any industry, with SAP responsible for the holistic service level agreement (SLA), cloud operations, and technical support and the partner (insert any Global SI) provides sales, consulting and application managed services (AMS).

…SAP is boosting support for the RISE with SAP solution using Red Hat Enterprise Linux as the preferred operating system for net new business for RISE with SAP solution deployments.

When the article talks about “net new” that just means any brand new RISE subscriptions.

Notice that one of the significant lines I pulled out of the article says:

…providing a standard Linux backbone to support SAP customers across hybrid and multi-cloud environments.

Since SAP are doing the hosting, the “multi-cloud” part is probably referring to SAP’s hybrid and multi-cloud.  i.e. SAP’s own datacentres and also the hyperscalers.

An enticing option that comes as part of the RISE deal (depending on the customer spend) is SAP Business Technology Platform (BTP).
SAP BTP is a PaaS solution under a subscription model, in which SAP customers can combine and deploy curated SAP services from SAP or third-parties, or use services to code their own solutions in a variety of languages including SAP’s proprietary ABAP language.

The SAP BTP environments are hybrid and multi-cloud, as they are hosted in Cloud Foundry (the newest) or Neo (currently sun-setting), with these being run from a combination of SAP’s own datacentres and/or on the main hyperscalers (Cloud Foundry).  There are two other environments Kyma, a micro-services runtime based on Kubernetes and the ABAP environment, hosted in Cloud Foundry.

In conclusion on this section, I suggest that the described “net new business” is actually internal business inside of SAP and not directly the hosting of customer’s S/4HANA systems.  In fact, S/4HANA is only very loosely mentioned in the article, which leads me to believe that this announcement is purely for BTP and other surround services.

SAP HANA and Compute Power

In one of the statements from SAP on this:

“deepening” partnership, we see “…Red Hat Enterprise Linux offers enhanced performance capabilities to support RISE with SAP solution deployments across cloud environments…
 

I can’t see anything specifically mentioned about how Red Hat’s Linux operating system is more performant than SUSE, other than an article from 2019 where a SAP Business Warehouse (BW) on HANA system (maybe, could be BW/4HANA, difficult to tell) holds a world record.

See here for more:  https://www.redhat.com/en/resources/red-hat-enterprise-linux-for-sap-solutions-datasheet   which links to here:  https://www.redhat.com/en/blog/red-hat-enterprise-linux-intels-newest-xeon-processors-posts-record-performance-results-across-wide-range-industry-benchmarks?source=blogchannel

The thing to note about those claims are that:

  • This was based on a 2nd Gen Intel Xeon (3rd Gen is already available).
  • The CPU used Intel Advanced Vector Extensions 512 (AVX-512) instruction set, which Intel says arrived in 3rd Gen chips (is the Red Hat article quoting the wrong chip generation?).
  • Generally we run HANA on hyperscalers on Intel Skylake or Cascade lake CPUs.  Only HANA on bare metal may allow Xeon CPUs.
  • The Red Hat Linux Operating System version was 7.2 for the world record, but 7.9 is the latest support pack version and  9.0 is out now.  Also, 7.2 is now only supported for older versions of HANA 2.0 (up to SPS03).
  • The use of Intel OptaneDC (Intel’s non-volatile memory persistence technology) was used in the world record, but recently announced in 2022 as defunct (superseded by another initiative).
  • 2019 was the year that the IBM acquisition of Red Hat concluded.  Coincidence?

My summary of this section is that I don’t believe performance is the reason for any switch by SAP from (mainly) SUSE to Red Hat.  The one article of relevance that I can find seems just too old and outdated.

What I think, is that the announcement from SAP is referring to something other than the Linux Operating System alone.

Red Hat’s Scalable, flexible, open hybrid cloud infrastructure

We maybe need to look past the Red Hat Linux Operating System and at the infrastructure eco-system that the Operating System is part of.

…building on Red Hat’s scalable, flexible, open hybrid cloud infrastructure.

When the article talks about “open” we are inclined to think about Open Source, freely available or even open APIs (sometimes just having APIs can make something “open”).

In my mind, something that can run seamlessly almost anywhere on hybrid cloud would involve containers.  Containers provide scalability (scale-out) and flexibility (multiple environments offered).

Let me introduce you to OpenShift.  Yeah, it’s got “open” in the name.

See here for a wiki article:  https://en.wikipedia.org/wiki/OpenShift

As a summary of OpenShift, the Red Hat Enterprise Linux CoreOS (RHCOS) underpins the OpenShift hybrid cloud platform and RHCOS uses the same kernel as Red Hat Enterprise Linux.

The orchestration of OpenShift containers is done using Kubernetes and Red Hat is the second largest contributor to Kubernetes after Google (Red Hat is a platinum member: https://www.cncf.io/about/members/).

I think you might be able to see where we are heading in this section.

Could SAP be adopting OpenShift internally for its future container hosting platform strategy?

IBM Cloud deprecated support for Cloud Foundry in mid-2022.  As suspected, Red Hat OpenShift is one of the touted solutions to replace it: https://cloud.ibm.com/docs/cloud-foundry-public?topic=cloud-foundry-public-deprecation#dep_nextsteps

Need greater efficiency and revolutionary delivery? Red Hat OpenShift on IBM Cloud might be your solution.

The above quote on the IBM Cloud site does provide some hint that operating Cloud Foundry platform services at scale, could be less efficient and less innovative compared to Red Hat OpenShift.


Maybe this is something that, internally, SAP have also concluded?

What Does SUSE Offer to Compete with Red Hat and it’s OpenShift offering?

The SUSE Linux Enterprise Server (SLES) Operating has been a solid foundation for running SAP systems.

Similar to Red Hat, SUSE has a varied portfolio of products in the Linux container technology space.
Rancher Labs is one of those products, and allows easier management of Kubernetes, especially once the quantity of containers accelerates.

SUSE is also a contributor to Kubernetes (it is a silver member).

SUSE also owns Rancher, which is an open source container management platform similar to Red Hat’s OpenShift. 

The SUSE Rancher product is open armed, in that it embraces many different operating systems and a number of license options, whereas Red Hat OpenShift supports only the Red Hat CoreOs and requires a SUSE subscription.

While being open is a good thing, it also adds complexity, since Red Hat’s CoreOs is a purpose built Operating System with all required features and it would appear to have a simpler method of deploying and maintaining it.

It’s possible that SAP’s announcement comes after some internal evaluation of the two products, with Red Hat’s being favoured the most.

Conclusions

We’ve looked at the article from the SAP site where the new “deeper” partnership with Red Hat was announced.

I think I ruled out performance as a reason for the Operating System change.  The article just didn’t have enough depth for my liking.

I have speculated on how this SAP and Red Hat partnership could be about the internal SAP hosting of PaaS and maybe SaaS related systems and not directly related to hosting of customer’s S/4HANA systems.

What we could be looking at, is the next generation of hosting platform for SAP BTP or possibly SAP S/4HANA Cloud public edition.
Red Hat’s OpenShift platform, underpinned with the Red Hat CoreOS and the Red Hat tools to monitor, automate and orchestrate, could all combine to provide a solid accompaniment to solve SAP’s internal strategic issues.

It’s one of the platforms chosen by IBM Cloud (a no brainer for them really), with the justification that Cloud Foundry was no longer the strategic platform.

The announcement has no impact on the certification of SUSE for running S/4HANA and therefore should not reflect any customer decisions during their RISE with SAP journey for their S/4HANA systems.

Resources:

https://news.sap.com/2023/02/red-hat-and-sap-deepen-partnership/
https://blogs.sap.com/2019/07/15/evolution-of-sap-cloud-platform-retirement-of-sap-managed-backing-services/
https://blogs.sap.com/2023/06/14/farewell-neo-sap-btp-multi-cloud-environment-the-deployment-environment-of-choice/
https://me.sap.com/notes/2235581
https://learn.microsoft.com/en-us/azure/virtual-machines/mv2-series
https://learn.microsoft.com/en-us/azure/virtual-machines/sizes-compute
https://www.intel.com/content/www/us/en/architecture-and-technology/avx-512-solution-brief.html
https://www.redhat.com/en/resources/red-hat-enterprise-linux-for-sap-solutions-datasheet
https://www.redhat.com/en/blog/red-hat-enterprise-linux-intels-newest-xeon-processors-posts-record-performance-results-across-wide-range-industry-benchmarks
https://docs.openshift.com/container-platform/4.8/architecture/architecture-rhcos.html#rhcos-key-features_architecture-rhcos
https://www.anandtech.com/show/14146/intel-xeon-scalable-cascade-lake-deep-dive-now-with-optane
https://www.sap.com/products/erp/s4hana.html
https://en.wikipedia.org/wiki/Red_Hat
https://en.wikipedia.org/wiki/Rancher_Labs
https://en.wikipedia.org/wiki/OpenStack
https://en.wikipedia.org/wiki/OpenShift
https://en.wikipedia.org/wiki/Cloud_Foundry
https://en.wikipedia.org/wiki/3D_XPoint
https://www.ibm.com/support/pages/sap-s4hana-red-hat-openshift-container-platform-business-perspective-cloud-hosting-provider
https://cloud.ibm.com/docs/cloud-foundry-public?topic=cloud-foundry-public-deprecation
https://www.cncf.io/about/members/

Power BI Desktop Single-Sign-On to SAP HANA

This post is all about Single-Sign-On (SSO) from Microsoft Power BI Desktop to SAP HANA database 2.0.
When you initially get the task to design and/or set this up, there are a few questions that need to be asked first, before you start setting up SSO.
In the post I will compare the two methods for single-sign-on: Kerberos and SAML, plus the use of the Power BI Gateway (also known as the “On-Premises Data Gateway”).

Index:

Questions to Ask

Before setting up SSO from Power BI Desktop to SAP HANA, you should ask these questions:

  1. Define what the Power BI Desktop end-user will be doing:
    Are the end users creating reports or are they consuming already published reports?

    End users that are creating new reports will need a direct Power BI Desktop to HANA connection with SSO (a.k.a SSO2DB). This will need to use Kerberos because SAML is not supported.
    End users that are consuming already published reports can use the On-Premises Data Gateway with SSO to access and execute the reports from Power BI Desktop. The On-Premises Data Gateway can use Kerberos or SAML.

  2. Define where Power BI Desktop will be running:
    Do end-users all have Windows accounts in the same domain?

    For direct to HANA connections with SSO, Kerberos is used and requires the end-user to be signed into Windows with a Windows account on the machine where Power BI Desktop is running.
    If the end-user does not have a Windows account (or the they sign into Windows with a different, un-trusted domain) they can enter Windows credentials into the login box inside Power BI Desktop (this is not quite so seamless), but they will still need an AD account and one that is federated with the domain in which SAP HANA has been added to (it gets it’s own service account).

  3. If using On-Premises Data Gateway, define how many HANA systems will be connected to it:
    Is the On-Prem Data Gateway needing to connect to multiple HANA systems?

    When connecting On-Premises Data Gateway to HANA using SAML for SSO, there is a one-to-many relationship with the SAML key and certificate generated for On-Premises Data Gateway. The On-Premises Data Gateway can only use one certificate and this one has to be deployed and trusted on all the HANA systems that it will be connecting to. Therefore, you really need to have a On-Premises Data Gateway for each HANA environment (or at least, one for Production and one for Non-Production) to allow proper testing and better security for production.
  4. If planning to use Kerberos for SSO, identify corporate security policies & settings for Active Directory (AD) service accounts:
    Do AD service accounts required AES256 bit encryption?
    What are the server names and domains of the required domain Key Distribution Centre (KDC) server(s)?
    What will be the full UPN of the user account when using Kerberos?

    When AD service accounts have AES256 bit encryption, it changes the process for setting up the keytab file that is placed onto the SAP HANA server.
    The KDC and domain information will be needed for the configuration of the HANA server’s krb5_hdb.conf file.
    The AD administrators should be asked for the above information.

On-Premises Data Gateway or no On-Premises Data Gateway

You can use the On-Premises Data Gateway (Power BI Gateway) for accessing the data in “on-premise” systems. This includes HANA databases. The gateway acts as a kind of reverse proxy because it connects out to Microsoft from inside the customer’s network (where it is hosted).

The Gateway provides a distribution (publishing) framework where reports can be curated and published for access by many users.
End-users can connect from their Power BI Desktop (installed on their local computer) to the On-Premises Data Gateway *over the internet*.

Without the On-Premises Data Gateway, each Power BI Desktop end-user will need a direct connection to the SAP HANA database. It is recommended that this is performed over a VPN connection, or for the end-user to be physically in a corporate office on the LAN/WAN. In the future the Azure “v-Net” connection option may support SAP HANA connections if you happen to host your SAP HANA in Microsoft Azure.
NOTE: In the below, we could be using Azure AD or classic Active Directory Domain Services.

HANA Integration Point

Before we continue we need to highlight that the Power BI Desktop and On-Premises Data Gateway connect to the SAP HANA database indexserver for SSO via both Kerberos and SAML.
Changes are not required to the HANA XSA (Application Server Extended) for these integrations. It is not the same integration that you may read in other guides (especially guides related to HANA’s analytical capabilities).

Kerberos or SAML

Whether to use Kerberos or SAML is really up to your organisation’s preferences and capabilities.
Microsoft and SAP recommend SAML as a modern approach to Single-Sign-On.

SAML de-couples the SAP HANA system from the identity provider and is simpler to use with potentially less firewall changes.
Be aware, the On-Premises Data Gateway can only use one certificate for SAML for all the HANA databases it talks to.
When using SAML, the On-Premises Data Gateway connection to HANA needs securing with TLS, otherwise the SAML assertion (kind of like a certificate) would be sent unencrypted.

On the other hand, Kerberos provides a centralised identity management approach and is much more rigid in design with a few more steps involved in the setup. It is also a much older protocol with its own set of vulnerabilities, but it comes without the requirement to setup TLS (although it is still recommended).

If you need to have Power BI Desktop connecting directly to SAP HANA (a.k.a SSO2DB), then as of writing this can only use Kerberos for single-sign-on. Kerberos delegation is not needed in this scenario.
For connection from Power BI Desktop via the On-Premises Data Gateway to SAP HANA, then both Kerberos or SAML can be used.
When using the On-Premises Data Gateway with SAML, the On-Premises Data Gateway becomes the SAML identity provider (IdP).
When using the On-Premises Data Gateway with Kerberos, the On-Premises Data Gateway will use Kerberos delegation on behalf of the end-user.

Power BI Direct to HANA via Kerberos (SSO2DB)

NOTE: This is also known as SSO2DB.

The first thing to note about connecting Power BI directly to SAP HANA using Kerberos for single-sign-on, is that your BASIS team will need to work with the Microsoft Active Directory (AD) team.
It is possible that the AD team can delegate a proportion of the work to the BASIS team by creating a separate (dedicated) organisation unit (OU) and apply permissions to allow the BASIS team to use their Windows accounts to manage the AD entities created in this new OU.

Here is how the architecture will look for a direct connection from Power BI to SAP HANA via Kerberos:

Process Flow:

  1. User opens Power BI (or Excel).
  2. User connects to SAP HANA database using a Windows authentication account (authenticates via Azure AD in this example).
  3. Kerberos authentication token (ticket) is forwarded to SAP HANA during the HANA logon process.
  4. HANA decrypts token using keytab file which contains the key for the stored service principle (SPN) and maps the decrypted Windows account name (UPN) to the HANA DB account.

There is no requirement for Kerberos delegation in this setup.

For the above setup to work, there are some required steps and some optional steps:

  • Required: Install SAP HANA client
    The main requirement is that the SAP HANA client is to be installed onto the end-user’s computer (where Power BI desktop is running). For SAP administrators, you will note that this HANA client will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAP HANA in use.
  • Recommended: Install SAPCRYPTOLIB
    As well as the requirement for the SAP HANA client, it is recommended that you secure the connection to SAP HANA using TLS.
    For this, you will need the SAPCRYPTOLIB also installing into the HANA client location on the end-user’s machine.
    This set of libraries allow TLS to be used to encrypt the connection which is part of your “data-in-transit” security posture.
    You will also therefore need a SAP Personal Security Environment (PSE) file placing onto the end-user’s machine along with the SAPCRYPTOLIB.
    These libraries will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAPCRYPTOLIB in use on the SAP HANA server.
  • Required: Define Env Variable SECUDIR
    So that the HANA client knows where the SAPCRYPTOLIB libraries (DLLs) have been deployed (if they are being deployed), you should set a SYSTEM environment variable called “SECUDIR” to point to the location of the SAPCRYPTOLIB files.
  • Optional: Enable “Server Validation”
    An optional step is to enable “Server Validation” on the connection properties. It is recommended to enable this, because without server validation, it is not possible to know that the target SAP HANA server that has been connected to, is to be trusted with the Kerberos ticket that will be sent during logon.
    This also serves as a method of helping to restrict who can connect to which servers, by un-trusting specific servers (maybe old sandbox ones).
    For “Server Validation” to work, the PSE file which is located in the HANA client directory on the end-user’s computer, will need to be populated with the public TLS certificate(s) of the SAP HANA system(s) the end-user will be connecting to and these certificates will need to contain the FQDN that has been used to initiate the connection (e.g. my-virtual-db-hostname.corp.net).
  • Required: Configure Kerberos on HANA server
    The krb5_hdb.conf is configured on the HANA server, according to your AD domain setup and whether AES256 is needed for the AD service account.
    Once krb5_hdb.conf is configured, the AD service account can be tested at the Linux level using the required kinit and ktutil tools.
    The Kerberos keytab can only be created once the AD service account has been created and the required SPN(s) mapped. The method of creating this changes depending on whether AES256 encryption is needed on the service account.
    When using AES256 bit encryption, you cannot simply rotate the key in the keytab, you will need to take an interruption to SSO connectivity while you update the password in AD, then generate a new keytab key and update the keytab on the HANA system.
    The SAP document speaks of not needing to restart HANA, this was not the case on all systems for whatever reason. Be prepared for HANA restarts or place the files into the /etc folder (changing names and permissions accordingly) until a restart can be done.
    An important point is the host name resolution. When you setup the Kerberos keytab, the SPNs you are told to create are prefixed with “hdb/server-host”. When authentication tracing is enabled on HANA with “debug” level, you can see the hostname detection in the trace files. HANA finds its hostname then finds every canonical name it finds from DNS, then looks for matching entries in the keytab file. Obviously it has an order but from what I’ve seen you can get it to match on any canonical name even if the entry in DNS is uppercase and the keytab is lowercase.
  • Required: Map HANA User to UPN
    In the HANA system, the database user account(s) need their “External ID” setting to the UPN that is passed in the Kerberos ticket. The UPN may not be apparent as you may imagine this to be “user.name@corp.net”, but in actual fact it may be the actual domain name “user.name@REALM”. Testing and tracing in the HANA system with the auth trace turned on, will reveal the UPN to you.

All of the above software and files can be packaged up and distributed to the end-user’s computer using orchestration tools such as SCCM.

Power BI via On-Premise Data Gateway to HANA using Kerberos

Connecting Power BI via the On-Premise Data Gateway to SAP HANA using Kerberos for single-sign-on will need to use something called Kerberos delegation. This delegation technique allows the On-Premise Data Gateway to impersonate the source user account when accessing the target SAP HANA system. It is similar to you lending your credit card to your partner (not your pin, but just your card, allowing them to make contact-less payments up-to a predefined value).
Again, the AD team will need to be involved in a similar way to the “direct to HANA via Kerberos” method.
In this setup, the On-Premise Data Gateway must be running as a domain service user (for delegation to be allowed).

As well as the AD team, you will also need to involve the Power BI administrators (or someone to configure the On-Premise Data Gateway) as some specific changes will need to be made on the gateway machine.

Here is how the architecture will look for a connection from Power BI via the On-Premise Data Gateway to SAP HANA using Kerberos for SSO:

Process Flow:

  1. User login to Power BI Desktop.
  2. Authentication via Azure AD (in this example).
  3. User accesses query/connection for SAP HANA configured and published from the On-prem Data Gateway.
  4. On-prem Data Gateway receives UPN and switches context to impersonate the end-user (account delegation), getting the token from AD and sending on to the HANA system.
  5. HANA decrypts token using keytab file which contains the key for the stored SPN and maps the decrypted Windows account name (UPN) to the HANA DB account.

For the above setup to work, there are some required steps and some optional steps:

  • Required: Install SAP HANA client
    The main requirement is that the SAP HANA client is to be installed onto the On-Premise Data Gateway machine. For SAP administrators, you will note that this client will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAP HANA in use.
  • Recommended: Install SAPCRYPTOLIB
    As well as the requirement for the SAP HANA client, it is recommended that you secure the connection to SAP HANA using TLS.
    For this, you will need the SAPCRYPTOLIB also installing into the HANA client location on the On-Premise Data Gateway machine.
    This set of libraries allow TLS to be used to encrypt the connection which is part of your “data-in-transit” security posture.
    You will also therefore need a SAP Personal Security Environment (PSE) file placing onto the On-Premise Data Gateway machine along with the SAPCRYPTOLIB.
    These libraries will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAPCRYPTOLIB in use on the SAP HANA server.
  • Required: Define Env Variable SECUDIR
    So that the HANA client knows where the SAPCRYPTOLIB libraries (DLLs) have been deployed (if they are being deployed), you should set a SYSTEM environment variable called “SECUDIR” to point to the location of the SAPCRYPTOLIB files.
  • Optional: Enable “Server Validation”
    An optional step is to enable “Server Validation” on the connection properties. It is recommended to enable this, because without server validation, it is not possible to know that the target SAP HANA server that has been connected to, is to be trusted with the Krberos ticket that will be sent during logon.
    For “Server Validation” to work, the PSE file which is located in the HANA client directory on the On-Premise Data Gateway machine, will need to be populated with the public TLS certificate(s) of the SAP HANA system(s) being connected to and these certificates will need to contain the FQDN that has been used to initiate the connection (e.g. my-virtual-db-hostname.corp.net).
    Although this is optional, I suspect there is a bug in the On-Premise Data Gateway software, since it does not seem possible to use the “test connection” facility without enabling “Server Validation”.
    For “Server Validation” to work, the PSE file will need to be populated with the public TLS certificate(s) of the SAP HANA system(s) the end-user will be connecting to.
  • Required: Configure Kerberos on HANA server
    The krb5_hdb.conf is configured according to your AD domain setup and whether AES256 is needed for the AD service account.
    Once krb5_hdb.conf is configured, the AD service account can be tested at the Linux level using the required kinit and ktutil tools.
    The Kerberos keytab can only be created once the AD service account has been created and the required SPN(s) mapped. The method of creating this changes depending on whether AES256 encryption is needed on the service account.
    Once configured, the AD service account can be tested.
    The keytab can only be created once the AD service account has been created and the required SPN(s) mapped.
  • Required: Map HANA User to UPN
    In the HANA system, the database user account(s) need their “External ID” setting to the UPN that is passed in the Kerberos ticket. The UPN may not be apparent as you may imagine this to be “user.name@corp.net”, but in actual fact it may be the actual domain name “user.name@REALM”. Testing and tracing in the HANA system with the auth trace turned on, will reveal the UPN to you.

In a High Availability cluster with 2 nodes for the On-Premise Data Gateway, both nodes will need the same files and config.

Power BI via On-Premise Data Gateway to HANA using SAML

Connecting Power BI via the On-Premise Data Gateway to SAP HANA using SAML is the most simplistic setup because it de-couples the HANA system from Azure AD, with the On-Premise Data Gateway becoming the Identity Provider in this scenario.
There is no need to make changes to AD and in this setup, the On-Premise Data Gateway service can be running as a local computer account.
You will need to involve the Power BI administrators (or someone to configure the On-Premise Data Gateway) as some specific changes will need to be made on the gateway machine.

One important point to note about this setup: The On-Premise Data Gateway can only use one certificate to connect to HANA. This means if you have more than one HANA system, they will need to all trust the same On-Premise Data Gateway certificate.
This is a limitation in the configuration file of the On-Premise Data Gateway.

Here is how the architecture will look for a connection from Power BI via the On-Premise Data Gateway to SAP HANA using SAML for SSO:

Process Flow:

  1. User login to Power BI Desktop.
  2. Authentication via Azure AD.
  3. User accesses query/connection for SAP HANA configured and published from the On-prem Data Gateway.
  4. On-prem Data Gateway receives UPN and generates a SAML assertion.
  5. Gateway signs the SAML assertion including target user account details, using IdP key and sends to HANA DB server over TLS.  HANA DB validates signature using IdP pub key then maps the target user to a DB user ID and performs the query work.

For the above setup to work, there are some required steps and some optional steps:

  • Required: Install SAP HANA client
    The main requirement is that the SAP HANA client is to be installed onto the On-Premise Data Gateway machine. For SAP administrators, you will note that this client will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAP HANA in use.
  • Recommended: Install SAPCRYPTOLIB
    As well as the requirement for the SAP HANA client, it is recommended that you secure the connection to SAP HANA using TLS.
    For this, you will need the SAPCRYPTOLIB also installing into the HANA client location on the On-Premise Data Gateway machine.
    This set of libraries allow TLS to be used to encrypt the connection which is part of your “data-in-transit” security posture.
    You will also therefore need a SAP Personal Security Environment (PSE) file placing onto the On-Premise Data Gateway machine along with the SAPCRYPTOLIB.
    These libraries will also need to be included in your frequent patching & maintenance routines to ensure it is aligned with the version of SAPCRYPTOLIB in use on the SAP HANA server.
  • Required: Define Env Variable SECUDIR
    So that the HANA client knows where the SAPCRYPTOLIB libraries (DLLs) have been deployed (if they are being deployed), you should set a SYSTEM environment variable called “SECUDIR” to point to the location of the SAPCRYPTOLIB files.
  • Optional: Enable “Server Validation”
    An optional step is to enable “Server Validation” on the connection properties. It is recommended to enable this, because without server validation, it is not possible to know that the target SAP HANA server that has been connected to, is to be trusted with the SAML assertion that will be sent during logon.
    For “Server Validation” to work, the PSE file which is located in the HANA client directory on the On-Premise Data Gateway machine, will need to be populated with the public TLS certificate(s) of the SAP HANA system(s) being connected to and these certificates will need to contain the FQDN that has been used to initiate the connection (e.g. my-virtual-db-hostname.corp.net).
    Although this is optional, I suspect there is a bug in the On-Premise Data Gateway software, since it does not seem possible to use the “test connection” facility without enabling “Server Validation”.
    For “Server Validation” to work, the PSE file will need to be populated with the public TLS certificate(s) of the SAP HANA system(s) the end-user will be connecting to.
  • Required: Create HANA SAML Provider
    In the HANA system a new SAML provider needs creating and assinging the IdP certificate that is to be trusted.
  • Required: Map HANA User to UPN
    In the HANA system, the database user account(s) need their account enabling for SAML authentication and mapping to an allowed provider IdP.
    The UPN may not be the same as the UPN use for any Kerberos setup. You may imagine this to be “user.name@REALM”, but in actual fact it may be the actual domain name “user.name@corp.net”. Testing and tracing in the HANA system with the auth trace turned on, will reveal the UPN to you.
    Once a provider is mapped, the user’s HANA account needs updating with their external ID (the UPN).

In a High Availability cluster with 2 nodes for the On-Premise Data Gateway, both nodes will need the same files.

A new private key and certificate will need to be generated for the On-Premise Data Gateway. Whilst the Microsoft documentation for SAML setup shows using OpenSSL to create the certificate, it is entirely possible to do this in PowerShell (see my other post here which will save you much hassle 😉 ).

Another step that the Microsoft documentation has, is to create a Certificate Authority key, then create a signing request for a new non-CA key. This is just not required with SAML. A certificate chain is not needed and HANA does not verify the chain.
Instead just create a CA key and certificate (again see my other post here). If you use my linked PowerShell method you don’t even need to manually transfer keys around, just create and import into the Microsoft Certificate Store (for local computer).

In the Microsoft documentation there are a couple of additional errors/ommissions that may catch you out:

  • The On-Premise Data Gateway configuration file is prefixed with “Microsoft.”. This was missing in the documentation.
    It should be: Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.dll.config
  • The thumbprint of the certificate should be in lowercase. It is not known if this is actually required, but ad-hoc Google searching revealed some customers were not able to get it to work with an uppercase certificate thumbprint.
  • When adding the certificate thumbprint to the Gateway config file, the file is XML format.
    This means you need to change the closing tag of the “setting” element and add a child “value” element.
    Overall it should look like this for the thumbprint:

    <setting name=”SapHanaSAMLCertThumbprint” serializeAs=”String”>
    <value>the-thumbprint-here</value>
    </setting>

Troubleshooting

During the setup process, do not expect it to be straightforward.
From experience, the following areas will cause issues:

  • Knowledge of the Active Directory KDC servers.
    Setting up the Kerberos configuration on the HANA server will need asnwers from the AD adminstrators.
  • Knowledge of the AD domain federation.
    Setting up the Kerberos configuration on the HANA server will need asnwers from the AD adminstrators.
  • Knowledge of public key cryptography.
    Creation of the IdP SAML keys is tricky and the documentation shows a convoluted method with added confusion.
  • Lack of accurate documentation.
    Some of the Microsoft documntation is not correct or accurate enough.
  • On-Premise Data Gateway trace log files
    These are difficult to get at as they have to be downloaded and un-zipped each time.
  • HANA system fails to find the Kerberos config and keytab, with only resolution being to place them in /etc or full HANA system restart.

My best advice is:

  1. Test the Kerberos setup on the HANA server using the kinit and other tools. If using Kerberos this must work and will report “valid” during the testing of the kvno (key version number).
  2. Use the On-Premise Data Gateway trace logs (if using On-Premise Data Gateway).
    Once you are sure that it has selected the IdP certificate and is trying to talk to HANA, then switch to the HANA traces.
  3. Use the HANA authorisation trace with “debug” setting, then check the traces.
    This is useful once you know that the On-Premise Data Gateway is actually trying to talk to HANA (if using On-Premise Data Gateway), or if you are using SS2DB use these traces straight away.
    These traces will tell you the decoded UPN and whether HANA has found an appropriate user account mapping (or SAML provider if using SAML).

Thanks for reading and good luck!

References:

https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem-indepth

https://learn.microsoft.com/en-us/power-bi/guidance/whitepaper-powerbi-security#vnet-connectivity-preview—coming-soon

SAP Note 2093286 – Migration from OpenSSL to CommonCryptoLib

SAP Note 2303807 – SAP HANA Smart Data Access: SSO with Kerberos and Microsoft Windows Active Directory

SAP Note 1837331 – HowTo configure Kerberos SSO to SAP HANA DB using Microsoft Windows Active Directory

https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-sso-kerberos-sap-hana

https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-sso-saml

https://en.wikipedia.org/wiki/Kerberos_(protocol)

https://en.wikipedia.org/wiki/Security_Assertion_Markup_Language

https://help.sap.com/docs/SAP_HANA_PLATFORM/b3ee5778bc2e4a089d3299b82ec762a7/1885fad82df943c2a1974f5da0eed66d.html?version=2.0.03&locale=1885fad82df943c2a1974f5da0eed66d.html

https://help.sap.com/docs/SAP_HANA_PLATFORM/6b94445c94ae495c83a19646e7c3fd56/c786f2cfd976101493dfdf14cf9bcfb1.html?version=2.0.03

https://help.sap.com/docs/SAP_HANA_PLATFORM/b3ee5778bc2e4a089d3299b82ec762a7/db6db355bb571014b56eb25057daec5f.html?version=2.0.03&locale=1885fad82df943c2a1974f5da0eed66d.html

https://social.technet.microsoft.com/wiki/contents/articles/36470.active-directory-using-kerberos-keytabs-to-integrate-non-windows-systems.aspx

Using Single Sign-on with the Power BI Gateway

https://blogs.sap.com/2020/03/22/sap-bi-platform-saml-sso-to-hana-database/

FREE SAP Extended Maintenance

Did you know…

SAP S/4HANA On-Premise customers on legacy versions get free extended maintenance when they move to a RISE with SAP subscription.

That’s correct “legacy SAP S/4HANA“, we’re already at that point where S/4HANA is starting to lose versions.
We are talking specifically about customers on S/4HANA versions 1709, 1809, and 1909.

Customers having subscribed to RISE with SAP or legacy subscription models, for example, subscription for SAP HANA Enterprise Cloud, take advantage of the option where extended maintenance is included at no additional fee.

Here are the significant dates for those mentioned On-Premise releases:

ReleaseEnd of Mainstream MaintEnd of Extended Maint
170931.12.202231.12.2025
180931.12.202331.12.2025
190931.12.202431.12.2025
202031.12.2025?
202131.12.2026?
202231.12.2027?
2023?

Wow, 2025 is going to be a busy year for SAP. Have a guess what release those customers will be moving to (assuming they choose S/4HANA Private Cloud Edition) if they have not already started? See my previous post here for my thinking on that one.

Reference link: https://news.sap.com/2022/09/new-sap-s4hana-release-maintenance-strategy/

SAP’s Private Innovations Made Public

In July 2023, SAP announced that future innovations and capabilities will only be available in SAP public cloud and SAP private cloud with a RISE subscription.

The audio version is available to listen to on the SAP Press site, but my favourite go-to for quoted interviews is the diginomica site:

https://diginomica.com/cloud-revenue-growth-misses-sap-q2-future-bright-ai-according-ceo-klein

…It’s also very important to emphasize that SAP’s newest innovations and capabilities will only be delivered in SAP public cloud and SAP private cloud using RISE with SAP as the enabler. This is how we will deliver these innovations with speed, agility, quality and efficiency. Our new innovations will not be available for on-premise or hosted on-premise ERP customers on hyperscalers.

What can we take away from this statement?

Firstly, we should note the absence of the product name “S/4HANA”.  It’s not like SAP to miss the opportunity to include the product name in a discussion.  Not once, but twice the opportunity was not used to insert “S/4HANA” into the conversation.
What the quote is saying is exactly that:  “…SAP’s newest innovations and capabilities will only be delivered in SAP public cloud and SAP private cloud using RISE with SAP…”.

Short version: if you buy RISE, you get the newest innovations and capabilities.  This is not explicitly saying they will be included in the S/4HANA product.

This is because in SAP’s own marketing material, RISE with SAP is a solution that includes:

  • SAP S/4HANA Cloud (one of the two “cloud” editions).
  • Business Process Transformation
  • Business Platform and Analytics
  • SAP Business Network
  • Outcome Driven Services and Tools

There are plenty of places for innovation to happen in that list and it doesn’t not mean S/4HANA specifically.

Also we have to consider this fact; for SAP to branch the S/4HANA code base from S/4HANA Private Cloud Edition and S/4HANA On-Premise Edition, would cause a lot of development and support effort from now until around 2040.

What if an “on-premise” S/4HANA customer, already at a recent S/4HANA version, decided to buy RISE with SAP?  If the code base was massively different it would be a system migration to lift it into a comparable system in RISE.

Instead, by providing these new innovations in some form of BTP hosted service which would only be accessible via RISE with SAP, the S/4HANA code can remain as it is (clean core lovers would like this), albeit with some special user-exits or extensibility points or even an Addon; then the new innovations would be provided by future-proof, containerised BTP services.

This would also allow SAP to leave the option open to eventually offer these innovations, at a much later point in time, to non-RISE customers at a premium, maybe.  Especially if they are truly innovative.  Who would give up that option to get more money by simply un-restricting access for the wider customer base, to what would be at the future time, old innovations.

The second point we note is that these services may not be included in RISE with SAP for free.

It is, after all, a subscription based service.

“…using RISE with SAP as the enabler.

Access is provided/enabled through the RISE subscription, but it sounds to me like this will be another request ticket with some contractual costs or additional consumption credits.

Why Restrict Innovations?

Another line of questioning has to be: why? – Why restrict these new innovations from on-premise customers?

Apart from the obvious suggestion that it simply adds pressure for customers to take a RISE with SAP subscription, there is another idea and it adds to the thinking that the innovation is not being delivered directly in S/4HANA.


These new innovations may not be easily integrated with an on-premise solution.

For “on-premise” we have to bear in mind that it does include both systems hosted physically on-premise (in a customer’s own datacentre) in geographical locations that are far from any SAP Cloud entry/exit points, and also those systems hosted in hyperscalers.  They are one and same version of SAP S/4HANA On-Premise.

SAP BTP hosted services need the SAP Cloud Connector as integration between BTP and an on-premise solution.

The SAP Cloud Connector is a secure one-way TLS tunnel, over which bi-directional application comms can flow.
It is not built for very large datasets and definitely not for precise real-time integration.

For customers hosted under a RISE with SAP subscription, maybe there is some new connectivity solution that can be deployed by SAP that allows a more secure, lower latency connectivity with true bi-directional flow between the SAP system and SAP BTP services.  This is what would be needed to provide innovations that require true real-time AI interaction with large data sets.

Maybe this is the reason why on-premise customers will not get these new innovations outside of RISE with SAP?