Clouds, Data Models, and Experiences – Three Entities, One Topic

0
194

Share on LinkedIn

After having covered some press releases about new releases and commenting some interesting organizational changes it is time to have a look at another topic – the need for consistency in a suite of cloud products.

Consistency not only in the most obvious part of a family of products and solutions – the user interface – but the more important aspect of consistency, namely the data model. If you wonder how this relates to customer experience I invite you to read on.

This post is actually spurred by a brief conversation that I had with Jon Reed of Diginomica about this very topic during one of the recent CRM Playaz episodes. Btw, if you do not yet listen in to the LinkedIn conversations of CRM Playaz Paul Greenberg and Brent Leary, discussing important developments and current events in the world of CRM – then you should.

Really!

But I digress.

Back to the topic.

The question is about whether it is necessary to have a unique data model or not. And this question might be answered differently, based upon the definition of ‘data model’.

There is no doubt that a unique data model across applications is very helpful, actually a necessity. Where there is doubt, is whether this data model needs to be defined on database level or not in order to be really helpful.

My point of view is that it does not need to be defined on database level.

This point of view might be contradicting some ‘common sense’ wisdom and the strategy that some very successful companies are pursuing, including Oracle – as it seems – and Zoho. In the good old days before the advent of the  ‘New Dimension’ products, SAP had one, too. On top of it sat R/3.

Just to be sure: Having a common ‘data model’ across applications is a huge advantage. There is no doubt about this.

But let’s dig into the two main possibilities on how to achieve and implement one.

One possibility is to model and fix it on database level. To define and model it in a way that every attribute and relation has its one-to-one representation on the database.

This model most certainly has some advantages. It offers one consistent and unique model of describing what is important for and about organizations and (business) transactions. It gives utmost control and precision about semantics and it makes it very easy to understand what a business concept is about. It is also performing well – if not normalized too far. This is the winning model, if it is correct and thought through – and can be kept stable. As I said above, it is the concept that Oracle and Zoho are pursuing. And I am not the one to say that either of these example companies has not thought through this approach of defining and implementing an enterprise data model.

In fact I am very sure that they did!

And they did even more. They did something that other companies, including SAP, and as far as I see, Salesforce, omitted to do for too long after the cloud and therefore silo’ed solutions emerged.

The advantage of cloud solutions and best-of-breed solutions is that they focus on solving few problems, but these very well; and the customers without the necessity to buy much functionality they neither want nor need, get just what they want.

However, with this comes a challenge, the challenge of diverging data models. Each of the applications, even within the same family of cloud applications, often has different data models. This is due to the fact that they are optimized for different tasks, so can be viewed as being quite natural.

Just that it isn’t. It is the easy way.

And it doesn’t work in a platform economy.

Not at all.

As I have written before, a platform constitutes of four pillars:

  • The technology platform
  • Tools that enable and provide insight
  • Productivity tools
  • And an ecosystem

The latter three pillars suffer, if the former does not provide for a unique, yet extensible, data model with well-defined semantics

If the latter three suffer, so will business applications built on top of the platform.

As ecosystem players provide own applications and extensions to existing applications, it is necessary to have a common language that describes how business entities look like, how they relate to each other and how they are governed.

In times of make or buy decisions frequently being decided towards buy the right way to go is to offer a business meta data model that fulfils three main conditions:

  • It provides a definition of the main business objects from a business point of view.
  • It is extensible.
  • It allows for centralized maintenance across applications within an ecosystem.

Now, it should be documented as well, but that is another story …

At SAP, in ancient times sincerely, this was a job done by the data dictionary (minus the documentation); partly done, to be honest. The data dictionary was an abstraction of the physical data model to describe business entities. Just that it was more geared towards abstracting from the database, as opposed to defining a business language.

There are different ways to implement this business meta data model in a cloud first world.

Microsoft developed the common data model, which enables no- and low code development across its ecosystem. Also providing the development tools and its own environments, Microsoft is essentially leading the pack.

Salesforce promotes its own Canonical Data Model with industry flavors. Salesforce’s challenge is that it is a CRM company and not covering the full value chain. And there is another one, which I’ll mention a bit later.

Zoho has gone forward similarly, staying in full control of their own destiny by not having acquired a single vendor so far (which makes up for an admirable strategy and success story). The company builds its apps around the concept of what they call data pillars, which are controlled by some apps that act as a database. Other apps use this database. Within its ecosystem these apps can be enhanced by means that stretch from no-code to professional coding. One of Zoho’s challenges is that the ecosystem still needs to get strengthened to be really on an eye-to-eye level with the big four.

SAP is currently working on SAP Graph, which is a wrapper around the APIs of SAP’s existing products, creating a harmonized, business oriented API layer that can and should be used by application developers. They are coming bit late, but with a good and important approach. Additionally, SAP is working on SCP based micro services that manage the access and usage of business objects across applications. Done right these services could also have the ability to extend the business objects of the underlying and connected applications. One challenge is to keep these services in synch with SAP Graph. Ideally they are the same.

The combination of SAP Graph and the micro services can be a real winner if the services do not only allow the management of data access but also the customer/partner specific extension of the data model and with it the corresponding web services.

It cannot be overestimated: With the help of a common data model and semantics customers, vendors and partners can easily and consistently extend application families to serve their customers and users. This is the foundation for any attempt at providing positive and lasting experiences.

On top of their own models, and jointly, Microsoft, Adobe, and SAP, together with a growing number of additional partners, are working on the Open Data Initiative ODI, which is ‘a common data model, and a common data lake’ that helps avoiding data silos and their integration.

Having this data lake, based upon a well-defined semantics, and a well-defined API as given by a single data model across all applications of an ecosystem, is what enables the creation of engagements that can result in memorable experiences. Everything, and I mean everything, that creates insight and enables corresponding action powering engagements and experiences, depends on a data model like this. The power of ODI cannot be underestimated.

The strength of ODI lies in its being cross ecosystem as it spawns across at least two major ones, therefore bringing the concept of a unified data model to a whole new level. Its weakness lies in not covering some more important ecosystems. But then this post is not about deficiencies of an initiative. It is about the importance of having and offering a joint data model and API for ecosystems.

The importance of this cannot be underestimated as well. And decision makers need to have a hard look at where platforms are moving with regards to this topic.

Republished with author's permission from original post.

Thomas Wieberneit

Thomas helps organisations of different industries and sizes to unlock their potential through digital transformation initiatives using a Think Big - Act Small approach. He is a long standing CRM practitioner, covering sales, marketing, service, collaboration, customer engagement and -experience. Coming from the technology side Thomas has the ability to translate business needs into technology solutions that add value. In his successful leadership positions and consulting engagements he has initiated, designed and implemented transformational change and delivered mission critical systems.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here