How to Navigate the 5 Stages of Organizational Maturity in Digital Transformation


Share on LinkedIn

Source: iStock

In contemplating digital transformation program investments, executives ask two things: What is the current state costing us, and does it make economic sense to fix it?

Transformations seek to improve some aspect of the customer experience or the backend processes that enable that experience. I like to say you cannot have acts of heroics upstream and expect a seamless customer experience downstream. If people who are creating value for customers cannot effectively do their work, there is a downstream impact. This means that a transformation that only focuses on the customer experience without looking at upstream enabling processes and efficiencies will not achieve the desired outcome. Since digital transformation represents a profound change in how an organization operates, it’s not surprising that organizational maturity is a key ingredient for success, but many organizations are not able to measure maturity as it relates to the development of specific capabilities that are important to the program.

One challenge is deciding where to begin tactically. A good starting point is to assess current capabilities in the context of organizational maturity and the desired future state. That exercise will help decision-makers determine whether their organization can get from where is to where it wants to be.

Multiple Dimensions to Maturity

Organizational maturity can be measured across multiple dimensions, including culture, people, process, and technology. Each of these dimensions contains other elements. For example, technology can include information architecture, infrastructure, integration, quality of deployment, data quality, and more. The level of maturity may not be equivalent across these areas – frequently there are pockets of more mature capabilities in one part of the organization and areas that are less so in other areas. Feedback mechanisms and measurements that validate decisions are part of any mature process and are therefore metrics-driven decision-making (or data-driven decision-making) is one of the indicators of maturity. Less mature processes generally do not have metrics that are aligned with organizational goals in a way that allows for meaningful course corrections.

Maturity can vary across business units, functions, departments, or even processes, and a digital transformation is difficult to carry out if they are wildly divergent – especially when considering that a typical customer experience will cut across multiple parts of the enterprise. Weakness in product information processes, data architecture, or quality will have a downstream impact on sales, service, support, and e-commerce.

Picture a dysfunctional culture where politics and hidden agendas drive decisions attempting to implement a complex process with multiple technologies. One reason that organizational maturity will likely not be consistent across an entire enterprise is that not all processes are critical to all functional areas – some will be more advanced than others simply due to the requirements of the business. Across all dimensions and departments, organizational maturity is marked by the ability to execute, learn from experience, and adapt to change.

Maturity models originated with the software Capabilities Maturity Model (CMM) developed in the 1980’s which was updated to the Capabilities Maturity Model Integration (CMMI) to incorporate agile approaches and focus on objectives rather than process for process sake[1].

Generally, maturity models articulate the characteristics that describe 5 stages of maturity from initial (less mature) to optimized (most mature). In the framework that my company (EIS) has developed, the stages are: 

  1. Unpredictable: little or no formal processes. Inconsistent outcomes, haphazard methodologies, one-off solutions
  2. Aware – the initial stage of attempting to standardize approaches but lack consistency and repeatability
  3. Competent – greater levels of sophistication and repeatability
  4. Synchronized – increasing coordination across departments, functions or applications
  5. Choreographed – finely tuned processes that integrate various elements to orchestration an experience

Those categories are necessarily abstract since they apply to many different aspects of the organization’s information ecosystem. The categories become more specific and concrete when applied to a specific component of that ecosystem. 

For example, to generate a truly personalized experience (whether for an employee or a customer), many pieces need to be orchestrated: Product information, content, customer information, and knowledge – tied together with governance and decision-making processes based on measurable outcomes. 

Example maturity model that articulates each of the components and stages for personalization.







Content Management

information sprawl: chaotic, inconsistent, and unguided

stable and labor-intensive rudimentary lifecycle; siloed activities with department-level incentives; one-off experiments; low reuse

comprehensive content lifecycle; rudimentary sharing; broad audience segmentation

adaptive content repurposed across applications and channels; content effectiveness reporting

automation-supported production enables meaningful personalization; accountability for content effectiveness

Content Component Architecture

static, monolithic documents; tool- and device-specific content structures

templates and copies with burdensome tagging requirements;

case-driven component reuse, rules-based tagging, limited customization

multi-dimensional modeling of content in selected channels for few defined customer groups

high-fidelity multi-dimensional content model across multiple content asset types and channels

Offer Architecture

one size fits all

basic audience segmentation and understanding of personalization drivers

multi-dimensional audience segmentation applied to limited, manual offer combinations

dynamic audience segmentation based on explicit and hidden dimensions (LDA); foundational messaging framework

offer components dynamically recombine for range of business outcomes (registration, conversion, retention)

Audience Architecture

characteristic generalizations based on assumptions

broad, silo-specific segment characterizations; reasonable but unsubstantiated customer journeys

clusters defined by attributes; simplistic journeys supported by data evidence

predefined rules for associating customer and context attributes; accurate journey stage recognition

individual-in-the-moment; predictable customer behaviors leading to proactive engagement and decision-making

Customer Data Maturity

data captured into inaccessible formats; identity hard to determine; unknowable data quality

multiple disconnected systems; uneven, form-based data collection practices

custom multi-system querying; non-normalized data; linked offline/online identities

360° omnichannel profiles; preprogrammed responses to behavioral signals

insight-driven practices; real-time machine learning behavior response

Product Information Management

unreliable manual processes and data; navigation-only taxonomy; few useful attributes

out-of-sync systems, few data quality processes; onboarding templates; classification taxonomy with category-specific attributes

content workflows triggered by product launch; normalized and controlled attribute values

attributes-driven cross-sell and merchandising; vendor data collaboration

seamless cross-channel launch; responsive, personalized product content

Knowledge Management

haphazard, application-limited, no collaborative tools

intentional but ineffective, out-of-box collaborative tools, knowledge oligarchies, isolated search experiences

formal harvesting and distribution, expertise networks, taxonomy to classify knowledge generation/inflow

intentional knowledge operations integrated into business processes, communities management, reliable e-discovery

seamless, habitual collaboration; business value added at each touchpoint; ongoing tuning & enhancement

Governance & Compliance

non-existent, punctuated with overreactive swings in involvement

departmental expectations; territorialism; individualistic compliance by passionate SMEs

funded, centralized oversight and policy; repeatable processes; assigned responsibilities; analytics reporting

specialized stewardship service teams; culture of personal responsibility; cross-functional decision making

integrated operations driven by business value; unconscious competency; proactive agenda-driven leadership


idiosyncratic hoarding; enterprise blindness; one-trick technologies

application-specific data stores with single-project collaboration; tool customizations at capacity

enterprise taxonomy, coordinated product and content workflows, accessible analytics, systematic information sharing

enterprise ontology; scalable services with some automation; data-driven decision making; workflow synchronization; non-destructive data streams

comprehensive unified attribute models; personalization and 360° views as norm; proactive response

How Far You Need to go Depends on Where You Begin

This example of a maturity model identifies requirements for true personalization at scale. The model does not imply that an organization needs to be at level 5 across every department and process. The required level is in part determined by customer expectations and business needs. It can take an organization a year to go from one maturity level to another. Certain elements require foundational capabilities to be in place before that transition can take place. If higher levels of customer support are needed (not indicated on this model) one of the prerequisites is good knowledge management capabilities. Good knowledge management frequently requires good search functionality. Good search requires a certain level of content management capability and so on.

A program of transformation needs to break down the components needed to achieve the future state. If those elements are missing, then the transformation will not meet the expected outcomes or fail outright. Therefore, it is critical to have an understanding of the elements that support the future state and understand the current level of maturity. The plan has to answer not only “How far do we need to go?” but also “Where are we starting from?”

One objective of a current state assessment is to harvest the mature processes and understand how they can be brought into the less mature parts of the organization, whether they are departments or functional areas. But it’s also important to understand what is realistic given current processes, technologies, data quality, governance, change management, and infrastructure.

The chief complaint against maturity models is that they’re not actionable–it’s the “So what?” question. Why do we need this? What does it do for us? The key benefit is that a maturity model shows the current state as well as the conditions that will be required by a future state. If the gap between the current conditions and the future required conditions is very wide, then it will take more resources to get there. In any case, this analysis can provide clarity on what needs to be done.

Maturity of Collaboration – Information Retrieval

Digital transformations are, in essence, a new way of using corporate information, and all the elements must work together. The greater the level of maturity at which an organization operates, the easier it will be to shift some of the actions to digital implementation.

Much value creation is from informal collaboration; however, there are parts of the organization that are tasked with the direct application of formal knowledge. In many organizations, people have to deal with the inability to find the information they need. They may have to contact other people to get questions answered or send emails with requests for information repeatedly. Often, the same or similar information gets shared repeatedly through one-to-one or one-to-many communications.

Problem solving generally happens in small groups and teams. Improving the efficiency of groups and teams comes from providing tools to interact and share knowledge and insights as well as to access and build upon prior work. Users can ask their colleagues, but they should also be able to ask the systems directly. They need to be able to access those systems easily and get reliable information.

As an example, consider how search is used to find information needed to keep a process moving. Perhaps the need is to find information for a customer while they’re on the phone and look up reference information or answers in different locations – from file shares, an intranet, or ideally from a well-designed knowledge base. If an organization has not structured its information properly, the agent will not be able to find the answer— and neither would a chatbot or other virtual assistant.

Another challenge in finding answers is that people use different words to say the same things, and they use the same words to mean different things. Language is ambiguous. If the content is not tagged exactly as requested in corporate guidance, it must be mapped to the preferred terms. Some search tools do a portion of this using standard language models (common synonyms for typical use cases. But with specialized terminology, these relationships need to be defined. This area of maturity is foundational to almost all others – the information architecture of the enterprise.

Maturity of Information Structures – from Taxonomy to Ontology

Most people in the information business are familiar with taxonomies, which organize information into categories that are often hierarchical. If the taxonomy describes physical goods, metadata provides further information on each item in the taxonomy; this may include price, size, brand, color and other attributes However there is never a single taxonomy in the organization. There are multiple vocabularies that describe the things that are important to the business. These multiple taxonomies comprise an ontology, which is the next level of maturity for enterprise information management. 

The concept of ontology is more complex. An ontology consists of all of the taxonomies and thesauri that describe the domain. For insurance, the ontology might include risks, regions, products, services, policy types, etc. For life sciences, it includes brand drugs, generic names, chemical compounds, mechanisms of action, drug targets, and biochemical pathways.

Ontologies include a thesaurus that describes the different ways of asking for something (statement of work is the same as SOW or proposal). But it can also include phrases and different ways of saying the same thing. When building a knowledge retrieval bot, classifying the thing that someone says (the “utterance”) to the thing they want (the “intent”) is similar to the role of a thesaurus. 

The power of ontology comes from its ability to define relationships among concepts (risks in a region, mechanism of action for drug target). In many cases an existing ontology can be used to improve certain data efficiencies (standards for communicating with suppliers for example). However, while standardization enables efficiency, the things that make an organization unique are at a nuanced level. Businesses that do the same thing have different processes, they have different practices, and the personality, character, and culture are based on different mental models of the people who work there. Some of these differences are captured at the level of language, terminology, and conceptual approaches that are in essence articulated by the ontology. The ontology contains the relationships between different domains of information and forms the knowledge scaffolding of the enterprise. Ontology provides context for data, knowledge, and information so that ambiguity can be reduced. It allows meaningful access to information across multiple sources. The ontology in essence, becomes the “soul” of the business, describing how it is unique in the marketplace and differentiated from its competitors.

Whatever the source, the information needs to be consistent so that no matter where it comes from, users can access the correct result. Frequently information presented online through one channel is different from that in another channel. A sales rep may provide different information than the website or may be unaware of a promotion that was launched but not synched with the sales organization. Information systems in a more mature organization not only understand the information that people need, but can anticipate what they need more precisely and make it available exactly when and where they need it while they do their jobs.

Building on a Common Foundation

Another element of maturity is the documentation of how systems need to function and how users interact with systems. This is done through establishment of libraries of use cases. When reviewing use cases and associated content with these cases, it’s important to work closely with stakeholders to help them understand how the approach is leveraged across systems and processes. At one organization, executives talked about building a knowledge base and designing a new intranet in parallel to a bot framework that was being developed. At the time, they had not understood that the bot framework used much of the same content and was driven by many of the same use cases that a knowledge base accessible to humans would use. 

In some cases, the content was the same, in others it was slightly different, or needed to be in a different format. But the same content and content source can be used in many cases. Whether it’s a search interface, a chat interface, an intranet interface, or a specialized knowledge base, they all run on content and if we componentize that content, it will be consistent for all downstream consuming systems and readily available for users across devices, contexts or applications.

It also means that when executives are not well educated about how content is used, they may resource projects incorrectly or not understand dependencies. Determining where the organization is on a maturity continuum is an important element for understanding those dependencies and helps to inform a realistic road map.

Customer Data Maturity and the Impact on Revenue

One organization we recently worked with was trying to improve subscription conversions from free to fee-based. The problem was while there are lots of sales enablement, accelerator and business development programs and efforts, the data that was required to track progress through the funnel and to understand conversion events was not available. The subscription model and the customer data model did not include enough details to identify the best prospects ready to convert and understand the motivation and catalysts for triggering conversion

Without the ability to track conversions effectively and to monitor customer behaviors prior to and after conversions, the other programs would not be effective. This also required alignment of content and outreach efforts: calling scripts, white papers, case studies, and other conversion assets to be inserted at the right place in the customer journey based on the characteristics of that customer. Their technical ability, infrastructure, the products they own, their problem space, their industry, and so on. This organization was quite mature in many areas, the gap in this specific area – customer attributes and content models – impacted the ability to personalize the experience in a way that increased conversions.

Realistic Plans Using Maturity Assessments 

These examples provide convincing evidence that without a certain level of maturity with respect to culture, people, process, and technology, a digital transformation will not perform as expected. As a first step, find out where your organization is along the maturity spectrum for the dimensions associated with the key pain points it is experiencing, whether that is difficulty finding answers from a knowledge base or bottlenecks in customer processes. Compare the current state to where the organization needs to be based on customer needs, the target process, or competitive pressures. Understanding that gap will guide the organization to take the next steps toward a digital transformation: correctly resourcing the program, identifying less visible supporting factors and developing a realistic, achievable plan that does not exceed the enterprise’s ability to execute and absorb change.




Please enter your comment!
Please enter your name here