2022 Analytics Trends

5 Key Strategies to Adopt Today to Get Ahead


Five leading voices in the analytics industry detail trends likely to make an impact—and recommend key ways to stay ahead.

 

1 Actionable Analytics:
A Priority for 2022

Analytics that don’t translate to action are a waste of resources. If you’re frustrated with unexpected after-effects from self-service BI, Doug Henschen has an action plan.

Doug Henschen
VP & Principal Analyst
Constellation Research

2 Dawn of the Devs: Developers are 2022’s Change Agents

Your secret weapons for 2022 are the coders and builders on your technology teams. Hugh Owen details why a developer-centric strategy could make all the difference for your business.

Hugh Owen
Executive Vice President & Chief Marketing Officer
MicroStrategy Inc.

3 The Procrastination is Over: Clean Up Your Reports

Your thousands of reports could be costing you millions. Robert Tischler breaks down clear guidelines and best practices to help your organization stay productive in 2022.

Robert Tischler
Managing Director
BARC Austria

4 Trust but Verify: How to Avoid Business Bias in 2022

Are biased insights worse than no insights at all? Roxane Edjlali lays out the case for an adaptive analytical governance approach built on a ‘trust but verify’ methodology.

Roxane Edjlali
Senior Director, Solutions Management
MicroStrategy Inc.

5 The Reality of Cloud Analytics Trends in 2022

Get your head out of the sand and into the cloud. If you haven’t embraced a cloud mentality, you’re not alone. But containerization could change that according to David Menninger.

David Menninger
SVP and Research Director
Ventana Research

Actionable Analytics: A Priority for 2022

Doug Henschen
VP & Principal Analyst, Constellation Research
 

In the year ahead we’ll see a continuation of the rising demand for actionable analytics. What’s behind this demand?

In short, the “democratization” of BI and analytics through self-service was only a partial success. In fact, when self-service initiatives were not backed by a degree of centralized management—in the form of unified data models and agreed-upon definitions and measures—the profusion of reports and dashboards sometimes led to conflicting and overlapping versions of the truth. These conflicts created a new source of frustration with BI and even distrust of reports and dashboards.

Business acceleration—together with the challenges described above—has fueled demand for more actionable analytics. There’s less time to navigate and interpret reports and dashboards, so there’s more interest in delivering concise, to-the-point insights at decision points within applications and workflows. The move toward actionable analytics is being enabled by four trends:

 

“…there’s more interest in delivering concise, to-the-point insights at decision points within applications and workflows.”

The move toward actionable analytics is being enabled by four trends:

Next-generation embedding. Concise, targeted insights are increasingly delivered at the point of decision making within applications. This trend is being enabled by microservices architectures, rich application programming interfaces, software development kits, and prebuilt integrations both within popular enterprise applications (e.g., ERP, CRM, HCM) and within collaborative tools (e.g., email, Teams, Slack, etc.).

Natural language query. NLQ isn’t just a feature for querying from within dashboards and other analytical interfaces. Application vendors are adding NLQ features within their apps, and BI and analytics vendors are promoting this trend by turning NLQ into a utility embeddable anywhere, so users don’t have to navigate to separate reports and dashboards. Users can simply type in natural-language questions from within their transactional apps, or they can ask Siri or Google and voice-to-text integrations will translate their questions for mobile app interaction.

Low-code/no-code development options. LC/NC options are popping up everywhere, offered by application vendors, BI and analytics vendors, database vendors, and more. The trend is rising because the need for software is vast while developers are in short supply. LC/NC options enable systems administrators, power users, analysts, and domain experts to develop custom apps and to extend off-the-shelf applications. Analytics are invariably a part of these LC/NC offerings, enabling data-driven decision-making and action.

Workflow and automation. During the pandemic, we’ve seen “the great resignation” and employers struggling to hire enough workers. No wonder there’s huge demand to automate wherever possible. Why saddle remaining employees with repetitive manual work steps? BI and analytics platforms have had alerting capabilities for years. So, where thresholds and exceptions are tied to predictable outcomes and there is confidence about what actions to take, why not automate? Organizations are increasingly using bots, workflow, and robotic process automation capabilities with events and analytic thresholds triggering actions.

Want faster time-to-insights? Inject key info directly into the apps where your people work to do their jobs.

Dawn of the Devs: Developers are 2022’s Change Agents

Hugh Owen
EVP & CMO, MicroStrategy Inc.

There’s a chasm growing in the global economy. It’s between companies who understand the power that can be unleashed from the waves of data their businesses generate each and every second, and those that are struggling to keep up.

See how MicroStrategy is empowering developers.

The widening divide is indicative of a data-driven mentality—where information is not simply captured for after-action reports, but it is constantly used to optimize next steps and forward strategies.

Successful organizations over the last year have leveraged data and analytics as a vital strategic differentiator, using it to safeguard and empower their employees in the midst of a volatile market; delight their customers with increased personalization and targeted customization, and seamlessly navigate supply chains in a constantly evolving business landscape.

These organizations and industry leaders understand that the key to unlocking their data is to uniquely empower a corps of skilled developers with impactful tools to build amazing applications and solutions. This commitment to a data-driven mentality puts developers front and center in their respective organizations to act as change agents, weaving information into action for thousands of their colleagues, by building solutions that are rich with insight, action-oriented, and accessible to anyone.

What does being a developer in the modern world entail? Today’s required skillsets may surprise you. The arrival of low-code/no-code app development has changed the game—and gives business experts the means to develop high-impact enterprise applications.

This new breed of application developers can create new realities and experiences for thousands of others—faster than ever before.

For example, we all remember what happened in the early days of the COVID crisis. Stores were closed, factories were shutting down; people were scared. There was legitimate concern that supply chains would be broken, and that food insecurity could manifest as a serious problem…

One organization, a prolific MicroStrategy customer and retailer serving millions in the UK, put its developers to work 
to operate its business with the highest degree of available intelligence.

In just one week’s time, its developers built a COVID-19 response app. Within two weeks, the app was fully tested and deployed to more than a thousand store locations.

What exactly did the application do? With an unpredictable workforce due to quarantine recommendations, this app allowed the company to effectively staff stores based on available resources—redirecting people and products to the high-traffic locations that required inventory or additional assistance. This helped keep the shelves stocked with food, medicine, and other products their customers required during an especially difficult period.

This is just one example of what can go right—in a time when so much could easily go wrong. Whatever the challenges 2022 has in store for us all, equip, trust, and empower your developers. You’ll likely be thanking them by year’s end.


To read the full report, sign in or create a free MicroStrategy account.

 

“Analysts investigate the world. Developers change it.”

Empowering your developers is easier than you think. Learn how the industry-leading platform makes it all possible.

The Procrastination is Over: Clean Up Your Reports

Robert Tischler
Managing Director BARC Austria
 

2022 is the year in which you must get your reporting in order. Many companies now find themselves in a mess of reports and the forces that make this worse are often stronger than those pushing for consolidation.

Reports and dashboards have often been proclaimed dead. But we see that they are very much alive. Sure, they are by no means the silver bullet for all information needs in a company, but they are the undisputed backbone of corporate information distribution. Reports and dashboards done right help quickly make sense of data and drive determined action to create value.

According to our worldwide BI & Analytics Survey, 84% of companies use standard reports, and 78% use dashboards or interactive analytical apps. These figures depict the use of reports and dashboards in a distinct tool while companies on average have more than three analytics and BI tools in use.

Information presented in a standardized form offers a low entry barrier for all users. It helps to satisfy repeated information needs. There is a substantial need for serving information to business users in a standardized way.

Now, while we clearly rely on reports and dashboards to direct our actions, corporate reporting is often in a terrible state. Many companies suffer from having too many reports in and outside their analytics and BI systems. Your company probably suffers from that overabundance too. Having too many reports seriously undermines trust and diminishes analytics and BI efficiency. Furthermore, it undermines your efforts to get value out of your data.

“Having too many reports seriously undermines trust and diminishes analytics and BI efficiency.”

In practice, it is always easier to request or build new reports than to have them deleted or decommissioned. This force is a major cause of the reporting mess in companies.

Changes in business, the competitive landscape or customer behavior as well as changes in management or corporate structure all magnify requirements for new information and reports. Further pressure is exerted by more data sources delivering more data to more users in more departments through more analytics and BI tools to support a growing variety of use cases.

While these forces are manifold and powerful, there is little evidence to suggest their pressure will diminish. They clearly show that analytics and BI must deliver information quickly to support these new requirements. The required time-to-insight can only be achieved when allowing and supporting self-service, which poses further challenges to the reporting environment.

The bottom line is: Whatever happens, more reports get created. And the more reports a company has, the lower the share of what is used. It is not too uncommon to find that 80% of the reports available have not been run in the last twelve months, especially in self-service environments lacking the required conditions. This is a huge waste of resources and a huge cause of distraction for decision-makers and employees at all levels.

The reporting mess is a significant cause of dissatisfaction with corporate reporting. Discontent arises from the high effort in creating and maintaining these reports, especially if a large proportion are only ever used once or twice. Too many reports result in inconsistencies, and this and they are the gas that gets lit by sparks of distrust in the information present. An overabundance of reports also draws the attention of decision-makers away from what matters most and blocks the resources of front-line workers from serving their customers. While the effort to create and maintain a massive number of reports is huge, it often does not pay off. So, you must clean up the reporting mess or even prevent it in the first place.

It is difficult to decommission reports. So, the first goal must be to prevent unnecessary reports from being created.

  • Establish clear guidelines when a report should be created or when an analysis is sufficient. Is the topic of recurring interest to users? Does it inform decisions? If not, do not build a report.
  • Every new report gets an expiry date. By default, all reports beyond that date will be moved into an archive and not supported any longer. 
  • The usage of reports should be monitored closely. Periodic reviews ensure that there is a constant discussion about which reports are still relevant and which can be archived.
  • Not delivering self-service capabilities does not help. It just results in reports being created in Excel. So, curate data in a way that is easy to understand and deliver versatile tools to analyze data such as visual analysis or notebooks. They offer huge flexibility to users to collaborate and answer urgent questions quickly.

If you want your reporting to be neatly structured and well organized, these are the starting points to an efficient way of delivering trusted information that informs action.

There’s a fast way to simplify your reporting, workflows, and operations. See how Library™ helps turn data into action.

Trust but Verify: How to Avoid Business Bias in 2022

Roxane Edjlali
Senior Director, Solution Management, MicroStrategy Inc.

Constant change in a covid economy has companies thinking differently about their data. New analytics use cases continue to emerge. Digitalization has accelerated. And data exchange continues to expand—among customers, suppliers, and partners.

Driven by the need for flexibility and agility in delivering new projects along with the growing number of cloud-native data sources in the enterprises’ ecosystem, data and analytics have continued to move to the cloud. As a result, we have seen the emergence of new architectural designs such as data mesh, data fabric, and data lakehouses. These concepts all attempt to reconcile the growing complexity of cloud data collection. Where raw data is collected without any consideration for how it relates to other data—if it can be used or trusted.


"Maintain flexibility and agility in an increasingly complex data landscape with a ‘trust but verify approach.”
 

All these architectural designs have their merits, as they highlight the growing complexity of the data landscape. Nonetheless, this complexity is challenging business users and analysts who were hoping that self-service alone could address their daily analytical needs. Even for savvy data users, making meaningful use of the data has become more difficult. As a result, having trusted shareable and reusable data in support of analytics has become even more important.

The data management practice alone can’t address the full spectrum of analytical use cases with the flexibility and agility needed by organizations even with increased investment in activities such as data catalogs or augmented data integration capabilities, and in dedicated resources like data engineers.

As a result, organizations need to adopt an adaptive analytical governance approach built on a ‘trust but verify’ methodology to overcome data complexity. To do so organizations need to articulate various rings of analytics governance that fit the business requirements, skills, and needs of the various users across the business:

  • Inner ring for the centrally governed analytics representation of data.
  • Middle ring for department or BU-specific analytical governance.
  • Local ring for users who will need to augment analytics from central or departmental governance.

Analytical governance differs from data governance as the former is based not only on data, but it also includes the definition of calculations, metrics, AI/ML models. Basically, it’s all of the analytical content and all the metadata relating to the content itself like the freshness of the analytical content, author, etc. Analytical governance is powered by a rich underlying metadata layer organized in the form of a semantic graph.

Key findings:
  • Becoming analytics-driven is reliant on people’s ability to make day-to-day decisions based on data
  • Data literacy is not progressing as fast as data complexity
  • Emerging architectural designs such as data fabric, data mesh, or data lakehouses promise to tame data complexity but aren’t a panacea
Recommendations:
  • Data and analytics leaders must drive alignment by delivering contextual access to analytics to users in their day-to-day digital workplace and applications
  • Data and analytics leaders must implement strategies to measure the business value of analytics to users
  • Data and analytics leaders need to invest in a semantic graph that participates in the overall data fabric but represents the agreed upon business representation of data and calculations for analytics
Notes:
  • Data Fabric: “A data fabric is a design concept that serves as an integrated layer (fabric) of data and connecting processes. The fabric presents an enterprise-wide coverage of data across applications that is not constrained by any single platform or tool restrictions.” Source: Gartner
  • Data mesh: “A data mesh places emphasis on originating sources and use cases in which data assets are designed and captured to then produce differing combinatorial data products relative to business context from those assets.” Source: Gartner 
  • “A data lakehouse is a new, open data management architecture that combines the flexibility, cost-efficiency, and scale of data lakes with the data management and ACID transactions of data warehouses, enabling business intelligence (BI), and machine learning (ML) on all data.” Source: Databricks

Powered by its rich metadata, data, and analytics—leaders can expect to have a better view of the value of analytics to the organization by tracking for instance analytics usage across various groups in the organization, frequency of access, and its impact on enterprise objectives such as customer satisfaction and supply chain efficiency.


By combining an adaptive analytics governance approach, enterprises empower all users to have access to analytics regardless of their data proficiency. This approach combined with a strategic view of analytics valuation can increase the efficacy and value of analytics to enterprises.

The Schwarz Group is setting a new standard for an enterprise-wide data strategy—as shown at MicroStrategy World. See how it empowers their brands, including Lidl and Kaufland.

The Reality of Cloud Analytics Trends in 2022

David Menninger
SVP and Research Director, Ventana Research

The trend toward cloud analytics is clear. One-half of the participants in our research currently use cloud computing products for their analytics and data processes, and three-quarters expect to use the cloud for these processes eventually.

Our research has also shown that 86% of organizations expect the majority of their data to be in the cloud eventually. Part of what is driving the adoption of cloud analytics is that many organizations are already operating in the cloud. More than one-half (59%) of the participants are analyzing data from cloud applications. They have experienced some of the benefits of cloud applications and it’s natural to extend those benefits to their analytics.

However, it’s not all smooth sailing. Organizations still report a slight preference for on-premises analytics at a rate of 39% compared with 34% that prefer cloud-based analytics. The research shows that more than one-half (57%) of those who are not using the cloud are concerned about security issues. Clearly, analytics software vendors and hyperscalers need to continue to address these issues by providing secure and trusted platforms, ideally with certifications to meet different organizations’ requirements.


“Containerization combined with a microservices architecture creates the scalability needed to support large numbers of users and large volumes of data.”

Interestingly, those organizations that prefer on-premises deployments report higher levels of satisfaction than those preferring cloud deployments. Those preferring on-premises deployments also report that their analytics technology has had a greater impact on improving their activities and processes. Perhaps these differences result from the fact that some on-premises offerings have not been fully migrated to the cloud yet.

Therefore, these particular cloud offerings are lacking some features compared to their on-premises predecessors. The research also shows two-thirds of organizations (65%) that prefer on-premises deployments have a hybrid of on-premises and cloud infrastructure. Perhaps some of the reduced satisfaction results from challenges interoperating between on-premises and cloud deployments. Further, the “cloud” is not one thing. Almost one-half (42%) of the organizations in our research are using at least two different hyperscalers. Consequently, organizations should strive for hybrid analytic architectures that can span these on-premises and multi-cloud configurations.

Containerization provides the means to achieve this deployment flexibility. Containerized applications enable the same code to be deployed on-premises and on the various hyperscalers’ platforms. Containerization combined with a microservices architecture creates the scalability needed to support large numbers of users and large volumes of data.

The other reality of making cloud analytics work is avoiding cloud cost surprises. Organizations prefer predictability in the costs associated with analytics software. Subscription-based pricing models vary from vendor to vendor; some are priced on simple metrics such as numbers of users, while others use more complicated metrics based on different usage parameters such as the amount of data processed, or the number of reports processed. If the pricing metric is not easily understood and predictable from one billing period to the next it can lead to some very unpleasant surprises when the bill arrives. Vendors can also make cloud costs more predictable by offering tools to monitor and manage the usage throughout the organization.

Notwithstanding the potential issues associated with cloud analytics, we expect to see their popularity continue to grow toward eventually eclipsing on-premises deployments. But it’s wise for organizations to have their eyes wide open so they can understand and manage their cloud deployments. The cloud offers many benefits and organizations should seek to maximize their impact with a careful and thoughtful selection of their cloud analytics vendors.

Migrate your analytics applications to the cloud now to spend more time on what matters—delivering intelligence to your business.