We’re ready to work with you.
Our team will reach out soon to connect you to what you need.
Explore these resources to get ready:
Analytics that don’t translate to action are a waste of resources. If you’re frustrated with unexpected after-effects from self-service BI, Doug Henschen has an action plan.
VP & Principal Analyst
Your secret weapons for 2022 are the coders and builders on your technology teams. Hugh Owen details why a developer-centric strategy could make all the difference for your business.
Executive Vice President & Chief Marketing Officer
Your thousands of reports could be costing you millions. Robert Tischler breaks down clear guidelines and best practices to help your organization stay productive in 2022.
Are biased insights worse than no insights at all? Roxane Edjlali lays out the case for an adaptive analytical governance approach built on a ‘trust but verify’ methodology.
Senior Director, Solutions Management
Get your head out of the sand and into the cloud. If you haven’t embraced a cloud mentality, you’re not alone. But containerization could change that according to David Menninger.
SVP and Research Director
VP & Principal Analyst, Constellation Research
In short, the “democratization” of BI and analytics through self-service was only a partial success. In fact, when self-service initiatives were not backed by a degree of centralized management—in the form of unified data models and agreed-upon definitions and measures—the profusion of reports and dashboards sometimes led to conflicting and overlapping versions of the truth. These conflicts created a new source of frustration with BI and even distrust of reports and dashboards.
Business acceleration—together with the challenges described above—has fueled demand for more actionable analytics. There’s less time to navigate and interpret reports and dashboards, so there’s more interest in delivering concise, to-the-point insights at decision points within applications and workflows. The move toward actionable analytics is being enabled by four trends:
Next-generation embedding. Concise, targeted insights are increasingly delivered at the point of decision making within applications. This trend is being enabled by microservices architectures, rich application programming interfaces, software development kits, and prebuilt integrations both within popular enterprise applications (e.g., ERP, CRM, HCM) and within collaborative tools (e.g., email, Teams, Slack, etc.).
Natural language query. NLQ isn’t just a feature for querying from within dashboards and other analytical interfaces. Application vendors are adding NLQ features within their apps, and BI and analytics vendors are promoting this trend by turning NLQ into a utility embeddable anywhere, so users don’t have to navigate to separate reports and dashboards. Users can simply type in natural-language questions from within their transactional apps, or they can ask Siri or Google and voice-to-text integrations will translate their questions for mobile app interaction.
Low-code/no-code development options. LC/NC options are popping up everywhere, offered by application vendors, BI and analytics vendors, database vendors, and more. The trend is rising because the need for software is vast while developers are in short supply. LC/NC options enable systems administrators, power users, analysts, and domain experts to develop custom apps and to extend off-the-shelf applications. Analytics are invariably a part of these LC/NC offerings, enabling data-driven decision-making and action.
Workflow and automation. During the pandemic, we’ve seen “the great resignation” and employers struggling to hire enough workers. No wonder there’s huge demand to automate wherever possible. Why saddle remaining employees with repetitive manual work steps? BI and analytics platforms have had alerting capabilities for years. So, where thresholds and exceptions are tied to predictable outcomes and there is confidence about what actions to take, why not automate? Organizations are increasingly using bots, workflow, and robotic process automation capabilities with events and analytic thresholds triggering actions.
EVP & CMO, MicroStrategy Inc.
See how MicroStrategy is empowering developers.
The widening divide is indicative of a data-driven mentality—where information is not simply captured for after-action reports, but it is constantly used to optimize next steps and forward strategies.
Successful organizations over the last year have leveraged data and analytics as a vital strategic differentiator, using it to safeguard and empower their employees in the midst of a volatile market; delight their customers with increased personalization and targeted customization, and seamlessly navigate supply chains in a constantly evolving business landscape.
These organizations and industry leaders understand that the key to unlocking their data is to uniquely empower a corps of skilled developers with impactful tools to build amazing applications and solutions. This commitment to a data-driven mentality puts developers front and center in their respective organizations to act as change agents, weaving information into action for thousands of their colleagues, by building solutions that are rich with insight, action-oriented, and accessible to anyone.
What does being a developer in the modern world entail? Today’s required skillsets may surprise you. The arrival of low-code/no-code app development has changed the game—and gives business experts the means to develop high-impact enterprise applications.
This new breed of application developers can create new realities and experiences for thousands of others—faster than ever before.
For example, we all remember what happened in the early days of the COVID crisis. Stores were closed, factories were shutting down; people were scared. There was legitimate concern that supply chains would be broken, and that food insecurity could manifest as a serious problem…
One organization, a prolific MicroStrategy customer and retailer serving millions in the UK, put its developers to work to operate its business with the highest degree of available intelligence.
In just one week’s time, its developers built a COVID-19 response app. Within two weeks, the app was fully tested and deployed to more than a thousand store locations.
What exactly did the application do? With an unpredictable workforce due to quarantine recommendations, this app allowed the company to effectively staff stores based on available resources—redirecting people and products to the high-traffic locations that required inventory or additional assistance. This helped keep the shelves stocked with food, medicine, and other products their customers required during an especially difficult period.
This is just one example of what can go right—in a time when so much could easily go wrong. Whatever the challenges 2022 has in store for us all, equip, trust, and empower your developers. You’ll likely be thanking them by year’s end.
Managing Director BARC Austria
Reports and dashboards have often been proclaimed dead. But we see that they are very much alive. Sure, they are by no means the silver bullet for all information needs in a company, but they are the undisputed backbone of corporate information distribution. Reports and dashboards done right help quickly make sense of data and drive determined action to create value.
According to our worldwide BI & Analytics Survey, 84% of companies use standard reports, and 78% use dashboards or interactive analytical apps. These figures depict the use of reports and dashboards in a distinct tool while companies on average have more than three analytics and BI tools in use.
Information presented in a standardized form offers a low entry barrier for all users. It helps to satisfy repeated information needs. There is a substantial need for serving information to business users in a standardized way.
Now, while we clearly rely on reports and dashboards to direct our actions, corporate reporting is often in a terrible state. Many companies suffer from having too many reports in and outside their analytics and BI systems. Your company probably suffers from that overabundance too. Having too many reports seriously undermines trust and diminishes analytics and BI efficiency. Furthermore, it undermines your efforts to get value out of your data.
In practice, it is always easier to request or build new reports than to have them deleted or decommissioned. This force is a major cause of the reporting mess in companies.
Changes in business, the competitive landscape or customer behavior as well as changes in management or corporate structure all magnify requirements for new information and reports. Further pressure is exerted by more data sources delivering more data to more users in more departments through more analytics and BI tools to support a growing variety of use cases.
While these forces are manifold and powerful, there is little evidence to suggest their pressure will diminish. They clearly show that analytics and BI must deliver information quickly to support these new requirements. The required time-to-insight can only be achieved when allowing and supporting self-service, which poses further challenges to the reporting environment.
The bottom line is: Whatever happens, more reports get created. And the more reports a company has, the lower the share of what is used. It is not too uncommon to find that 80% of the reports available have not been run in the last twelve months, especially in self-service environments lacking the required conditions. This is a huge waste of resources and a huge cause of distraction for decision-makers and employees at all levels.
The reporting mess is a significant cause of dissatisfaction with corporate reporting. Discontent arises from the high effort in creating and maintaining these reports, especially if a large proportion are only ever used once or twice. Too many reports result in inconsistencies, and this and they are the gas that gets lit by sparks of distrust in the information present. An overabundance of reports also draws the attention of decision-makers away from what matters most and blocks the resources of front-line workers from serving their customers. While the effort to create and maintain a massive number of reports is huge, it often does not pay off. So, you must clean up the reporting mess or even prevent it in the first place.
If you want your reporting to be neatly structured and well organized, these are the starting points to an efficient way of delivering trusted information that informs action.
Senior Director, Solution Management, MicroStrategy Inc.
Driven by the need for flexibility and agility in delivering new projects along with the growing number of cloud-native data sources in the enterprises’ ecosystem, data and analytics have continued to move to the cloud. As a result, we have seen the emergence of new architectural designs such as data mesh, data fabric, and data lakehouses. These concepts all attempt to reconcile the growing complexity of cloud data collection. Where raw data is collected without any consideration for how it relates to other data—if it can be used or trusted.
All these architectural designs have their merits, as they highlight the growing complexity of the data landscape. Nonetheless, this complexity is challenging business users and analysts who were hoping that self-service alone could address their daily analytical needs. Even for savvy data users, making meaningful use of the data has become more difficult. As a result, having trusted shareable and reusable data in support of analytics has become even more important.
The data management practice alone can’t address the full spectrum of analytical use cases with the flexibility and agility needed by organizations even with increased investment in activities such as data catalogs or augmented data integration capabilities, and in dedicated resources like data engineers.
As a result, organizations need to adopt an adaptive analytical governance approach built on a ‘trust but verify’ methodology to overcome data complexity. To do so organizations need to articulate various rings of analytics governance that fit the business requirements, skills, and needs of the various users across the business:
Analytical governance differs from data governance as the former is based not only on data, but it also includes the definition of calculations, metrics, AI/ML models. Basically, it’s all of the analytical content and all the metadata relating to the content itself like the freshness of the analytical content, author, etc. Analytical governance is powered by a rich underlying metadata layer organized in the form of a semantic graph.
Powered by its rich metadata, data, and analytics—leaders can expect to have a better view of the value of analytics to the organization by tracking for instance analytics usage across various groups in the organization, frequency of access, and its impact on enterprise objectives such as customer satisfaction and supply chain efficiency.
By combining an adaptive analytics governance approach, enterprises empower all users to have access to analytics regardless of their data proficiency. This approach combined with a strategic view of analytics valuation can increase the efficacy and value of analytics to enterprises.
SVP and Research Director, Ventana Research
Our research has also shown that 86% of organizations expect the majority of their data to be in the cloud eventually. Part of what is driving the adoption of cloud analytics is that many organizations are already operating in the cloud. More than one-half (59%) of the participants are analyzing data from cloud applications. They have experienced some of the benefits of cloud applications and it’s natural to extend those benefits to their analytics.
However, it’s not all smooth sailing. Organizations still report a slight preference for on-premises analytics at a rate of 39% compared with 34% that prefer cloud-based analytics. The research shows that more than one-half (57%) of those who are not using the cloud are concerned about security issues. Clearly, analytics software vendors and hyperscalers need to continue to address these issues by providing secure and trusted platforms, ideally with certifications to meet different organizations’ requirements.
Interestingly, those organizations that prefer on-premises deployments report higher levels of satisfaction than those preferring cloud deployments. Those preferring on-premises deployments also report that their analytics technology has had a greater impact on improving their activities and processes. Perhaps these differences result from the fact that some on-premises offerings have not been fully migrated to the cloud yet.
Therefore, these particular cloud offerings are lacking some features compared to their on-premises predecessors. The research also shows two-thirds of organizations (65%) that prefer on-premises deployments have a hybrid of on-premises and cloud infrastructure. Perhaps some of the reduced satisfaction results from challenges interoperating between on-premises and cloud deployments. Further, the “cloud” is not one thing. Almost one-half (42%) of the organizations in our research are using at least two different hyperscalers. Consequently, organizations should strive for hybrid analytic architectures that can span these on-premises and multi-cloud configurations.
Containerization provides the means to achieve this deployment flexibility. Containerized applications enable the same code to be deployed on-premises and on the various hyperscalers’ platforms. Containerization combined with a microservices architecture creates the scalability needed to support large numbers of users and large volumes of data.
The other reality of making cloud analytics work is avoiding cloud cost surprises. Organizations prefer predictability in the costs associated with analytics software. Subscription-based pricing models vary from vendor to vendor; some are priced on simple metrics such as numbers of users, while others use more complicated metrics based on different usage parameters such as the amount of data processed, or the number of reports processed. If the pricing metric is not easily understood and predictable from one billing period to the next it can lead to some very unpleasant surprises when the bill arrives. Vendors can also make cloud costs more predictable by offering tools to monitor and manage the usage throughout the organization.
Notwithstanding the potential issues associated with cloud analytics, we expect to see their popularity continue to grow toward eventually eclipsing on-premises deployments. But it’s wise for organizations to have their eyes wide open so they can understand and manage their cloud deployments. The cloud offers many benefits and organizations should seek to maximize their impact with a careful and thoughtful selection of their cloud analytics vendors.