In a recent study by Gartner, entitled “The Future of Decisions”, two statistics stood out as compelling indicators of the current state of play. Of the respondents surveyed, 65% agreed that the decisions they make today are more complex than two years ago. Further, 53% agree that there is now a higher expectation for them to be able to explain or justify their decisions.
It is therefore not surprising that there has been an exponential increase in organisations’ interest in adopting modern platforms that help them to become more data driven. Boards around the world are now demanding decisions are to be backed by data, and when 1% can make the difference between profit or loss, company survival is increasingly depending on it.
Perhaps more surprising, is the number of organisations operating today with heavy reliance on the age-old decision-marker’s crutch – the Excel spreadsheet. Whether the requirement is to sift through a data set exported from a key source system to look for anomalies, or to cut a budget for the year ahead and revise this with a monthly forecast and hope that the flow of actuals lines up and gives a positive variance.
These basic functionalities are all good and well on the surface and may appear to offer a degree of sophistication for those organisations that have never looked at the data (believe us, they are out there). However, they don’t help put the story behind the data into the right context.
They don’t offer diagnostic capabilities to understand the ‘why’, and they certainly don’t offer the ability to pivot from a rearward looking view to a forward-looking predictive view of the world to understand what is likely to happen next. The more sophisticated platforms are now able to suggest to the decision maker which path is the right path, and they are now doing that with clear explainability, a level of quantification that goes beyond having a human correctly interpret a set of bar charts on a business intelligence dashboard, an ability to appropriately handle bias, and they provide levers that help tune the recommendation based on rapidly changing business conditions, like we saw with COVID.
For those organisations still featuring Excel spreadsheets as part of key decision-making value chains, whilst you may have already fallen behind the pack, it is not too late to make your move. And you need to make that move now before the post-COVID bounce arrives.
For more mature organisations, the situation is a little different. Many have had Decision Support systems and Business Intelligence tools in place for some time, but they were designed through the lens of legacy capability. The net result is these ecosystems became huge report factories, where tens and hundreds of reports churned through their schedules, often overnight, and jettisoned those outcomes to users’ inboxes in the hope they would find what they were looking for.
Frequently, such reports are not prepared for the next set of questions the user is likely to ask. What happens next is users then demand a copy of the data extract, only to go off and recreate their own analyses, and you guessed it, in Excel. Or they went and purchased a copy of Tableau on the company credit card, much to the ire of the central IT team. Today, report factories are ‘old hat’, and their licence constructs are expensive and out of date.
Decision Support systems for the future need to be designed and implemented with a high degree of self-serviceability in mind, and they need to be built on trusted data foundations that contain enough history to help predict the future.
According to Gartner, modern business intelligence platforms offer easy-to-use functionality that supports a full analytic workflow, from data preparation to visual analysis, with an emphasis on self-service usage and augmented user assistance.
These platforms are no longer differentiated by what kind of data viz they can create, which are now commoditised. Interactive key performance indicator (KPI) dashboards using common charts and drawing on a wide range of data sources are standard fare. Mapping capabilities are also now available across the board. Where these platforms now differ, is on how well the platform augments traditional analytics with other value adds.
Capabilities like machine learning (ML) and artificial intelligence-assisted data preparation, insight generation, and insight explanation help businesspeople and data analysts determine the real story behind the data more effectively than they could building their own bespoke dashboard. As a result, the definition of self-service is shifting somewhat to feature more of these augmented capabilities.
Gartner goes on to define twelve critical capability areas, which would make a good basis for your scoring sheet when assessing the options.
Security: In a world that is seeing an increasing number of ransomware and related Cyberattacks, capabilities that enable platform security, administering of users, auditing of platform access and authentication should be top of the list.
Manageability: Capabilities that track usage of the platform and manage how information is shared (and by whom). This should include information relating to cost consumption where licencing operates under such a construct.
Cloud Capability: The ability to support building, deployment and management of analytics in the cloud, based on data stored both in the cloud and on-premises, and doing so securely.
Data source connectivity: Capabilities that enable users to connect to, query and ingest data, and maximising the performance benefits of the underlying system while doing so (e.g. caching, in-memory optimisations, parallel processing).
Data preparation: Support for drag-and-drop, user-driven combination of data from different sources, and the creation of analytic models (such as user-defined measures, sets, groups, and hierarchies).
Cataloging: The ability to automatically generate and curate a searchable catalog of analytic content, thus making it easier for analytic consumers to know what content is available.
Automated insights: A core attribute of augmented analytics, this is the application of machine learning and other Data Science techniques to automatically generate findings for end users (for example, by identifying the most important attributes in a dataset).
BI Reporting Tools: The best business intelligence reporting tools The ability to create and distribute (or “burst”) pixel-perfect, grid-layout, multipage reports to users on a scheduled basis.
Data visualisation: Support for highly interactive dashboards and exploration of data through manipulation of chart images.
Data storytelling: The ability to combine interactive data visualisation with narrative techniques in order to package and deliver analytic content in a compelling, easily understood form for presentation to decision makers. More frequently, such capabilities are either replacing, or automating, the creation of PowerPoint presentations.
Natural language query (NLQ): This enables users to ask questions and query data and analytic content using terms that are either typed into a search box or spoken.
Natural language generation (NLG): The automatic creation of linguistically rich descriptions of answers, data and analytic content. Within the analytics context, as the user interacts with data, the narrative changes dynamically to explain key findings or the meaning of charts or dashboards.
Most vendors will claim that their product can do everything, and increasingly the leading offerings tend to jockey for position with each release as they respond and play catch-up with each other. However, only once you have used them in anger are the platforms’ respective strengths and weaknesses revealed. Here we provide a run-through of the dominant players based on real world experiences.
More recently, experience and feedback across multiple projects has been highly positive, with features such as Paginated Reporting, Deployment Pipelines, and Dataflows all contributing to high-value client outcomes and a sense of enterprise readiness and solidity. Its augmented analytics features are only a few clicks away and offers a sense of being made for the cloud.
Furthermore, its seamless integration into Microsoft Teams makes it an ideal accompaniment for data-driven decision making to feature as part of the now ubiquitous online meeting. It’s low-cost entry point via Power BI desktop and Power BI Pro licensing, makes it a compelling offering that should see automatic entry in any BI product comparison process.
Once you’ve had the opportunity to review the pros and cons of the different business intelligence tools, here are some best practices to follow before making your final section.
If you are one of those organisations that has a legacy platform in place, it is vital that you take the time to gain a firm understanding of what you have today. Faced with your latest licence renewal and the pressure to cough up the dollars, it is tempting to make a bold call such as turning the system off and see if the IT Support phone rings, or simply making the decision to switch to an option such as Power BI under the assumption that the new platform will be able to do everything the old one does.
Experience has shown us that the 80/20 rule rings true; the new platform will do 80% of what the old one did, but will struggle with the 20%, at which point you are stuck running both systems in parallel and incurring both costs into the future. Or there will be that one report that was sent to the regulator that no longer runs because the system was decommissioned in a rush. The best advice is to run a short but effective Application Review to take stock of the current use case library that is delivered by your current system.
For those cases where systems are undocumented, business intelligence tools are available to help automated the harvesting of important metadata that describes what has been built, and the interdependencies between components. Motio and AVT BSP MetaManager are two examples of these useful tools.
Whether you have a cloud strategy or not, and depending on which vendors you are considering as part of that cloud strategy, will very much have a bearing on which Business Intelligence tools score strongest in your specific assessment.
For example, Microsoft Power BI might score strongest on factors relating to usability and price, but if your cloud strategy ends up taking a direction towards Google, it may no longer be the right choice for you thanks to the volume discounting and bundling that may be on offer for Looker.
For those organisations preferring to defer the move towards cloud for regulatory reasons or infrastructure availability, you may be surprised that some of the less popular offerings may be a better fit for you thanks to their on premise origins.
If any Business Intelligence vendor tells you that you don’t need to construct a Data Foundation layer in your organisation to get the best out of their platform, proceed with caution. Whilst it is true that the majority of Business Intelligence and Analytics tools can ingest data directly, it is very rarely the best approach to take.
The best implementations that help an organisation to become truly data-driven feature a holistic data architecture and ETL framework that helps nurture the vast historic record of data an organisation has, and offer up only what is required to be exposed to the BI layer, keeping it lightweight in the process.
Locking yourself into an analytics tool that isn’t a clean fit within your data architecture sets up development hurdles before the process even begins. Consider data foundation technologies from Snowflake, Microsoft (Azure stack), Amazon (Redshift), Google Big Query, or IBM as part of your cloud or non-cloud strategy. We often say at Tridant, you ‘date’ your BI tool, but ‘marry’ your data foundation. It is important to get your data story right.
Never put full trust in the vendor demo. These are typically curated scenarios run by skilled presenters that can be made to appear as if they are doing exactly what you want. It is vital that you see the technology working in your environment, against your data, being used by your people.
Devise a scoring methodology that lists out your mandatory and non-mandatory requirements, add some weightings for good measure, and run your shortlisted business intelligence tools side by side for a short period of time. Hit the vendor with your questions, and make sure they share the product roadmap with you.
For the ultimate safety net, involve an implementation partner with skills across each of the offerings you are considering, to help keep the vendors honest during their presentations and to accelerate the time to complete your Business Intelligence tools comparison.
Most large technology vendors will be striving to meet revenue targets across the quarters within their financial year. So this means that their motivation to close a deal will be strongest at key points of the year; namely quarter end, and certainly end of financial year.
It is also useful to maintain a degree of competitive tension across your top two preferred technologies to achieve the best pricing. Your shortlisted business intelligence tools may be line ball from a capability standpoint, but it could be the commercial deal that sways you on way or another.