For organisations just starting out, selecting the right Analytics and Business Intelligence tools (ABI) to meet the decision-making needs across your user base can be a daunting task given the array of options and the speed at which the market is moving.
In a recent study by Gartner, entitled “The Future of Decisions”, two statistics stood out as compelling indicators of the current state of play. Of the respondents surveyed, 65% agreed that the decisions they make today are more complex than two years ago. Further, 53% agree that there is now a higher expectation for them to be able to explain or justify their decisions.
It is therefore not surprising that there has been an exponential increase in organisations’ interest in adopting modern platforms that help them to become more data-driven. Boards worldwide are now demanding that decisions be backed by data, and when 1% can make the difference between profit or loss, company survival is increasingly dependent on it.
Perhaps more surprising is the number of organisations operating today heavily relying on the age-old decision-makers crutch – the Excel spreadsheet. Whether the requirement is to sift through a data set exported from a key source system to look for anomalies or to cut a budget for the year ahead and revise this with a monthly forecast and hope that the flow of actuals lines up and gives a positive variance.
These basic functionalities are all good and well on the surface and may appear to offer a degree of sophistication for those organisations that have never looked at the data (believe us, they are out there). However, they don’t help put the story behind the data into the right context.
They don’t offer diagnostic capabilities to understand the ‘why’, and they certainly don’t offer the ability to pivot from a rearward-looking view to a forward-looking predictive view of the world to understand what is likely to happen next. The more sophisticated platforms are now able to suggest to the decision-maker which path is the right path.
They are now doing that with clear explainability, a level of quantification that goes beyond having a human correctly interpret a set of bar charts on a business intelligence dashboard, an ability to appropriately handle bias, and they provide levers that help tune the recommendation based on rapidly changing business conditions like we saw with COVID.
For those organisations still featuring Excel spreadsheets as part of key decision-making value chains, whilst you may have already fallen behind the pack, it is not too late to make your move. And you need to make that move now before the post-COVID bounce arrives.
For more mature organisations, the situation is a little different. Many have had Decision Support systems and Business Intelligence tools in place for some time, but they were designed through the lens of legacy capability. The net result is these ecosystems became huge report factories, where tens and hundreds of reports churned through their schedules, often overnight, and jettisoned those outcomes to users’ inboxes in the hope they would find what they were looking for.
Frequently, such reports are not prepared for the next set of questions the user will likely ask. What happens next is users then demand a copy of the data extract, only to go off and recreate their own analyses, and you guessed it, in Excel. Or they went and purchased a copy of Tableau on the company credit card, much to the ire of the central IT team. Today, report factories are ‘old hat’, and their licence constructs are expensive and out of date.
Decision Support systems for the future need to be designed and implemented with a high degree of self-serviceability in mind. They need to be built on trusted data foundations that contain enough history to help predict the future.
According to Gartner, modern business intelligence platforms offer easy-to-use functionality that supports a full analytic workflow, from data preparation to visual analysis, emphasising self-service usage and augmented user assistance.
These platforms are no longer differentiated by what kind of data viz they can create, which are now commoditised. Interactive key performance indicator (KPI) dashboards using common charts and drawing on a wide range of data sources are standard fare. Mapping capabilities are also now available across the board. These platforms now differ on how well the platform augments traditional analytics with other value adds.
Capabilities like machine learning (ML) and artificial intelligence-assisted data preparation, insight generation, and insight explanation help businesspeople and data analysts determine the real story behind the data more effectively than they could build their own bespoke dashboard. As a result, the definition of self-service is shifting somewhat to feature more of these augmented capabilities.
Gartner defines twelve critical capability areas, which would make a good basis for your scoring sheet when assessing the options.
Security: In a world that is seeing an increasing number of ransomware and related Cyberattacks, capabilities that enable platform security, administering of users, auditing of platform access and authentication should be top of the list.
Manageability: Capabilities that track platform usage and manage how information is shared (and by whom). This should include information relating to cost consumption where licencing operates under such a construct.
Cloud Capability: The ability to support building, deployment and management of analytics in the cloud, based on data stored both in the cloud and on-premises, and doing so securely.
Data source connectivity: Capabilities that enable users to connect to, query and ingest data, and maximise the performance benefits of the underlying system while doing so (e.g. caching, in-memory optimisations, parallel processing).
Data preparation: Support for drag-and-drop, a user-driven combination of data from different sources, and the creation of analytic models (such as user-defined measures, sets, groups, and hierarchies).
Cataloguing: The ability to automatically generate and curate a searchable catalogue of analytic content, thus making it easier for analytic consumers to know what content is available.
Automated insights: A core attribute of augmented analytics, this is the application of machine learning and other Data Science techniques to automatically generate findings for end users (for example, by identifying the most important attributes in a dataset).
BI Reporting Tools: The best business intelligence reporting tools The ability to create and distribute (or “burst”) pixel-perfect, grid-layout, multipage reports to users on a scheduled basis.
Data visualisation: Support for highly interactive dashboards and exploration of data through manipulation of chart images.
Data storytelling: The ability to combine interactive data visualisation with narrative techniques in order to package and deliver analytic content in a compelling, easily understood form for presentation to decision-makers. More frequently, such capabilities are either replacing or automating, the creation of PowerPoint presentations.
Natural language query (NLQ): This enables users to ask questions and query data and analytic content using terms that are either typed into a search box or spoken.
Natural language generation (NLG): The automatic creation of linguistically rich descriptions of answers, data and analytic content. Within the analytics context, as the user interacts with data, the narrative changes dynamically to explain key findings or the meaning of charts or dashboards.
Most vendors will claim that their product can do everything, and increasingly the leading offerings tend to jockey for position with each release as they respond and play catch-up with each other. However, only once you have used them in anger are the platforms’ respective strengths and weaknesses revealed. Here we provide a run-through of the dominant players based on real-world experiences.
Pros: The popularity of Power BI has seen an exponential increase in recent years. Four years ago when the Tridant team reviewed its capabilities for Enterprise Readiness, it fell some way short and our advice to clients at the time was to ‘handle with care’.
More recently, experience and feedback across multiple projects have been highly positive, with features such as Paginated Reporting, Deployment Pipelines, and Dataflows all contributing to high-value client outcomes and a sense of enterprise readiness and solidity. Its augmented analytics features are only a few clicks away and offer a sense of being made for the cloud.
Furthermore, its seamless integration into Microsoft Teams makes it an ideal accompaniment for data-driven decision-making to feature as part of the now ubiquitous online meeting. Its low-cost entry point via Power BI Desktop and Power BI Pro licensing, makes it a compelling offering that should see automatic entry in any BI product comparison process.
Cons: Despite its improvements in Enterprise Readiness, there are still signs throughout the product that its maturity is developing. Several of the newer features do have bugs, and the problem with dealing with an organisation as vast as Microsoft is that the customer support experience lacks the intimacy and responsiveness that some of the more focused vendors provide.
On one particular support call, the representative didn’t know about some of the latest features. And for those organisations that want to retain the comfort of an on-premise implementation, Power BI Report Server offers a reduced feature set compared to the cloud deployment approach.
Bad habits are forming in the industry also, with new implementations driven by unseasoned practitioners seemingly unaware of the role that a good data foundation plays, thanks to a product that allows you to do almost everything in the reporting layer. Just because you can, doesn’t mean you should, and thus much time is being spent undoing poor implementations that concentrate too much of the workload, data integration, and business rules near the end user.
Pros: You only needed to attend a Tableau Global conference in the days before COVID to see first-hand the profound impact this platform has had on its end-user community. People from Sales, Marketing, and other non-technical backgrounds were given incredible empowerment and their own “aha” moments to help bring their data sets and stories to life, and look good for their boss in the process.
Thus to this day, Tableau remains the industry leader in terms of speed of thought analysis, self-serviceability, and front-end aesthetics and user experience. Tableau also has the best online community for supporting its users, meaning happier users and fewer tickets logged to IT support.
And whilst still a “Cessna” to the Alteryx “Boeing 747”, Tableau Prep has helped to bridge the gap in the tool’s ETL and data prep capabilities where other business intelligence tools typically played a complementary role. Tableau is still the safest choice if delighting your users is the primary goal.
Cons: Tableau has never fully broken through to become the de facto choice as a singular Enterprise BI tool, and for that reason, it is frequently seen alongside legacy platforms such as Cognos BI or Oracle OBIE, and organisations should take comfort in the number of coexistence implementations working well.
However, Tableau is starting to look expensive alongside some other rivals where user counts are on the higher side, and it is a platform that does need a reasonable amount of work done upstream, either in the shape of a Data Warehouse, or some other data fabric, to scale effectively.
Pros: A bit of a wildcard entry and slightly unusual inclusion due to its low score in the most recent Gartner Magic Quadrant, however, what has become apparent in recent times is the surprising number of organisations surfacing that still retain large legacy implementations of Cognos BI, presenting a headache for those Analytics Managers.
Also rather interesting is that those large legacy implementations often have very satisfied end-user communities, thanks to years of investment in constructing valuable mission-critical reporting ecosystems that churn out trusted reporting day in and day out, even if they are typically unsexy aesthetics-wise. Classic features such as pixel-perfect reporting, advanced scheduling, integration with diverse data sources including options ranging from TM1 cubes to Google Big Query and SAP Hana databases, integration with IBM’s leading Data Science stack, and almost bulletproof bug-free capabilities on-premises and in the cloud, Cognos is one of those platforms that has definite staying power. As they say, the “good stuff sticks”.
Cons: Perhaps the biggest issue with Cognos Analytics is not the product itself, but the IBM machine around it. Ever since the acquisition back in 2008, things haven’t been quite the same, and only very few of the best Cognos experts remain in the organisation. Product strategy and direction have been questionable. Specifically, user-friendliness has taken time to catch up to the market leaders, meaning the barrier to entry has been rather steep, whilst improving more recently.
The number of Executives who would rather not deal with IBM ever again is growing, usually due to pricing difficulties, a negative experience from a licence audit, or both. Unfortunately for Cognos Analytics, it is just no longer a sexy net new option alongside its more modern foe. The foundations are there for it to be a formidable platform.
Pros: If your organisation is in the retail or manufacturing spaces, and you’ve never seen Qlik’s associative data model in action (ask for the grey/white/green viz), you need to see it before making any major decisions. We have seen multiple occurrences of that feature alone leading to getting Qlik the nod without further questions.
In terms of a time-to-value proposition in building an application from the ground up to production—Qlik is a definite market leader. Qlik’s recent addition of the Attunity product family to the mix also gives the platform a very powerful data integration story that others cannot match.
Cons: Users of Qlik tend to indicate that the delineation between the legacy platform Qlikview, and the new SasS offering Qlik Sense, could have been handled better and more clearly in recent years. Many organisations are now stuck in a ‘no man’s land’ with both platforms in use and a clouded roadmap for cleaning things up. Pricing continues to be an issue also, with Qlik in some cases being double or triple the cost of an equivalent Microsoft Power BI construct. The requirement to use NPrinting for certain pixel-perfect reporting requirements also feels a little old school in today’s modern world.
Pros: Our first glimpse at Looker was at an Exasol conference in Berlin a few years ago, and at that time it didn’t blow our hair back. Today, its status as a challenger in the magic quadrant is justified, and its Google backing adds serious weight. Similar to Power BI’s preference to load and store data locally, Looker encourages the use of the underlying database, which will resonate with purists. The LookML semantic layer continues to be a focal point, offering good Enterprise-grade capabilities around the management of business rules and calculative logic.
The fact that the team are content to open up LookML assets to other analytics platforms shows a different level of thinking that other vendors don’t seem to be putting forward. For organisations that possess good developer skills and capacity, and/or are considering a cloud direction on GCP, Looker is definitely worth a look!
Cons: Looker’s commercial approach to the market is an interesting one. They are not that interested in selling organisations one or two licences to run some experimentation before deciding, they are looking for a more substantial commitment from the outset, which won’t suit those organisations that like to run proofs of concept. This is also a platform for technical practitioners that know what they are doing, less so a user community who want a simple, drag-and-drop experience that is offered by the likes of Tableau. Local support is still ramping up for buyers in the AsiaPac region, so be prepared for the occasional call to the US for support.
Once you’ve had the opportunity to review the pros and cons of the different business intelligence tools, here are some best practices to follow before making your final section.
If you are one of those organisations that has a legacy platform in place, it is vital that you take the time to gain a firm understanding of what you have today. Faced with your latest licence renewal and the pressure to cough up the dollars, it is tempting to make a bold call such as turning the system off and see if the IT Support phone rings, or simply making the decision to switch to an option such as Power BI under the assumption that the new platform will be able to do everything the old one does.
Experience has shown us that the 80/20 rule rings true; the new platform will do 80% of what the old one did but will struggle with the 20%, at which point you are stuck running both systems in parallel and incurring both costs into the future. Or there will be that one report that was sent to the regulator that no longer runs because the system was decommissioned in a rush. The best advice is to run a short but effective Application Review to take stock of the current use case library that is delivered by your current system.
For those cases where systems are undocumented, business intelligence tools are available to help automate the harvesting of important metadata that describes what has been built, and the interdependencies between components. Motio and AVT BSP MetaManager are two examples of these useful tools.
Whether you have a cloud strategy or not, and depending on which vendors you are considering as part of that cloud strategy, will very much have a bearing on which Business Intelligence tools score strongest in your specific assessment.
For example, Microsoft Power BI might score strongest on factors relating to usability and price, but if your cloud strategy takes a direction towards Google, it may no longer be the right choice for you thanks to the volume discounting and bundling that may be on offer for Looker.
For those organisations preferring to defer the move towards the cloud for regulatory reasons or infrastructure availability, you may be surprised that some of the less popular offerings may be a better fit for you thanks to their on-premise origins.
If any Business Intelligence vendor tells you that you don’t need to construct a Data Foundation layer in your organisation to get the best out of their platform, proceed with caution. Whilst it is true that the majority of Business Intelligence and Analytics tools can ingest data directly, it is very rarely the best approach to take.
The best implementations that help an organisation to become truly data-driven feature a holistic data architecture and ETL framework that helps nurture the vast historical record of data an organisation has, and offer up only what is required to be exposed to the BI layer, keeping it lightweight in the process.
Locking yourself into an analytics tool that isn’t a clean fit within your data architecture sets up development hurdles before the process even begins. Consider data foundation technologies from Snowflake, Microsoft (Azure stack), Amazon (Redshift), Google Big Query, or IBM as part of your cloud or non-cloud strategy. We often say at Tridant, you ‘date’ your BI tool, but ‘marry’ your data foundation. It is important to get your data story right.
Never put full trust in the vendor demo. These are typically curated scenarios run by skilled presenters that can be made to appear as if they are doing exactly what you want. It is vital that you see the technology working in your environment, against your data, being used by your people.
Devise a scoring methodology that lists out your mandatory and non-mandatory requirements, add some weightings for good measure, and run your shortlisted business intelligence tools side by side for a short period of time. Hit the vendor with your questions and ensure they share the product roadmap with you.
For the ultimate safety net, involve an implementation partner with skills across each of the offerings you are considering, to help keep the vendors honest during their presentations and to accelerate the time to complete your Business Intelligence tools comparison.
Most large technology vendors will strive to meet revenue targets across the quarters within their financial year. So this means that their motivation to close a deal will be strongest at key points of the year, namely quarter end, and certainly the end of the financial year.
It is also useful to maintain a degree of competitive tension across your top two preferred technologies to achieve the best pricing. Your shortlisted business intelligence tools may be line ball from a capability standpoint, but it could be the commercial deal that sways you one way or another.
Copyright © Tridant Pty Ltd.