The 2014 Business Intelligence Report from Gleanster Research showcases business intelligence tools ranked by four criteria: ease of use, ease of deployment, features and functions, and overall value. We asked a minimum of 8 users to provide feedback on each of these categories for vendors they have experience using. The data showcases two findings: (1) whether customers are comfortable providing feedback on the solutions and (2) how well each solution actually fits the user’s needs. Translated differently, did the solution providers sell the tool to the right customers? Do customers actually value the capabilities they have access to, regardless of their role?
The intent of this analysis is not to compare features and functions. The fact is, it wouldn’t be appropriate to do it in this way. The Business Intelligence landscape has changed dramatically over the last ten years. What was traditionally a very robust offering with features designed for highly skilled analysts and IT is now expanding into a landscape of technology providers that address very different business and functional requirements. While BI was always intended to be consumed by any business function (Sales, Marketing, Operations, Finance, IT, etc.), it was really only of value if the right questions were asked AND the data was available AND there was sufficient time to do the analysis before a decision had to be made. The trick these days is to strike a balance between the robustness of capabilities and the business requirements.
Core themes in this year’s data include self-service BI (which is causing considerable buzz around the concept of “agile BI”), mobility, and dashboards. Over the years, we have learned that the value of the solution isn’t in the robustness of features, but rather the organization’s ability to act on the insights produced from the tool. So it’s no wonder 89% of business leaders estimate they regularly use about 30% of the data that exists in their organization for informed data-driven business decisions. Organizations struggle to translate available data into strategic insights in a timely and efficient way.
Coming back to the features and functions discussion, it’s really about bridging the gap between analysis and the ability of business users to consume and translate this into strategy. For some organizations the best way to do this is to give decision makers access to self-service BI capabilities, making it easier for them to “own” the analysis and learn from the process. This is driving a new wave of innovation in BI and a significant shift to cloud delivery. Cloud-based BI offers rapid implementation, easy consumption of data and connectivity, and rapidly configurable mobile analytics for business leaders. For additional insights on these and other trends in the space, check out the full Gleansight Benchmark report, which can be downloaded here.
- Gleanster Vendor Rankings FAQs -
What is Gleanster’s methodology for capturing vendor rankings data?
Vendor rankings are crowd-sourced by end users in Gleanster surveys. Respondents are asked to rank their current or past experience with relevant vendors. A minimum of 8 user reviews are required to show up on the chart. This is not a statistically valid sample size, but it’s quite difficult to get in front of actual users. Gleanster promotes this survey independently AND allows vendors to promote the survey link prior to publication to drive customer participation. The top 8 highest survey responses are taken into account on the rankings. All vendors have equal ability to be covered on the rankings charts. Vendors do not pay Gleanster to be covered and cannot influence placement with an analyst relationship.
How do I interpret the data on this chart?
Eight users with current or past experience with one or more solutions from this vendor gave them an average score of “x” based on the criteria of this chart. This information should (1) be taken with a grain of salt given the sample size and (2) be married with other sources of rankings data available in the market research industry.
If a vendor isn’t ranked as “Best,” what does that mean?
The Good, Better, Best rankings are a way to segment user feedback in easy to digest buckets. Any vendor with more than one customer has a technology offering that is successfully addressing the needs of a satisfied customer base. Regardless of the score, placement on the vendor rankings charts is a good thing. It means you get insight into user perception from a tiny sub-set of the vendor’s users who were willing to provide feedback. But don’t assume the score is indicative of ALL customers – be that a top ranking score or a lower ranking score. Again, use the data as one of many pieces of information that may influence your decision. Our goal is to help buyers, not bias buyers.
If a vendor isn’t ranked at all, what does that mean?
Not showing up on vendor rankings is merely an indication that Gleanster did not capture enough user reviews for inclusion on the report. Sometimes the magic works, and sometimes it doesn’t. But we’re determined to keep trying so you can make informed decisions on technology spend. You will, however, notice that these vendors are covered in our Gleansights and usually have a Gleanster Skinny covering their solutions.
Why does Gleanster provide rankings in this way?
You have access to an abundance of data from analysts who provide context about vendors based on the solutions offered and market presence. It’s more difficult to capture user feedback based on criteria buyers consider when investing in technology solutions. Our rankings are based on end-user feedback and should be used as a directionally relevant data point in your decision – one of many. The data may not be statistically valid, but it’s better than a sharp stick in the eye. It’s up to you to determine if it merits any weight in your decision process.
Can the data be biased?
Vendors have the ability to promote the survey link prior to publication. Technically they could encourage 8 customers to bias the data. However, survey responses are anonymous, and generally users are quite honest – which ultimately impacts the average score for vendors. Also, buyers are savvy. Gleanster does capture personally verifiable data on survey respondents to validate the accuracy of user feedback, but analysis is always at an aggregate level. Personal data from respondents is put in a special lock box that Al Gore keeps under his bed.