Self-service analytics: data-driven challenges and solutions
By Bill Saltmarsh on December 1, 2015
Self-service analytics is an ideal that is widely shared and promoted in the modern workplace, especially for those organizations that pride themselves on being data-driven. Self-service analytics can be described as an environment where business users are enabled and encouraged to directly access data, in order to derive insight from business information as quickly and efficiently as possible.
However, the process to achieve true self-service is fraught with obstacles and unforeseen complexities. If I were to characterize the majority of these problems, then I’d say that we, as data professionals, are underestimating the level of difficulty in transforming a data-curious business user into someone who can competently access, analyze and consume that data.
Overcoming the challenges
From where I sit, there are two primary options in overcoming these difficulties. The first is to spend an extraordinary amount of time studying the behavior of your business users, understanding their needs and then training them on both an abstract level as well as a contextual level, so that their skill set is improved enough to become truly self-sufficient in their role.
The other primary alternative is to acquire and/or build tools that drastically simplify the process of acquiring, cleaning and analyzing data. As someone who continually works to develop data products that are increasingly accessible, I constantly ask myself the question: Is this easy to use? Unfortunately, tools that are easy to use are few and far between.
Using Tableau
For example, Tableau prides itself on its ability to convert users into analysts. However, once you get past the initial inspiration provoked by learning the potential of Tableau (or any other visualization app), you’re left with the difficult and technical work of creating actual meaning from disparate data. This is not easy work, even if it is easy to click and drag a field to make a bar chart.
Speaking specifically about Tableau, one common question from new users is: How do I compare the performance of a singular dimension against the performance of the larger category that dimension falls within? This is such a common and logical question in data analysis, yet getting to the answer is not easy. The construction of a level of detail (LOD) expression, which is one solution to this question, is not easy to accomplish. And more than that, the concept is even more difficult to grasp than the mechanics, especially for someone with minimal data skills and experience.
Here’s an example of an LOD expression in Tableau. This allows you to see the % of Category total for each of the Sub-Categories, without actually having Category in your view:
…not exactly intuitive.
Adding Blockspring
On the whole, Tableau does make things easier for business users. Another one that does so is Blockspring. It was first developed as an add-on for Excel and/or Google Sheets that allowed anyone to connect to web data, and import it without leaving the friendly confines of your spreadsheet, and without having to write any code at all.
One of the impediments for the data-curious is the complex nature of acquiring data. The ability to connect to a raw data source, and have a tool make sense of that data is a huge step forward in the world of self-service analytics. Not only does Blockspring create and provide tools for spreadsheet-based analyses, but they’ve also branched out into the realm of Tableau integration. For example, here’s how easy it to connect to web page, convert that page’s content into data and then bring it into Tableau Desktop:
Finding a solution
There you have it. In just over two minutes, I made a connection, converted web site content to ‘clean’ data, imported it into Tableau Desktop (using version 9.1 or greater only) and analyzed it to find outliers based on multiple measures. As far as data analysis projects go, this is simple stuff. However, the complexities that are overcome by using these tools (Blockspring and Tableau) cannot be overstated. Tools like these open up possibilities to non-technical business users in ways that were previously unattainable.
That being said, there is still a vast amount of potential for improvement. In the example above, the data import was clean and minimal. However, possible pitfalls abound in the steps from URL to dashboard. The challenge for data engineers, analysts and stewards is to continue to perfect the tools that enable data democratization. Never stop asking: Is this easy to use?

…not exactly intuitive.
Adding Blockspring
On the whole, Tableau does make things easier for business users. Another one that does so is Blockspring. It was first developed as an add-on for Excel and/or Google Sheets that allowed anyone to connect to web data, and import it without leaving the friendly confines of your spreadsheet, and without having to write any code at all.
One of the impediments for the data-curious is the complex nature of acquiring data. The ability to connect to a raw data source, and have a tool make sense of that data is a huge step forward in the world of self-service analytics. Not only does Blockspring create and provide tools for spreadsheet-based analyses, but they’ve also branched out into the realm of Tableau integration. For example, here’s how easy it to connect to web page, convert that page’s content into data and then bring it into Tableau Desktop:
Finding a solution
There you have it. In just over two minutes, I made a connection, converted web site content to ‘clean’ data, imported it into Tableau Desktop (using version 9.1 or greater only) and analyzed it to find outliers based on multiple measures. As far as data analysis projects go, this is simple stuff. However, the complexities that are overcome by using these tools (Blockspring and Tableau) cannot be overstated. Tools like these open up possibilities to non-technical business users in ways that were previously unattainable.
That being said, there is still a vast amount of potential for improvement. In the example above, the data import was clean and minimal. However, possible pitfalls abound in the steps from URL to dashboard. The challenge for data engineers, analysts and stewards is to continue to perfect the tools that enable data democratization. Never stop asking: Is this easy to use?