Business Intelligence / Management Information user perceptions of quality and reliability
We conducted user research on behalf of the BI/MI Service to understand how perceived data quality influenced BI Tools users in the process of producing or accessing reports.
The Service Management team in Information Services are responsible for supporting BI Tools, including BI Suite, the main BI tool used across the University.
Business intelligence (BI) tools are types of application software that collect and process large amounts of unstructured data; they help prepare data for analysis so that reports, dashboards and data visualisations can be created to support decision making across the University business.
The BI Service was aware of issues being presented by the large number of data sets and reports available to users of their service, which have been generated by the devolved user community over a long period of time.
From service experience and anecdotal feedback, they suspected that issues of trust in some datasets and findability of reliable reports were having a detrimental effect on the quality of their service and user satisfaction.
The BI Service wanted to:
- find out what their users' goals were when looking for data or reports
- understand what could give users greater confidence when seeking and interacting with datasets or reports
- establish whether their idea of a quality seal - similar to the UK’s Quality Standards Kitemark - would be something users would value
- work out how user self-service processes might support scaling up the service
- include the users in the design and development process
What we did
We used the previous user research executed during 2017/18 as our starting point for this work.
We worked with the BI/MI Service to better understand their perceptions of the quality issue and the formal or informal sources of insight they were drawing on as the basis for their hypothesis.
We prepared a round of in-depth interviews with users who matched the profiles of the personas developed earlier in 2018. The interviews were semi-structured, following a script prepared with the BI/MI Service Management team, but which allowed the interviewer to improvise around any interested insight, changing or adding more questions to the interview when needed.
Interviewing users for approximately an hour this way allowed us to discover new insights on a broad spectrum of topics with rich context, keeping the session safe from being biased towards a certain specific issue or solution without finding the main problems.
Having collected and classified all the interviewees’ insights, we encountered new behaviours and attitudes. We wanted to understand how these new insights aligned with what had been previously uncovered and summarised in personas.
In order to do this, we mapped the original personas and the interviewed cases against two axes: from technical to non-technical and from strategical to non-strategical profiles.
Turning information into a story that the team could share
Our aim was to translate the findings into a more plain and shared language, with a story that anyone from any role and background could utilise to discuss such a complex series of problems.
During our mapping exercise, we naturally started to use an analogy to think through it together. We used an analogy between a steep BI learning curve and how a steep terrain is shown in a contour map.
As we placed different roles, personas and cases interviewed in this terrain, we asked ourselves what other factors could be mapped in a contour-like map.
Unveiling mythical monsters
We then thought about how could we deal with the unknown areas and strange phenomenon we discovered during research. Up until very recent times, maps have also had to deal with representing blurred areas of knowledge of their time. We transformed our own mysterious phenomenon - the lack of trust of data - into a mythical monster on the map.
This analogy proved to be helpful for us to work with the BI/MI Service Management team and think about the situation without losing context. What was a very technical talk has now turned into a more strategic conversation, but using terms that non-specialists can understand.
Aligning expectations of quality criteria and prototype features
After presenting the BI/MI Service Management team our map, the group generated a list of potential variables affecting perceptions of the quality of reports and data sets.
We designed a small exercise to generate consensus around the importance weighting of each of these factors regarding quality. With this team consensus achieved, we repeated the exercise with a small group of users to identify where perceptions aligned and where they differed.
This led into a prototyping exercise where the BI/MI Service Management team worked in pairs to generate potential features and tools which were then exposed to users in moderated sessions. Our goal here was to help the team get a better understanding of how they might address the user problems already validated through work to date.
Through this, we gathered insight into the elements of the team’s prototype design ideas that were most comprehensible and valued by target users - without progressing beyond pencil-and-paper sketch fidelity. The dialogue generated between the team and target users through this process provided further insight into how staff use BI tools, what they value and what frustrates them. Through the BI/MI Service Management team observing the moderated session, they were able to see first-hand where potential feature value lay, and which of their ideas were unlikely to be worth pursuing.
Outcomes and benefits
The user research, design facilitation and review activities undertaken in collaboration with the User Experience Service have provided:
- additional richness of user understanding in the areas the BI/MI Service Management team identified as highest priority
- new models around which the BI/MI Service Management team can collaborate and communicate when working towards delivering more useful and usable tools and services to the University
- easily replicable techniques that sense check the team's ideas with users at an early stage before significant cost and commitment
UX techniques used
Design thinking workshops
The UX Service provided us with a solid, structured way of moving forward and identified blind-spots in what we knew about our user-base. This process has brought unexpected insights and a deeper understanding of our users.
We were in a situation where we believed we had a solution for the issue of users’ trust in data. The UX Service helped us to appraise our anticipated approach, avoid wasting effort on developments that didn’t align with users’ needs and move forward once we better understood the real problem.