Learning from users through discovery research
Through in-depth research into how public service analysts gather and use data, the DTA has learned how to better support them in their work.
The Digital Transformation Agency’s (DTA) Observatory team uses data science to learn how people interact with government and to manage several whole-of-government data services.
Part of the team’s mission is to empower data practitioners to improve government services. We wanted to move past assumptions and make our own data-informed decisions about where we could provide more value for users. So we asked the users what they need.
We began this mission by providing Google Analytics 360 subscriptions, training sessions and analytics.service.gov.au; however, we wanted to better understand the realities of what being an Australian Public Service analyst is like. We wanted to understand how our users’ worlds work, what their professional pain points are, and whether there are opportunities to enhance their practice so they can deliver more value to their organisations.
Research started with analysts who subscribe to the Observatory and use our services daily. We spoke with 18 users across 13 agencies during our first phase of discovery interviews. They told us about their experiences interacting with our service and using analytics in their organisations.
Developing our line of enquiry and conducting interviews
Our main aim is to empower data analysts across government, so we designed our line of inquiry to match. We explored:
- how our users engage with analytics in their day-to-day work
- what kinds of questions our users are trying to answer through analytics
- what barriers and opportunities our users see in reaching their desired goals.
We designed our interviews to be semi-structured — we followed a line of inquiry but allowed participants to guide the conversation with the stories they wanted to share. This approach allowed us to organically explore our research objectives without leading conversations too strongly. This helped uncover new and unexpected insights.
Synthesising and analysing our findings
Two members of the Observatory team attended each interview with the analyst. We then synthesised what we heard using a collaborative online whiteboard. We captured key observations, quotes and ideas. The board became the space where we could group and theme similar ideas that came out of all the interviews.
In user research we say, “It only takes one person to notice the missing step on the staircase”. Even if only one subscriber gave an insight, we treated it as having analytical importance to make sure we heard our users.
Once we had themed our interview findings, we analysed themes together to see what they told us about the user experience. This is how we uncovered insights.
What we learned
We learned that we have primary and secondary users. We assumed the research cohort was obvious, since our assumption was that data analysts are our primary users. From our conversations we also learned there are relevant stakeholders on the fringe of data analysis — those whom analysts report to, or those needing data to explore or validate assumptions and hypotheses.
Through our research with primary users, we learned that these secondary users are also important to the Observatory’s mission as they represent the end users of analytics work. In future, we aim to further research the stakeholder experience to improve the focus of our work.
Types of barriers our users face
Barriers for users include:
- data literacy
- user perspective
These barriers occur at different rates and in different forms across agencies and represent a challenge to our mission. By exploring them, we will better understand how to navigate the challenges faced by our users as they try to improve agencies’ services.
Pain points and opportunities
It can be difficult for our users to establish an ‘analytics service’ within their organisation. This is due to the ad hoc nature of how they receive and answer requests. A lack of awareness among stakeholders about the power of data leads to infrequent and less-strategic use of data.
We also found that users would like to have a greater sense of purpose behind the use of Google Analytics and know how to define and measure success. Some also face barriers to adopting user-centred design principles, which affects their ability to work with Google Analytics.
Responding to user needs
Three ways the Observatory can respond to user needs include:
- analytics leadership through best practice guidelines, standards and advice
- the creation of practical tools and templates to support users which are aligned to best practice
- facilitation of sharing and learning relationships across government.
A deeper exploration of these insights can be found in our Discovery Insights Report.
We are beginning to refine, design and test possible solutions. We are adopting a dual-track, agile approach to working. This means we will use a parallel discovery process to explore hypotheses and test prototypes. This will also improve our ability to deliver viable solutions.
If you would like to help with future research, we’d love to hear from you! Contact the Observatory team at email@example.com.
Do you want to help your agency enhance its data practice? Sign up for updates from gov.au Observatory to learn about our work, get access to insights we uncover, and hear early announcements about the launch of new tools and initiatives.
Originally published here.
Now could be the right time to take another look at your organisation's SaaS contracts.
Government and private sector stakeholders have warmly responded to the Digital Transformation...
Overhauling security, implementing real-time observability and using data to drive...