
Back to top

Making it easier to interpret
analytics repots
Background: Gwella stakeholders have access to data reports that monitor the performance of Gwella hosted webpages they manage, but these go largely unused.
Goal: Identify what's stopping stakeholders from engaging with the dashboards, and redesign them so that users with low digital literacy can confidently read and act on their data.
Solution: Redesign focused on simplified visuals, plain-language explanations, benchmarking, and prioritised key metrics, helping stakeholders quickly understand webpages performance, and make more informed, confident decisions.
Product
Role
Tools
A note on process: This project ran under tight time constraints, which meant a leaner approach than my other case studies. I've included it because the design challenge (making data interpretable for people with low data literacy) is one I found genuinely interesting, and the constraints shaped some decisions worth reflecting on.
Define: The gap between data and understanding
Conversations with stakeholders revealed that Looker Studio dashboards were rarely used, as stakeholders found them difficult to interpret and struggled to use the data to inform their decision-making.
To better understand how stakeholders experienced the existing dashboards, I gathered both informal and structured feedback.
Example of original report for ‘Supervision Hub’
“The report feels like data rather than usable information.”
Stakeholder feedback
Key Insights
Prototype: Turning data into clear, actionable insights
I explored new dashboard layouts with a strong focus on clarity and ease of interpretation, treating low digital and data literacy as a core design consideration.
This involved:
Example of final report for ‘Supervision Hub’
Test: Validating the dashboards with real users
To validate the redesigned dashboards, I gathered informal feedback from stakeholders by sharing the updated reports and discussing their experience using them.
Both stakeholders reported that the dashboards were easier to understand and navigate, and felt more confident interpreting the data. No major pain points were identified. Minor discrepancies in the data were flagged during testing and quickly resolved.
While lightweight, this validation confirmed that the changes improved clarity and usability, and that stakeholders could engage with the data more confidently.
Report as viewed by stakeholder of ‘Supervision Hub’
Outcome
Lessons learned
What worked
What didn’t go as expected
What I learned
What I’d do differently next time
Previous case study


Define
Prototype
Test

Back to top

Making it easier to interpret
analytics repots
Background: Gwella stakeholders have access to data reports that monitor the performance of Gwella hosted webpages they manage, but these go largely unused.
Goal: Identify what's stopping stakeholders from engaging with the dashboards, and redesign them so that users with low digital literacy can confidently read and act on their data.
Solution: Redesign focused on simplified visuals, plain-language explanations, benchmarking, and prioritised key metrics, helping stakeholders quickly understand webpages performance, and make more informed, confident decisions.
Product
Role
Tools
A note on process: This project ran under tight time constraints, which meant a leaner approach than my other case studies. I've included it because the design challenge (making data interpretable for people with low data literacy) is one I found genuinely interesting, and the constraints shaped some decisions worth reflecting on.
Define: The gap between data and understanding
Conversations with stakeholders revealed that Looker Studio dashboards were rarely used, as stakeholders found them difficult to interpret and struggled to use the data to inform their decision-making.
To better understand how stakeholders experienced the existing dashboards, I gathered both informal and structured feedback.
Example of original report for ‘Supervision Hub’
“The report feels like data rather than usable information.”
Stakeholder feedback
Key Insights
Prototype: Turning data into clear, actionable insights
I explored new dashboard layouts with a strong focus on clarity and ease of interpretation, treating low digital and data literacy as a core design consideration.
This involved:
Example of final report for ‘Supervision Hub’
Test: Validating the dashboards with real users
To validate the redesigned dashboards, I gathered informal feedback from stakeholders by sharing the updated reports and discussing their experience using them.
Both stakeholders reported that the dashboards were easier to understand and navigate, and felt more confident interpreting the data. No major pain points were identified. Minor discrepancies in the data were flagged during testing and quickly resolved.
While lightweight, this validation confirmed that the changes improved clarity and usability, and that stakeholders could engage with the data more confidently.
Report as viewed by stakeholder of ‘Supervision Hub’
Outcome
Lessons learned
What worked
What didn’t go as expected
What I learned
What I’d do differently next time
Previous case study


Define
Prototype
Test

Back to top

Making it easier to interpret
analytics repots
Background: Gwella stakeholders have access to data reports that monitor the performance of Gwella hosted webpages they manage, but these go largely unused.
Goal: Identify what's stopping stakeholders from engaging with the dashboards, and redesign them so that users with low digital literacy can confidently read and act on their data.
Solution: Redesign focused on simplified visuals, plain-language explanations, benchmarking, and prioritised key metrics, helping stakeholders quickly understand webpages performance, and make more informed, confident decisions.
Product
Role
Tools
A note on process: This project ran under tight time constraints, which meant a leaner approach than my other case studies. I've included it because the design challenge (making data interpretable for people with low data literacy) is one I found genuinely interesting, and the constraints shaped some decisions worth reflecting on.
Define: The gap between data and understanding
Conversations with stakeholders revealed that Looker Studio dashboards were rarely used, as stakeholders found them difficult to interpret and struggled to use the data to inform their decision-making.
To better understand how stakeholders experienced the existing dashboards, I gathered both informal and structured feedback.
Example of original report for ‘Supervision Hub’
“The report feels like data rather than usable information.”
Stakeholder feedback
Key Insights
Prototype: Turning data into clear, actionable insights
I explored new dashboard layouts with a strong focus on clarity and ease of interpretation, treating low digital and data literacy as a core design consideration.
This involved:
Example of final report for ‘Supervision Hub’
Test: Validating the dashboards with real users
To validate the redesigned dashboards, I gathered informal feedback from stakeholders by sharing the updated reports and discussing their experience using them.
Both stakeholders reported that the dashboards were easier to understand and navigate, and felt more confident interpreting the data. No major pain points were identified. Minor discrepancies in the data were flagged during testing and quickly resolved.
While lightweight, this validation confirmed that the changes improved clarity and usability, and that stakeholders could engage with the data more confidently.
Report as viewed by stakeholder of ‘Supervision Hub’
Outcome
Lessons learned
What worked
What didn’t go as expected
What I learned
What I’d do differently next time
Previous case study

