Renaissance’s Star computer adaptive assessments generate nationally norm-referenced scores that help screen students, track progress, measure growth, and inform a variety of instructional decisions. Although the Star reports and dashboards provide a wealth of information, there is an unmet need of high interest to educators--namely, the ability to compare achievement and growth to demographically similar schools.
I and a team of 4 others felt that the scope of the problem and tone of the client warranted a holistic approach. It became quickly clear that the research insights were far more beneficial than any specific deliverable, so we prioritized our work to focus on gathering information and delivering insights.
Our generative research, January to May 2019, consisted of literature review, competitive analysis, data-mining, and structured interviews with data coordinators around the country. We aimed to identify data coordinators’ pain points, understand their data analysis process, and validate their needs for demographic comparison.
We transcribed our interviews, and rotated interpretation between group members in order to avoid bias. Then we used affinity diagramming to consolidate our interpretations and see how opinions clustered. The affinity diagram generated lots of insights, which we have consolidated into seven main findings. These findings informed our co-design sessions design, survey design and model building.
We arrived at the following insights:
Data Coordinators would like to know how their school is doing compared to schools with similar demographic characteristics
Data Coordinators would like students to be on-track, looking across sub-groups or time
Data Coordinators have difficulty communicating data with teachers and administrators because there is little shared understanding of data interpretation
Data Coordinators don’t have a smooth way of creating reports
Data Coordinators want to use data to better inform educational decision-making
Most of these were expected and backed by our client, having heard similar issues. However, one additional insight stood out:
Data Coordinators are open to, but lack convenient ways to find and connect with similar schools
Our client did not expect Data Coordinators to want to connect with each other to collaborate on similar problems.
We conducted two co-design sessions with data coordinators to understand:
1. What information they would need given a certain scenario of analyzing or presenting data.
2. Their preferred way of visualizing the information in the given scenario.
3. How they would use this information and communicate the data with others.
We used Figma, inVision, and at times simply Google Slides to prototype, develop, test and refine our design. Below are sequence of iterations
I always liked this graphic, showing how many iterations we went through on the central premise of our design. Showing "Similarity," "Performance," "Growth" and other details on the same card for each school was critical to our work.
Our final product (pictured below) was an interactive prototype using some mock-data. The designs, code, and all research were turned over to the client, as well as a report on some final validation testing.
Our validation testing proved our product was easy to use, met all needs, and that Data Coordinators were willing to use the connect feature we discovered.