Testing the Design of a Quality Report By Getting User Feedback
User testing with people who represent your intended audience is the best way to ensure that your report design is clear and effective.
Use Interviews To Collect Feedback From Users
For most purposes, you will learn much more from users if you test the material by conducting individual interviews rather than focus groups. Interviews give you the privacy and flexibility you need to observe people’s behavior, ask them to share their thoughts aloud, and use the material to perform tasks (such as identifying the best and worst performers).
Moreover, user testing can be done efficiently and cost-effectively. You can get a great deal of valuable feedback by conducting a fairly small number of interviews. To learn about the advantages of interviews as a method for user testing, go to Writing Tip 7. Test the Report With Your Audience.
Start User Testing Early
To help guide the development of your design, request feedback early in the process while there is still time to make any needed adjustments to the design. What you learn can save time and money in the long run.
- Show people a draft version of parts of the report. If you are doing a web report, you can show a paper prototype.
- Show different choices for the overall look of the report, including color schemes and visuals, and ask people what they prefer.
- Show a sample comparison chart with fictional names and information. Check on how hard or easy it is for people to understand the chart and make appropriate comparisons.
Do User Testing To Verify the Effectiveness of Your Report
When your report is more fully developed, you can do more user testing to find out how well the report is working. At this stage, you might want to focus on collecting feedback from users that will help you assess ease of comprehension, navigation, and usability.
It’s especially critical to verify that users are interpreting your data displays correctly. Because you are familiar and comfortable with using comparison charts and bar graphs, it can be hard to keep in mind that many people are not.
Moreover, research shows that even people with strong literacy skills can misinterpret tables and charts. For example, one study showed that college-educated people with strong quantitative and analytical experience made numerous mistakes when interpreting bar graphs and line graphs.1
The most effective way to check on accuracy of interpretation is to show users the comparison chart or bar graph and get their reactions. How you ask for their feedback is important.
- Don't ask directly for an opinion about whether the chart is easy to understand, because they may not know if they are making an incorrect interpretation.
- Instead, to get the most meaningful feedback, ask your readers to tell in their own words what the chart or bar graph is about.
- Give them some tasks to do that will help demonstrate whether they are able to understand and apply the information it presents, such as picking out the best and worst performers. Then ask them how they happened to choose those particular ones; sometimes people get the "right" answer for the wrong reasons.
Learn More About Methods for Testing Written Materials
- Toolkit for Making Written Material Clear and Effective, Part 6, How to collect and use feedback from readers. Written by Jeanne McGee for the Centers for Medicare & Medicaid Services. This guide offers step-by-step guidance on how to design and conduct interviews to get feedback from readers. It is written in a nontechnical style for people without a research background.
- A Practical Guide to Usability Testing, Revised edition, by Joseph S. Dumas and Janice C. Redish, Portland, OR: Intellect Books, 1999.
- www.usability.gov. This web site is the primary government resource on usability and accessibility, including the latest usability research and training opportunities.
- Blessing C, Bradsher-Fredrick H, Miller H, et al. Cognitive testing of statistical graphs: methodology and results. Presented at the 2003 Federal Committee on Statistical Methodology Conference. 2003. Energy Information Administration, Department of Energy, 1000 Independence Ave., S.W., Washington, D.C., 20085. Download report at https://nces.ed.gov/FCSM/pdf/2003FCSM_BlessingBradsher.pdf (PDF, 295 KB).