Visualizing COVID-19 Survey Data

In the age of misinformation and constant inundation of news – and let’s face it, it’s mostly been bad – it is more important than ever that we get better at critical thinking. Data literacy is a skill that everyone needs as our world continues to be powered by data. The news is not something we all feel we can blindly trust, with more and more skepticism building around agenda-pushing and ratings-based reporting. 

At Acorn Analytics, we’re passionate about data education and we want everyone to be as skeptical and thoughtful about interpreting data as our data science team. But how keen of an eye does the average person have? To take a stab at addressing this question, we did what any good scientist would do – we started collecting data. 

A few weeks back, we sent out a survey asking people to interpret various graphs that showed information about COVID-19 cases in California during the first two weeks of April. The data was from California’s Health & Human Services Agency. We asked some basic questions comparing various line and bar graphs, and here’s a bit about what we saw.

Here’s one of the survey questions. What do you think the answer is?

We had 117 participants total, from 20 different US states, plus Washington, D.C., and several international poll-takers. Most were from California and Colorado, which did not surprise us, given that Acorn has communities in each! Participants ranged from 17 to 77 years old, with a median age of thirty-six and a half. 

On average, people answered 79% of the questions confidently, but only got 60% correct! This suggests to us that everyone could use some tips to sharpen their data skill set. Although, we’d like to acknowledge: these questions were hard, even though they appeared easy!

Results

Here are our survey results:

We had respondents of all ages
Most people felt confident about their answers to all 4 questions
However, very few respondents actually got all 4 questions correct

Tips and Tricks

With these results in mind, here are some quick tips to look out for when looking at and comparing graphs:

  1. Compare: Do the axes show the same range, or are you comparing trends over different scales? In general, it’s best when comparing graphs to do so on the same scale.
    (Apples to apples, or apples to planets?)
  2. Check: Do the horizontal and vertical scales both start at zero? If not, then you actually can’t know the value of any point that is out of range of the scale!
    (If you can’t see it, you don’t know if it’s there or not!)
  3. Consider: Is the graph squashed or stretched out in any way? Sometimes that can be an indication that the graph designer was trying to enhance or hide a certain pattern. Just because it’s “data” doesn’t mean there can’t be bias! (Stay skeptical, friends!)

If you’re curious to learn more about how to improve your analytics eye, feel free to look around our blog for more pointers. In the meantime, stay skeptical, ask questions, and embrace a lifetime of learning. We don’t get anywhere if we pretend to know it all when we’re unsure! 

Our team would love to see any questionable graphs you see in the news or other platforms. If you find any, please send them to us at info@acornanalytics.com

 

A big thanks goes to Jai Bansal for providing editing and graphics support on this post!

Leave a Reply

Your email address will not be published. Required fields are marked *