Weekly Review: Data Visualization with Jeremy Harper – December 28, 2020

Weekly Review: Data Visualization with Jeremy Harper – December 28, 2020

When making a sword, quenching meant you were hardening the steel while tempering removed a controlled amount of that rigidity or hardness to make it more flexible and resilient. COVID-19 has been the tempering process for data visualization and analytics. We have spent decades preparing and building capabilities for data visualization and analytics, and frequently those solutions were not flexible. That effort helped us be ready to take the next step in our evolution as a nation. Over these last months we began rolling out new tools that made it easier than ever before for you, an individual, to explore and visualize what is happening in the nation around you. You have been able to see these visualization tools in action:

Analytics is the engine that drives visualization and it has been an urgent need during this pandemic. You’ve heard the term “Big Data” before – what exactly is “Big Data?” Why should we care? Well to get to big data you first have to start with small data. The early months of the pandemic had us addressing and dealing with small data. We have seen early study after early study have follow up studies that showed data signals we thought were strong were actually non-existent because we were relying on data from only 50 to 100 patients.

Now that we have an additional 200,000 cases each day, we aren’t dealing with small data, instead it has become big data. That means more challenges in preparing the information so we can analyze it. With 100 patients we can go through each chart and pull out individual data elements; with 10,000 we have to leverage automated processes which will miss information.

We are proud of this progress but there is always room for improvement. Many of the tools that have been built since February are single use visualizations. While we as a nation are now able to see the impact of our actions at the local and national level that doesn’t mean we’re seeing a consistent story let alone translating into the same reaction. One of the key scientific principles we strive for is to not dictate the action through what and how we visualize the data but instead to focus on exploring the underlying truth of the situation.

That is not always the same principle that will be found in other circles. Sometimes that means the data are messy, sometimes it means that we give conflicting advice, sometimes it just means that you won’t be able to glance to identify an answer but will instead need to think more deeply about the given topic. Analytics gives us the ability to predict and quantify the situation – what the analytics tells us about the situation is what we use to drive recommendation for action.

Humans have a propensity to project forward and believe they are able to analyze the future. The easiest method to convince yourself that we are bad at precognition is to look at the data on stock picking and the extensive body of evidence that being able to understand the crowd behavior is often not as simple as we believe. We delude ourselves into believing we knew where something was headed when if we test and measure people’s memory of their prediction. Their memory migrates toward successes rather than when they were dramatically wrong.

Data visualization today has amazing toolsets that allow us all to do deep dives into how many people are living in our neighborhoods and how many are unemployed or underemployed in our area. We can see where crime is going up and down, and most importantly to the discussion today we can see the rates at which COVID19 is spreading. We’re exposed to information we didn’t even know we needed 12 months ago. The concept that we would be tracking statewide ICU bed availability as a matter of public discourse is phenomenal. The raw ability to look at this information in near-real time allows us all to make better decisions about our lives. Only in 2020 would we have been able to deliver data at this scale with this speed.

As we think about where we are going in 2021 its important to begin to consider what we want to accomplish. We are going to need to track and trace vaccinations. How we visualize communities and areas that have been successfully vaccinated is a current challenge to our visualization work. The analytics we will need to deliver are going to face skewing, as portions of the population become inoculated. On the other hand, we also need to start tracking all individuals who are being given vaccinations to ensure we don’t have issues that are not immediately evident. We all remember the story of Vioxx, which was pulled because of the thousands of fatal heart attacks associated with the drug.

All this leads to what technologies will help us achieve our goals. We apply machine learning extensively in the analysis of big datasets, but we haven’t seen it come to the forefront with COVID-19 because our datasets were not large enough to tackle. Visualization and data modeling of machine learning is often challenging. Machine learning and artificial intelligence gives us answers but it takes a team to dig into how this process generated the decisions and recommendations it puts forward. In 2021 we will likely see these start to be used to predict outcomes and help us allocate our resources in order to impact the most lives.

2021 will be a proving ground to determine which data we will continue to visualize and what will go back behind closed doors. We haven’t seen a rush to throw other disease states, opioid rates, or any other cohorts out into the public domain. That may indicate that we are all too busy or that without the financial considerations of a pandemic these visualizations are too costly to maintain. We will have to debate how we deliver this information in the future and how to communicate it to the public in the most coherent and useful manner possible. We will leave 2021 with an expectation over how we tackle future pandemics and what will be required for the system to address these issues. 2021 will be an exciting year which helps us all align.


Jeremy HarperJeremy Harper, FAMIA served as the Chief Research Information officer for the Indiana CTSI and Regenstrief Institute from 2017 to 2020.He is a recognized expert in data visualization and electronic medical records. He now is an independent consultant, you can find out more at https://owlhealthworks.com, and connect with him at https://www.linkedin.com/in/jeremyharper1/.

|2020-12-23T09:58:52-05:00December 28th, 2020|COVID-19 Literature|Comments Off on Weekly Review: Data Visualization with Jeremy Harper – December 28, 2020

About the Author: WISE Indiana

WISE Indiana
WISE Indiana (Wellbeing Informed by Science and Evidence in Indiana) is a partnership between the Indiana Clinical and Translational Sciences Institute’s Monon Collaborative and the Indiana Family and Social Services Administration to engage Indiana’s nationally-recognized academic experts to evaluate and inform Indiana practices, programs and policies. This partnership aligns with and furthers the visions of both organizations by facilitating timely, high-quality evidence-informed research, evaluation and analysis to the benefit of all Hoosiers.

Get Involved with Indiana CTSI