Government Cheese, Jack White, and Data Quality | Prodigo Solutions
September 28, 2020

Government Cheese, Jack White, and Data Quality

By Scott Keith - Senior Data Architect
By Scott Keith - Senior Data Architect

Government Cheese 

For those of you not familiar with government cheese, I will explain. In the early 1980s, the U.S. was in a deep recession and times were tough for a lot of people. The government felt that the appropriate response to the situation was to distribute a tremendous amount of free cheese every month – this was known as government cheese. I can’t tell you what the thought process was behind that decision, but that is what happened. The cheese was not the best cheese in the world. Nevertheless, families made the best of it. If you added enough ingredients to it, you could mask the stale, dry, bland taste. For example, when making macaroni and cheese you would add salt, pepper, butter, milk, and more butter. In the end, and with a little creativity, the macaroni and cheese was pretty good. 

Jack White 

In 2012, there was an amazing documentary released called “It Might Get Loud." The film highlighted the careers and differing styles of famous guitarists Jimmy Page (Led Zeppelin), The Edge (U2), and Jack White (The White Stripes). If you love music and have not watched this film, it is a must see. One segment of the film that stuck with me was about Jack White growing up. He was not able to afford an electric guitar, so he simply constructed his own with whatever was available to him. You can imagine how simple these guitars were. Even so, he pointed out that no matter how bad the instrument was, you could make it sound good if you were a good musician. If you were a great musician, you could make it sound great. Passion combined with talent can overcome the limitations of a bad instrument. 

Data Quality 

So, how do these stories relate to the data quality conundrum facing healthcare today?  

The previous examples demonstrate how people can turn something of low quality into something valuable with a little ingenuity and skill. I believe it is human nature. When someone is given something of low quality, they either throw it aside or they make the best of it. Given that perspective, it would make sense that if someone is given “bad data" to work with, they will either discard it or attempt to make it usable. Manipulating bad data to smooth out gaps, excluding incomplete data, masking unmatched data, etc. can hide the poor quality of the source data. Using modern visualization tools, every presentation can make the data look great. It is human nature – we make the best of what we have.  

This is good news for data, right? Wrong! Bad data is bad data. You can mask data. You can make it pretty in a visualization. You can obscure it through statistics. But in the end, bad data is bad data, so we need to ask ourselves: Are we telling the right – accurate and complete – story?  

In the case of government cheese or Jack White, the response to the problem was isolated.  What people did to their cheese in the ‘80s was their business. What Jack White did to make music was his business. However, data is an important organizational asset that should be managed and protected like something of value. The real danger in bad data is, where data professionals are really good at their job, decision makers are making wrong decisions based on poor or incorrect input. No one wins in this scenario. The data professionals have wasted their time trying to smooth out the answer and management will have drawn wrong conclusions. At its worst, nobody is even aware that there are data problems in the organization. 

It is too late for government cheese, and Jack White is probably doing alright now with his guitars, so let us focus on assessing bad data. Data quality does not have to be a major project consuming a lot of time and money. Some basic inquiries can give you a good idea of the state of your data assets. The following are steps to take to get started: 

  1. Assess your current state: Cross referencing master data from various source systems can highlight how out of sync or disparate your systems are. Getting a list of account numbers for customers, suppliers, products, locations, etc., from each of your systems and comparing them will highlight discrepancies between systems and within systems. Often, you will find that different locations in your organization will create different accounts for the same locations or suppliers. How many account numbers are being used for your top suppliers? How are these account numbers being related to the master supplier record? Who is responsible for the account consolidations? 
  2. Validate, validate, validate: If the three most important things in real estate are "location, location, location," then the three most important things in data are validation, validation, validation. Running similar reports out of different systems can highlight discrepancies in financial and operational metrics. Who generates the reports? What is the source of the data? What business rules were applied to produce the results? You might find that different departments use different rules to produce the same reports. How do department-level reports compare to each other and contribute to overall corporate results? 


Making the most out of data is not just an innate drive to make the best of what we are given; too often it is driven out of necessity. When a department or facility creates an access database or an elaborate Excel repository to produce reports, it is most likely because of poor data quality or a lack of data availability in source systems. When these offline reporting systems are discovered, it is important to ask why they were created. The answers will help you identify the gaps across your systems.  

Centralizing data management and governance at a corporate level frees up resources spread across departments and facilities. It will streamline operations while providing consistency to reporting. These are the value drivers for Master Data Management (MDM) and Data Governance. If your organization does not have an MDM program, the organization is likely relying on a decentralized, patchwork approach to business intelligence. An assessment will show if decentralization is working for you. My guess is that it is not working that well. Do not let your teams Jack White your data or smooth it over with government cheese. It is not fair to the data analyst or to the organization. 

About Prodigo Solutions 

Prodigo Solutions is a healthcare technology company that improves providers’ financial control and reduces supply chain cost. Prodigo Solutions’ technology was purpose-built for healthcare by supply chain experts to deliver tangible results across a continuum of care. Customers who use our systems purchase more than $23 billion annually for the more than 700 hospitals they operate. 

For additional information please contact:

Prodigo Solutions’ Marketing Department

Scott Keith
Senior Data Architect