By Michelle Covey, Vice President of Advisory Services, GS1 US
In the 41 years since the first scan of a barcode at a small grocery store in Ohio, consumer paths to purchase have changed severely. We’ve moved from simply buying what was available at the store in that moment, to today’s constant consultation and validation on product purchases with our smartphones. All of this is possible because of the abundance of product data.
But data is moving so fast — 90 percent of global data was created in the past two years, according to IBM — that its quality and reliability have suffered. This has become a challenge for many different industries, including the food industry, as it seeks to leverage product data in innovative strategies to improve on-shelf availability, satisfy consumers, and reduce inefficiencies.
To keep pace with today’s demands, food manufacturers are prioritizing data quality improvement as the need to seamlessly share product information with their trading partners and as consumers evolve. Before data-driven insight can live up to its fullest potential and create the optimal customer experience, supply chain stakeholders need to work on foundational supply chain data and the processes that govern it.
The best way to approach data quality improvement is to understand that it is a journey, not a race. Data quality programs are continual, and organizations should seek to achieve fluid business processes that can keep up with the pace of new data constantly being created and shared. Data quality is the outcome of having good data governance processes, knowledge, and training to support good product data management so the integrity of product data is maintained through all steps of the supply chain.
Please log in or register below to read the full article.