Data Management. Garbage In, Garbage Out
Donna Unitt
February 4,2018
Whether you're looking to optimise your business processes or streamline your operations, we have the expertise and experience to help you achieve project success.
Contact us today to discuss your needs further.
Learn how our SAP solution extensions can help your workforce reduce reliance on manual paper based activities and ensuring the ever changing needs of your supply chain operational workforce is supported.
Book a demo to learn more.
We work closely with our clients to understand their unique needs and challenges. Explore our SAP expertise services to learn more.
Data management is a prerequisite for the success of an IT implementation, but it also needs to be extended beyond the project and embedded into daily business life.
We have looked at the importance of the human element in ensuring an IT implementation is successful in a previous blog post. ‘Power to the People’ explains why it’s critical to get the team on board to make sure the organisation benefits from its new technology.
People are an essential ingredient, even in our tech-focused world. But other reasons for projects failing need to be identified and addressed.
Poor data management comes high on that list. As with change management, where the onus is usually on the IT department to deliver when it comes to system implementations, more often than not data is seen as the responsibility of the IT department. And that is part of the problem; today’s enterprises are powered on data, making the business and IT its joint custodians.
The adage ‘Garbage in, garbage out’ is as true as it ever was. However new and shiny an IT system, using bad data means the organisation will not reap the rewards of its technology investment.
But cleansing data is a daunting task (if it wasn’t, many more organisations would automatically undertake it). Databases develop and proliferate over a long period of time and it’s easy to assume that repairing the situation will take just as long. As a result many enterprises live with using their bad data and accept that reports run against it will need to be manually manipulated to be accurate.
Pivotal to good data management is identifying the master data source.
Due to the number of systems that are potentially in use at an organisation, and the interaction between these systems and additional external ones, master data can be held in various, disparate locations. This creates silos that hinder transparency and performance; a ‘lead’ system needs to be agreed to drive informed and effective business decisions.
Making any changes to a business requires the starting point and the desired position to be stated to ensure the objectives of a project are met. (‘Where are we now and where do we want to be?’) For example, standard agreement of the current warehouse productivity level is needed before embarking on project to increase it.
Even this starting point can be a topic of great debate as different departments use different data from different systems to drive their own individual key performance indicators. Quality of the master data can also be an issue, often due to the growth in the volume stored. For example, product master data may contain details of items that have not been available for several years; failing to archive this data compromises the information that is current and the system using it.
Businesses know that data is their lifeblood, but this belief can lead to ‘hoarding’ all information acquired without considering its business value. For example, it may not be necessary to keep all the data in every transaction undertaken; orders can be simplified into items ordered on a specific date.
Storing data that has inconsistent elements such as units of measure can also lead to issues and inefficiencies. For example, the same product might be packed in cases of six and twelve, which can lead to discrepancies when calculating how many are on a pallet, as well as during picking when it may not be clear whether ‘1’ refers to one item, one case of six or one case of twelve. As well as causing confusion in the warehouse, it has knock-on effects for the back office when they interpret stock and accuracy reports.
Similarly, different system formats can mean that address details are not compatible across databases, while multiple entries for the same item inflates numbers and introduces inaccuracies.
An IT implementation project is often an opportunity to review the organisation’s data management to ensure that the new system delivers its promised benefits. How to migrate master data must also be factored into the schedule, with decisions made on whether all data needs to be moved across, what needs to be kept from a legal perspective, what needs to be archived and how this will be accessed in the future. New data from other sources may be needed, particularly if an enterprise has altered its operations or introduced a new offering.
Data management is a topic that is often not given the consideration it warrants and, even when it is, can then be overlooked once a project is completed. It’s also a complex process, requiring proactive decision-making and permanent resource if an organisation is to maintain clean databases that empower its business processes.
Rocket Consulting has extensive experience of delivering supply chain projects in which data management was critical to their success. To speak to Rocket for more insight, please contact us.
• Data management is not just the responsibility of the IT department
• To realise the benefits of new technology implementations, projects also require budget for data cleansing.
• Data management needs to extend far beyond the life of a project; it should be embedded into an organisations day-to-day activities.