Visit Our Sponsors |
Yet despite the fact that they have undergone a number of complex and expensive technology implementations, most C-level and supply chain executives admit that they still have little idea of what is happening throughout their extended supply chain until long after events have taken place. It is nearly impossible for their organization to sense an issue and modify or optimize its response in a timely manner.
As a result, today’s executives are frustrated—they know their companies are sitting on extremely valuable information assets and yet they are unable to leverage it for the benefit of their organization. The truth is that most tools can’t scale to meet the new demands of big data.
Just how much data are we talking about? Big data might be a buzzword, but there’s truth in the label. To give you a sense of scale, 20 petabytes is the total amount of hard disk space manufactured in 1995, it is also the amount of data that Google processes on a daily basis, and today it could easily represent the size of a manufacturer’s ERP databases.
Making that data visible is the first step to making it useful. But current supply chain visibility methods are quite crude. Essentially, companies are executing transactions, storing the results in a data warehouse, pushing the data to portals and/or business intelligence tools, running analytics on what has happened, and just trying to do better next time.
Given these limitations, it is no wonder that Gartner Research recently revealed that virtually no companies are able to or will be able to provide end-to-end supply chain visibility in the near future; in fact, by 2016, they estimate less than 20 percent of companies will be able to provide end-to-end supply chain visibility. The truth is that most companies are essentially flying blind. The problem is if a single manufacturer can house 20 petabytes of data, how much more data must the rest of the supply chain contain?
Forward-thinking companies understand that a reliance on analytics presents the only scalable approach to analyzing and gaining insights from deluge of big data. Much like a grandmaster in chess, they must become expert in looking at different patterns within their supply chain. A chess grandmaster employs a set of actionable strategies to win matches, which adjust in real-time depending on the moves selected by their opponents. In a similar fashion companies must establish a set of protocol strategies which can be effectively deployed on a real-time basis as the pieces on our board change on a daily and weekly basis as a result of supply conditions, consumer decisions, available tradeoffs, or some relevant combination of factors.
But analytics by themselves are only part of the story—yes, they can create intelligence by helping you understand gaps in performance, but you’ll still need to make use of this intelligence as you move your pieces on the chessboard. This is where process automation, automated exception handling, and collaborative exception management enter the picture. The only way to successfully deal with the huge volume of exceptions that big data architecture produces will be to automate your responses to supply chain conditions. Predictive analytics are a dead end without actionable prescription and feedback loops which learn in real time and automatically make better decisions in the future.
Achieving this will not be easy. It requires a wide range of data from across the internal supply chain, the trading partner network, and from macroecomomic conditions. It also requires a technology architecture that can support both advanced analytics and automated decision-making.
So what are today’s executives to do? A complete answer would require more space than this column allows, but I will say that that any new supply chain technology tools they invest in should at a minimum be able to represent big data, make that data accessible, glean intelligent insights, and automatically adjust to changing conditions.
The bad news is most supply chain technology tools are nowhere close to delivering these kinds of capabilities. They struggle with capturing, housing and analyzing data, much less recognizing demand and supply patterns. In fact, Gartner predicts that many of the analytics-based supply chain decision support tools will likely become obsolete due to their inability to deal with big data, conduct analysis within the required time cycle for the decision, and automate decision making.
To go back to my chess analogy, expecting to transition to tomorrow’s supply chain with today’s technology tools (built on yesterday’s platforms), is like trying to play chess on a backgammon board—it makes no sense. This is a fact that market and industry analysts are only beginning to understand.
Source: One Network Enterprises
RELATED CONTENT
RELATED VIDEOS
Timely, incisive articles delivered directly to your inbox.