“Moneyball” and Your Business

 

It’s MLB playoff time, and my team (the Tribe) is there, again.  (Pregnant pause to enjoy the moment.)

A while back, the film “Moneyball” showed us how the Oakland A’s built a super-competitive sports franchise on analytics, essentially “competing on analytics”, within relevant business parameters of a major league baseball franchise.  The “Moneyball” saga and other examples of premier organizations competing on analytics were featured in the January 2006 Harvard Business Review article, “Competing on Analytics” (reprint R0601H) by Thomas Davenport, who also authored the book by the same name.

The noted German doctor, pathologist, biologist, and politician, Rudolph Ludwig Karl Virchow called the task of science “to stake out the limits of the knowable.”  We might paraphrase Rudolph Virchow and say that the task of analytics is to enable you to stake out everything that you can possibly know from your data.

So, what do these thoughts by Davenport and Virchow have in common?

In your business, you strive to make the highest quality decisions today about how to run your business tomorrow with the uncertainty that tomorrow brings.  That means you have to know everything you possibly can know today.  In an effort to do this, many companies have invested, or are considering an investment, in supply chain intelligence or various analytics software packages.  Yet, many companies who have made huge investments know only a fraction of what they should know from their ERP and other systems.  Their executives seem anxious to explore “predictive” analytics or “AI”, because it sounds good.  But, investing in software tools without understanding what you need to do and how is akin to attempting surgery with wide assortment of specialized tools, but without having gone to medical school.

Are you competing on analytics?

Are you making use of all of the data available to support better decisions in less time?

Can you instantly see what’s inhibiting your revenue, margin and working capital goals across the entire business in a context?

Do you leverage analytics in the “cloud” for computing at scale and information that is always on and always current?

I appreciate everyone who stops by for a quick read.  I hope you found this both helpful and thought-provoking.

As we enter this weekend, I leave you with one more thought that relates to “business intelligence” — this time, attributed to Socrates:

“The wisest man is he who knows his own ignorance.”

Do you know what you don’t know?  Do I?

Have a wonderful weekend!

Advertisements

A Digital Value Network Needs an Accelerated “3-D” Cycle

Photo licensed through iStockphoto

The strength of any chain is defined by its weakest link.  A supply chain, or as I prefer to say, a value network, is similarly constrained.  By orchestrating the flow of material, information and cash through your value network, you can prevent negative business impact from weak links by detecting anomalies, diagnosing their causes, and directing the next best action before there is a serious business impact.  Do you need some kind of self-aware artificial intelligence to make this work?  Let’s think about that for a minute.

 

 

 

Photo licensed through Shutterstock

 

 

There is a lot of buzz about the “autonomous” supply chain these days.  The subject came up at a conference I attended where the theme was the supply chain of 2030.  But, before we turn out the lights and lock the door to a fully automated, self-aware, supply chain “Skynet”, let’s take a moment and put this idea into some perspective.

 

 

 

The Driverless Car Analogy

I’ve heard the driverless vehicle used as an analogy for the autonomous supply chain.  However, orchestrating the value network where goods, information and currency pulse freely, fast, and securely between facilities, organizations, and even consumers, following the path of least resistance (aka the digital supply chain), may prove to be even more complex than driving a vehicle.  Digital technologies, such as additive manufacturing, blockchain, and more secure IoT infrastructure, advance the freedom, speed and security of these flows.  As these technologies makes more automation possible, as well as a kind of “autonomy”, the difficulty and importance of guiding these flows becomes ever more crucial.  

 

Most sixteen-year-old adolescents can successfully drive a car, but you may not want to entrust your global value network to them.

 

 

Before you can have an autonomous supply chain, you need to accelerate the Detect, Diagnose, Direct Cycle – let’s call it the 3-D Cycle, for short, not just because it’s alliterated, but because each “D” is one of three key dimensions of orchestrating your value network.  In fact, as you accelerate the 3-D Cycle, you will learn just how much automation and autonomy makes sense.

 

Figure 1

Detect, Diagnose, Direct

The work of managing the value network has always been to make the best plan, monitor issues, and respond effectively and efficiently.  However, since reality begins to diverge almost immediately from even the best plans, perhaps the most vital challenges in orchestrating a value network are monitoring and responding.

 

In fact, every plan is really just a response to the latest challenges and their causes.

 

So, if we focus on monitoring and responding, we are covering all the bases of what planners and executives do all day . . . every day.

 

Monitoring involves detecting and diagnosing those issues which require a response.  Responding is really directing the next best action.  That’s why we can think in terms of the “Detect, Diagnose, Direct Cycle”:

 

  1. Detect (and/or anticipate) market requirements and the challenges in meeting them
  2. Diagnose the causes of the challenges, both incidental and systemic
  3. Direct the next best action within the constraints of time and cost

 

The 3-D Cycle used to take a month, in cases where it was even possible.  Digitization – increased computing power, more analytical software, the availability of data – has made it possible in a week.  Routine, narrowly defined, short-term changes are now addressed even more quickly under a steady state – and a lot of controlled automation is not only possible in this case, but obligatory robotic process automation (RPA).  However, no business remains in a steady state, and changes from that state require critical decisions which add or destroy significant value.   

 

You will need to excel at managing and accelerating the 3-D Cycle, if you want to win in digital economy.

 

There is no industry where mastering this Cycle is more challenging than in retail, but the principles apply across most industries.

Data Is the Double-edged Sword

The universe of data is exploding exponentially from growing connections among organizations, people and things, creating the need for an ever-accelerating 3-D Cycle.  This is especially relevant for retailers, and it presents both a challenge and an opportunity for competing with your digital value network in the global digital economy.

 

For example, redesigned, retail supply chains, enabled with analytics and augmented reality (AR), are not only meeting, but raising consumer expectations.

 

Figure 2

Amazon’s re-imagination of retail means that competitors must now think in terms of many-to-many flows of information, product, and cash along the path of least resistance for the consumer (and not just to and from their own locations).  This kind of value network strategy goes beyond determining where to put a warehouse and to which stores it should ship.  Competing in today’s multi-channel world can mean inventing new ways to do business, even in the challenging fashion space – and if it is happening in fashion, it would be naive to think rising consumer expectations can be ignored in other retail segments, or even other industries.  Consider a few retail examples:

Zara leverages advanced analytics, not only to sense trends, but also to optimize pricing and operations in their vertically integrated supply chain.

Stitch Fix is changing the shopping model completely, providing more service with less infrastructure.

Zolando has been so successful in creating a rapid response supply chain that they are now providing services to other retailers.

Nordstrom, of all organizations, is opening “inventoryless” stores.

Walmart has been on an incredible acquisition and partnership spree, recently buying Flipkart and, as early as two years ago, partnering with JD.com.  And, then, there is the success of Walmart.com.

Target is redesigning the way their DC’s work, creating a flow-through operation with smaller replenishment quantities.

 

Yet, many companies are choking on their own ERP data, as they struggle to make decisions on incomplete, incorrect and disparate data.  So, while the need for the 3-D Cycle to keep pace grows more intense, some organizations struggle to do anything but watch.  The winners will be those who can capitalize on the opportunities that the data explosion affords by making better decisions faster through advanced analytics (see Figure 2).

 

The time required just to collect, clean, transform and synchronize data for analysis remains fundamental barrier to better detection, diagnosis and decisions in the value network.  A consolidated data store that can connect to source systems and on which data can be consolidated, programmatically “wrangled”, and updated into a supra data set forms a solid foundation on which to build better detection, diagnosis, and decision logic that can execute in “relevant time”.  This can seem like an almost insurmountable challenge, but it is not only doable with today’s technology, it’s becoming imperative.  And, it’s now possible to work off of a virtual supra data set, but that’s a discussion for another day.

Detect, Diagnose and Direct with Speed, Precision & Advanced Analytics

Detection of incidental challenges (e.g. demand is surging or falling based on local demographics, a shipment is about to arrive late, a vendor is behind on production, etc.) in your value network can be significantly automated to take place in almost real-time, or at least, in relevant time.   Detection of systemic challenges will be a bit more gradual and is based on the metrics that matter to your business, such as customer service, days of supply, etc., but it is the speed and, therefore, the scope, that is now possible that drives better visibility through detection.

 

Diagnosing the causes of incidental problems is only limited by the organization and detail of your transactional data.  Diagnosing systemic challenges requires a hierarchy of metrics with respect to cause and effect (such as the SCOR® model).  Certainly, diagnosis can now happen with new speed, but it is the combination of speed and precision that makes a new level of understanding possible through diagnosis.

 

With a clean, complete, synchronized data set that is always available and always current, as well as a proactive view of what is happening and why, you need to direct the next best action while it still matters.  You need to optimize your trade-offs and perform scenario and sensitivity analysis.

Figure 3, below, shows both incidental/operational and systemic/strategic examples for all three dimensions of the 3-D Cycle.

Figure 3

 

 

Speed in detection, speed and precision in diagnosis, and the culmination of speed, precision and advanced analytics in decision-making give you the power to transpose the performance of your value network to levels not previously possible.  Much of the entire 3-D Cycle and the prerequisite data synchronization can be, and will be, automated by industry leaders.  Just how “autonomous” those decisions become remains to be seen.

 

Fortunately, you don’t need “Skynet”, but a faster and better 3-D Cycle is fundamental to your journey toward the digital transformation of your value network.

 

The basic ideas of detecting, diagnosing and directing are not novel to supply chain professionals and other business executives.   However, the level of transparency, speed, precision and advanced analytics that are now available mandate a new approach and promise dramatic results.  Some will gradually evolve toward a better, faster 3-D cycle.  The greatest rewards will accrue to enterprises that climb each hill with a vision of the pinnacle, adjusting as they learn.  These organizations will attract more revenue and investment.  Companies that don’t capitalize on the possibilities will be relegated to hoping for acquisition by those that do.

 

Admittedly, I’m pretty bad at communicating graphically, but I’ve attempted to draft a couple rudimentary visuals of what the architecture to support a state-of-the-art 3-D Cycle could look like (Figures 4 below), as a vehicle for facilitating discussion.  I do realize that the divisions I’m showing between Cloud, IoT, Extended Apps, and ERP are somewhat arbitrary and definitely fluid.

 

Figure 4

 

 

 

 

 

 

 

 

 

 

 

So, I imagine that I’m at least partly wrong, and could be completely wrong-headed . . . but, then again, maybe not.  I will say this:  The convergence of cloud business intelligence (BI) technology and traditional advanced planning solutions supports my point, and that is definitely happening.  Cloud BI solutions (e.g. Aera, Birst, Board) incorporate at least some machine learning (ML) algorithms for prediction, while Oracle, Microsoft, IBM, and SAP are all making ML available in their portfolios, adjacent to their BI applications.  Many advanced planning vendors are pitching “control towers” which are really an attempt to move toward combining BI capabilities and planning.  Logility recently purchased Halo which embeds ML.  Even Oracle and SAP have built their cloud supply chain planning solutions with embedded BI, really making an effort toward a faster, better 3-D Cycle

So, the future would appear to be now.  If that’s true, you have to ask yourself whether your current paradigm for value network planning will guide you to competitive advantage or leave you hoping that someone else will ask you to the dance.

Incorta, a company you will start to hear about, has innovated a disruptive BI technology and embraced the perspective I advocate in this post.  Their technology and DNA enables the accelerated 3-D cycle more than any other I know.  That’s why I have joined them as Director of Supply Chain Innovation.

I’ll leave you with this thought for the weekend:  I know more now than I once did, especially about how much I still don’t know that I don’t know.

Have a wonderful weekend!

 

Analytics vs. Humalytics

I have a background in operations research and analysis so, as you might expect, I am biased toward optimization and other types of analytical models for supply chain planning and operational decision-making.   Of course, you know the obvious and running challenges that users of these models face:

  1. The data inputs for such a model are never free of defects
  2. The data model that serves as the basis for a decision model is always deficient as a representation of reality
  3. As soon a model is run, the constantly evolving reality increasingly deviates from the basis of the model

Still, models and tools that help decision-makers integrate many complex, interrelated trade-offs can enable significantly better decisions.

But, what if we could outperform very large complex periodic decision models through a sort of “existential optimization” or as a former colleague of mine put it, “humalytics“?

Here is the question expressed more fully:

If decision-makers within procurement, manufacturing and distribution and sales had the “right time” information about tradeoffs and how their individual contributions were affecting their performance and that of the enterprise, could they collectively outperform a comprehensive optimization/decision model that is run periodically (e.g. monthly/quarterly) in the same way that market-based economies easily outperform centrally planned economies?

I would call this approach “humalytics” (borrowed from a former colleague, Russell Halper, but please don’t blame him for the content of this post!), leveraging a network of the most powerful analytical engines – the human brain, empowered with quantified analytical inputs that are updated in “real-time” or as close to that as required.  In this way, the manager can combine these analytics with factors that might not be included in a decision model from their experience and knowledge of the business to constantly make the best decisions with regard to replenishment and fulfillment through “humalytics”, resulting in constantly increasing value of the organization.

In other words, decision-maker would have instant, always-on access to both performance metrics and the tradeoffs that affect them.  For example, a customer service manager might see a useful visualization of actual total cost of fulfillment (cost of inventory and cost of disservice) and the key drivers such as actual fill rates and inventory turns as they are happening, summarized in the most meaningful way, so that the responsible human can make the most informed “humalytical” decisions.

Up until now, the answer has been negative for at least two reasons:

A. Established corporate norms and culture in which middle management (and maybe sometimes even senior management) strive diligently for the status quo.

B. Lack of timely and complete information and analytics that would enable decision-makers to act as responsible, accountable agents within an organization, the same way that entrepreneurs act within a market economy.

With your indulgence, I’m going to deal with these in reverse order.

A few software companies have been hacking away at obstacle B.”, and we may be approaching a tipping point where the challenge of accurate, transparent information and relevant, timely analytics can be delivered in near real-time, even on mobile devices, allowing the human decision-makers to constantly adjust their actions to deliver continuously improved performance.  This is what I am calling “humalytics”.

But the network of human decision-makers with descriptive metrics is not enough.  Critical insights into tradeoffs and metrics come through analytical models, leveraging capabilities like machine learning, optimization, RPA, maybe in the form of “mini-apps” models that operate on a curated supra set of data that is always on and always current.  So, at least two things are necessary:

1. Faster optimization and other analytical modeling techniques from which the essential information is delivered in “right time” to each decision-maker

2. An empowered network of (human) decision-makers who understand the quantitative analytics that are delivered to them and who have a solid understanding of the business and their part in it

In current robotics research there is a vast body of work on algorithms and control methods for groups of decentralized cooperating robots, called a swarm or collective. (ftp://ftp.deas.harvard.edu/techreports/tr-06-11.pdf)  Maybe, we don’t need swarm of robots, after all.  Maybe we just need empowered decision-makers who not only engage in Sales and Operations Planning (or, if you will, Integrated Business Planning), but integrated business thinking and acting on an hourly (or right time) basis.

What think you?

If you think this might make sense for your business, or if you are working on implementing this approach, I’d be very interested to learn your perspective and how you are moving forward.

I leave you with these words from Leo Tolstoy, “There is no greatness where there is no simplicity, goodness, and truth.”

Have a wonderful weekend!

Metrics, Symptoms and Cash Flow

Metrics can tell us if we are moving in the right or wrong direction and that, in itself, is useful.  However, metrics by themselves do not help us assess our competitive position or aid us in prioritizing our efforts to improve.

To understand our competitive position, metrics need to be benchmarked against comparable peers. Benchmarking studies are available, some of them free.  They tell us where we stand relative to others in the industry, provided the study in question has sufficient other data points from your industry (or sub-industry segment).

Many times, getting relevant benchmarks proves challenging.  But once we have the benchmarks, then what?

Does it matter if we do not perform as well as the benchmark of a particular metric?  If that metric affects revenue growth, margins, return on assets, or available capital, it may matter significantly.

But, we are left to determine how to improve the metrics and with which metrics to start.  

Consider an alternative path.  Begin with the undesirable business symptoms that keep you up at night and give you that bad feeling in the pit of your stomach.

Relate business processes to symptoms and map potential root causes within each business process to undesirable business symptoms.

Multiple root causes in multiple business processes can relate to a single symptom.  On the other hand, a single root cause may be causing multiple undesirable symptoms.  Consequently, we must quantify and prioritize the root causes.

“Finding the Value in Your Value Network” outlines a straightforward, systematic approach to prioritizing and accelerating process improvements.  I hope you will take a look at that article and let me know your thoughts.

Thanks for having a read.  Remember that “You cannot do a kindness too soon, for you never know how soon it will be too late.”

Have a wonderful weekend!

Make Analytics Useful, Meaningful and Actionable

Last week, I identified reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time.  As promised, this week, I want to share a remedy.

For the analyst, I recommend the following:

  1. Put yourself in the shoes of the decision-maker.  Try to step back from the details of your analysis for a moment and ask yourself the questions he or she will ask.
  2. Engage your decision-maker in the process.  Gather their perspective as an input.  Don’t make any assumptions.  Ask lots of questions.  They probably know things that you don’t know about the question you are trying to answer.  Draw them out.  Schedule updates with the decision-maker, but keep them brief and focused on essentials.  Ask for their insight and guidance.  It may prove more valuable than you think.
  3. Take time to know, explore and communicate the “Why?” of your analysis – Why is the analysis important?  Why are the results the way they are?  To what factors are the results most sensitive and why?  Why are the results not 100% conclusive?  What are the risks and why do they exist?  What are the options? 
  4. Make sure you schedule time to explain your approach and the “Why?”  Your decision-maker needs to know beforehand that this is what you are planning to do.  You will need to put the “Why”? in the context of the goals and concerns of your decision-maker.
  5. Consider the possible incentives for your decision-maker to ignore your recommendations and give him or her reasons to act on your recommendations that are also consistent with their own interest.
  6. “A picture is worth a thousand words.”  Make the analysis visual, even interactive, if possible.
  7. Consider delivering the results in Excel (leveraging Visual Basic, for example), not just in a Power Point presentation or a Word document.  In the hands of a skilled programmer and analyst, amazing analysis and pictures can be developed and displayed through Visual Basic and Excel.  Every executive already has a license for Excel and this puts him or her face-to-face with the data (hopefully in graphical form as well as tabular).  You may be required to create a Power Point presentation, but keep it minimal and try to complement it with Excel or another tool that actually contains the data and the results of your analysis. 

Frustration with your decision-making audience will not help them, you, or the organization.  Addressing them where they are by intelligently and carefully managing the “soft” side of analytics will often determine whether you make a difference or contribute to a pile of wasted analytical effort. 

Thanks again for stopping by.  I hope that these suggestions will improve the usefulness of your analysis.  As a final thought for the weekend, consider these words from Booker T. Washington, “There is no power on earth that can neutralize the influence of a high, pure, useful and simple life.” 

Have a wonderful weekend!

Why the Soft Side of Analytics Is So Hard to Manage

I’m borrowing both inspiration and content from two good friends and long-time supply chain professionals, Scott Sykes and Mike Okey.  They deserve the credit for the seminal thoughts.  Any blame for muddling the ideas or poorly articulating them is all mine.

If you are an analyst, operations researcher or quantitative consultant, you probably enjoy the “hard” side of analytics.  What we often struggle with as analysts is what you might call the “soft” side of analytics which is always more challenging than the “hard” stuff.  Here are a few of the reasons why.

Many times, the problem is not insufficient data, defective data, inadequate data models, or even incompetent analysis.  Often, the reason that better decisions are not made in less time is that many companies of all sizes have some, if not many, managers and leaders who struggle to make decisions with facts and evidence . . . even when it is spoon-fed to them.  One reason is that regardless of functional or organizational orientation, some executives tend not to be analytically competent or even interested in analysis.  As a result, they tend to mistrust any and all data and analyses, regardless of source.

In other situations, organizations still discount robust analysis because the resulting implications require decisions that conflict or contrast with “tribal knowledge”, institutional customs, their previous decisions, or ideas that they or their management have stated for the record.  Something to keep in mind is that at least some of the analysis may need to support the current thinking and direction of the audience that is analytically supportable if you want the audience to listen to the part of your analysis that challenges current thinking and direction.

Understanding the context or the “Why?” of analysis is fundamental to benefiting from it.  However, there are times when the results of an analysis can be conflicting or ambiguous.  When the results of analysis don’t lead to a clear, unarguable conclusion, then managers or executives without the patience to ask and understand “Why?” may assume that the data is bad or, more commonly, that the analyst is incompetent.

Perhaps the most difficult challenge an organization must overcome in order to raise the level of its analytical capability, is the natural hubris of senior managers who believe that their organizational rank defines their level of unaided analytical insight.  Hopefully, as we grow older, we also grow wiser.  The wiser we are, the slower we are to conclude and the quicker we are to learn.  The same ought to be true for us as we progress up the ranks of our organization, but sometimes it isn’t.

So, if these are the reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time, what is the remedy?

The remedy for this is the subject of next week’s post, so please “stay tuned”!

Thanks for having a read.  Whether you are an executive decision-maker, a manager, or an analyst, I hope these ideas have made you stop and think about how you can help your organization make higher quality decisions in less time.

A final thought comes from T.S. Eliot, “The only wisdom we can hope to acquire is the wisdom of humility—humility is endless.”

Have a wonderful weekend!

%d bloggers like this: