What are “Analytics”?

“Analytics” is one of those business buzz words formed by transforming an adjective into a noun. 

So forceful and habitual is such misuse of language that one might call it a compulsion among business analysts and writers.

The term “analytics” commonly refers to software tools that can be used to organize, report, and sometimes visualize data in attempt to lend meaning for decision-makers.  These capabilities have been advanced in recent years so that many types of graphical displays can be readily employed to expose data and try to make information from it.   “Analytics” has been used to refer to a very broad array of software applications.  Numerous industry analysts have attempted to segment these applications in various ways.  “Analytics” refers to so many kinds of applications that it is useful to establish some broad categories.

A simple, though imperfect, scheme such as the following may be the most useful where the potential value that can be achieved through each category increases from #1 through #4.

Reports – repetitively run displays of pre-aggregated and sorted information with limited or no user interactivity.

Dashboards – frequently updated displays of performance metrics which can be displayed graphically.  They are ideally tailored to the needs of a given role.  Dashboards support the measurement of performance, based on pre-aggregated data with some user selection and drill-down capability.  Hierarchies of metrics have been created that attempt to facilitate a correlation between responsibility and performance indicators.  The most common such model is the Supply Chain Operations Reference Model (SCOR Model) that was created and is maintained by the Supply Chain Council.

Data Analysis Tools – interactive software applications that enable data analysts to dynamically aggregate, sort, plot, and otherwise explore data, based on metadata.  Significant advancements have been made in recent years to dramatically expand the options for visualizing data and accelerating the speed at which these tools can generate results.

Decision Support/Management Science Tools – simulation, optimization, and other approaches to multi-criteria decisions which require the application of statistics and mathematical modeling and solving.

Let’s focus on Decision Support/Management Science Tools, the category with the most potential for adding value to strategic (high value) decision-making in a sustained fashion. 

So, then, if that is what analytics are, do they enable higher quality decisions in less time, and if so, to what extent are those better decisions in less time driving cash flow and value for their business?  These are critically important questions because improved, integrated decision-making that is based in facts and adjusted for risk drives the bottom line.

Execution is good, but operational execution under a poor decision set is like going fast in the wrong direction.  It is bad, but perhaps not immediately fatal.  Poor decisions will put a business under very quickly.

Enabling higher quality decisions in less time depends on the decision-maker, but it can also depend on the tools employed and the skills of the analysts using the tools. 

The main activities in using these tools involve the following:

  1. Sifting through the oceans of data that exist in today’s corporate information systems
  2. Synthesizing the relevant data into information (a thoughtful data model within an analytical application is helpful, but not sufficient)
  3. Presenting it in such a way so that a responsible manager can combine it with experience and quickly know how to make a better decision

Obtaining a valuable result requires careful preparation and skilled interaction, asking the right questions initially and throughout the above activities.

Some of the questions that need to be asked before the data can be synthesized into information in a useful way are represented by those given below:

  1. What is the business goal?
  2. What decisions are required to reach the goal?
  3. What are the upper and lower bounds of each decision? (Which outcomes are unlivable?)
  4. How sensitive is one decision to the outcome of other, interdependent decisions?
  5. What risks are associated with a given decision outcome?
  6. Will a given decision today impact the options for the same decision tomorrow?
  7. What assumptions are implicitly driven by insufficient data?
  8. How reliable is the data upon which the decision is based?
    • Is it accurate?
    • How much of the data has been driven by one-time events that are not repeatable?
    • What data is missing?
    • Is the data at the right level of detail?
    • How might the real environment in which the decision is to be implemented be different from that implied by the data and model (i.e. an abstraction of reality)?
    • How can the differences between reality and its abstraction be reconciled so that the results of the model are useful?

Ask the right questions.

Know the relative importance of each.

Understand which techniques to apply in order to prioritize, analyze and synthesize the data into useful information that enables faster, better decisions.

We often think of change when a new calendar year rolls around.  Since this is my first post of the new year, I”ll leave you with one of my favorite quotes on change.  Leo Tolstoy:  “Everybody thinks of changing humanity, and nobody thinks of changing himself.”

Have a wonderful weekend!

Advertisement

Resilience Versus Agility

Just a short thought as we move into this weekend . . .

Simple definitions of resiliency and agility as they relate to your value network might be as follows:

Resiliency:  The quality of your decisions and plans when their value is not significantly degraded by variability in demand and/or changes in your competitive and economic environment.

Agility:  The ability to adjust your plans and execution for maximum value by responding to the marketplace based on variability in demand and/or changes in your competitive and economic environment.

You can take an analytical approach that will make your plans and decisions resilient and also give you insights into what you need to do in order to be agile.

You need to know the appropriate analytical techniques and how to use them for these ends.

A capable and usable analytical platform can mean the difference between knowing what you should do and actually getting it done.

For example, scenario-based analysis is invaluable for understanding agility, while range-based optimization is crucial for resiliency.

Do you know how to apply these techniques?

Do you have the tools to do it continuously?

Can you create user and manager ready applications to support resiliency and agility?

Finally, I leave you with this thought from Curtis Jones:  “Life is our capital and we spend it every day.  The question is, what are we getting in return?”

Thanks for stopping by.  Have a wonderful weekend!

What’s Best for Advanced Analytics: A Package or a Platform?

The Dilemma

Many organizations struggle with the dilemma of deciding between a commercially available, prepackaged, off-the-shelf, software application to address a particular decision or planning challenge versus creating something completely from scratch.

There is an increasingly viable third option – leveraging a platform for rapidly developing and supporting a secure, scalable, enterprise-class application for business users that embeds powerful analytical techniques like optimization, machine learning, etc.  The accelerating viability of platforms, especially in the cloud, for developing your own applications has made this third option something that really can’t be overlooked.

In most, if not all cases, this availability of a platform means that building your own app for strategically differentiating decisions is not only doable, but imperative.

The Key Questions

The rather obvious question is how to compare apples with bowling balls, so to speak.

Alternatives for an off-the-shelf, prepackaged, software application approach can’t be evaluated in the same way as alternatives for a platform approach.

So, I offer you the following questions to determine which approach is best for a given use case:

  1. Is the financial opportunity strategically significant (i.e. will it make a difference in the value of the enterprise?)?
  2. Do you need to capture as much of the opportunity as possible (you can’t settle for 70% or 80%)?
  3. Because of #1, #2 and other reasons, do you need to make these decisions to achieve this opportunity in a way that is different from the competition and that can’t be copied by them?
  4. Do you intend to expand the use of advanced analytics and continuously improve the decision models you intend to build, thereby continuing to differentiate yourself in the market?

If the answers to all four of these questions of these questions is “yes”, then you should be focusing your attention on a “platform”, not on off-the-shelf packages.  If the answer to either of the first two is “no”, then you should probably look for a prepackaged application.

The right platform enables you to load, organize, clean, synchronize, share and analyze data from multiple sources so you can solve complex challenges rapidly, all with state-of-the-art technology.  

Thanks for stopping by.  Have a wonderful weekend, for which I leave you this thought from Ralph Waldo Emerson:

“Though we travel the world over to find the beautiful, we must carry it within us, or we find it not.”

 

Analytics vs. Humalytics

I have a background in operations research and analysis so, as you might expect, I am biased toward optimization and other types of analytical models for supply chain planning and operational decision-making.   Of course, you know the obvious and running challenges that users of these models face:

  1. The data inputs for such a model are never free of defects
  2. The data model that serves as the basis for a decision model is always deficient as a representation of reality
  3. As soon a model is run, the constantly evolving reality increasingly deviates from the basis of the model

Still, models and tools that help decision-makers integrate many complex, interrelated trade-offs can enable significantly better decisions.

But, what if we could outperform very large complex periodic decision models through a sort of “existential optimization” or as a former colleague of mine put it, “humalytics“?

Here is the question expressed more fully:

If decision-makers within procurement, manufacturing and distribution and sales had the “right time” information about tradeoffs and how their individual contributions were affecting their performance and that of the enterprise, could they collectively outperform a comprehensive optimization/decision model that is run periodically (e.g. monthly/quarterly) in the same way that market-based economies easily outperform centrally planned economies?

I would call this approach “humalytics” (borrowed from a former colleague, Russell Halper, but please don’t blame him for the content of this post!), leveraging a network of the most powerful analytical engines – the human brain, empowered with quantified analytical inputs that are updated in “real-time” or as close to that as required.  In this way, the manager can combine these analytics with factors that might not be included in a decision model from their experience and knowledge of the business to constantly make the best decisions with regard to replenishment and fulfillment through “humalytics”, resulting in constantly increasing value of the organization.

In other words, decision-maker would have instant, always-on access to both performance metrics and the tradeoffs that affect them.  For example, a customer service manager might see a useful visualization of actual total cost of fulfillment (cost of inventory and cost of disservice) and the key drivers such as actual fill rates and inventory turns as they are happening, summarized in the most meaningful way, so that the responsible human can make the most informed “humalytical” decisions.

Up until now, the answer has been negative for at least two reasons:

A. Established corporate norms and culture in which middle management (and maybe sometimes even senior management) strive diligently for the status quo.

B. Lack of timely and complete information and analytics that would enable decision-makers to act as responsible, accountable agents within an organization, the same way that entrepreneurs act within a market economy.

With your indulgence, I’m going to deal with these in reverse order.

A few software companies have been hacking away at obstacle B.”, and we may be approaching a tipping point where the challenge of accurate, transparent information and relevant, timely analytics can be delivered in near real-time, even on mobile devices, allowing the human decision-makers to constantly adjust their actions to deliver continuously improved performance.  This is what I am calling “humalytics”.

But the network of human decision-makers with descriptive metrics is not enough.  Critical insights into tradeoffs and metrics come through analytical models, leveraging capabilities like machine learning, optimization, RPA, maybe in the form of “mini-apps” models that operate on a curated supra set of data that is always on and always current.  So, at least two things are necessary:

1. Faster optimization and other analytical modeling techniques from which the essential information is delivered in “right time” to each decision-maker

2. An empowered network of (human) decision-makers who understand the quantitative analytics that are delivered to them and who have a solid understanding of the business and their part in it

In current robotics research there is a vast body of work on algorithms and control methods for groups of decentralized cooperating robots, called a swarm or collective. (ftp://ftp.deas.harvard.edu/techreports/tr-06-11.pdf)  Maybe, we don’t need swarm of robots, after all.  Maybe we just need empowered decision-makers who not only engage in Sales and Operations Planning (or, if you will, Integrated Business Planning), but integrated business thinking and acting on an hourly (or right time) basis.

What think you?

If you think this might make sense for your business, or if you are working on implementing this approach, I’d be very interested to learn your perspective and how you are moving forward.

I leave you with these words from Leo Tolstoy, “There is no greatness where there is no simplicity, goodness, and truth.”

Have a wonderful weekend!

Why the Soft Side of Analytics Is So Hard to Manage

No real post this week, but I’ll point you to an article of mine that was published this month in Analytics.  You may recognize it as a combination and enhancement of two of my previous blog posts.

Answering Questions that Your ERP and APS Can’t

I have worked for some large software companies.  I loved many aspects of those experiences.  But, do you want to know the toughest part of those jobs?  It was meeting someone from one of their customers and getting a reaction like, “Oh, you are the enemy!”  Yes, that’s literally what one actually woman said to me verbatim. 

Now, of course, she did not stop to consider all the things that were much easier for her company to do and to keep straight with an integrated, enterprise suite of software applications from accounting through manufacturing to procurement

What flashed to her mind were the things that she and her colleagues could not do with the software.  That’s the way it is with software.  The first things we notice are what we can’t do, not what we can now do that was impossible before.

What we cannot do with our enterprise software systems, however, is a real problem.  To make matters worse, your knowledge workers can easily out-think a software application vendor’s development cycle.  There are some fairly legitimate reasons for this, of course, but the fact remains that ERP and APS vendors have no shot at supporting the need for ongoing innovation on the part of you and your colleagues who must make constantly make faster, better decisions.

Of course, that explains the popularity of Microsoft desktop applications like Excel and Access.

In the meantime, business managers who are not paid to be statisticians, data scientists, algorithm engineers, or programming experts struggle to build and constantly recreate the tools they need to do their work.

They are paid to ask important questions and find alternative answers, but the limitations of their enterprise resource planning (ERP) and advanced planning systems (APS) systems keep them wrestling just to find and format data in order to answer the really challenging analytical and/or strategic questions.

While it is possible to hire (internally or externally) the talent that combines deep business domain knowledge with data analysis, decision-modeling and programming expertise to build customized, spreadsheets in Microsoft Excel™, faster, more comprehensive and ubiquitous cloud solutions are emerging.  What’s needed in this approach is the following:

  1. A hyper-fast, super-secure, cloud of transaction level data where like data sources are blended, dissimilar data sources are correlated, and most of the hundreds of basic calculations are performed.  This needs to be a single repository for all data of any type from any source.
  2. A diagnostic layer where the calculations are related to each other in a cause and effect relationship
  3. A continuous stream of decision-support models (e.g. econometric forecasts, optimization models, simulation, etc.)

If you ever need to make better decisions than your competition (Duh!), then this kind of framework may speed your time to value and result in a more secure, scalable, and collaborative solution than desktop tools or point software solutions can provide.   

Such a platform would allow you to see what is happening in business context, why it is happening, and a recommendation for your next best action.    

It also provides a way to build decision “apps” for your business.  You know what apps on your phone have done for you.  Imagine what apps for your enterprise could do . . . and all the data is already there or could be there, regardless of data type or source.

I will leave you with these words from William Pollard, “Learning and innovation go hand-in-hand.  The arrogance of success is to think that what you did yesterday will be sufficient for tomorrow.  (http://www.thinkexist.com)

Have a wonderful weekend!

Make Analytics Useful, Meaningful and Actionable

Last week, I identified reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time.  As promised, this week, I want to share a remedy.

For the analyst, I recommend the following:

  1. Put yourself in the shoes of the decision-maker.  Try to step back from the details of your analysis for a moment and ask yourself the questions he or she will ask.
  2. Engage your decision-maker in the process.  Gather their perspective as an input.  Don’t make any assumptions.  Ask lots of questions.  They probably know things that you don’t know about the question you are trying to answer.  Draw them out.  Schedule updates with the decision-maker, but keep them brief and focused on essentials.  Ask for their insight and guidance.  It may prove more valuable than you think.
  3. Take time to know, explore and communicate the “Why?” of your analysis – Why is the analysis important?  Why are the results the way they are?  To what factors are the results most sensitive and why?  Why are the results not 100% conclusive?  What are the risks and why do they exist?  What are the options? 
  4. Make sure you schedule time to explain your approach and the “Why?”  Your decision-maker needs to know beforehand that this is what you are planning to do.  You will need to put the “Why”? in the context of the goals and concerns of your decision-maker.
  5. Consider the possible incentives for your decision-maker to ignore your recommendations and give him or her reasons to act on your recommendations that are also consistent with their own interest.
  6. “A picture is worth a thousand words.”  Make the analysis visual, even interactive, if possible.
  7. Consider delivering the results in Excel (leveraging Visual Basic, for example), not just in a Power Point presentation or a Word document.  In the hands of a skilled programmer and analyst, amazing analysis and pictures can be developed and displayed through Visual Basic and Excel.  Every executive already has a license for Excel and this puts him or her face-to-face with the data (hopefully in graphical form as well as tabular).  You may be required to create a Power Point presentation, but keep it minimal and try to complement it with Excel or another tool that actually contains the data and the results of your analysis. 

Frustration with your decision-making audience will not help them, you, or the organization.  Addressing them where they are by intelligently and carefully managing the “soft” side of analytics will often determine whether you make a difference or contribute to a pile of wasted analytical effort. 

Thanks again for stopping by.  I hope that these suggestions will improve the usefulness of your analysis.  As a final thought for the weekend, consider these words from Booker T. Washington, “There is no power on earth that can neutralize the influence of a high, pure, useful and simple life.” 

Have a wonderful weekend!

Why the Soft Side of Analytics Is So Hard to Manage

I’m borrowing both inspiration and content from two good friends and long-time supply chain professionals, Scott Sykes and Mike Okey.  They deserve the credit for the seminal thoughts.  Any blame for muddling the ideas or poorly articulating them is all mine.

If you are an analyst, operations researcher or quantitative consultant, you probably enjoy the “hard” side of analytics.  What we often struggle with as analysts is what you might call the “soft” side of analytics which is always more challenging than the “hard” stuff.  Here are a few of the reasons why.

Many times, the problem is not insufficient data, defective data, inadequate data models, or even incompetent analysis.  Often, the reason that better decisions are not made in less time is that many companies of all sizes have some, if not many, managers and leaders who struggle to make decisions with facts and evidence . . . even when it is spoon-fed to them.  One reason is that regardless of functional or organizational orientation, some executives tend not to be analytically competent or even interested in analysis.  As a result, they tend to mistrust any and all data and analyses, regardless of source.

In other situations, organizations still discount robust analysis because the resulting implications require decisions that conflict or contrast with “tribal knowledge”, institutional customs, their previous decisions, or ideas that they or their management have stated for the record.  Something to keep in mind is that at least some of the analysis may need to support the current thinking and direction of the audience that is analytically supportable if you want the audience to listen to the part of your analysis that challenges current thinking and direction.

Understanding the context or the “Why?” of analysis is fundamental to benefiting from it.  However, there are times when the results of an analysis can be conflicting or ambiguous.  When the results of analysis don’t lead to a clear, unarguable conclusion, then managers or executives without the patience to ask and understand “Why?” may assume that the data is bad or, more commonly, that the analyst is incompetent.

Perhaps the most difficult challenge an organization must overcome in order to raise the level of its analytical capability, is the natural hubris of senior managers who believe that their organizational rank defines their level of unaided analytical insight.  Hopefully, as we grow older, we also grow wiser.  The wiser we are, the slower we are to conclude and the quicker we are to learn.  The same ought to be true for us as we progress up the ranks of our organization, but sometimes it isn’t.

So, if these are the reasons for the organizational malady of failing to fully leverage analytics to make higher quality decisions in less time, what is the remedy?

The remedy for this is the subject of next week’s post, so please “stay tuned”!

Thanks for having a read.  Whether you are an executive decision-maker, a manager, or an analyst, I hope these ideas have made you stop and think about how you can help your organization make higher quality decisions in less time.

A final thought comes from T.S. Eliot, “The only wisdom we can hope to acquire is the wisdom of humility—humility is endless.”

Have a wonderful weekend!

Profiling Your Profitability

I want to expand on that thought of a profitability proile.  A profitability profile is the result of analyzing which transactions have been profitable and to what degree.

This analysis doesn’t need to be down to the penny with 100% accuracy to be of value.  In fact, that’s simply impossible in most cases because some costs just aren’t captured at the transaction level such as inventory holding and obsolescence costs, warranty costs, R&D, tooling, capacity investment and so forth, all of which are exacerbated by demand variability and forecast error.  However, meaningful and useful  approximations are feasible.

Once you have analyzed the profitability by transaction, then you need to segment the business.  You want to understand what customer, channel and product combinations have been more or less profitable.  This requires you to segment the business by customer and product attribute.

What is likely to emerge is an understanding of overlapping product offerings whether some simplification is possible.  This data should also form the foundation for an analysis of pricing power and risk as well as margin leakage.

 There will be some obvious immediate actions that can be executed – raising prices on or foregoing unprofitable transactions, eliminating unprofitable SKU’s that are low volume, shifting some customers to lower cost channels.

There will also be longer term, more fundamental questions that need to be answered:

1)      What is the optimal product mix?

2)      How can we build tighter linkages with our most profitable customers?

3)      How does the price book need to be altered?

4)      How does the price approval process need to be configured?

5)      How can we align business functions for the most profitable mix of offerings and customers?  (e.g. If we want to orient our offerings around platforms, how does manufacturing need to change?  What do we need from our vendors?  Will the distribution channel need to adjust?  Does this affect the way we market and sell or process warranty work and returns?)

Next time, I’ll spend a few words on how you need to follow the profitability profile with a detailed analysis of the decisions that drove the unprofitable or marginally profitable transactions in the first place.

As a final thought to ponder over the weekend, I thought I would try my hand at a slightly different twist on the definition of supply chain (with apologies to Jonathan Byrnes from whose words I compiled it):  “Working intensively with counterparts across organizational boundaries to monitor, co-manage and maximize the profitable productivity of assets over the long run and jointly create important new value in the process.”

Thanks once more for spending a moment with me here at Supply Chain Action.

Have a wonderful weekend!

Applying Analytics and Supply Chain Tools to Healthcare

To say that understanding and managing a large, disaggregated system such as healthcare delivery with its multitude of individual parts, including patients with various medical conditions, physicians, clinics, hospitals, pharmacies, rehabilitation services, home nurses and many more is a daunting task would be an understatement.  Like other service or manufacturing systems, different stakeholders have different performance measures.

Patients want safe, effective care with low insurance premiums. 

Payers, usually not the patient, want low cost. 

Health care providers want efficiency.

The Institute of Medicine has identified six quality aims for the twenty-first century healthcare system:  safety, effectiveness, timeliness, patient-centeredness, efficiency, and equity.  Achieving these goals in a complex system will require an holistic understanding of the needs and performance measures of all stakeholders and simultaneously optimizing the tradeoffs among them.

This, in turn, cannot be achieved without leveraging the tools that have been developed in other industries.  These are summarized in the table below.

While the bulk of the work and benefits derived from the application of these tools will lie at the organization level, these tools are well-developed concepts that can be applied directly to healthcare systems, beginning at the environmental level and working back left down to the patient, where indicated by the check marks.

Industrial engineers and operations researchers use systems analysis tools to understand how complex systems operate, how well they perform, and how they can be improved.  For example, mathematical analyses of system operations include queuing theory which can be used to understand the flow of patients through a system, the average time patients spend in a system, or bottlenecks in the system.  Discrete event simulation is another tool that can aid in a more detailed examination of system characteristics and sensitivity to inputs and changes in the system.  Economic and econometric models, based on data-driven analysis, help identify causal relationships among system variables.  Supply chain management tools help forecast demand for services and relate that demand to available resources.  Longer term mismatches can be minimized through sales and operations planning, while short-term challenges are addressed with capacity planning and scheduling.  A few examples of specific challenges that can be addressed through systems analysis tools include the following:

  1. Staff scheduling
  2. Improving patient flow through rooms and other resources and elimination of wait time and waste in work flow
  3. Capacity management in hospitals
  4. Evaluation of blood supply networks
  5. Distributing and balancing consumable supplies
  6. Ensuring the availability of medical device kits
  7. Optimal allocation of funding

Systems analysis techniques have been developed over many years and are based on a large body of knowledge.  These types of analytical approaches while very powerful, require appropriate tools and expertise in order to apply them efficiently and effectively.  Many healthcare delivery organizations are beginning to build staff who have at least some familiarity with a few of these tools, particularly Lean thinking in process design and six-sigma in supply chain management.  There also instances where some of the tools under “Optimizing Results” are being applied.  But, it is clear that much more remains to be done and many healthcare providers will initially need to depend on resources external to their own organizations in order to leverage many of these tools.

Two notes of caution as we move forward:

  1. In our efforts to consider end-to-end processes and their inherent tradeoffs, we must ensure that we do not enforce a complex structure to the detriment of disruptive innovations that will lead to more efficient and effective health care as described by Christensen et. al. in The Innovator’s Prescription.
  2. We must also take care not to base our data analysis and decision models on faulty cost data or inadequate outcome data.  In most cases, neither reimbursements nor charges reflect costs and the measurement of outcomes is significantly underdeveloped.  Some of the tools outlined above will be helpful in addressing these challenges.

Thanks again for stopping by Supply Chain Action.  I leave you with a thought from Mother Teresa, “We can do no great things – only small things with great love.”

Have a wonderful weekend!

%d bloggers like this: