Control Towers - Not another business process?

The volume is being turned up on the Control Tower approach to running a business; I have recently been introduced to logistics control towers, supply chain control towers and operations control towers just for starters.  I’m sure there must be at least a half dozen more out there – pick a noun, place it out front and voila, your very own control tower du jour.

By the time I got to the third one, I realized that these people were serious, and my initial reaction upon considering the implications was – Oh no, not another business process!  Does it replace something?  Does it consolidate multiple somethings into one something?  Does it provide me with a new capability?  If it doesn't do at least one of these, why would I bother?

Perhaps in the end the control towers will prevail, I will have been shown to have  overreacted, and the answer to complexity might indeed be more complexity.  But I’m not going down without putting up at least some token resistance, so what follows are a few critical factors I think you should evaluate before taking the control tower plunge, with the understanding that by borrowing the pre-existing label / framework / metaphor, the airport control tower, it means we’re borrowing and building on pre-existing concepts that will influence our expectations.

  • OPERATIONAL:    Control towers are for operations, they are tactical, not strategic.  Control towers are about execution, they are not for planning or for simulations.  I have seen presentations where everything but the kitchen sink is thrown in for good measure – a dashboard, some analytics, some alerts, some simulation, some reporting, some optimization.  Good grief, the end result would more resemble NASA Mission Control than an airport control tower. 
  • INTEGRATION:  I have likewise seen presentations where, in the name of “visibility” a worthy goal indeed, the objective becomes creating an END-to-END control tower, from the tier-n supplier to the end consumer and everything in between. This is not how real world airport control towers work.  Most airport operations divide their control tower functionality into three, more manageable, segments: ground control (gate-side to taxiway, including ground vehicles), the local or “tower” control (active runways and close-range airspace) and regional airspace control, with carefully orchestrated handoffs between each. 

Likewise, I think we would be better off focusing on improving our data integration and better coordinating our own internal handoffs than in building something unmanageable in practice.  The end-to-end visibility is still there, but just in manageable chunks.  While it’s always good to have a noble, motivating high-end vision in mind, often it is more productive to simply set our sights on meaningful, achievable, incremental improvement. If you are at stage 1 or 2 on a five level maturity scale, getting to stage 3 and staying there can be so much more important than aspiring to a level 5 goal that never gets any closer.

  • AUTHORITY:  Are you ready to give your control tower complete authority over the cross-functional processes it governs?  Because if not, you are just wasting your time.  An airport control tower works precisely because it has complete authority, the air traffic controllers are the gods of their domains, superseding even the airplane captains.  Can you invest that kind of authority in your control tower personnel, over and above that of the functional domains, the department managers and division directors they are meant to be coordinating?  Personally, I do think this is where business needs to evolve to (see my argument in favor of senior VPs, reporting to the CEO, in charge of each cross-functional “Value Discipline” in this post – “The Sound and the Fury of enterprise-wide process management”), but if that is the goal, then I think control towers are half-way measures doomed to failure.
  • BUSINESS RULES:  If control towers are about operations and execution, then their value derives from their ability to quickly identify and respond to issues as they arise in real time.  The ideal front-end for a control tower would be an event stream processing or decision management application, triage for the incoming data, with the ideal platform a visual BI/analytics tool.  Some problems could be dispatched automatically with no human intervention required.  Others may simply need all inputs displayed for human evaluation, with a decision made based on comparisons, trade-offs, triggers and priorities.  The most difficult situations might require real-time inputs to be married with supporting static data (i.e. customer, product, inventory, operational capacity, etc …) so that a more informed decision can be arrived at.
  • METRICS: Lastly, if a control tower is going to fulfill its role and promise, then it needs its own operationally-oriented set of metrics, and not the metrics emerging from the overall planning process.  If an event has come to the attention of the control tower via an alert, then something has already gone “wrong”, there has already been a deviation from plan.  The control tower’s job at that stage is to make the best of a bad situation.  The control tower is not optimizing at this point – the plan has already been previously optimized - the control tower is simply trying to keep deviations from plan as small and as inconsequential as possible.  You don’t hold the control tower to achievement-of-plan metrics - that's for the rest of the organization - you hold them instead to metrics regarding how well they managed the disruption. 

Despite my initial misgivings, I am trying to keep an open mind on this topic, and will be closely watching its evolution and maturity.  Control towers may very well have their operational place in an organization, but I am skeptical that they will have a strategic role to play – more on that next time.

Post a Comment

The future of shopping

“Within ten to fifteen years, the typical US mall, unless it is completely reinvented, will be a historical anachronism—a sixty-year aberration that no longer meets the publics’ needs, the retailers’ needs, or the community’s needs.”  So proclaimed Rick Caruso, founder and CEO of Caruso Affiliated, a retail/commercial real estate development firm, at NRF earlier this year.

Is his concern properly placed?  How bad is it?

Bad.  So bad that there is even a web site called “Dead Malls”.  I counted over 400 listed in the U.S. alone.  In some cases the buildings have been converted into educational or other commercial use, in others the parking lot was saved but the mall demolished to be replaced by a big box store.  Many, many others are simply boarded up, closed off to the public, awaiting disposal.

Ecommerce of course accounts for a portion of the impact to the traditional mall, but even now, in the middle of the second decade of the 21st century,  online shopping represents only 6% of total retail commerce, growing at a rate slightly in excess of 15% a year. 

But the internet’s impact has been far greater than just that 6%, it has changed the entire shopping experience, birthing multichannel marketing and the omnichannel consumer.  I don’t think I can singlehandedly save the mall with the remainder of my 900 words today, but I do want to address in broad terms some potential scenarios.

  • Manufacturers as retailers.  If you make a consumer good, this transition is likely inevitable and has been underway for some time already. While you might like to hold out with your traditional distribution model for as long as you can, the demise of the mall may force your hand.  It won’t be pretty, neither the channel conflict nor your noticeable lack of retailing expertise as you attempt to sell direct to the consumer.  There will be no trading off the retailer's brand anymore– for better or for worse, it’s 100% your brand that matters.
  • Retailing reinvented.  As I did with my defense of Boeing regarding it’s Dreamliner outsourcing strategy, where they outsourced substantial portions of the R&D effort  (I believe that despite the initial complications, this is a concept that was, at base, sound, just poorly implemented, and will resurface again), I likewise defend Ron Johnson’s attempts to turn JC Penney around, which in retrospect was probably a poor target for this sort of a transformation, but once again, I believe a similar approach will resurface more successfully with some stronger retailers, perhaps those not stuck with anchor stores in already failing malls.  Johnson saw the warning signs and attempted to reinvent the department store as a city street / square more in tune with the changing expectations of the consumers’ shopping experience.
  • Reinvented malls.  It is telling that Victor Gruen, the father of the enclosed mall, was appalled at what the mall became – stranded amidst acres and acres of blacktop parking lot, a fortress with an asphalt moat.  His vision then, and that of Rick Caruso today, was/is of a more integrated outdoor shopping experience, perhaps the mall as the high street, the social center of a community that includes housing, schools and libraries, even a medical center.  Such a reinvention likely entails the concomitant reinvention of your distribution strategy, with fewer big box and department-type stores and more specialty or category killers.
  • The triumph of the Big Box.  Then again, maybe the reinvented mall never catches on, and in its place: the one-size-fits-all / carries-all distribution center.  A showroom to test and compare, followed by anytime-online ordering, then back to the big box for pick-up.  Bleak, yes, but for all except the higher-end goods, who needs an “experience” when it usually comes down to price and convenience for the bulk of our consumer purchases.
  • Online, all the time.  Five years ago I would not have made this stark of a prediction, but since then, the smartphone has changed everything.  Five years ago, if you had claimed that online shopping would come to dominate retail commerce I would have raised as my first objection the fact that so many people still lacked the broadband internet capability at home necessary to make the transition.  But now the lack of a PC server or laptop at home is no longer an obstacle – everyone has, or will have, a smart phone, and we’ll soon be wondering, ‘smarter than what’?  Everything will be smart – your sneakers, your sweatshirt, your refrigerator, your car, your community.  What will get reinvented is not the mall or even the shopping experience but the social experience as a whole.  Shopping?  Was that something people once did after they got the horses fed?

Which of these, or which combination, if any, will come to pass?  After my bracket-busting disaster in this year’s March Madness (I picked the ONLY twelfth seed not to win (I had no choice - all three of my kids go there), and NONE of the other twelfth seed upsets that did happen) I am loath to prognosticate further. But one thing that is certain is that no matter which scenario comes to dominate the retail space, change is on the way, and you are going to have to get closer to your customer.  You are going to have to know more about them, their changing buying and channel habits, and the type of shopping experience they prefer.  Customer analytics will come to drive your business strategy in recognition of the fact that it has always been the consumer that ultimately decides whether that business strategy is a success or a failure.

Post a Comment

Agile strategy, Agile operations

With the increasing emphasis on responsiveness, resiliency, flexibility and agility, I suppose it was only a matter of time before the “agile” concept caught up with strategy itself.  While I may have hinted at this idea four years ago in two of my earliest posts for the Value Alley, “Strategy as a Hypothesis” and “Strategy as a Set of Options”, it would appear I was markedly too cautious when compared to how strategy theory has matured.  This recent article / interview from Strategy and Business of Rita Gunther McGrath, professor at the Columbia Business School, says it all it its subtitle: “The era of sustainable competitive advantage is being replaced by an age of flexibility”.

Quoting from the article, “McGrath thinks it’s time for most companies to give up their quest to attain strategy’s holy grail: sustainable competitive advantage. Neither theory nor practice of strategy has kept pace with the realities of today’s relatively boundaryless and barrier-free markets. As a result, the traditional approach of building a business around a competitive advantage and then hunkering down to defend it and milk it for profits no longer makes sense.  This is the core argument in McGrath’s most recent book, The End of Competitive Advantage: How to Keep Your Strategy Moving as Fast as Your Business (Harvard Business Review Press, 2013).”

McGrath makes a number of key observations pertinent to the objective of this post:

  • Organizational structure becomes important, as you will need to organize for continual flexibility and change.
  • Diversification isn’t enough, because ALL of your businesses are continually subject to losing their competitive advantage.
  • Innovation (“Innovating for the Numerator”) becomes paramount, because there is no lasting competitive advantage in anything.
  • Resource allocation is a “powerful lever for shifting the center of gravity”.   [I am particularly partial to this last resource allocation argument - see my previous post here, “What’s a Budget For?” on what I think is the proper role and function of a “budget”.]

There are a number of connections between McGrath’s argument at the level of strategy and the more pedestrian issues we face operationally, where the trade literature is filled with talk of the flexible factory and the data-driven factoryIDC’s Manufacturing Insights and Predictions for 2014, covering manufacturing, supply chain and PLM, features a strong emphasis on resiliency and responsiveness, driven in turn by the imperative to get closer to the customer.  Getting closer to the customer entails greater visibility both upstream into your supply chain as well as downstream into your distribution channels, with a finer level of granularity than was previously acceptable.

Such responsiveness is predicated on speed.  Speed requires accuracy, as in accurate data, quality data -  all of which is built on a foundation of data integration.

You can hardly emphasize it enough:  speed requires integration, flexibility calls for integration, responsiveness depends on data integration.  (McGrath doesn’t address it in this brief interview, but as part of her focus on the ramifications for organizational structure that I mentioned above, I could see her also including an agile approach to IT as a key part of that structure).

Looping back from agile operations to agile strategy, the obvious connection would seem to be that in order for strategy to be responsive to changing markets, it too will need to be built on a solid foundation of integrated data management, but in this case that data will include a significant external component of industry, government, market and third party data sources.  The internal component will require a strategy management layer to organize the changing business, link the objectives to relevant (and again, changing) metrics, and clearly communicate the current strategy to a workforce that is likely to be more confused and insecure than ever before.

Personally, I have not yet rendered judgment on McGrath’s thesis regarding the impending demise of competitive advantage – there is much here to digest and ponder, and I suspect I will peruse a copy of her book before I draw any firm conclusions.  My own current bias is that this will all play out more in line with what I discussed in “Analytics for the Value Disciplines”, that firms will still have opportunities for competitive advantage through focus and development of core competencies in a particular value discipline, that firms of the future will be successful by becoming agile experts within their chosen discipline – flexible low-cost producers, responsive customer relationship managers, or agile innovators.  But that’s why they call it a market – every idea and variation will undoubtedly be attempted, and the market will provide the criteria for success.

Post a Comment

Announcing SAS for ‘Demand Signal Analytics’

From Gartner to IDC to the trade press, the watchwords in the supply chain for rest of this decade appear to be “resiliency” and “responsiveness”. It’s not going to be about promotion-based pull-through, and it’s most definitely not going to be about channel incentive-based push-through.  What it’s going to be about is demand sensing, and then effectively responding to those demand signals.

It is of course a cliché to talk about change and the increasing pace of change and how this or that role or function is changing, but the combination of globalization and the internet has in fact dramatically affected the pace of change in the consumer market over the past decade.  It’s no cliché to point out that replenishment-only-based supply chain strategies are a thing of the past.

Three significant factors come to mind when dissecting this increase in the rate of change.  The first is the rate of change on the supply side.  The speed of innovation has picked up.  New ideas sprout up more often and come to market more quickly than previously.  New competitors can enter the market more rapidly with fewer of the traditional barriers to entry impeding them.  The increasing adoption of 3D printing will only serve to accelerate this process further.

Secondly, the internet has provided for both more supply and more consumer channels.  Consumers are doing more product and price research on-line, affecting not just traditional supply-chain oriented segments of the economy like consumer goods, but also areas like health care delivery as well (confess – who hasn’t logged in online to make sure that that funny feeling wasn’t a sign of cancer or some deadly, infectious disease?)  Consumers are arbitraging the bricks against the clicks, while your competitors’ products, and sometimes even your own, are appearing in the most varied of on-line channel outlets.

Lastly, consumer trends emerge, morph and spread faster than ever before, which is saying something.  While social media gets most of the credit/blame, other media are becoming all pervasive too, with no escape from advertising anywhere.  Content-marketing based retail mobile shopping, texting you with an individualized 2-for-1 offer precisely when you are standing in front of that product in aisle 10, enabled by the GPS on your smartphone. Product placements in television and the movies, advertising in our cash-strapped public schools, naming rights not just for stadiums or even players but for individual player activities (‘… and that Barry Bonds Blast was brought to you by …’).  How soon before we start seeing paid advertising on Google Glass?  (I would think it likely that it’s already part of the business model)

If you are basing your forecast on shipment data alone, you don’t stand a chance.

Which is why the focus is increasingly shifting towards the analysis of early demand signals that can be translated back into production and supply chain actions that better sync up with a moving demand target that has lately found itself another gear.

Yesterday’s announcement of SAS for Demand Signal Analytics at the IBF Supply Chain Planning & Forecasting Conference in Scottsdale, AZ, is SAS’ response to your need to react faster to market changes.  Its foundation is a robust demand signal repository (DSR) upon which is layered the user-friendly analytical forecasting you are already familiar with from SAS, coupled with SAS® Visual Analytics, which in this offering incorporates the custom-built capabilities needed to address demand signal analysis.

Higher revenues and fewer stock-outs, close-outs and inventory write-downs come from better forecasting.  And better forecasting doesn’t come from doing the same you’ve always done – relying solely on your own shipment data.  Better forecasting comes from utilizing downstream data, syndicated scanner data, closer to the customer data, from your intermediate distribution channels and your retailers’ POS systems. Building your operational plans on consumer buying behavior as it occurs allows you to search for those early but weak demand signals that portend a new trend, increased competition, or the immediate effects of promotional efforts.

What if you could watch and analyze the effect of pricing or trade promotions as they occur at the retail level rather than waiting in arrears for quarter-end reports?  What if you could manage rather than just measure the effectiveness of your operational and supply chain planning? What if you could identify what really influences sales performance and recognize even subtle market shifts months earlier than if you had relied on after-the-fact shipment data alone?

Ultimately, a manufacturer with better vision into the true demand for its products can manage its suppliers and channels more efficiently and effectively. Lower operational, logistic and inventory costs, greater revenue from fewer missed opportunities, and happier consumers who got what they wanted, when and where they wanted it, are all part of the benefits of synching supply with demand by increasing your focus on the demand signal side of the equation.

Post a Comment

Agile risk management – What might that look like?

I had the opportunity to moderate a roundtable discussion on risk management at the International Institute for Analytics’ (IIA) winter symposium in Orlando earlier this month.  I set the stage for the session with a brief overview of my favorite risk approach, “Competing on Value”, by Mack Hannan and Peter Karp, with their three-part framework of: How much, How soon and How certain.

As I describe in more detail in one of my early Value Alley posts (linked above), the basic framework is:

  • How much?  Nearly everyone can instinctively evaluate this first criterion.  How big is your paycheck, your bank balance, the sticker price on that car, how much is that doggy in the window?  Even a small child knows when his older brother took the bigger “half”.
  • How soon?  The domain of finance; the time value of money, discounted cash flow, internal rate of return, net present value, a dollar today is worth more than a dollar tomorrow.
  • How certain?  We know the price to the penny, and the IRR to three decimal places, but what about the uncertainty around those numbers?  The risk?

Or as I put it in my follow-up post (“How Certain is that Number in the Window”), while you may know that the expected IRR is 39%, is that 39% sitting on top of a gentle hill, so that if some disturbance comes along it only get knocked down a dozen points or so, still well above your 20% hurdle rate, or is it sitting precariously at 28,000 feet atop K2, such that any deviation from plan causes it to avalanche downwards into single digit or negative territory?

Lastly, you need a risk management and mitigation plan with which to evaluate and monitor the investment decision.  Here I shared some ideas from the third and last of this series – “Making the “How Certain” Decision”.  My own approach involved evaluating the uncertainty around five key variables: cost and revenue cash flows, margins, working capital and time-to-market.  Looking back at my initial attempts, how I wish I had had some of SAS’ risk analytic capabilities at my disposal, such as sound statistical cash flow distributions, confidence intervals, and Monte Carlo simulation – not available via spreadsheets but a piece of cake for SAS analytics.

With the stage now set, the roundtable discussion began in earnest.  The first point to become clear was the confirmation of the obvious – that very few companies are actually engaging in even this minimal level of operational risk analysis and management.  Outside of credit risk, and outside of the market risk analysis that dominates the financial services industry, adequate operational risk management seems to be confined to a relative minority of the Fortune 1000, perhaps 20% tops.  It’s not just that the risk / uncertainly element is missing on the front end, it’s that the basic follow-up and the post-mortems aren’t there either; there is little tracking and monitoring of even the How much and How soon components, no follow-up to see if the actual outcome was anywhere close to the 39% promise.

When we talk about swinging the finance function from being 80/20 transaction weighted to an 80/20 analysis focus, risk management and investment post-mortems would be one of my top priorities for that new-found, hard-earned free time.

But the topic that really captured our attention for the remainder of the session was: what role does risk management play in an agile environment?  What might it look like?  How might it be different from today’s risk management approach, if at all?

The group consensus was that “agile” does in fact change the game.  In today’s environment, it is highly unlikely that your typical two-to-four year new product / new market / IT system initiative/project/investment will end up looking much like it was originally drawn up. Therefore, does every project get re-planned as the business actively responds to changes in the market with changes at the strategic level?  If you were to conduct a post-mortem, what should the final success measurement criteria be – the original, or the last version left standing?

My own opinion is that ‘agile’ translated into ‘risk’ means scenario planning.  Just as I discussed scenario planning at the enterprise level in this post (“Rolling forecasts, or Who ordered that?”), with its Plan B, Plan C and Plan V (for volatility), if a project is large enough, important enough, or risky enough to warrant a full-blown, quantitative business plan and analysis, then that project plan needs to incorporate optimistic and pessimistic, best case and worst case scenarios (at a minimum) as part of its business plan.  When the enterprise as a whole moves from Plan A to Plan B, the projects move from their respective Plan A’ to Plan B’.

And if the business shifts its strategy sufficiently such that none of the project scenarios are applicable, then yes, I think the project and the investment decision get revisited and re-analyzed.

But I don’t think this approach mollified the group’s concerns.  Can you even be agile with these relatively cumbersome risk management processes in place?  The consensus seemed to be that these operational risk management processes as currently construed cannot survive.  The risk management processes themselves need to become agile.

But at this point the conversation stalled.  What an agile risk management process might look like was beyond the scope of our mere sixty minutes, but hopefully not beyond the experience and insight of my audience.  I’d like to hear from you in the COMMENTS section:  What do you think about this relatively recent intersection of agile and operational risk management?  Can they be made complementary?  Can we expect them to play well together, or does one of them have to go?  How can we make risk management as agile as the rest of the business?

Post a Comment

Analytics for integrating the Value Disciplines

For the B2C business of the future, catering broadly to the middle class consumer, there will be nowhere to hide.  Such a business will have to compete on all three Value Disciplines simultaneously:  Customer intimacy, Low-cost producer, and Innovation.  Or, perhaps not so much the single “business”, but the entire supply chain / value chain / business network.  This value chain of the future will bring all three of these value disciplines together to the market.

For starters, low cost will be a given, achieved as it is today either through low-wage labor arbitrage, or automation/productivity, or a combination of the two.  Your products will move through the product lifecycle into the majority and laggard phases faster than you can say, “What happened to my premium pricing strategy?”  Copy cats and secondary entrants to your market will be basing their entire strategy on underpricing you, the market leader.  Getting cost out of the product will not be a project for a later date - even for the most innovative of new products, low production cost will be built in from the start.

What this means is that the production aspect will become specialized around low cost.  What business are you in?  I’m a low-cost producer.  A low cost producer of what?  Anything and everything!  We make stuff to spec (i.e. quality standards) as cheaply as possible.  We don’t design, we don’t sell – we manufacturer.

The next value discipline, the innovators, specializing in the R&D that creates and markets new ideas, likely won’t be able to compete effectively in this low-cost production phase, won’t be building any of this cool stuff themselves. Competing with the rest of the Innovator discipline world on feature, function, ergonomics, design, novelty, design for manufacturability, and design for serviceability will take everything they’ve got when it comes to innovation. The innovators may be the most likely value chain segment to eventually play the role of value chain integrator, but that’s not a given.

On the other end of this pipeline are the Customer Intimacy specialists, the retailers and the rest of the distribution channel.  They have always been less vertically integrated than the innovators and producers, and therefore have been comfortable as specialists in their customer relationship role for some time now.  It is in this customer facing arena that they will have to sharpen their skills and tools even further, because every retailer will have access to the same products, all at the same low cost.  Customer service may mean different things to different sub-segments of the consumer market, but within each sub-segment the customer service will have to be optimal as appropriate.  There will be no substitute for knowing your customer and then executing on meeting their needs.

The future business environment won’t be one vertically integrated business choosing a single value discipline as its focus, but instead will consist of three separate businesses, even within the same “holding company”, focusing on their respective value disciplines.  While the innovator is the most likely actor to form the “holding company” for the follow-on production and distribution activities (if only to protect and control their brand), I think it equally likely that value/network integrator businesses (“The Value is in the Network”) will emerge to coordinate the separate value discipline actors.  This would be especially true when it involves secondary innovator market entrants, where protecting a market leading brand image is of less importance.

The conundrum I highlighted in this earlier post (“Hybrid Strategy Management”) is that it just won’t be good enough to build a sustainable business strategy around a primary focus on only one of the three value disciplines.  The consumer has become too demanding, on cost, on quality, on service, on functionality.  And it will be the rare vertically integrated business that can excel at all three of these disciplines simultaneously – the disciplines differ culturally in significant ways.  But they can and will be separately integrated into a successful value network.  Such success will not come from simply variations on old themes, but from a more disciplined, analytic and data-driven approach to managing the value chain.  What specific components will each actor need to leverage?

The Low Cost Producer will need to:

The Innovator will need to:

The Retailer/distributor will need to:

  • Understand their customer segments and individual customer needs
  • Effectively and efficiently target market each segment and individual
  • React to customer needs with the ‘next best offer’
  • Monitor customer satisfaction and the customer relationship

The Integrator will need:

The challenge is daunting.  While I don’t think this assessment applies at the very low and very high ends of the market, I do think this is the upcoming business reality for the growing, global middle class consumer market. Even if your industry segment has no players operating at this integrated three-discipline level yet, you can certainly already feel the niche specialists nipping at your heels.  You’ve got programs and strategic initiatives in place to move forward with your chosen value specialty while also attempting to meet the entire spectrum of balanced business needs, but still you feel as if you will never get out of fire-fighting mode, never get out of being merely reactive to the competitive threats of the other market players.  It’s like Whack-A-Mole out there, and that’s not an illusion – it’s because every single aspect and function of your business is competing against a niche specialist.

Post a Comment

Are you smarter than your sweatshirt?

Product as a service platform.  Design for service, not just for serviceability, not just for manufacturing and maintenance.  It’s the new trend in manufacturing that neatly ties together the product lifecycle with innovation.

What is your response to the commodification of your product, to its inevitable journey over the top of the product lifecycle and onto the late majority/laggard downslope?  There are several knee-jerk responses available, depending on how many knees you have.  The most obvious response is to join the low-cost producer battle.  Drive costs out of your product design, out of your manufacturing processes, out of your supply chain.

You likely knew this was coming, though, from even before you launched your beta version to the trial market. But unless your overarching business model is in fact low-cost production/operational efficiency, this response will have two primary drawbacks.  First, if product innovation, quality, functionality or customer service is your chosen value discipline, marketing to the low-end/low-cost segment of the market likely hurts your brand image.  You can’t be all things to all consumers. ‘Luxury subcompact’ is pretty close to being an oxymoron and can’t help but detract from your quality message.  Secondly, while it might help you avoid losing, it’s not a winning game for your chosen business model; it’s not differentiating.  The best you can hope for is “me too” among a host of other low-cost copy cats.  The imitation may be flattering, but it's also costly.

With your other knee, you could double down on quality and functionality.  Not a bad reaction in many cases.  As I discussed in this previous post (“Innovating for the Numerator”), this is the second of Clayton Christiansen’s innovation categories, ‘innovation for sustainability’ (the first category, alluded to in the previous paragraph, being ‘innovation for efficiency’), going from ‘good-to-better’, the product extended with new features, functions and improved performance and quality.

This approach has the benefit of protecting your brand reputation, and as I pointed out in “Coolhunting”, it also matches with the quality expectations of the late majority.  It does little, however, to directly address the cost issue, that kicked-in-the-teeth feeling you get when when one of your long term customers invites you to participate in a reverse auction (as I previously discussed in “Hybrid Strategy Management”).  Furthermore, higher quality and increased functionality has become part of the same “Dealing with Darwin” phenomenon as low cost production -  every product has its own Moore’s Law expectation, just with varied exponents for the doubling ratio.  Unless quality is the foundation for a conscious customer-intimacy value discipline, it can also end up as just another “me too” strategy.

Which leads to a third approach – the product as a service platform. As with my primary thesis, this service theme can likewise be broken down into three basic components.

The first approach to baking services into the product would be with up-front implementation services.  I won’t dwell on this too much, as it seems to ignore if not exacerbate the fundamental reason we got into this mess in the first place – product commodification and cost.  But properly done in conjunction with a quality or customer service focus, it could serve as a profitable differentiator.  There are any number of products that I can find cheaper on the internet, but without a plumber’s or electrician’s license I’d be hard pressed to install them successfully.

The second approach would be after-market services.  For many companies this is already the driving business model, where after-market services drives 30-40% of the revenue and 40-50% of profit.  Sectors such as aerospace or heavy industrial equipment can experience a 3X after-market revenue factor, where the expensive airframe or equipment frame outlasts the almost-as-expensive engines or hydraulics by several multiples.

But the approach that is getting all the buzz is not the “before” nor the “after” market, but the “on going” services.  The realm of the Internet of Things (IoT), of sensor data and connected devices and smart devices.  The feature/functionality/quality of the product/device becomes secondary to “What services can I get on this thing?”, right now and in the future.  Smart phones and mobile devices are just the beginning of what is likely to become the IoT decade.

There is already technology available that distinguishes the separate electricity profiles of each connected electrical device in your home, which of course can be harnessed for both lower energy costs and improved living convenience.  Who knew your electric meter could be such a smarty pants?

While the technological aspects of smart devices expand exponentially, it is the business models that are playing catch up, just as they did during the growth of the internet.  “What is your internet strategy” is being replaced with “What is your smart device strategy”?  What is your product-as-a-service-platform strategy?  We can create the value, but how do we monetize it?  And just as the explosion of the internet was interrupted with the dot.com bust, I fully expect similar false starts when it comes to building sustainable business models around the IoT.

On the other hand, I don’t doubt that someday in the not too far future your facial tissue will be able to indicate if what you have is merely a cold, the flu, or strep.  I have joked, with some pride in my children, that I am the fifth smartest in a family of five, but now with the advent of smart devices and textiles I fear falling to sixth, seventh, or even further.  My fallback plan was to get a dog, but I’ve been warned that beagles can be pretty smart too, so it might come down to a hamster or some goldfish.

Post a Comment

Analytics for your varied team member styles

This being my 100th Value Alley blog post, I thought I’d focus on the general subject matter that seems to have generated the most interest over these past four years: culture, strategy and communications - the “soft” issues that turn out to be harder than one might initially think.

Constructing or selecting a team is not the same as team building. The latter focuses on team cohesion and cooperation, whereas the former, by definition, precedes this exercise in camaraderie.

An effective team requires a balance of skills and team member styles.  The problem with most departmental teams, and even executive teams, is that certain team member styles tend to be over-represented in particular functions.  You end up with nearly everyone in the team exhibiting one particular style and therefore competing with each other for that one team member role, while other styles and roles go begging.

A model I was introduced to many years ago delineates eight distinct team member styles:

 

  • SHAPER – A task leader who makes things happen and brings drive to the team.
  • INNOVATOR – The imaginative, creative brainstormer who brings ideas to the team (Yellow Hat).
  • RESOURCER – This person, like Morgan Freeman’s character in “Shawshank Redemption” (or Radar O’Reilly from MASH), gets things and can improvise when necessary.
  • COORDINATOR – Leads through respect, is focused on goals and defining roles.
  • MONITOR – Keeps the team on track, focused on metrics, and risks, and offers critical analysis (Blue Hat).
  • IMPLEMENTER – Results oriented.  Turns goals and strategies into actionable, manageable tasks.
  • COMPLETER/FINISHER – Worries about the details, sees things through.
  • HARMONIZER – A good listener who builds on the ideas of others and promotes team harmony.

I have been on many finance department teams and can state with certainty that we have more than our share of Completer/Finishers, as you would expect from a profession that balances the books to the penny every 30 days (a more impressive task back in the days before automated intercompany transactions, when accounting systems permitted one-sided entries, when the Trial Balance could actually be out of balance, hence its name).  On the other hand, we might likewise have a dearth of Innovators and a relative shortage of Resourcers, Harmonizers and Innovators.

You can imagine other stereotypical functions such as engineering, QA or HR, having an overabundance of Innovators, Monitors or Harmonizers respectfully, but coming up short on Completers, Resourcers or Shapers.

Cross-functional teams tend to provide for a ready-made solution to this conundrum, with built-in team style diversity coming naturally as part of the cross-functional package.  Within a department or function, however, the problem can be a bit more difficult to resolve unless management has been wise enough to hire for team style diversity to begin with.  This of course is not easy to do, as the primary hiring objective is ‘fit for role or task’, followed typically by CONFORMITY with the prevailing team style so that they’ll “fit in” with the functional culture.

A team member style assessment is a good place to start, in order to discover everyone’s predominate primary (and secondary/tertiary) team styles, and to find out where the gaps are, and who those rare departmental members are who can fill those gaps.

One approach that has been available to me personally has been flexibility.  I can’t count the dozens of Myers-Briggs and similar assessments that have been done on me over the years.  Some have additionally come with “Z scores”, measures of adaptability (where I tend to score high) in addition to the X and Y scores that place you in your proper box on the 2x2 grid.  I remember one proprietary classification that tended to score 90% of the people somewhere along an upwardly sloping 45 degree line, with the rest scattered about the grid, but when the counsellor came to plotting my point among the others, he paused for moment then drew the orbit of Jupiter around the entire matrix, which he said represented my extreme flexibility.

My own team member style profile of course ranked me high as a Completer/Finisher along with my fellow financiers (with Implementer as a secondary strength – scores of 8-10 on a 10 point scale), but unlike the typical profile, I did not drop off into the 2’s and 3’s for everything else, but profiled with 4’s, 5’s and 6’s for the other style attributes.  So when I find myself on a team, instead of rushing to volunteer for a task utilizing my strength(s), and for which there are usually several others just like me, I often wait to discern if there any gaps, any missing skills, styles or roles on our team, and will instead volunteer to fill that shorthanded role in support of a better functioning, more effective team outcome.

An effective manager or team leader needs to recognize this would NOT likely be the normal response of most team members - the path of least resistance and most security is to migrate toward your strengths, and it will likely take some attention and persuasion to get people to volunteer to fill those those open but crucial team roles.

As you can imagine, different team member styles, just like different Myers-Briggs personality types, acquire and utilize information in different ways, which is one of the primary drivers behind SAS’ structuring of our Visual Analytics offering.  A highly visual and intuitive,  robust BI platform that the entire team (or enterprise) can use for consistency and communication, but with plenty of flexibility and variety in its reporting and analytics to benefit anyone’s style or type.

Whereas the Innovators might like to use it as a visual sandbox in order to “tell me something I don’t know”, Monitors can apply it as a metrics/KPI monitoring tool, Resourcers for its drill-down capability, Coordinators for its promise of consistent, enterprise-wide communication, and Completers as the corporate source for “one version of the truth”.

And if you are flexible and adaptable, why not try out the entire range of capabilities.  Analytics - Visual Analytics - for the non-data scientist; Analytics for everyone, no matter your style, score, or label on the 2x2 management matrix.

Post a Comment

Exit strategies aren’t just for entrepreneurs

A standard joke among venture capitalists is that often the only well-developed section of the many business proposals they receive is the exit strategy.  “We’re going to be acquired for $10-20 million by a Fortune 500 company within 6-8 years.”  Exactly how they are going to achieve that $10M valuation is, however, often a bit sketchy to say the least, with a lot riding on a vaguely defined expansion into emerging markets once they’ve finished granting exclusive distribution rights in the domestic sphere.

I wrote about exits strategies at the product level in this post from last year, “Know when to hold ‘em, Know when to fold ‘em”, where I covered in some detail the issues of:

  • "Selling high” rather than getting caught holding too long, and ending up with a “buy low / sell low” zombie-product outcome.
  • Product retirements and the best owner of your assets.
  • Geoffrey Moore’s “Dealing with Darwin” - managing a product through its lifecycle, especially the final “offload” phase, and how great companies make this part look easy.

The ability to avoid hanging on to a declining product past its sell-by date depends largely on up-front planning and risk management that identifies trigger points and actions with the same rigor and discipline with which the initial investment decision was made.

But for this post I want to get above the product level and into the strategic realm of business models that in today’s rapidly changing economy can become as obsolete as any product.

Undated view of U.S. Steel Corporation's Fairless Works on the Delaware RiverA discussion I had last week with a journalist about analytics in the metals and mining industries brought personal memories flooding back, memories of my father’s 26 years at U. S. Steel’s Fairlesss Works on the Delaware River just south of Trenton, New Jersey (undated photo on right); 26 years that might have been 40 had the entire industry not collapsed in this country.

I remember the business articles of the 1970’s and 80’s discussing America’s economic nemesis of the time, Japan’s steel and auto industry.  Japan – a mountainous country that owes its existence to the Eurasian-Pacific plate subduction zone on which it rests – has little iron or coal of its own to speak of.  And yet against all odds it became an export powerhouse in these markets.

Consider the absurdity of the situation.  Japan had to import nearly all of its coal and iron ore from the U.S. and Canada, ship it across the Pacific Ocean, process it, turn it into automobiles, and then ship it back, and still manage to sell cars well below standard American prices at the time.  How could this have even been possible?

Three words:  the ‘basic oxygen furnace’.

The basic oxygen furnace is a technology invented in 1948 and commercialized in the early 50’s.  It can produce a 350 ton charge of steel in under an hour, compared with eight to twelve hours for a comparable ‘heat’ from the already established open hearth technology.  Rebuilding following the war, Japan was quick to adopt this new technology in the late 1950’s.

The big American steel manufacturers, U.S. and Bethlehem Steel, however, faced an economic conundrum.  Most of America’s steel-making capacity was put in place during the war, all of it open hearth, basic oxygen not coming on the scene until after the war.  The problem the U.S. faced after the war was excess capacity (no further need for massive quantities of steel for tanks and shipbuilding) and excess labor, with 12 million soldiers returning home from the Atlantic and Pacific theaters.  Pent up demand did significantly stimulate the post-war economy, but one thing the country most definitely did not need was additional steel-making capacity.

So the United States was quite late to the basic oxygen game.  Why would a steel executive in his right mind choose to scrap and replace already paid for, perfectly good open hearth furnaces, with an operational life of at least 40-50 years, after just ten years, with new capital investment?  More money for even more excess capacity?  You’d be nuts to do it, right?

But as it turned out, it was nuts not to. It was like the fable of the boiling frog - too comfortable in the slowly warming water to bother to jump out until it was too late.  Likewise, the competitive impact of basic oxygen was initially barely perceptible, yet ultimately fatal.  When U.S. Steel finally closed the hot end of Fairless Works in 1991 and laid my father off, it still operated with the same nine open hearth furnaces that it had started with.  It wasn’t labor costs that did in the U.S. steel industry, it was the 10-15X productivity advantage of basic oxygen (and additionally, electric arc) over open hearth.

I referred in a recent post (“Innovating for the Numerator”) and in a more distant one (“Surfing the Disturbance”) regarding how disruptive business models can be so, disruptive!  (think of Kodak and Polaroid in film, and what Apple's iTunes did to the music industry).  Right now you likely have some competitors being deliberately disruptive, betting that you won’t react because of paralysis, the status quo and sunk costs, and other competitors employing disruptive approaches simply as the leap-frogging path of least resistance, just like the post-war Japanese steel industry.

While the need for product-level exit strategies is imperative, and whereas an enterprise exit strategy is optional (your strategy may of course be to run your business as a going concern), the need to consider a ‘business model’ exit strategy is becoming more important as the pace of economic, social and technological change continues to accelerate.  Long term viability as a going concern means agility if it means anything, even at the fundamental level of the basic business model.

A business model is just a specific strategy for how to create and extract value, and like a product that no longer fulfills a market need, business models can also outlive their usefulness as market factors change and evolve.  The problem is often that a company's business model can become so familiar as to be almost unrecognizable as a separate aspect of strategy, as unnoticed as the air we breath.

The average lifespan for a company to be listed on the S&P 500 was under 20 years in 2010, one half of what it was 30 years ago, and a third of what it was 50 years ago.   It's not necessarily that these companies disappeared entirely, but the current prediction is that by 2027 the pace of change will be such that 75% of today's S&P 500 companies will have exited the list, some voluntarily, with a business plan to gracefully and profitably manage the transition, and others involuntarily and not so gracefully or profitably.

 

Post a Comment

The data-driven factory, and economy, of the future

“The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.”  ~ Warren G. Bennis

The promise of digital manufacturing is drawing closer to fulfillment as many of its separate components simultaneously evolve and mature.  It’s not just about CAD/CAM or PLM any longer but now includes innovations in 3D printing, robotics, M2M sensor data and 3D simulation.

You can see these changes in how industrial engineering is now taught at university, where it’s often called ‘Industrial and Systems Engineering’.  The core production engineering components (facility location, scheduling, inventory, layout, material handling, warehousing, and logistics) are now just a subset of a wider program that includes manufacturing systems (CAD/CAM, CIM, robotics, automation, mechatronics and concurrent engineering), systems optimization (entropy optimization, linear and nonlinear programming, queuing theory, computer simulation and fuzzy logic) as well as industrial ergonomics, biomechanics, cognitive ergonomics and biomedical applications in bioinformatics and physiological signal processing.  A young graduate, who took required courses in industrial database applications is much more likely to find themselves working on the IT aspects of an ERP or CAD/CAM/CIM implementation than designing an assembly line.

A quick run-down of some of these cutting-edge applications and their payoffs include:

  • The Flexible Factory and predictive asset analyticsSensor data and analytics combine with digital manufacturing to anticipate equipment failure on the factory floor before it happens, and then reroute the workflow around the potentially defective machine to keep production flowing and avoiding the consequences of unplanned downtime.
  • Rapid 3D printing of tooling:  Most of the focus of 3D printing has been on rapid prototyping and the benefits of a faster time-to-market for the initial design.  But the impact on what it might do for tooling has been overlooked.  Product life cycles are shorter, consumer demand is more fickle and the odds of redesign are high, all of which shorten the useful life of custom tooling. In many cases companies will opt to continue manufacturing the product without change (and with continued defects or shortcomings). The cost and time of standard retooling outweigh the advantages of redesign.  Direct digital manufacturing / 3D printing provides the freedom to economically redesign for quality or feature/functionality.
  • 3D Factory simulation:  3D isn’t just for printing - 3D simulation is becoming the norm for larger manufacturers.  With simulation powered by high performance computing (HPC), commissioning becomes quicker and more reliable.  With predictive simulation, every material, design, eventuality and variable is analyzed and assessed for potential outcomes.  Innovation at the process level, bringing new products to market in months rather than years.
  • The Cloud:  The flexible factory isn’t just about robotics and automation.  The concept of flexible plug-and-play factories, where manufacturing operations can be custom configured for production of in-house or third party components, is enabled by supply chain and production management, scheduling and even ERP solutions shared and run from the cloud.  Simulation combined with cloud deployments allows IT to get out ahead and be ready to commence production the moment the design is finalized.  Imagine connecting a new plant to headquarters in just a few weeks, rather than having to go through a yearlong on-premise ERP implementation. The cloud makes it possible, and manufacturers setting up shop in emerging economies will make the most of it.
  • Quality analytics and simulation:  Combined to support six-sigma and lean initiatives, providing a graphical environment to analyze dimensional variation, identify the root cause(s), and quickly reconfigure the process to eliminate the defective element before it becomes a field maintenance issue or affects your brand's reputation.

The most interesting aspect of the drive to digital manufacturing is the effect it will have on manufacturing business models that go digital themselves.  Design and simulation can be managed centrally, allocated out for production to the specific flexible factory that has current available capacity.  Or dispersed and allocated for production closer to the customer in smaller factories that trade off shorter production runs for lower materials and logistics costs and greater market responsiveness.

Digital manufacturing will transform and change our understanding of what a manufacturer is. Imagine a world in which anyone can make a discrete part just by putting raw materials into a 3D printer, laser cutter or CNC machine.  A world where manufacturing will increasingly be about creating and selling digital code.  The manufacturing industry may in time come to resemble the music industry in this way.  Rather than centrally manufacturing, shipping and warehousing a part, your machinist of the near-future will simply purchase and download the blueprint and ‘print’ or machine it locally.

In all of these scenarios, information is the driver.  And as with any modern innovation project, data management, data integration and data quality will be at the heart and foundation of it.  The factory of the future will run on data to the same extent that today it runs on materials and labor.  The long-standing economic model currently includes just the standard four factors of production we learned in Econ 101:  raw materials, labor, capital and management.  The economy of the future will include a fifth factor:  information.  Information by itself will come to be seen as a peer, not merely a subset, of capital and labor.

Post a Comment