Customer Relations by Walking Around

Const1Perhaps nowhere is the saying “time is money” more true than in the construction industry.  There is no better indicator of project cost and budget over/underrun than the number of days on-site.  Reducing that number has a near 1:1 relationship with cost cutting, so it’s no wonder that days on-site is the most watched project metric.

Further complicating matters, the construction industry is well-behind the 3D adoption curve, still relying primarily on 2D blueprints when most other manufacturers have long since moved to 3D CAD-CAM design and production systems, despite the obvious benefits of the application of 3D systems to the construction of 3D physical structures.

Stepping into this breach is Nancy Novak, Vice President of Operations for Balfour Beatty Construction services, a speaker at the IE Group's Manufacturing Analytics Summit earlier this year.  Nancy specializes in applying off-site manufacturing (OSM) techniques to large commercial and industrial projects – one of the most innovative process to recently emerge in this industry.

Or maybe I shouldn’t call it a process, as much as OSM’s intent is to productize the construction industry, to allow it to standardize and reap the benefits of common manufacturing techniques and processes that have been around for decades.  The benefits of OSM include:

  • Faster – Fewer days on-site with a more predictable schedule  (i.e. lower cost)
  • Safer – Less on-site labor, better site logistics
  • Better quality, with a more predictable product

This is the story that Nancy brings to her potential clients.  With each project, she explores with the construction team the possibilities for modular systems that can be assembled off-site and then integrated into the larger structure on-site, whole and in working order, such as bathrooms, kitchen facilities, elevators and staircases, office space, HVAC, interior and exterior walls, and even entire living suites for apartment complexes.

Perhaps the most surprising aspect of her work is how often the client informs her that she is the first person who ever proposed such an approach to them, how often she is the first person to suggest that they take a walk through a current project to assess what improvements might be able to be incorporated into the next one.  Not so much management-by-walking-around as sales, or customer relationships, by walking around.

This is an easy lesson to apply to your own highly-competitive manufacturing business.  Are you tired of the price wars?  Are you looking for a differentiator other than features, functions and performance in a largely mature market?  Are you interested in taking need-based, consultative selling to the next logical level?  Then instead of making the focus of your next customer visit your own products and services, simply ask for the opportunity to walk around their business environment and ask “what if”?

Many your customers will of course have well-defined problems with straight forward solutions, where the the only obstacle is budget, but it’s more likely that their needs and problems are much more nebulous or even completely hidden.  As Henry Ford once famously quipped, “If I had asked people what they wanted, they would have said faster horses.”  Often they are looking for you to be the expert, or, as we often say here in the world of SAS analytics, “tell me something I don’t know”.  To get to the answer, first you need the insight.

I couldn’t possibly list here all the insights you and your customer might uncover, but just to give you a flavor for the types of questions to ask:

  • What can we do regarding custom packaging / logistics that would better suit how you use our product?
  • What services might better be provided on-site or mid-stream rather than all before or after delivery?
  • What if we could manufacture the product in multiple components (or singularly) for easier installation / service?
  • What integration could we be doing with your other suppliers before our product ever gets to you?

If all goes well, this inevitably leads to a discussion around where BOTH parties are making changes to their products and processes to reduce the total overall cost and/or to otherwise make the total end product more competitive. Not just, “What can I do for you?”, but “What can we do together?” The proverbial yet rarely seen "win/win".  Getting to this level of conversation is the best differentiator you could ever have.  You are no longer just a vendor, nor even a ‘strategic supplier’, but a real business partner.  You are no longer competing on price against a dozen other contenders, but are now critical to making your client more competitive in THEIR market.

So what are you waiting for?  Go for a walk – it will be good for you, … and your customer, …and their customer.

Post a Comment

Analogies, mind-mapping and New Product Forecasting

There are two ways you can react to a “Hey – that was my idea” situation.  The first would be to throw a pity party and lament about how unfair life is – if only the car hadn’t broken down and I didn’t have grass to mow and laundry to do I could have filed a patent and been a millionaire by now.  The other is to recognize that you were never going to do anything of the sort under any conditions anyhow and simply take the experience as confirmation bias of how brilliant you are.

1363280485Hofstadter-Surfaces_andWhen I came across Pulitzer Prize-winning author Douglas Hofstadter’s latest work, “Surfaces and Essences: Analogy as the Fuel and Fire of Thinking”, I chose the latter course.  His core theme is that analogies lie at the heart of how we develop concepts, how we construct language, how we understand the world, how we think – something I not only heartily agree with, but a concept I considered myself decades before Hofstadter’s book.

Among other things, I fancy myself an amateur student of language.  You see, as a parent, by necessity you become an amateur student of a whole host of subjects that previously may have never interested you.  For example, I find that parents are three times more likely than non-parents to know that Michael Crichton got it wrong: T-Rex is from the Cretaceous, not the Jurassic, and I have the short video, “Cretaceous Park”, to prove it, created by my then five year-old son, who produced it in order to set the record straight among his kindergarten classmates.

Likewise, as a parent you also quickly become an expert in the field of linguistics as you watch in amazement the literal explosion of language once your children have mastered a basic vocabulary.  They start with what they have in their linguistic toolkit and build on it, making telltale mistakes along the way that shows how their mind is working, such as undressing a banana, or cooking water, or breaking their book, before they’ve learned the verbs peel, boil and tear.

Metaphors come next and get incorporated into the very meanings of words: tables have legs, bellies have buttons, and airplanes get tails.  The analogies get more complex over time, as we encounter windows of opportunity, haunting melodies and watertight reasoning.  These later develop into idioms where sometimes the analogy is still clear, as in ‘bend over backwards’, ‘between a rock and a hard place’ or ‘stacked the deck’, and others where the root is barely discernable, such as ‘kick the bucket’, ‘egg someone on’ or to ‘give someone short shrift’.

One thing I instinctively knew about myself at a young age was that my preferred learning style was by analogy and via storytelling.  Rather than feverishly trying to scribe every single detail into my notes as the teacher or professor spoke, I saved those for later (an especially useful approach in today's Google-age) and focused on relating the main and secondary concepts with each other and with what I already knew, working them into my existing knowledge framework and creating a new, expanded or more complex story about the subject for myself.  I was mind mapping, or concept mapping as I thought of it, way before it became a thing.

This concept of analogies is what lies behind SAS’ New Product Forecasting solution.  New product forecasting (NPF) can be a recurring challenge for consumer goods and other manufacturers and retailers. The lack of product history or an uncertain product life cycle can dampen the hopes of getting an accurate statistically-based forecast.  Here are some of NPF situations you might encounter:

  • Entirely new types of products.
  • New markets for existing products (such as expanding a regional brand nationally or globally).
  • Refinements of existing products (such as new and improved versions or packaging changes).

SAS’ patent-pending structured judgment methodology helps you automate the evaluation and selection of candidate analogous products, facilitates the review and clustering of previous new product introductions, and generates statistical forecasts. This structured judgment approach uses product attributes from prior and new products, along with historical sales, to create analogies.

The use of analogies is a common NPF practice. You can see it, for example, in the real estate market, where an agent will prepare a list of “comps” – similar houses in the area that are on the market or have recently sold – and use this to suggest a selling price.

The structured analogy approach requires two types of data – product attributes (for prior and new products) and historical sales (for prior products). Product attributes can include:

  • Product type (toy, music, clothing, shirts, etc.).
  • Season of introduction (summer item, winter item, etc.).
  • Financial (target price, competitor price, etc.).
  • Target market demographic (gender, age, income, postal code etc.).
  • Physical characteristics (style, color, size, etc.).

The statistical forecast is then built using a structured process based on defining and selecting candidate surrogate products and models.  Furthermore, you can combine this with data visualization to study previous new product introductions to gain a better sense of the associated risks and uncertainties.

Candidate products2

Using analogies to improve your forecasting should not seem at all foreign – you’ve been using analogies since you were a toddler to expand your knowledge base by connecting and building on what you already know.  To find out more, check out this white paper, “Combining Analytics and Structured Judgment: A Step-By-Step Guide for New Product Forecasting”, and learn the details of getting from A to B, of getting from the product history you know to the new product forecast you don’t.  You might call it mind-mapping for your new product forecast; See - analogies are everywhere!

Post a Comment

CaaS – Crime-as-a-Service: Murder on the Internet of Things

Europol, the law enforcement agency of the European Union, in its recently released 2014 Internet Organized Crime Threat Assessment (iOCTA), cited a report by U.S. security firm IID that predicts that the first “online murder” will occur by year end, based on the number of computer security system flaws discovered by hackers.

pacemaker2While there have been no reported cases of hacking-related deaths so far, former vice president Dick Cheney has had the wireless function on his implanted defibrillator disabled in order to prevent potential hackers from remotely accessing his device. Just such a scenario was played out fictionally in the political TV thriller Homeland, in which his counterpart was murdered by terrorists who were able to hack into the (fictional) vice president’s pacemaker.  In an interview last year, Cheney said, “I was aware of the danger that existed and found it credible – [the scene in Homeland] was an accurate portrayal of what was possible.”

Cheney’s fears are not unfounded in the least.  2012 research from security vendor IOActive regarding the security shortcomings in the 4.5+ million pacemakers sold between 2006 and 2011 in the U.S turned up the following:

  • Until recently, pacemakers were reprogrammed by medical staff using a wand that had to pass within a couple of meters of a patient, which flips a software switch that allows it to accept new instructions.
  • But the trend is to go wireless. Several medical manufacturers are now selling bedside transmitters that replace the wand, with a range of up to 30 to 50 feet.  With such a range, remote attacks become more feasible.  For example, devices have been found to cough up their serial number and model number with a special command, making it is possible to reprogram the firmware of a pacemaker in a person's body.  Other problems with the devices include the fact they often contain personal data about patients, such as their name and their doctor. The devices also have "backdoors," or ways that programmers can get access to them without the standard authentication - backdoors available for more nefarious uses.
  • Just as your laptop scans the local environment searching for available WiFi networks, there is software out there that allows a user to scan for medical devices within range. A list will appear, and a user can select a device, such as a pacemaker, which can then be shut off or configured to deliver a shock if direct access can be obtained.
  • As if this wasn't bad enough, it is possible to upload specially-crafted firmware to a company's servers that would infect multiple pacemakers, spreading through their systems like a real virus - we are potentially looking at a worm with the ability to commit mass murder.

Can it get worse?  By now you’ve heard of SaaS (software-as-a-service) and PaaS (platform-as-a-service), but how about CaaS – Crime-as-a-Service?  From the Europol report: “A service-based criminal industry is developing, in which specialists in the virtual underground economy develop products and services for use by other criminals. This 'Crime-as-a-Service' business model drives innovation and sophistication, and provides access to a wide range of services that facilitate almost any type of cybercrime. As a consequence, entry barriers into cybercrime are being lowered, allowing those lacking technical expertise - including traditional organized crime groups - to venture into cybercrime by purchasing the skills and tools they lack.”

Just take a moment to let that sink in: Lowering barriers to entry, criminal innovation, CaaS as a business model.  It really shouldn’t surprise us, though - criminal enterprises have been adapting the principles of sound business management from the early days of organized crime.  Did you know that the illegal drug market is a $2.5-trillion dollar industry? Not merely a billion dollar industry, it’s a TRILLION dollar industry, employing standard business school tactics such as quality control, freemium pricing models, upselling, risk management and branding, not to mention the ever changing supply chain and logistics challenges.

Cyber criminals are at least, if not more, sophisticated than the typical drug trade.

Lest you think that cyber security is primarily the province of the big banks and retailers, how your products will integrate with the Internet of Things (IoT) should make you think twice.

I went over twenty years with the same credit card number - now I have to get a new one pretty much every year because someone got hacked, and I’m guessing that your experience hasn’t been much different.  And remember, these breaches are occurring at large enterprises already employing a significantly sized staff of cybersecurity experts.

If your device is going to be on the internet, security will need to be baked into the design from the very beginning.  Again, from the Europol report:  “"The Internet of Things represents a whole new attack vector that we believe criminals will already be looking for ways to exploit. The IoT is inevitable. We must expect a rapidly growing number of devices to be rendered 'smart' and thence to become interconnected. Unfortunately, we feel that it is equally inevitable that many of these devices will leave vulnerabilities via which access to networks can be gained by criminals.”

Rod Rasmussen, the president of IID - the source of the murder prediction mentioned at the beginning of this post - had this to say: "There's already this huge quasi-underground market where you can buy and sell vulnerabilities that have been discovered. Although the first ever reported internet murder is yet to happen, ‘death by internet’ is already a reality as seen from a number of suicides linked to personally-targeted online attacks.”

While it’s unlikely that anyone will die from a stolen credit card number, that’s not going to be the case for many of the tens of billions of devices attached to the internet, from medical devices to wearables to the connected car.  As a manufacturer of current and potential IoT devices, you may not be aware of SAS’ dominant presence in the fraud detection/prevention and cybersecurity field.  When you get a call from your bank freezing your credit card and questioning that $3,500 purchase at a shopping mall in Altoona, it was likely SAS analytics behind the scenes that identified and flagged the fraud.

If CaaS is going to be part of the criminal elements’ business model, cybersecurity will need to be part of your product design and IoT business model, and SAS can help.  While your brand may survive a variety of production quality problems, it won't survive a murder on the IoT.

 

[You can also learn more about cybersecurity from Ray Boisvert, CEO and founder of I-Sec Integrated Strategies, at his presentation, “The Threat Landscape: Cyber Tools and Methods Transforming the Business Environment,” at SAS' Premier Business Leadership Series, Wednesday, Oct. 22, from 2:15 to 3 p.m. Boisvert sees cybersecurity as a task for analytics that can help organizations tease out the proverbial signal from the massive internet “noise” around serious threats. The challenge is to identify the right threat vector related to the most valued elements an organization holds dear. The organization will only be successful if it has technology to quickly digest huge streams of data, in real time, so that it may begin to see patterns that can thwart further attacks.]

Post a Comment

East meets North: Integrated Business Planning for both efficiency and alignment

Sales and Operations Planning (S&OP) started out with big aspirations.  As initially conceived, S&OP was to cover the entire domain now called Integrated Business Planning (IBP).  As S&OP process implementations rolled out during the 1980’s, this broad scope turned out to be a bit much to attempt in one bite.  S&OP instead settled effectively into a more focused and limited role, and it would be another decade before a new attempt, and a new name, IBP, re-emerged to tackle the larger picture.

So what is the difference between the two, and does it matter?

Briefly, S&OP is the balancing act between supply and demand.  It sets the production plan for the upcoming period based on the unconstrained sales/demand forecast but informed and adjusted for other supply chain constraints such as capacity, material supply, inventory levels, logistics, and customer lead-time requirements.

IBP3

S&OP is a process focused on the EFFICIENCY of the production process.  In terms of Treacy and Wiersema’s three Value Disciplines, S&OP is concerned with the efficiency and effectiveness of the horizontal OPERATIONAL EXCELLENCE value discipline.   Note that while the prescription of the Value Disciplines is that organizations should focus on achieving excellence in primarily only one of the three, all three are always present, so even if your chosen value discipline is Customer Intimacy or Innovation, you still have an operational aspect to your business that needs to be optimized.

What does IBP bring to the party that S&OP lacks?  Alignment.  Financial and strategic alignment.

A couple of years ago I wrote this post (“The Nine-Foot Aviator”) about what first steps to take when attempting to institute an IBP process.  I had to admit then that, despite the brilliant description of east-west versus north-south processes by Gartner’s Noha Tohamy, I and much of the audience that heard her presentation still seemed confused over definitions and boundaries and roles and functions when it came to differentiating S&OP from IBP.  It was all a bit fuzzy, although you could see hints of the resolution in the IBP calendar shared at the IE Group conference by Verso Paper’s Michael Partridge – review sessions that included finance, risk and general management in addition to the usual production, demand and supply suspects.

However, if you approach the two processes from the perspective of the Value Disciplines, the distinction becomes obvious (see "The Sound and the Fury" for the connection between the value disciplines and your value-creation business processes).  As I mentioned above, S&OP operates primarily within the horizontal, east-west, operational discipline, with the aim of improving the efficiency and effectiveness of that discipline.  IBP, however, takes a broader, north-south perspective – the ALIGNMENT of S&OP and the Operational Excellence discipline with the organization’s financial and strategic objectives.  What is optimal for the horizontal production and supply chain process might not be optimal for the business as a whole.  Examples include:

  • Does the S&OP plan meet cash flow, earnings, revenue and margin objectives?
  • Does it meet the company’s risk appetite / profile, and are the identified risk mitigation plans acceptable?
  • Does it comply with safety and sustainability requirements?
  • Does it appropriately support marketing, new product/territory expansion and commercial initiatives?
  • Does it align with other strategic objectives, such as quality or customer retention?

In other words, IBP does two things that S&OP does not:

  • It aligns one value discipline, operations, with the other two – innovation and customer relationship management.
  • It aligns the Operational Excellence value discipline with the broader, high-level strategic objectives of the organization as a whole.

In order to fulfill the promise of IBP, the key takeaway from this would be to move beyond just the alignment of S&OP and Operations and generalize the intent and scope of IBP to include all three value disciplines.   In most companies the product development and customer relationship value disciplines have their own internal efficiency and effectiveness processes , maybe not as complex as S&OP but every bit as critical, especially if one of them is the organization’s chosen strategic focus.  It should not be sacrilegious to expect for R&D to regularly check its alignment with the company’s marketing direction or customer service performance, nor for marketing to likewise understand product development roadmaps and for sales to proactively be aware of the ever changing array of production and development constraints that could impact client relations.

IBP as a concept has wider applications than just policing S&OP, and you can use the process as applied broadly to manage your business holistically, complementing horizontal business-process efficiency with vertical and strategic alignment.  Via S&OP, East has already met West – with IBP, East gets introduced to North and South as well.

Post a Comment

The Cloud and other forces – Climate change, or just the weather?

SD thunderstormI’ve been having trouble getting a handle on the relationships between the nexus of forces / third platform themes  of social media, mobility, big data, analytics, and the cloud, and it made me feel better that someone like Geoffrey Moore, world-renowned author of “Crossing the Chasm”, seems to be in the same boat.  If you haven’t run across it yet, Geoffrey Moore is an official LinkedIn “InFluencer”, and you can 'follow' him on LinkedIn and read up on some of his recent insights on his author page.   Moore has been tackling these topics for several years now, and you can watch especially how his opinion of the Cloud has evolved over time.

In sorting these themes out, I wanted to assess them outside the influence of those parties who have a stake in how the subject gets framed, and also from the point of view of, “So what? – What’s in it for me?”

Mobility” seems to be the easiest to classify, but the repercussions are going to be monumental.  No need for a ‘third platform’, mobility fits neatly into the client-server model, or maybe it should now be called ‘device-server’. Or rather, there are a range of devices in the client role that run the continuum from thick client to sensor.  In between are smart devices of all flavors, such as phones, in-car GPS and home appliances.  Far from being just an IT problem (i.e. BYOD), mobility impacts everyone from marketing to operations, as everything from people to products goes mobile.

Big Data’ is of course a relative term, but when I think ‘big data’ one of the following three data categories seems to be in play:

  • High transaction volumes: Millions of customers, billions of transactions (i.e. ATMs or POS), or tens of thousands of SKUs crossed with other attributes such as retail locations, cost and/or service levels.
  • Temporally dense:  Sensor data, audio.
  • Spatially dense:  Video, satellite imagery.

The business issue becomes – what do you want to do with all this data?  Is it just a matter of storage, in which case Hadoop might be called for, or does the value come from real-time event stream processing, or does the data serve as the foundation for the further application of analytics and the extraction of metadata?  But does Big Data constitute a fundamental building block for a new computing platform?  By itself, I don’t think so – evolutionary rather than revolutionary.

Analytics’ always has the potential for revolution, because it is in the unique position of being able to respond to the requirement, “Tell me something I don’t know”.  How much risk is in that forecast?  What’s the optimal product mix given certain production constraints?  What’s the next best offer to make to that customer in the store or on the web?  What are our customers saying about us on social media?  Insights like that are more than a system, platform or architecture.

The Cloud.  Is it just an outsourcing model / just another business model, or is it going to be as disruptive as its proponents advocate?  For the time being, the answer seems to depend on which side of the equation you find yourself.  If you are in the IT business, the potential for disruption, either by yourself or your competitors, is keeping you awake at night.  Those on the receiving end, however, currently seem to view the Cloud primarily from a cost basis, from the motivation to cut the costs and risks of hardware acquisition, maintenance, and software migrations.

While this might be a good foot in the door (or sticking your head in the cloud?), it would be best not to dismiss too quickly what the cloud visionaries have envisioned.  Two potentially important cloud applications to keep in mind are:

  • SaaS as a way to acquire capabilities you could never support in-house, especially niche applications that could add considerable value but don’t currently generate enough internal critical mass.  Watch this space – SaaS as a business model will enable a plethora of new applications that were previously barely imaginable.
  • PaaS as an internal IT business model.  You won’t be outsourcing everything, there will still be mission critical enterprise apps that you manage in-house, but in order to meet your internal business client needs you are likely going to have to be more cost competitive and more flexible with your IT resources.  As an effective business model, PaaS needn’t remain the sole domain of the big boys.

That leaves us with Social Media.  Honestly, I don’t know how to classify it.  It’s a game changer, most certainly.  As I’ve mentioned in a previous post, I’m quite the fan of Marshall McLuhan and his observations on the media.  If “the medium is the message”, what is the message of Social Media?  The implications of “social” as both a medium and a message are likely to be both subtle and far reaching.  This isn’t about “digital” at all – this is about brand reputations and networks and experiences and influencers and chaos.  Your current struggles with big data or with BYOD security are just a taste of what’s to come in the social arena.

If I had to put a label on them, Big Data feels like weather, like a cold front passing through.  Mobility is the storm itself.  For now, the Cloud is like the seasons – winter if you are in one hemisphere but summer if you are in the other.  Analytics is a longer trend, an El Niño with global consequences.  And social media? Climate change for sure – but whether it’s a runaway greenhouse or the next ice age remains to be seen.

Post a Comment

The tragedy of overcapacity

Capital investment in production capability is the weakest link in the business value chain.  It always has been and likely always will be.  It’s the driving force behind the tendency towards cartels, collusion and monopolies.  While it can make the first entrant into a brand new market, in the long run it usually breaks everyone, playing no favorites between the initial innovators and the late comers.  It’s the problem of supply and demand in its purest form.

To illustrate the conundrum, imagine a market where each player has 20% of the total market production capacity.  The first into the market enjoys the premium pricing benefits of a green field with no competition.  Others soon join the party.  With just four participants, everything is still peachy – only 80% of the demand can be met, so there is still plenty of margin to go around.  Things tightens up with the entry of the fifth player – margins are still acceptable, but growth must now come at the cost of someone else’s market share.

overcapacity2But it’s that sixth entrant that ruins the fun for everyone.  The industry is now over capacity.  Prices get driven downward to the level of variable costs, but the elasticity of demand never seems to respond proportionally – after all, I only need one refrigerator and one washing machine, and I can only make use of so many TV’s or even automobiles.  It’s essentially Garrett Hardin’s famous “Tragedy of the Commons”, where everything is fine with 99 cattle on the pasture, but where the 101st leads to overgrazing and certain environmental / economic collapse.

If you are an executive in manufacturing, you have seen this effect many times over, and it has become the bane of your existence. The opposite, lack of capacity leading to pricing discretion, is typically a euphoric but short-lived phenomenon that might happen once in your career if you are lucky.

Excluding bankruptcy or getting out of the market altogether, there are six or seven principle ways in which most businesses attempt to address this iron law of economics.

  • Driving cost out of the production process and out of the supply chain.  As most efficiency gains are readily copied and adapted across the industry, this is seldom a long term differentiator, but it does become table stakes – if you can’t quickly improve on costs and efficiency, you’ll be the first to fold.
  • Increase the value throughput.  This approach gives rise to the familiar feature / function / performance arms race.  But unless one of the market players cannot keep up with the innovation, this doesn’t do much for the capitalist other than set the variable cost bar higher.
  • A focus on quality.  History and economics tend to look favorably on this strategy.  Although at first glance an investment and focus on quality would seem to be no different than investment in feature / function / performance, the famous Faster-Cheaper-Better triad shows that "better" (i.e. quality) has long been differentiated from "faster", with its own value proposition, especially as it relates to brand loyalty and equity.
  • De-risking via outsourcing of production.  Huge segments of the manufacturing industry have spent the last couple of decades doing just that.  If your core competency and primary differentiator is product innovation, this might make sense.  Why bother with the risk of building a factory if production efficiency is not your strength?  What’s lost in the bargain is operating leverage and some control over your product quality and availability.  Should your product become the biggest thing since sliced bread, without the operating leverage you make the same profit on your billionth shipment as on your first.
  • Consolidation, coopetition, and secondary markets.  Excess and unproductive capacity is often the mother of invention.  Getting your facilities back up to 100% utilization can require some creative thinking on your part, from running unrelated products through your plant to developing secondary markets for your primary products.
  • The flexible factory.  Now we’re talking.  You are no longer stuck with being able to run only one product through the production line.  Your industrial engineers set you up for long term success with a flexible factory floor design, underpinned by robotics, programmable or soft automation, analytics, and a flexible approach to IT and data management.  The key to making this work, however, is “knowing when to fold them”, having the discipline to regularly get out of the commoditized, low margin products in favor of the innovations your design team is bringing forth.
  • A focus on the customer, and the customer experience.  While you can’t control how much excess production capacity comes onto the market, you can do a lot about your downstream access to the customer – another approach that seems to be highly rewarded by the market.  It’s one thing to build it, or even to build it cheaply, and quite another to be able to match that capacity with a paying customer on the other end.  This in turn requires knowing your customers, the market, and the distribution channels via customer analytics, understanding the broader market via demand forecasting, and paying attention to the service and aftermarket aspects of your total offering.

Regardless of what the strategic or value discipline approach of any particular firm might be, in the end, if product is going to be made then the capacity has got to be built, and the capital investment committed to.  We can’t outsource manufacturing capacity off the planet.  Someone with a core competency in production is going to take the risk, and then take the necessary steps to manage and mitigate that risk.  This will likely result in higher outsourced versus insourced product costs (once we get past this current phase of global labor arbitrage).  While this may be an acceptable outcome for those businesses specializing in only product design / innovation, most of the manufacturing industry will find itself concentrating on some combination of those other key differentiating characteristics mentioned earlier:  flexibility, quality, service, information management, analytics and the customer.

Post a Comment

Why analytic forecasting?

Candidate productsBecause you are already halfway there and you should want the entire process to be data-driven, not just the historical reporting and analysis.  You are making decisions and using data to support those decisions, but you are leaving value on the table if the analytics don't carry through to forecasting.  In the parlance of the domain, don't stop with just the descriptive analytics while neglecting the power of predictive and prescriptive analytics.

Descriptive analytics relies on the reporting and analysis of historical data to answer questions up until a particular moment in time.  Using basic statistics such as mean, frequency and standard deviation, it can tell you what happened, how many, how often and where?  With the application of additional statistical techniques such as classification, correlation and clustering, you end up with an explanatory power that can sometimes even tell you ‘Why’.

In the terminology I proposed in this earlier post, “The Skeptical CFO”, descriptive analytics covers the first two of my first four points:  “Where am I right now”, and “What is my ability to execute”, the latter typically surfaced through a BI capability that computes and displays the historical data in the form of metrics for ease of standardization, comparison and visualization.

FS model vs FAW model
But why stop there?  Why stop your data-driven approach to decision making at the halfway point, at the vertical bar in the above graphic?

Your decisions are always about the future – what direction to take, where to invest, what course corrections to make, what markets to expand into, what and how much to produce, who to hire and where to put them.  In other words, a forecast, the third of my four points, with the fourth being perhaps the most important of the lot - a confidence level or uncertainty measurement about that forecast, these last two coming from the realm of predictive analytics.

Even if you’re not comfortable using the statistical forecast straight out of the box, don’t you at least want to know what it indicates?  What data-driven trends and seasonality it has on offer?  And wouldn’t you appreciate having a ballpark estimate of the risk and the variability that is likely inherent in any forecasting decision?

Getting back to that “straight out of the box” issue, the truth is, for roughly 80% of your detailed forecasting needs (depending of course on the quality of the data and the inherent forecastability of the item in question – see: “The beatings will continue until forecast accuracy improves”), the machine is going to be as or more accurate than you, and much, MUCH faster at it.  The forecast analyst workbench listed below can generate incredibly high-volume forecasts at the detailed level (i.e. SKU, size, color, style, packaging, store, expense line item, cost center …) in short order, leaving the forecast analyst free to spend the bulk of their time improving on those hard-to-forecast exceptions.

Lest you doubt the veracity of my 80% (+/-) claim above, the collaborative planning workbench (below), in addition to facilitating the consensus forecast you would expect from its name, also includes a Forecast Value Add capability to identify and eliminate those touch points that are not adding value.  You would be surprised at how many reviewers, approvers, adjustments, tweaks and overrides actually make the forecast worse instead of better (then again, maybe you wouldn’t).

Can you be data-driven when it comes to new product forecasting?  If you’ve got the structured judgment / analogy capability of the new product forecasting workbench then the answer is yes.  It uses statistically determined candidate analogies or existing surrogate products with similar attributes to provide an objective basis for predicting new product demand.

Beyond predictive analytics, which provides answers to 'What if these trends continue', and 'What will happen next', lies prescriptive analytics – what SHOULD I do; what’s the best, or optimal, outcome?  The inventory optimization workbench optimizes inventory levels across a multiechelon distribution chain based on constraining factors such as lead times, costs, and/or service levels.  And just as with the forecasting component, 80% of the optimization can be automated, again leaving the inventory analyst free to focus on hard-to-plan or incomplete orders.

When you hear the word “optimization” in this context, think of two elements: a forecast, and a corresponding set of constraints.  Knowing that context, you can see why SAS has taken this integrated workbench approach to demand-driven planning.  A common foundation and data repository enables the consensus forecast, as well as collaboration between the forecast and inventory analysts.  Even the purely descriptive component, the demand signal analytics workbench, is completely integrated with the same demand signal repository that will eventually build the forecast and the inventory plan.

DDPO2

When it comes to decision support, don’t settle for halfway.  Because half of the value-add lies to the right of that vertical line. It all starts with the forecast, which drives the integrated business planning (IBP) process and is its largest source of variation and uncertainty. Improving the forecast will affect everything downstream. And, it can have a multiplier effect as it travels along the IBP process. Even slight forecasting improvements can have a larger proportional effect on revenue, costs, profit, customer satisfaction and working capital than any other factor – financial, supply-oriented, or otherwise.

Get the forecast right, and good things will follow.

Post a Comment

Impending Crisis: Analytics for the top-line

I’m sitting here staring at a book on my shelf entitled, “Impending Crisis”.  Even knowing the copyright date, 2003, it could still be about any one of several possible crises: healthcare, financial, energy, education, environment.  But no, in this case the impending crisis in question is provided by the subtitle: “Too many jobs, Too few people.”  A perfect storm of demographics, education and technology that was supposed to hit the Western economies by the end of the decade, a crisis ultimately stillborn, upstaged and derailed by its antithesis – The Great Recession, with its concomitant double-digit unemployment.

Predictive-Analytics-for-Human-Resources-Wiley-and-SAS-Business-SeriesBut still, it was there on my bookshelf for a reason.  If the derivative-driven economic implosion of 2008-‘09 had never happened, the book’s thesis represented a most likely case.  At the time, the US Bureau of Labor Statistics was predicting an overall workforce shortage in the US of about 10 million workers by 2010.  A decade has now passed since its initial publication, so besides the Great Recession, what else has changed?

The demographics are what they are, but with everyone now placing an additional ten candles on their birthday cake.  The state of education may be even worse, with No Child Left Behind turning into No Child Left Untested.  The cost of higher education is the fastest increasing segment in the national economy, outpacing even healthcare, as the ratio of full-time faculty to management and staff declined from about 2:1 twenty years ago to roughly parity today.

Technology seems to be the big unknown.  For a thorough perspective, I highly recommend this study from the Pew Research Center: “AI, Robotics and the Future of Jobs”.  To illustrate what a challenge this subject is, the nearly 2,000 respondents were roughly evenly divided on the question of the future of jobs, with 52% taking the non-Luddite view that there is nothing as constant as change and that in the end more jobs will be created than lost.  The other 48% would likely find themselves more in agreement with Bruce Springsteen, who wrote in “My Hometown: “These jobs are going boys and they ain't coming back”, their main supporting point being, ‘You want evidence?  Just look around, it’s happening now, it’s happening everywhere, it’s been happening since at least 1990 if not before.’

One more datum to add to the mix:  $6 trillion.  Or make that $35 trillion if you are thinking globally.  That’s the annual labor cost in the US / World respectively – representing 40% of US GDP, 50% at the global level.  So when you impact labor productivity by more than a few percentage points, you’re likewise talking trillions (for comparison, energy costs run about 10% of GDP).

Stepping into this ill-defined, undiscovered country from which perhaps no job returns, is strategic human capital expert Jac Fitz-enz and his co-author, John Mattox, with their new book, “Predictive Analytics for Human Resources”.  What makes this such a worthwhile read for anyone interested in applied analytics is the authors’ broad general business experience.  If you want, you can take their analytic approach completely out of its HR context and drop it wherever you are facing an analytic need.  Whether it’s the chapter on ‘Getting Started’, or ‘Data Issues’, or ‘Analytics in Action’, or my nomination for best-in-show, ‘Developing an Analytic Culture’, this is the analytics primer you’ve been searching for, no matter whether your business problem is quality, customers, process or people.

While the obvious application of analytics to human capital (see my prior post, “Strategic Workforce Planning”) is the cost impact, hence the $6 trillion reference above and its prominence throughout the book, I want to direct your attention to the issue from the other direction, the top-line versus the bottom line, and the mixed realities of the post-recession employment picture.  Moreover, I want to tie all of this into another important business paradigm – Treacy and Wiersema’s ‘Value Disciplines’.

Official unemployment in the US currently stands a tad over 6%, with the unofficial rate, which counts those who have stopped looking for work, at 14%.  The comparable figure for the EU is slightly over 10%, with the extremes running from 5% in Germany to over 25% for Spain.  With that many people out of work, who needs workforce analytics?  Just run the ads and take the lowest bidder, right?

Not so fast.  If your chosen Value Discipline is Operational Efficiency, then you most likely aren’t hiring in the Western economies anyhow, you moved those jobs offshore long ago.  On the other hand, if your Value Discipline is Innovation or Customer Intimacy, cost is not your primary concern (a truism whether your specific business problem is workforce, or something else like quality, innovation, service or retention, and a truism your approach to analytics should reflect).

What is your concern is the shortage of STEM and skilled workers - the lingering high unemployment rate being a rather asymmetrical affair, primarily affecting the lower skilled job classifications. Besides, you’re not looking for the cheapest engineer, scientist, cyber-security specialist, nurse or marketer.  There are multiple stories making the rounds of manufacturers in rural, low-wage regions of the country with 100 applicants for each shop floor position, but unable to find and attract the design and manufacturing engineers and the management to run the place.  Silicon Valley’s recently uncovered anti-poaching cartel is certain proof for the reality and seriousness of the issue.

The benefits of using an analytical approach to addressing STEM and skilled workforce management issues will show up in the revenues, not just on the bottom line, of those companies that depend on innovation, quality and customer service as the foundation of their business model, and who need the right people, not just the least expensive, to make that business model work.  As the saying goes, you can’t just save your way to prosperity, eventually you need to put the emphasis on growth.

Lest you think this STEM shortage is fairly straight forward and one-dimensional, let me scare you to death with reference to this series of posts on LinkedIn by Heather McGowan - “Jobs are Over: The Future is Income Generation”  (the link is to Part 2 of this four-part series, Part 2 being where I became truly frightened for my children’s future) (and I won’t even get into the unnerving picture that Fitz-enz paints at the end of Chapter 7 of his book – I’ll leave that for you to discover – just don’t be taking any three-day weekends).

Here’s McGowan in her own words: “The era of using education to get a job, to build a pension, and to then retire is over. Not only is the world flat, but this is the end of employment as we once knew it. The future is one of life-long learning, serial short-term employment engagements, and the creation of a portfolio of passive and active income generation through monetization of excess capacity and marketable talents.”

Let that sink in for a bit.  A future that some might call entrepreneurship, but others might label ‘gigs’, everyone a temporary contract worker, no benefits, competing to create monetized portfolios (how would you have fared in your twenties or thirties, trying to start a family, under such conditions?).  Is your business ready to address a workforce strictly defined by contractual short-term gigs and monetized marketable talents, whatever that might mean?   While you might initially think you’ve got the upper hand when it comes to employment negotiations with such relatively insecure ‘income generation seekers’, focusing on cost, as I mentioned before, would be missing the point for most organizations.

I’m not saying McGowan is right (and I’m hoping she’s wrong), but I do have to admit that the trends she identifies are all already here. The ‘market economy’ is becoming the ‘market society’, with little indication that this socio-economic movement is slowing down let alone running into obstacles that might halt it. In such an environment, and without a far-sighted, disciplined and analytic approach to workforce planning and management, you’ll end up with a top-line going nowhere fast, and a bottom-line spelled I-R-R-E-L-E-V-E-N-T.

Post a Comment

Cloud Encounters of the Fifth kind

Wow_signalCSETI, the Center for the Study of Extraterrestrial Intelligence, defines a “Close Encounter of the Fifth kind” as an event that involves direct communication between aliens and humans (a “Close Encounter of the Third kind” would be one in which an animated creature is present).  So I think Spielberg misnamed his movie by two whole steps.  We most definitely had some direct communication going on there, beginning with the iconic five tone sequence of B flat, C, A flat, (octave lower) A flat, E flat, progressing to the point where the technician announces, “We have a translation interlock on their audio signal – We’re taking over this conversation, … NOW!”, and the computers and the keyboard and the mothership go about their business without any further human involvement.

While it has been a couple of years since we passed the point where more than half of all Web traffic became non-human, mostly search engines, bots and spam, when it comes to the internet as a whole, video and media / gaming still holds sway at 50%+ of the transmitted bits.  The peer-to-peer segment (P2P) currently comprises about 20% of the total, largely dominated by file sharing and financial trading, but masked within it is the fastest growing component:  machine-to-machine (M2M), expected to grow by more than an order of magnitude within just the next five years.  When it comes to the Internet of Things (IOT), the future clearly belongs to the Things.

There is a well-known saying that a stopped clock is right twice a day.  I once heard the philosopher of communications Marshall McLuhan described as a clock that was only right once in a hundred years, but when he was right he was dead on (i.e. “The medium is the message” and the “global village” from “Understanding Media” and “The Gutenberg Galaxy” respectively). Ken Olsen, the founder and former CEO of Digital Equipment Corporation would probably fall somewhere in between.

Olsen is today most infamously remembered for his “snake oil” comments regarding UNIX, and, when taken somewhat out of context, his dissing of the personal computer.  I worked for Olsen for eight years, and what gets left out of his PC comments was his vision for the future of information – “information as a utility” he called it.  Combined with the standardization of Ethernet in the 1980’s, he foresaw people plugging into the wall for information just as they might plug into an electric socket or connect a home to water and gas utilities.  Thus the ubiquitous VT220 terminal of the times, the smartest dumb terminal ever made.

Olsen’s vision was spot on, but his timing left a little to be desired.  That time is drawing closer, step-by-step, piece-by-piece.  We’re seeing components, such as the Cloud, the Web, the IOT and Analytics at the Edge, along with mobile and social technologies, grow, mature, connect and overlap.  However, as the author William Gibson noted, “The future is already here — it's just not very evenly distributed”.

The Cloud is one of those currently unevenly distributed elements.  I think the word itself will go out of non-meteorological use within a dozen years or so as information becomes more of a utility and cloud computing becomes a commonplace.  Mentioning the Cloud will one day date you just as talk of rotary phones and punch cards (and VT220’s) dates you now.

Four years ago when I started chairing financial conferences for the IE Group, the primary concern when it came to the Cloud was data security.  Four years later, CFO’s seem to have become more comfortable with the security issue, and now the reluctance comes from a different quarter.  Half of the effort and cost for any large system or process re-engineering initiative occurs on the front-end: the data, documentation, admin and policy and procedure clean-up.  You just can’t throw your mess over the wall and into the Cloud and expect it to work, and neither will your Cloud partner accept it.  What I hear from these same CFO's today is that once they’ve cleaned up their act, they rather like what they see, and then proceed by keeping the back-end of the project in-house.

But in this uneven world in which we find ourselves, I think it’s going to be the security issue that actually drives businesses to deploy more rapidly to the Cloud.  The Cloud providers, of course, will all insist that since it’s your data, the data security liability is yours and yours alone, but wouldn’t you rather be behind a firewall that has TEAMS of cyber-security professionals separately dedicated to the primary security threats, experienced cyber-security teams dedicated to threats emanating from China, Russia, India or even the U.S. (“Top Ten Hacking Countries”).  With cyber threats becoming both more numerous and more complex all the time, it’s already tough enough for most firms to fill key data security roles, let alone compete with Google, Amazon, Microsoft and the NSA for the top talent.

Before information truly becomes a utility there are still some business and content creation models to be worked through, conflicting standards to be ironed out, and turf wars over gate-keeping and rent-seeking to be fought, but it does not appear that technology will be a barrier.  Brian Arthur’s “digital economy” is already here (“The Second Economy”) with the advent of large, non-financial but also non-product enterprises such as Facebook and Google.  Arthur’s digital economy continues to build out its nervous system, with the Things of the Internet not just talking to each other, but learning as they go (“Machine Learning”) – they are taking over this conversation, and they don’t care whether it’s cloudy or not.

Post a Comment

A Holistic view of Product Quality

While managing quality within the four walls of your own operation is all well and good and totally necessary, both the market and your bottom line are demanding a more holistic, quality lifecycle approach, and in support of that aim there is a treasure trove of downstream data waiting to be tapped and exploited to improve product quality and customer satisfaction.

qualitycheck(1)The impact of this downstream customer quality data can be badly described by this less-than-perfect analogy about having a baby.  It goes like this:  With little or no incoming inspection of supplier materiel, the baby is conceived and spends the next nine months in production.  Typically born in a hospital, the baby will undergo a number of invasive outgoing inspection procedures before being released for shipment.  Depending on the manufacturer’s health care plan, the baby will come with an 18 to 24 month warranty, during which time period the proud parents will make regular pediatric dealer visits for routine inoculation maintenance.  The warranty tends to expire before the (now) toddler begins to operate in less forgiving environments, with many of the post-warranty malfunctions being handled by the nearest urgent care or emergency room  ( I have three children, and have been to the emergency room four times during their childhood – all four times with the same child.  It now would appear, however, that the involuntary software upgrades (i.e. learning, experience) that accompanied these hardware failures have at last had their intended effect in ameliorating the culpable risk taking behavior).

Then they grow up, go off to college, move out of the house, and live for ANOTHER seventy years; seventy more years of additional hardware (and occasionally, software) breakdowns.  From this article, “The Eleven Most Implanted Medical Devices in America”, the top five are:  lens implants to replace cataracts, ear tubes, stents, artificial knees, and metal screws, pins, plates, and rods.  My son, the other one, the one who never went to the emergency room once despite playing lacrosse through both all of high school and college, is in grad school studying biomechanical engineering – it looks like there’s a bright future for him in either joints (knees and hips) or cardio (stents, pacemakers and defibrillators) should he so choose.

The poorly illustrated point here is, of course, that with humans, as with manufactured products, there are numerous downstream quality issues that never get reported back to original hospital / manufacturing plant.  Just as you will likely never return to the hospital of your birth for any kind of treatment, but will be treated in a variety of specialized care centers across the country or the globe, your malfunctioning manufactured products are going to be repaired at a host of dealers, retailers and repair shops both in and out of your distribution network.  Formally, back at the ranch, you are typically only going to see a fraction of all customer returns, warranty claims and product problems. This gives a false indication of customer return rates and reasons. Failure rates can differ greatly between manufacturing and customer returns. It would be like a medical / public health system ignoring the last seventy years of a person’s life after they left the care of their pediatrician.

You can see this effect especially with consumer electronics as they get more mobile and are used in environments and for purposes that were never foreseen back in the design lab.  While some types of failures, attributed to things like poor shipping or packaging, might surface quickly and consistently enough for a reliable root cause analysis and fix, other problems with the user interface might only show up after years and years of cycles operating in previously untested environments.

As you begin to tackle this, one data management issue that will become readily apparent is the need for common failure symptom descriptions across all stages of data collection. You won’t be able to diagnose the problem if everyone is describing the same thing in five different ways.

Collecting, processing and acting on this downstream data will become easier as the Internet of Things evolves into the Connected Consumer with every product communicating continuously with the mothership throughout its life, but awareness now, along with making the best of the data you currently have or can get to, can have a substantial impact on your total quality program.  Making the best of what’s available would most certainly include social media, sentiment and text analytics, where you can assess what’s being said about the quality of your product behind your back.

While it might be difficult today for a single business to justify providing financial inducements to downstream players to incentivize them to report their findings back to the manufacturer, depending on how the Internet gatekeepers of the future structure themselves, we might see the evolution of syndicated warranty / repair information service providers similar to those that operate on the POS side.  And if it’s not you taking advantage of this information, perhaps it will be one of your competitors.

 

(My special thanks to Jeff Pink, Director of Operations at ViaSat, whose presentation at the IE Group’s Manufacturing Analytics Summit this past May provided the inspiration for this topic)

Post a Comment