Objectives that drive strategy - A lesson in strategic planning from NASA and the Kennedy Space Center

Discussing strategy, and what we mean by it, can be a confusing and sometimes unproductive undertaking. Considering its different uses as a noun and an adjective, defining our terms is a good place to start:

  • Strategic thinking: Characterizing the environment, identifying and assessing risks, and developing and evaluating options.
  • Strategic goals: High level goals immediately derived from the organization’s vision and mission.
  • Strategy: The general approach to how a specific goal / objective is achieved, with consideration given to core values, vision and mission, boundaries and limits to action, and options.
  • Tactics: The specific actions taken to achieve the objective.
  • Strategic plan: The aggregate of the organization’s vision, mission, goals, objectives and strategies.
  • Vision: A roadmap to a projected future, what the organization wants to become.
  • Mission: What the organization does, its purpose, and its core competencies.

With this as a starting point, I’d like to relate the key points of a remarkable presentation that contributed greatly to the clarification and simplification of our conception of the strategic planning process. The venue was the Strategic Planning Summit, managed by the IE Group, in New York last month.  The speaker was Suzy Cunningham, Strategy and Integration Manager for the Kennedy Space Center (KSC).


What was particularly noteworthy was the cascade of OBJECTIVES from NASA to KSC that drives KSC’s strategy.  NASA starts with just three high-level strategic goals, which might be summarized as: 1) Exploration, 2) Earth science, and 3) Serving the American public. Under these three strategic goals are three sets of objectives, 15 in total, but of which only three are directly primarily applicable to KSC’s mission: 1.1) Solar system exploration, 1.2) ISS management, and 1.3)  Facilitating commercial space capabilities.

NASA manages several facilities other than KSC, such as the Goddard Space Flight Center, the Jet Propulsion Laboratory, and the Ames and Langley Research Centers, all with their own specific missions and set of core competencies.  KSC in turn has its own vision and mission, distinct from that of NASA proper and from NASA’s other facilities. Compare visions – NASA: “Reach for new heights and reveal the unknown for the benefit of humankind”, versus KSC: “The world’s preeminent launch complex for government and commercial space access”.

However, KSC inherits its priority objectives directly from NASA. From those inherited priority objectives KSC next develops the associated milestones and metrics, and it’s not until this point that “strategies” come into play – strategies for achieving these objectives, typically via supporting projects, such as the redesigned launch complex for the Space Launch System (SLS).

Generalizing this for a typical organization, the process outline would be:

  • Corporate vision and mission
  • Corporate strategic goals
  • Corporate objectives
  • Corporate metrics and strategies for managing its business units to achieve those corporate goals
    • Business Unit/Division vision and mission
    • Business Unit/Division objectives (subset of corporate)
    • Business Unit/Division metrics, milestones, projects and strategies in support of priority objectives

This objectives-driven approach to strategic planning has several things going for it:

  • Clarity and simplicity: The focus is on the Objectives; there’s no muddling with the vagaries of strategy, and no wading through multiple strategy levels.
  • Consistency: The objectives cascade lock, stock and barrel. Methods, approaches, strategies, and competencies can all vary as the environment dictates, but the objectives remain the objectives.
  • Communication: One of the biggest complaints you hear from leadership is that, “80% (or whatever your number is) of our people cannot articulate our strategy”. And how could they? What do you mean by “strategy” anyhow”? Your customer engagement strategy, your global sourcing strategy, your talent management strategy, your product life cycle strategy, your – well, you get the point. What you really want them to understand unambiguously are your values, vision, mission and objectives (or those objectives that pertain to them). Yes, management needs to understand strategy, so that they can coordinate across functions and alter tactics in conformance with mission and objectives, but you can greatly simplify what your individual contributors need to understand by focusing on objectives.
  • Metrics: If you’ve got too many metrics, one reason might be that they are not all focused on objectives.

It is this last topic, metrics, that intrigues me as much as the simplified strategic planning process. I have previously delved into this at least twice, once with “Metrics for the Subconscious Organization”, and again with “Metrics – Too many different ways of keeping score”, with my major bone of contention being the disconnect between the metrics management uses to steer the ship and the metrics the rest of the organization needs in order to get things done and improve their efficiency and effectiveness. Too often what we cascade are operationally irrelevant metrics like ROA, DSO or inventory turns, important to the ship’s captain and to certain, specific functions, but not generally applicable across the board.

Related to this is the distinction we make between business intelligence (BI) and operational intelligence (OI), as if the two are unrelated. Today, for practical reasons, they might very well be separate domains, but there is no need for them to remain so.  With unwavering objectives as the common denominator (versus strategies that will vary by function), BI and OI can be seen as a continuum, with a single, integrated “intelligence” platform, including both BI and OI, communicating the relevant metrics across the organization, assuring consistency of message and alignment of activity.

Post a Comment

Automating bad decisions and the Ladder of Inference

There’s more than one way to make a poor decision.  Bad data, inappropriate assumptions and flawed logic are just three of the missteps you can take on your climb up the Ladder of Inference, a concept first developed by Chris Argyris, professor of business at Harvard, in 1974, and later popularized by Peter Senge in his 1990 book, “The Fifth Discipline”.  If we’re not mindful of these mental pitfalls, we’re likely to use our automated business processes to simply make bad decisions faster.

The Ladder of Inference, an oldie-but-goody, is likely familiar to you, although you may not have run across it in some time.  A quick summary of the ladder’s seven rungs would be (starting at the bottom):

  1. Ladder-of-inferenceObservation: The world of observable data and experience
  2. Filtering: The selection of a subset of this data for further processing
  3. Meaning: Assigning meaning / interpretation to the data, through semantics or culture
  4. Assumptions: Associated context, often from your Framework (below), of what you already know and the new meanings you’ve assigned
  5. Conclusions: Drawn based on the assumptions and meaning applied to the filtered data
  6. Framework: You alter, adjust or adapt your belief system / knowledge framework based on your conclusions
  7. Action: You take action based on the meaning of the data and your updated belief system


A simple example of the ladder in action might be: Read More »

Post a Comment

Tell me something I don’t know

What is information? The lack of a working definition plagued both science and the emerging telecommunications industry until the arrival of Claude Shannon and his famous 1948 paper, “A Mathematical Theory of Communication”, based on his cryptography work during WWII while at Bell Labs.  The landmark article is considered the founding work of the field of information theory, and would augment Shannon’s earlier groundbreaking research at MIT into the design of digital circuits and digital computers.

geo - CopyShannon interpreted his formal definition, H = -∑ pi log (pi), in a number of counterintuitive ways:

  • As a measure of entropy (the formula exactly mirrors Boltzmann’s definition of thermodynamic entropy)
  • As the resolution of uncertainty
  • As a measure of surprise

While that first definition has captured the attention of the likes of physicist Stephen Hawking and has implications for cosmology, black holes and a holographic universe, it’s the latter two that are of interest to us for the moment. Read More »

Post a Comment

How steep is your learning curve? On Analytics and Mentors ...

Having a mentor is the number one factor in increasing the steepness of your personal learning curve. So says my oldest, Garik, a Park Scholar at North Carolina State University (class of 2012), during a discussion he recently had with the incoming Park Scholar class of 2019.

learning-curveTo accept the value of mentoring first requires one to understand the centrality and importance of the learning curve. Garik asked the students to imagine plotting the characteristics of two people on a simple X-Y axis.  Person A comes to the game with only a moderate amount of resources at their disposal, but importantly, also a relatively steep learning curve, such that a plot of their capabilities has them crossing the Y-axis at an intercept of 1 and with a slope of one-half.  Person B, in contrast, has much greater resources at their current disposal:  time, talent, smarts, money, education, experience, etc …, but for whatever reason, has a shallower learning curve, such that their plot on the graph intercepts higher up the Y-axis at 2 but with a shallower slope of only one-quarter.

Unless you think you’re going to die before the two lines cross, you’d of course be better off as Person A. Based on his domestic and international experiences as an undergrad and grad student, as a researcher and an employee, and as part of two start-ups (so far), Garik’s conclusion is that, while there are several factors impacting the steepness of that learning curve, none is more important than that of having chosen good mentors.

Businesses can be said to have learning curves as well, and my discussion with my son got me to thinking about what factors would have the greatest bearing on organizational learning curve steepness. Read More »

Post a Comment

Your information supply chain

Viewing data as an asset implies there are benefits to taking a supply chain approach to data management. It’s not just inventory that needs to be at the right place at the right time in the right format and quantity. An end-to-end information supply chain approach, from sourcing/acquisition through transformation and storage, to end users, analytics and insights, allows you to keep your focus on the business problems to be solved, and avoid having that ‘big honking data cube’ become a bottleneck instead of an enabler.

Here are a few best practice supply chain mindsets to consider applying to data management:

  • cadworx2d3dInventory turns / Data turns: How fast can you acquire and get the needed data into the hands of the decision makers? The whole point of analytics, BI and corporate performance management is to make better decisions faster, and how you structure your information supply chain will have a big impact on that “faster” part. This could mean anything from self-service BI to data visualization to better and faster data prep and data quality procedures. Critical decision support data delivered too long after the problem surfaces is like not having the inventory you need in the stores until the holiday shopping season is half over.
  • Analytics / Decisions at the edge: Taking “faster” to the extreme can sometimes mean taking action on the data BEFORE you store it, acting on streaming data via event stream processing, which has applications in financial services (e.g. credit scoring), cybersecurity and fraud (e.g. detecting unusual network connection patterns), quality (e.g. process and product quality control), and asset maintenance (e.g. sensor data from critical equipment). Whether it’s humans or machines making those decisions at the edge, real-time and near-real-time decision support capabilities are becoming competitive differentiators across a number of industries.
  • Visibility / Control Towers: Reacting quickly to fluctuating demand requires visibility into your entire physical supply chain, from what’s in which store or warehouse, which production lines are down for maintenance, and which suppliers have additional capacity at the ready. In the same way, better business decisions means having immediate access to ALL the relevant data – a worthy integration challenge for both inventory and data, with a commensurately worthy business outcome.
  • ABC inventory / data classification: Not all inventory is equally important, a maxim understood by every supply chain professional. “A” inventory is high in value but not necessarily volume, warranting tighter controls and monitoring than low-value, high volume “C” material. Your data can likely be similarly categorized, with revenue, cost, employee and customer data no doubt in the “A” category, with perhaps production, quality and transaction details falling lower in priority. Such a classification approach, based on what data is used most often in the most critical decision support processes can help your prioritize your data quality and myriad other IT activities and investments.
  • Data Management for Analytics: Designing a physical warehouse that maximizes the efficiency of receiving and storage is not necessarily the best overall approach to supply chain management, where access to raw materials and WIP at the right place and time on the factory floor is more critical to meeting cost, revenue, customer satisfaction and business goals than simply minimizing storage and handling costs. Likewise, a data warehouse built to minimize data storage cost may make it difficult for business and decision support users to access what they need, in the format they need for rapid analysis and insight. This gets back to point number one above about data turns: it’s not about how fast and cheaply you can get data into the warehouse, it’s about how quickly you can turn that data into valuable insights across the entire information supply chain.

Read More »

Post a Comment

Painting with big data analytics

Seurat-La_Parade_detailBig data, by which most people mean Big Volume, doesn’t get you very far just by itself, but with the addition of Big Variety and analytics, now you’re talking. In fact, most organizations who are making headway into capitalizing on their data assets now refer to the process as "big data analytics" – a combination of data storage and management, data integration, and analytic tools and techniques.

I have previously made the case that the value in big data stems primarily from its Big Variety, and now I want to put that variety into its proper context of related data volumes and analytics for insights.

The potential for getting this combination of Volume + Variety + Analytics right can perhaps be best illustrated by the findings of a U.S. Department of Defense review committee on the September 11, 2001 attacks. What the committee found was that essentially all 19 of the hijackers could have been linked to each other and to the pending attacks via just seven properly targeted mouse clicks through existing public/government databases: Read More »

Post a Comment

Marketing analytics lessons from the KGB

“Half the money I spend on advertising is wasted, the trouble is I don't know which half.” ~ John Wanamaker, U.S. department store magnate and merchandising / advertising pioneer.

I’m not going to claim that I can pinpoint exactly which half of your marketing dollars are wasted in the space of this post, but I am going to illustrate that basic analytic techniques are available that can considerably narrow down the range of uncertainty and provide actionable insights for your marketing efforts.

KGBOur story begins with a fascinating article that surfaced last week by Jonathan Haslam, professor of the history of international relations at Cambridge University, the subject of which was how, during the Cold War, was the KGB able to so easily and readily identify undercover CIA agents?

The Soviet efforts were so successful that the head of the KGB counterintelligence group, Yuri Totrov, was known within CIA circles as the “Shadow Director of Personnel” on account of how much he seemed to know about the foreign posting of CIA agents. How he was able to unmask and compromise entire intelligence networks was the subject of much handwringing, debate and speculation, the leading candidate being a highly placed mole within the Agency. What other explanation could there possibly be, right?

Wrong. Read More »

Post a Comment

What is it like to be a customer?

bat2To paraphrase Thomas Nagel’s famous 1974 paper on consciousness, “What is it like to be a bat?”, I want to instead ask the question, “What is it like to be a customer?” Nagel’s argument was geared at refuting reductionism - the philosophical position that a complex system is nothing more than the sum of its parts. Such a materialist approach omits the essential components of consciousness ("emergent properties" we would say today): an actor with motives and feelings and a personality. We typically approach the customer in the same fashion – our hypothetical target consumer is typically nothing more than the sum of our data and demographics combined with our own products and services.

I want to digress for a moment to illustrate and highlight this important point about “being like something”. What is it like to be you? Read More »

Post a Comment

Big Model: The necessary complement to big data

With all the hype over big data we often overlook the importance of modeling as its necessary counterpart. There are two independent limiting factors when it comes to decision support: the quality of the data, and the quality of the model. Most of the big data hype assumes that the data is always the limiting factor, and while that may be for a majority of projects, I’d venture that bad or inadequate models share more of the blame than we care to admit.

stone-balanceIt’s a balancing act, between the quantity and quality of our data, and the quality and fit-for-purposeness of our models, a relationship that can frequently get significantly out of balance. Or more likely, complete mismatches between data and modeling can crop up all over our organization. In one instance we may have remarkable models starved for good data, and on the other hand, volumes of sensor or customer data sit idle with no established approach to exploration, analysis and action.

This imperative to balance the data with the model reminds me of an espionage story from WWII. Read More »

Post a Comment

Visualization – Worth a thousand words

Why visualization? Several reasons, actually, the most compelling being that sometimes visualization literally solves the problem for you.

I remember an exercise in eighth grade English class where we were asked to describe, in words only, an object set in front of us with sufficient clarity such that our classmates, sequestered outside the room, could accurately draw the object from our written description. The object was a bowtie shaped set-top UHF antenna.

bike2The exercise was a disaster. Which of course was the objective, at least from the teacher's perspective, who was attempting to demonstrate how difficult clear, comprehensible writing can be. From our perspective, however, all we could focus on was what idiots the recipients of our written descriptions must have been. “Why did you draw the loops at right angles to each other when I clearly indicated they were in the same plane as the base?” Looking back, had I been clever enough, I realize now that I should have “cheated” and used a typographical approach to illustrate the object diagrammatically with my otherwise 'descriptive' words.

Live and learn. But what did I learn? Read More »

Post a Comment