Whitepaper

The Art of Successful Analysis

A successful analysis project is one that directly influences an important decision, satisfies the customer, and makes the analytic team that did the work proud.  Achieving this success in analysis takes a focused effort.  There are specific things that the leader of an analytic team needs to do in order to deliver a quality product that has this impact.  I have spent 25 years in this profession in the Pentagon and subsequently over 6 years now at SPA learning these lessons at the “school of hard knocks” and teaching them to others.  The lessons are the same in both environments, and this article summarizes the ones I have learned.

National security decision-makers use analysis as one of many factors that they balance when making hard decisions on complex issues.  It provides a way, often the only way, of using objective evidence to understand the measurable consequences of the various alternatives available to them.  If analysis is to be useful for this, it has to be completed on a timeline that supports their decision process; it has to be delivered in a concise to-the-point form (generally a briefing) that fits within the many demands on their time; and it has to be understandable, trustworthy and objective.  Senior officials have their own judgment and opinions and plenty of advisors who can add more; they count on analysis to provide accurate, objective, and operationally-realistic quantitative evidence to give them insights they could not get by any other means.

There are five steps in the end-to-end process of delivering useful analysis to decision-makers, and they all have to be done right in order for an analytic project to be successful:

  • Defining the problem
  • Attacking the problem
  • Assuring quality
  • Preparing the briefing
  • Delivering the briefing

Defining the Problem

Senior leaders who do not have a background in analysis often do not know how to characterize their problem in terms that define how to approach analyzing it.  Written task statements produced by their staffs and passed down to analytic service providers are often no better.  This is not their fault, our language and techniques are not familiar to them nor should they be.  Their problem is often complex, with many variables and constraints, and some aspects of the problem may not be tractable to quantitative techniques.  An analytic project that is designed and executed without a clear understanding of the key problem and how analytic techniques will be used to address it in ways useful to the recipient will not succeed.

It is the job of the leader of an analytic project to make sure that he or she has the problem correctly characterized. There is no substitute for a direct conversation with the senior leader who has the problem (in the private sector we call this person the “customer”), or at least with an advisor who fully understands that leader’s decision space.  This may be hard to arrange and may not be lengthy if it happens, so it is essential to go into it with specific objectives for what has to be elicited in the conversation in order to correctly focus the analysis.  Foremost among these objectives must be a proposal for how the problem will be stated in terms that are appropriate for analysis, what key assumptions are proposed for the work, and what metrics are proposed for characterizing the “goodness” among the analytic outcomes.  It is also important to elicit from the customer what, if any, constraints he or she has on their decision space; are there solution options that would be non-starters for them regardless of analytic outcomes?  And are there stakeholders whose perspectives must be accounted for when shaping and conducting the analysis? Analysis tasks that are simply handed off to analytic teams without a mechanism for up-front dialog with the customer, then regular subsequent interactions with them, can easily go unintentionally astray.   This is particularly true if the analysis team is not working day-to-day inside the customer’s organization and operating environment, a problem more frequently encountered in the private sector across a contracting “wall” than in government analytic organizations.  If you are not answering the right problem and providing realistic options, then it does not matter how technically excellent your analysis is, it will not be successful.

Attacking the Problem

Once the customer’s problem is clearly defined, it is the analytic project leader’s responsibility to figure out how to produce useful, accurate results within the project’s time and resource constraints.  Time is often the pacing constraint here; perfect analytic results not delivered in time to influence the customer’s decision are a useless academic exercise.   This constraint often limits the choice of analytic techniques, requires the use of simplifying assumptions, or limits the range of options that can be evaluated.  A realistic project timeline that meets the customer’s decision timeline requirements is the first project-management product that the leader has to produce.

It is a rare case when a national security analytic issue is one that has never been previously addressed analytically in some manner by somebody.  As part of the process of gathering data at the beginning of a project, it is helpful to find what has been done before in the area of interest, what techniques were used, and why that previous work was not deemed to be sufficient for the addressing the current problem.  National security analytic work is often not well-archived and accessible across organizations, so this is not always a productive step but it is one worth trying in order to limit the risk of duplicating previous work or repeating past failures.

Each class of analytic problem has specific techniques and types of model that are appropriate to its characteristics.  It is the job of an analytic project leader to determine what these are and to choose the approach best suited to the problem, then to select the teammates who have the skills to use (or learn) the most appropriate available model or technique.  This should not be a solitary exercise; bounce ideas off of colleagues and seek their advice and use information or relationships gained through MORS or other connections to find approaches used by other organizations.  The customer is not being well-served if the project leader stays in his or her comfort zone and simply uses the techniques, model, and/or teammates most familiar to them regardless of suitability to the task.

Whatever analytic technique is selected, the results are generally driven to a significant extent by the inputs – the data on system performance and cost and on threat; the employment concepts and/or tactics of the units or systems involved; and the operating environment, geography, and/or scenario.  Each problem and technique has its own particular set of inputs that will drive the outputs, and these have to be identified so they can be researched by the analytic team with a solid audit trail of authoritative sources credible both to the customer and to key stakeholders. When using factors based on human judgement, such as relative weightings of importance of a set of attributes or a concept of employment for a future system, these need to be elicited from individuals who are professionally credible to the customer and stakeholders.  Analytic results are sometimes not popular (!), and it is important to design the project with the understanding that the credibility of the results may be challenged.

Assuring Quality

Quality is the result of well-designed processes and projects, and a sound organizational culture that values critical review throughout a project’s execution.  It cannot simply be inspected in at the end of a project, although time and resources do have to be reserved at the end for a final review.  Assuring quality has a modest short-term cost, but the long-term cost to an analytic organization of failing to get it right is far higher.

There are two dimensions to quality, administrative and intellectual.  The administrative dimension is very straightforward and there is never an excuse for getting any aspect of it wrong.  This involves such basic questions as:

  • Do the results deliver everything that was committed to in the contract or statement of work, on the agreed timeline and within agreed resource limits?
  • Is the security classification correct and properly marked everywhere?
  • Is the report/briefing free of spelling, grammar, or math errors and are all the axes and lines on any graphs correctly labelled?
  • Were the model run inputs and calculations done correctly?
  • Are the sources of all data and assumptions documented?

The intellectual dimension of quality requires a more sophisticated type of assessment, and is best assured through a rigorous process of peer review by people with competency in the area but who are not directly involved day-to-day in the project work and potentially lost in its details.  This involves addressing more sophisticated questions such as:

  • Does the product identify and address the customer’s fundamental issue(s) in a credible, clear, and non-verbose manner?
  • Are the analytic techniques used appropriate to the issues?
  • Are the assumptions that might affect the conclusions explicitly identified and justified?
  • Are all conclusions directly supported by the analysis provided and are they operationally relevant from the customer’s perspective?
  • Is the tone of the report/briefing dispassionate, logical, and non-subjective?

Preparing the Briefing

The language of national security decision-makers is the Powerpoint briefing.  It is generally all they have time for, and they get a lot of them.  Somebody may read full text project reports someday and they are important for documenting details for future reference, but real decisions come from briefings.  Briefings have to have a point that directly supports a decision and get concisely to that point in order for an analytic project to achieve a successful outcome.  It is not necessary in that process to share with the customer everything learned or done enroute.

Briefings must be tuned to the seniority of the recipients and the time that they make available to receive them.  The more senior the recipient the less time they are likely to have.  In the progression of a project’s results briefing up a chain of command, it typically becomes shorter as details on methodology and data that the subordinates want to know in order to support further progression are covered to their satisfaction then moved out or to backup.  It is important to not overwhelm a briefing recipient at any level with content or to use every available minute of the briefing time for transmission; leave recipients time to ask questions and discuss.  Generally a one-hour briefing should be limited to a slide count in the low 30’s at most.

Analytic project briefings should start with a statement of the problem and end with a summary of how the analytic work illuminated the consequences of specific options for addressing that problem.  Those options have to clearly and logically be the result of the key assumptions and the analytic work presented in between those two points; and they must be objective and technically, operationally, and (if appropriate) programmatically realistic.   The analytic results used to explain the options should rely on graphical presentations to the greatest extent possible, and the number of words on the slides should be few in number and large in font size.  Slides with more than a dozen lines of text are too wordy.  Decision-makers are regularly buried in verbose, nearly unreadable Powerpoint text and appreciate clarity and conciseness.

Delivering the Briefing

Good briefings on well-done analytic work do not get the project across the goal line to full success unless they are delivered well.  While briefings should be designed so that the slides still convey the key elements of the story even without a human briefer, the briefer should never just read the slides; the recipient can read.  The purpose of the briefer is to identify and explain the key “teaching points” on each slide (particularly graphics) and answer the recipient’s questions.  Less is more – elaborating slides with additional unnecessary detail, talking fast to get everything they know said, pulling up backup slides, and occupying every moment of the allotted time in transmit mode defeats the purpose of the human interaction and annoys the recipient.

A good briefer delivers a project’s results with confidence and clarity.  The briefer does not necessarily have to be the project leader, but it must be someone with complete understanding of what the project did, how it was done, and how it was run.  He or she has to be fully confident that no matter what the recipient’s question about the project, they know the correct answer.  They must speak slowly and clearly, never simply reading a script but making regular eye contact with the senior recipient and moving with confidence from behind the podium when necessary in order to point out key items on slides.  They must consciously restrain themselves from using empty phrases, saying “uh”, or displaying mannerisms that distract the recipients from what they are saying by focusing them instead on how they are saying things.  This level of personal polish and confidence only comes from peer-reviewed rehearsals.

Summary

Success in an analysis project does not come easily, or solely as the result of technical skill.  It is the result of a team effort that uses a disciplined, coherent five-step process to apply a set of operational, technical, and communications skills to the project. 

Interested in learning more?

Get in touch with us today to learn more about partnering with SPA.

Contact Us