3  Audiences and the utility of decisions

Having learned to specify our intent with precision—whether to AI systems or to human collaborators—we now turn to a prior question: for whom are we specifying this work? The most beautifully crafted specification, the most elegantly generated visualization, the most sophisticated analysis all fail if they do not reach and move their intended audience.

Sometimes in analytics we write primarily for ourselves, as Joan Didion captured:

I write entirely to find out what I’m thinking, what I’m looking at, what I see, and what it means (Didion 1976).

These introspective explorations serve our understanding but rarely transfer directly to others. The gap between what satisfies our own curiosity and what serves decision-makers is where communication lives or dies. To bridge this gap, we must understand the minds we seek to reach.

3.1 Understanding executive audiences

Different executives possess different knowledge, responsibilities, and concerns. Their vocabularies overlap but are not identical. The analytics executive speaks of models and uncertainty. The marketing executive speaks of brand value and cultural resonance. The chief executive speaks of strategy and shareholder returns. Effective communication requires us to map our analysis onto their conceptual terrain.

3.1.1 Analytics executives

Consider the perspective of Citi Bike’s Chief Analytics Officer. She knows what the public experiences with the bike sharing program—experiences documented in news reports like the West Side Rag quoting spokeswoman Dani Simmons: “Rebalancing is one of the biggest challenges of any bike share system, especially in … New York where residents don’t all work a traditional 9-5 schedule, and … people work in a variety of other neighborhoods” (Friedman 2017).

As Chief Analytics Officer, her responsibilities include analyzing and overseeing analyses that inform decisions for solving organizational problems (Zetlin 2017)—like rebalancing. She possesses technical fluency. She understands that “rebalancing” means taking actions ensuring customers can both rent bikes and park them at docking stations.

Now imagine she receives an email beginning:

Citi Bike, a bike sharing program, has struggled to rebalance its bikes. By rebalance, I mean taking actions that ensure customers may both rent bikes and park them at the bike sharing program’s docking stations….

Is she motivated to continue reading? Does she know why she should invest her attention? This opening insults her knowledge—it defines terms she already understands. It wastes her time and signals the sender’s failure to understand their audience.

The lesson: analytics executives require us to match their technical sophistication while respecting their time. They need sufficient detail to verify our methods but not so much that they must wade through explanations of concepts they mastered years ago.

3.1.2 Marketing executives

The Chief Marketing Officer shares some responsibilities with analytics colleagues but owns others exclusively. David Carr, Director of Marketing Strategy at Digitas, describes three value types marketing drives (Carr 2019):

  1. Business value: long and near-term growth, greater efficiency, enhanced productivity
  2. Consumer value: attitudes and behaviors affecting brand choice, frequency, and loyalty
  3. Cultural value: shared beliefs creating favorable environments for operation and influence

These values exist in concentric relationships—cultural value creates the environment where consumer value can flourish, which generates business value. Marketing executives think in these terms. They want to know not merely what the data show but what the data mean for brand perception, customer relationships, and market positioning.

Carr emphasizes that brand strategy must align with business strategy and corporate culture (Carr 2016): “There is nothing more wasteful and damaging than developing a brand identity or vision based on strategic imperative that will not get funded. An empty promise is worse than no promise.”

The lesson: marketing executives require us to translate analytical findings into brand and customer implications. They need to see how our analysis supports or challenges strategic positioning.

3.1.3 Chief executives

Analytics and marketing executives report, directly or indirectly, to the CEO, who bears ultimate responsibility for driving business performance. CEOs are more likely to be generalists than specialists, though more than one quarter of Fortune 500 CEOs hold MBAs (Bertrand 2009). Their education—managerial statistics, business analytics, strategy, marketing, finance, economics, operations—gives them vocabulary intersecting with both analytics and marketing.

But their responsibilities are broader. Bertrand notes that “current-day CEOs may require a broader set of skills as they directly interact with a larger set of employees within their organization.” Their focus remains fixed on creating business value. Communications with CEOs must begin and remain focused on how our analysis helps them fulfill these responsibilities.

The continuum of knowledge matters here. Everyone is a specialist on some subjects and a non-specialist on others. Even among specialists, gradations exist. Specialists want details—technical aspects they can use in their own work and require for conviction. Non-specialists need us to bridge gaps between what they know and what our document discusses: more background to understand need and importance, more interpretation to grasp relevance and implications. And we must remember that audiences are plural1—even memos addressed to individuals may be passed to team members, creating secondary audiences we must also serve.

Exercise 3.1 (Researching executive backgrounds) Select one executive from each category below. Research their professional backgrounds using LinkedIn, company websites, press releases, or news articles.

Analytics executive: Chief Data Officer, Chief Analytics Officer, or VP of Data Science

Marketing executive: Chief Marketing Officer, VP of Marketing, or equivalent

Chief executive: CEO, President, or Managing Director

For each executive, identify: - Educational background (degrees, fields of study) - Career trajectory (previous roles, industries, progression path) - Technical vs. business orientation (do they have hands-on technical experience or pure business backgrounds?) - Scope of responsibilities (team size, budget authority, decision-making autonomy) - Public communication style (formal presentations, written reports, media interviews)

Analysis questions:

  • How would you adjust a data analysis memo for each executive, given what you learned about their backgrounds?
  • Which executive would likely ask the most technical questions? Which would focus on strategic impact?
  • What terminology would you use or avoid with each?
  • How does their career path suggest what they value most in data communications?

Organizational context reflection:

For at least one of your selected executives, consider how their responsibilities might differ between: - A large established organization (thousands of employees, multiple departments, formal hierarchies) - A startup or small company (lean teams, flat structure, resource constraints)

How would an analytics executive’s day-to-day work change? What would a CEO prioritize in each context? How might a marketing executive’s approach to data differ when working with limited resources versus extensive market research departments?

Write a one-paragraph summary for each executive type explaining how their background and organizational context shape communication needs.

3.2 The communication gap

3.2.1 Information and interpretation gaps

Even with audience knowledge, communication challenges persist. We might consider Didion (1976)’s insight again: we generally revise our written words and refine our thoughts together; improvements in thinking and writing reinforce each other (Schimel 2012). Clear writing signals clear thinking. We should clarify our project in writing, then iterate through data collection, technical work, and renewed clarification until we converge on answers supporting actions and goals.

More overlooked is communicating effectively to others. Analytics projects require diverse skills: project management, data wrangling, analysis, subject expertise, design, storytelling (Berinato 2019). The team must first ask smart questions, wrangle relevant data, and uncover insights. Second—critically—they must communicate what those insights mean for the business.

An interpretation gap frequently exists between data scientists and executive decision-makers they support (Maynard-Atem and Ludford 2020; Brady, Forde, and Chadwick 2017). Brady and colleagues argue that data translators should bridge this gap, address data hubris and decision-making biases, and find linguistic common ground. They suggest teaching quantitative skills to subject-matter experts because “it is easier to teach quantitative theory than practical, business experience.”

Before accepting this argument, consider the perspective. Both sources write for business executives: Harvard Business Review readers “have power, influence, and potential… senior business strategists who have achieved success and continue to strive for more” (“HBR Advertising and Sales,” n.d.); MIT Sloan Management Review reaches an audience where “37% work in top management, while 72% confirm that MIT SMR generates conversation with friends or colleagues” (“Print Advertising Opportunities” 2020). The authors themselves hold senior management positions. Might their conclusion—that business experts should learn data science rather than data scientists learning business—reflect their own backgrounds and audience?

Perhaps the “data translator” need not be an individual but a shared responsibility. Berinato argues data science requires teams. The translation function might distribute across team members, each contributing their expertise to bridge different aspects of the gap.

3.2.2 Bridging the gap

Bridging requires developing a common language. Senior management does not share analysts’ vocabulary and terms. Decision-makers seek clear ways to receive complex insights. Plain language aided by visuals allows easier absorption of meaning.

Effective translation also requires beginning with questions rather than assertions, using analogies and anecdotes resonating with decision-makers, and speaking truth while remaining curious, crafting accessible questions and answers, maintaining high standards, and being self-directed. These habits—questioning, analogizing, truth-telling—transcend individual personalities and can be cultivated across team members regardless of whether they began as business experts or data specialists.

3.2.3 Multiple or mixed audiences

Frequently we encounter mixed audiences—readers with different proximity to subject matter and context. Primary readers are close to the situation; secondary readers are more distant. The challenge: provide secondary readers information we assume primary readers know while keeping primary readers interested.

The conceptual solution: ensure each sentence makes an interesting statement—new to all readers, even if the novelty differs. Doumont (2009) illustrates. Compare:

We worked with IR.

Some readers may not recognize “IR.” One might define it:

We worked with IR. IR stands for information Resources and is a new department.

But this bores readers who already know. Better:

We worked with the recently launched Information Resources (IR) department.

The “recently launched” provides novelty for specialists while the definition serves non-specialists. Every sentence must earn its place by offering something to everyone.

Exercise 3.2 (Writing for mixed audiences) You have analyzed Citi Bike ridership patterns and discovered that weather affects station usage differently across boroughs. You must write a one-paragraph summary for two audiences simultaneously: - Primary audience: The Chief Analytics Officer (technical, already knows the rebalancing challenge) - Secondary audience: The CEO (business-focused, may not know operational details)

First, write a version that would bore the CAO by defining terms she knows but explaining everything for the CEO.

Then, following Doumont (2009)’s principle, revise it so every sentence offers something new to both readers. Ensure specialists get novel insights while non-specialists get necessary context—without either group feeling patronized or confused.

Compare your two versions: What specific changes allowed you to serve both audiences?

3.3 The utility of decisions

For data analysis to grab an audience’s attention, it must answer their question: “so what?” Now that they know what you’ve explained, what should come of it? What does it change?

This question reveals a fundamental truth: analysis exists to inform decisions. Without decisions, analysis is merely intellectual exercise. The connection between analysis and decision is utility—the value of information for making better choices.

3.3.1 The logic of decision-making

Consider the structure of a decision. We face uncertainty about future outcomes. We can choose among various actions. Each action leads to possible outcomes with different probabilities. Each outcome carries different value—positive or negative, large or small.

Rational decision-making combines these elements: probability distributions of expected outcomes and the utility of those outcomes (Parmigiani 2001; Gelman et al. 2013, chap. 9). In simple terms, we weight each possible outcome by its value and probability, then choose the action maximizing expected benefit (or minimizing expected loss).

More formally, optimal decisions choose actions maximizing expected utility, where expectation integrates over the posterior distribution:

\[ \min_a\left\{ \bar{L}(a) = \textrm{E}[L(a,\theta)]= \int{L(a,\theta)\cdot p(\theta \mid D)\;d\theta} \right\} \]

where \(a\) are actions, \(\theta\) are unobserved variables or parameters, and \(D\) are data. The von Neumann-Morgenstern framework established this foundation for rational decision-making (Neumann and Morgenstern 2004), and Berger provides the classic statistical treatment connecting decision theory with Bayesian analysis (Berger 1985).

Model choice itself is a decision using a zero-one loss function: loss is zero when we choose the correct model, one otherwise. But the framework extends broadly. A business might use expected profit as its loss function, choosing actions maximizing expected returns. A public health agency might use lives saved. An environmental regulator might use ecosystem services preserved.

3.3.2 Why utility matters for communication

The utility framework transforms how we communicate analysis. Without it, we present findings: “Stations near subway entrances experience 40% higher morning ridership.” The audience nods and asks, “So what?”

With utility thinking, we present decisions: “Pre-positioning bikes near subway entrances during morning hours increases expected customer satisfaction by reducing empty-station encounters, outweighing the additional rebalancing costs.” Now the audience sees the action implied, the trade-offs weighed, the value calculated.

The utility framework forces us to specify:

  • Actions available: What can the decision-maker actually do? Pre-position bikes? Adjust pricing? Expand stations?
  • Outcomes contingent on actions: What happens if we take each action? What is uncertain?
  • Probabilities of outcomes: How likely is each result? What does our data tell us?
  • Values of outcomes: How much does each result matter to the decision-maker? Profit? Customer satisfaction? Public health?
  • Decision criteria: Are we maximizing expected value? Minimizing worst-case loss? Satisficing above a threshold?

Each element requires different data, different analysis, different communication. Knowing which elements our audience needs—and which they will fill in themselves—shapes what we must specify in our communication.

3.3.3 The analyst’s advantage

Understanding utility gives analysts leverage. When we frame findings as decisions—when we calculate expected values and trade-offs—we elevate our role from “data provider” to “decision support.” We speak the language of executives not by abandoning rigor but by translating rigor into action.

Consider the Citi Bike example. An analyst might report: “Weather affects ridership.” A decision-support analyst reports: “For every 10-degree temperature drop below 50°F, we expect 15% ridership decline at outer-borough stations but only 5% decline at Manhattan stations. Pre-positioning fewer bikes at outer stations during cold weather saves expected rebalancing costs of $X while maintaining 95% customer satisfaction.”

The second communication enables decision-making. It specifies actions, outcomes, probabilities, and values. It invites the executive to verify assumptions, adjust parameters, or consider alternative actions. It treats the executive as a decision-maker rather than a passive consumer of facts.

3.3.4 Practical application

Applying utility thinking does not require solving the integral equation. It requires asking the right questions:

  • What decision does this analysis inform?
  • What are the available actions?
  • What outcomes might follow from each action?
  • What do we know about probabilities and values?
  • What remains uncertain, and how does that uncertainty affect the decision?

These questions guide both our analysis and our communication. They ensure we do not merely describe data but illuminate decisions. They bridge the gap between analysis and action, between the analyst’s insight and the executive’s authority.

The utility framework also clarifies when analysis should stop. Perfect analysis—complete certainty about all outcomes and values—is impossible and often unnecessary. We need sufficient analysis to distinguish clearly superior actions from inferior ones. Once the decision is clear, further refinement yields diminishing returns. The executive needs “good enough” analysis timely, not perfect analysis late.

This principle guides communication scope. We specify enough to enable confident decisions, not so much that the decision-maker drowns in detail. The discipline of specifying to AI—which we explored in the previous chapter—parallels the discipline of specifying to executives: include what matters, exclude what doesn’t, and verify that the output serves the intended purpose.

Exercise 3.3 (Transforming findings into decisions) You are a Citi Bike analyst who has completed two analyses:

Analysis A: “Ridership on weekends is 34% lower than weekdays.”

Analysis B: “Stations within two blocks of subway entrances have 23% higher morning utilization than stations further away.”

For each analysis, transform the finding into a decision-ready statement using the utility framework. Your revised statements should specify:

  • What actions are available to the decision-maker?
  • What outcomes follow from each action?
  • What probabilities or expected values does the data suggest?
  • What trade-offs exist between actions?

Then, briefly explain how your transformed versions invite executive engagement while the original findings invite only passive acknowledgment.

3.4 Looking ahead

Having established who our audiences are and why decisions matter, we turn to how we reach them. The next chapters explore the craft of communication: writing persuasively, structuring arguments, using rhetorical tools effectively. We will apply these skills to the Citi Bike case, moving from problem definition through analysis to recommendation. The goal: communications that not only inform but persuade, that not only describe but drive action.


  1. Note the plural. While we identify a single person in memo examples, those memos may be passed to others on their team—secondary audiences matter.↩︎