PART 2-商务分析代写
时间:2023-05-23
PART 2
BUILDING AN ANALYTICAL
CAPABILITY
C
o
p
y
r
ig
ht
2
01
7.
H
ar
va
rd
B
us
in
es
s
Re
vi
ew
P
re
ss
.
A
ll
r
ig
ht
s
re
se
rv
ed
.
Ma
y
no
t
be
r
ep
ro
du
ce
d
in
a
ny
f
or
m
wi
th
ou
t
pe
rm
is
si
on
f
ro
m
th
e
pu
bl
is
he
r,
e
xc
ep
t
fa
ir
u
se
s
pe
rm
it
te
d
un
de
r
U.
S.
o
r
ap
pl
ic
ab
le
c
op
yr
ig
ht
l
aw
.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ)
AN: 1798662 ; Thomas Davenport, Jeanne Harris.; Competing on Analytics: Updated, with a New Introduction : The New Science of Winning
Account: s1097571.main.ehost
90
CHAPTER SIX
A ROAD MAP TO ENHANCED ANALYTICAL
CAPABILITIES
PROGRESSING THROUGH THE FIVE STAGES OF ANALYTICAL
MATURITY
By this point, developing an analytical capability may seem straightforward. Indeed, some organizations such
as Marriott, GE, and Procter & Gamble have been using intensive data analysis for decades. Others, such as
Google, Amazon, Netflix, Zillow and Capital One, were founded with the idea of using analytics as the basis
of competition. These firms, with their history of close attention to data, sponsorship from senior
management, and enterprise use of analytics, have attained the highest stage of analytical capability.
The overwhelming majority of organizations, however, have neither a finely honed analytical capability
nor a detailed plan to develop one. For companies that want to become analytical competitors, a quick and
painless journey cannot be promised. There are many moving pieces to put in place, including software
applications, technology, data, processes, metrics, incentives, skills, culture, and sponsorship. One executive
we interviewed compared the complexity of managing the development of analytical capabilities to playing a
fifteen-level chess game.
Once the pieces fall into place, it still takes time for an organization to get the large-scale results it needs to
become an analytical competitor. Changing business processes and employee behaviors is always the most
difficult and time-consuming part of any major organizational change. And by its nature, developing an
analytical capability is an iterative process, as managers gain better insights into the dynamics of their
business over time by working with data and refining analytical models. Our research and experience
suggests that it takes eighteen to thirty-six months of regularly working with data to start developing a steady
stream of rich insights that can be translated into practice. Many organizations, lacking the will or faced with
other pressing priorities, will take much longer than that.
Even highly analytical companies have lots of work left to do to improve their analytical capabilities. For
example, Sprint, which used analytics to realize more than $1 billion of value and $500 million in
incremental revenue over five years, believes that it has just scratched the surface of what it can accomplish
with its analytical capability. And managers at a bank that has been an analytical competitor for years
reported that different units are slipping back into silos of disaggregated customer data. Analytical
competitors cannot rest on their laurels.
Nevertheless, the benefits of becoming an analytical competitor far outweigh the costs. In this chapter, we
will introduce a road map that describes how organizations become analytical competitors and the benefits
associated with each stage of the development process.
Overview of the Road Map
The road map describes typical behaviors, capabilities, and challenges at each stage of development. It
provides guidance on investments and actions that are necessary to build an organization’s analytical
capabilities and move to higher stages of analytical competition.
Figure 6-1 provides an overview of the road map to analytical competition and the exit criteria for each
stage.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
91
FIGURE 6-1
Road map to becoming an analytical competitor
Stage 1: Prerequisites to Analytical Competition
At stage 1, organizations lack the prerequisites for analytics. These companies need first to improve their
transaction data environment in order to have consistent, quality data for decision making. If a company has
poor-quality data, it should postpone plans for analytical competition and fix its data first. Dow Chemical’s
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
92
path is instructive. It began installing one of the first SAP systems in the United States in the late 1980s but
did not begin serious initiatives to use data analytically until enough transaction data had been accumulated.
Even if an organization has some quality data available, it must also have executives who are predisposed
to fact-based decision making. A “data-allergic” management team that prides itself on making gut-based
decisions is unlikely to be supportive. Any analytical initiatives in such an organization will be tactical and
limited in impact.
Once a company has surmounted these obstacles, it is ready to advance to a critical juncture in the road
map.
Assessing Analytical Capabilities
Once an organization has some useful data and management support in place, the next task is to take stock
and candidly assess whether it has the strategic insight, sponsorship, culture, skills, data, and IT needed for
analytical competition.
One financial services executive offers sage advice to anyone getting started: “Begin with an
assessment—how differentiated is your offering versus what you and your customers want it to be? Big data
and analytics can be a real game changer. But you must have a clear vision of what value looks like if you are
going to figure out how to unleash that vision.”
While each stage of the road map reflects the ability of an enterprise to compete on analytics, different
parts of the organization may be in very different stages of development. For example, actuarial work
requires an appreciation of statistical methods that may not exist to the same extent in other parts of an
insurance company. Or a pharmaceutical firm’s marketing analytics in the United States may be far more
sophisticated than they are in other countries or regions simply because the US operation has greater access
to data and an analytically minded manager in charge.
Just as some business units or processes may be more or less advanced than others in the enterprise, some
aspects of the business are likely to be more analytically astute than others. For example, an organization may
have a tightly integrated, highly standardized, and flexible IT environment but little demand for analytics, or,
conversely, the demand for analytics far outstrips the capabilities of the IT organization. A utility might use
machine learning capabilities to manage its electrical grid but not elsewhere in the enterprise. There may be
many user departments and even individuals with their own analytical applications and data sources but little
central coordination or synergy.
Organizations need to assess their level of analytical capability in three main areas. ( provides anTable 6-1
overview of these key attributes, each of which is equally vital to successful analytical competition.) One
cautionary note: executives are often tempted to just obtain the data and analytical software they need,
thinking that analytics are synonymous with technology. But unless executives consciously address the other
elements, they will find it difficult to progress to later stages.
TABLE 6-1
The key elements in an analytical capability
Capabilities Key elements
Organization Insight into performance drivers
Choosing a distinctive capability
Performance management and strategy execution
Process redesign and integration
Human Leadership and senior executive commitment
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
93
Establishing a fact-based culture
Securing and building skills
Managing analytical people
Technology Quality data
Analytic technologies
We’ll address the organizational issues in this chapter and go into greater detail on the human factors in
and the technical ones in .chapter 7 chapter 8
A company needs a clear strategy in order to know which data to focus on, how to allocate analytical
resources, and what it is trying to accomplish (what we refer to as “targets” in the DELTA model that we
describe in and later in this chapter). For example, the business strategy at Caesars Entertainmentchapter 2
dictated the company’s analytical strategy as well. When the explosion of newly legalized gaming
jurisdictions in the mid-1990s ground to a halt, Caesars (then Harrah’s) managers realized that growth could
no longer come from the construction of new casinos. They knew that growth would need to come from
existing casinos and from an increase of customer visits across Caesars’ multiple properties. To achieve this
goal, the company set a new strategic focus to drive growth through customer loyalty and data-driven
marketing operations. This strategic focus enables Caesars to concentrate its investments on the activities and
processes that have the greatest impact on financial performance—its distinctive capability. Implementing
this strategy required the company to expand and exploit the data it had amassed on the gaming behaviors
and resort preferences of existing customers. (See the box “ ” forChoosing a Strategic Focus or Target
examples of where companies chose to focus their initial analytical investments.)
CHOOSING A STRATEGIC FOCUS OR TARGET
Organizations initially focus on one or two areas for analytical competition:
Caesars: Loyalty plus service
New England Patriots: Player selection plus fan experience
Intuit: Customer driven innovation plus operational discipline
Dreyfus Corporation: Equity analysis plus asset attrition
UPS: Operations plus customer data
Walmart: Supply chain plus marketing
Owens & Minor: Internal logistics plus customer cost reduction
Progressive: Pricing plus new analytical service offerings
To have a significant impact on business performance, analytical competitors must continually strive to
quantify and improve their insights into their performance drivers—the causal factors that drive costs,
profitability, growth, and shareholder value in their industry (only the most advanced organizations have
attempted to develop an enterprise-wide model of value creation). In practice, most organizations build their
understanding gradually over time in a few key areas, learning from each new analysis and experiment.
To decide where to focus their resources for the greatest strategic impact, managers should answer the
following questions:
How can we distinguish ourselves in the marketplace?
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
94
What is our distinctive capability?
What key decisions in those processes, and elsewhere, need support from analytical insights?
What information really matters to the business?
What are the information and knowledge leverage points of the firm’s performance?
As organizations develop greater insights, they can incorporate them into analytical models and adapt
business processes to leverage them and increase competitive differentiation. These strategically focused
insights, processes, and capabilities form the basis of the organization’s distinctive capability.
Analytical competitors design effective decision making into their processes to ensure that analytical
insights get translated into action and ultimately enhance business performance. They incorporate a way of
thinking, putting plans into place, monitoring and correcting those plans, and learning from the results to help
shape future actions. For example, UK police analysts learned that they could predict from early behaviors1
which juveniles were likely to become adult criminals. The analysis concluded that if the UK police could
proactively intervene when kids began to pursue the criminal path, and prevent them from going in that
direction, they could dramatically reduce the number of crimes ultimately committed. However, in order for2
this insight to have any impact on crime rates requires more than awareness; it requires a close collaboration
between police, educators, and human services workers to establish programs and practices aimed at
eliminating the root causes of crime.
Finally, to ensure that strategy is converted into operational results, organizations must define and monitor
metrics that are tied to strategic enterprise objectives, and align individual incentives and metrics with
business objectives.
Choosing a Path
After an organization has realistically appraised its analytical capabilities, it must next choose which path to
pursue. Organizations blessed with top management commitment and passion for analytics can move quickly
through the “full steam ahead” path, while the rest are forced to take the slower “prove-it” detour.
Full Steam Ahead
A committed, passionate CEO can put the organization on the fast track to analytical competition. To date,
firms pursuing this path have usually been digital businesses (e.g.: Google, Amazon, LinkedIn, Zillow)
whose strategy from the beginning has been to compete on analytics, although occasionally a new CEO at an
established firm (such as Gary Loveman of Caesars or the new general managers at the Chicago Cubs and
Boston Red Sox in baseball) has led a major organizational transformation with analytics.
For a startup, the primary challenge on this path is acquiring and deploying the human and financial
resources needed to build its analytical capabilities. Established companies face a more complex set of
challenges because they already have people, data, processes, technology, and a culture. The existence of
these resources is a double-edged sword—they can either provide a head start on building an analytical
capability or be a source of resistance to using new methods. If organizational resistance is too great, it may
be necessary to take a slower path to build support by demonstrating the benefits of analytics.
The organization going full steam ahead is easily recognized, because the CEO (or other top executive)
regularly articulates the burning platform for competing on analytics. He or she consistently invests and takes
actions designed to build the organization’s strategic analytical capability. At such firms, the top priority is
integrating analytics into the organization’s distinctive capability, with an eye to building competitive
differentiation. Success is characterized in enterprise-wide terms; company metrics emphasize corporate
performance, such as top-line growth and profitability, rather than departmental goals or ROI.
An executive sponsor planning to follow this path must get the rest of the organization on board. The first
step is to set a good example. Executives send a powerful message to the entire organization when they make
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
95
1.
2.
3.
4.
decisions based on facts, not opinions. They must also demand that subordinates support their
recommendations with analytically based insights. Second, they must articulate a clear and urgent need for
change. It’s always easier, of course, to move an organization in an entirely new direction when it is facing a
dramatic crisis. Executives at highly successful companies confide that it is difficult to persuade (or compel)
employees to become more analytical when no clear need to change is present.
Third, the CEO or executive sponsor must also be able to commit the necessary resources. Seriously ailing
companies (though they have a clear mandate to do things differently in order to survive) may lack the
resources needed for analytical competition. In such cases, pursuing an analytical strategy is like proposing
“wellness sessions” to a patient undergoing cardiac arrest. Those organizations will have to revive themselves
before pursuing the full-steam-ahead path to analytical competition.
An organization that makes competing on analytics its top priority can expect to make substantial strides in
a year or two. While we are convinced that the full-steam-ahead path is ultimately faster, cheaper, and the
way to the greatest benefits, we have found relatively few companies prepared to go down this road. If top
management lacks the passion and commitment to pursue analytical competition with full force, it is
necessary to prove the value of analytics through a series of smaller, localized projects.
Stage 2: Prove-It Detour
To those already convinced of the benefits of analytical competition, having to avoid the fast track feels like
an unnecessary detour. Indeed, this path is much slower and circuitous, and there is a real risk that an
organization can remain stalled indefinitely. We estimate that having to “prove-it” will add one to three years
to the time needed to become an analytical competitor. But executives unwilling to make the leap should take
a test-and-learn approach—trying out analytics in a series of small steps.
For organizations taking the detour, analytical sponsors can come from anywhere in the organization. For
example, at one consumer packaged goods manufacturer, a new marketing vice president was shocked to
discover that the analytical capabilities he took for granted at his former employer did not exist at his new
company. Rather than try to garner support for a major enterprise-wide program, he logically chose to start
small in his own department by adopting an analytically based model for planning retail trade promotions. In
situations like this, initial applications should often be fairly tactical, small in scale, and limited in scope.
Despite its drawbacks, there are also important advantages to taking the slower path. Any true analytical
competitor wants to have a series of experiments and evidence documenting the value of the approach, and
the prove-it path helps the organization accumulate that empirical evidence. As managers get more
experience using smaller, localized applications, they can gain valuable insights that can be translated into
business benefits. Each incremental business insight builds momentum within the organization in favor of
moving to higher stages of analytical competitiveness.
There are practical reasons for taking the prove-it road as well. By starting small, functional managers can
take advantage of analytics to improve the efficiency and effectiveness of their own departments without
having to get buy-in from others. This approach also requires a lower level of initial investment, since
stand-alone analytical tools and data for a single business function cost less than any enterprise-wide
program.
In stage 2, it is best to keep things simple and narrow in scope. The steps essentially boil down to:
Finding a sponsor and a business problem that can benefit from analytics
Implementing a small, localized project to add value and produce measurable benefits
Documenting the benefits and sharing the news with key stakeholders
Continuing to build a string of localized successes until the organization has acquired enough
experience and sponsorship to progress to the next stage
An organization can linger in stage 2 indefinitely if executives don’t see results, but most organizations are
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
96
ready to move to the next stage in one to three years. By building a string of successes and carefully
collecting data on the results, managers can attract top management attention and executive sponsorship for a
broader application of analytics. At that point, they are ready to progress to stage 3.
Table 6-2 summarizes some of the major differences in scope, resources, and approach between the
full-steam-ahead and the prove-it paths.
TABLE 6-2
Attributes of two paths to analytical competition
Full steam ahead Prove-it
Management sponsorship Top general manager/CEO Functional manager
Problem set Strategic/distinctive capability Local, tactical, wherever there’s a
sponsor
Measure/demonstrate value Metrics of organizational
performance to analytics (e.g.,
revenue growth, profitability,
shareholder value)
Metrics of project benefits: ROI,
productivity gains, cost savings
Technology Enterprise-wide Proliferation of analytics tools,
integration challenges
People Centralized, highly elite, skilled Isolated pockets of excellence
Process Embedded in process, opportunity
through integration
supply/demand
Stand-alone or in functional silo
Culture Enterprise-wide, large-scale
change
Departmental/functional, early
adopters
For an example of a company that chose the prove-it path, we will now look at the experiences of an
organization that has begun to develop its analytical capabilities. (We have disguised the company and the
names of the individuals.)
PulpCo: Introducing Analytics to Combat Competitive Pressure
PulpCo is a successful company engaged in the sale of pulp, paper, and lumber-based products, including
consumer goods such as paper cups, industrial products like newsprint, and lumber-based products such as
particleboard for residential construction. It has successfully sold these products in the United States and
Europe for more than twenty years but has been facing increasing pressure from new entrants from
developing economies, newly competitive European firms, and providers of substitute materials for
construction and consumer products. The PulpCo management team has been with the company for years;
most began their careers working at the mills. Traditionally, analytics have taken a back seat to PulpCo’s
more intuitive understanding of the industry and its dynamics.
Under increasing competitive pressure, the CEO, at a board member’s urging, broke with long-standing
tradition and hired a new CFO from outside the industry. Azmil Yatim, the new CFO, had been lured by
PulpCo’s scale of operations and major market position. But after a month on the job, he was wondering
whether he had made a career mistake. The members of PulpCo’s management team behaved as though they
had limited awareness of the financial implications of their actions. Lacking accurate information about their
customers and market competition, executives largely relied on feedback of the last customer they had
visited. Major investment decisions were often made on the basis of inaccurate, untested assumptions. In
operations, managers had grown used to having to make decisions without the right data. They had an
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
97
incomplete understanding of their costs for major products such as particleboard, construction lumber, toilet
paper, and newsprint. As a result, they made some costly mistakes, from unnecessary capital investments in
plants and machinery to poor pricing.
The CFO resolved to improve the financial and decision-making capabilities of the organization. Sensing
that the support of COO Daniel Ghani, a widely respected insider, would be critical to his efforts, Yatim
initially sought his support for a bold and far-ranging change initiative to transform the organization. Ghani
rejected it as too radical. But the COO was tired of constantly dealing with one crisis after another, and after
talking with Yatim, he began to realize that many of these crises resulted from a lack of accurate financial
and customer information, which in turn produced poor decisions. Ghani became convinced that the
organization could not afford to continue to make decisions in an analytical vacuum. He urged Yatim to
devise a new plan that would make effective use of financial data generated by the company’s enterprise
systems. Better financial information and control became a shared priority of the two executives.
Using a recent series of embarrassing missteps—including a major customer defection to a key
competitor—as a rallying point, Yatim (together with the CIO, who reported to him) gained approval and
funding for an initiative to improve financially based insights and decision making. He began by reviewing
the skills in the finance function and at the mills and realized that operating managers needed help to leverage
the new system. He arranged for formal training and also created a small group of analysts to help managers
at the mills use and interpret the data. He also provided training to employees engaged in budgeting and
forecasting.
As Yatim began to analyze the new data he was receiving, an unsettling picture emerged. PulpCo was
actually losing money on some major accounts. Others that were deemed less strategically valuable were
actually far more profitable. Armed with these insights, Yatim and his analysts worked with individual
executives to interpret the data and understand the implications. As he did, the executive team became more
convinced that they needed to instill more financial discipline and acumen in their management ranks.
A breakthrough occurred when detailed financial analysis revealed that a new plant, already in the early
stages of construction, would be a costly mistake. Management realized that the company could add to
capacity more cost-effectively by expanding and upgrading two existing plants, and the project was canceled.
Managers across the company were shocked, since “everyone knew” that PulpCo needed the new plant and
had already broken ground.
The executive team then declared that all major investments under way or planned in the next twelve
months would be reviewed. Projects that were not supported by a business case and facts derived from the
company’s enterprise system would be shut down. New projects would not be approved without sufficient
fact-based evidence. Managers scrambled to find data to support their projects. But it wasn’t until a new
managerial performance and bonus program was introduced that managers really began to take financial
analysis seriously.
After a year of concerted effort, initial hiccups, and growing pains, PulpCo is a changed organization.
Major financial decisions are aligned with strategic objectives and based on facts. Management is largely
positive about the insights gained from better financial analysis and supportive of further analytical initiatives
in other parts of the business. Forecasts are more accurate, and managers are better able to anticipate and
avoid problems. The culture is no longer hostile to the use of data. Operational managers at the mills and at
corporate headquarters have increased their financial acumen and are more comfortable interpreting financial
analyses. The use of analytics has begun to spread as the organization’s managers realize that better insight
into costs and profitability can give them an edge in competitive situations. Inspired by analytical companies
such as CEMEX, PulpCo has begun a test program to bypass the lumberyard and deliver products directly to
the construction site. Not surprisingly, PulpCo is also enjoying better financial performance.
Initially, Yatim just wanted improved financial data for decision making. With limited sponsorship and
inadequate systems, PulpCo’s CFO realized that he needed to conduct some experiments and build credibility
within the organization before attempting a broader change program. With each success, PulpCo’s leadership
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
98
team became more enthusiastic about using analytics and began to see their broader potential. PulpCo is not
an analytical competitor and may never reach stage 5, but today management is enthusiastic about the
benefits of analytics and is considering whether it should make the leap to stage 3.
Stage 3: Analytical Aspirations
Stage 3 is triggered when analytics gain executive sponsorship. The executive sponsor becomes an outspoken
advocate of a more fact-based culture and builds sponsorship with others on the executive team.
Executive sponsorship is so vital to analytical competition that simply having the right sponsor is enough
to move an organization to stage 3 without any improvement in its analytical capabilities. However, those
organizations pursuing the prove-it path will have already established pockets of analytical expertise and will
have some analytical tools in hand. The risk is that several small groups will have their own analytical
fiefdoms, replete with hard-to-integrate software tools, data sets, and practices.
Whether an organization has many analytical groups or none at all, it needs to take a broader, more
strategic perspective in stage 3. The first task, then, is to articulate a vision of the benefits expected from
analytical competition. Bolstered by a series of smaller successes, management should set its sights on using
analytics in the company’s distinctive capability and addressing strategic business problems. For the first
time, program benefits should be defined in terms of improved business performance and care should be
taken to measure progress against broad business objectives. A critical element of stage 3 is defining a set of
achievable performance metrics and putting the processes in place to monitor progress. To focus scarce
resources appropriately, the organization may create a centralized “analytics hub” to foster and support
analytical activities.
In stage 3, companies will launch their first major project to use analytics in their distinctive capability.
The application of more sophisticated analytics may require specialized analytical expertise and adding new
analytical technology. Management attention to change management is critical because significant changes to
business processes, work roles, and responsibilities are necessary.
If it hasn’t already done so, the IT organization must develop a vision and program plan (an analytical
architecture) to support analytical competition. In particular, IT must work more aggressively to integrate and
standardize enterprise data in anticipation of radically increased demand from users.
The length of stage 3 varies; it can be as short as a few months or as long as two years. Once executives
have committed resources and set a timetable to build an enterprise-wide analytical capability, they are ready
to move to the next stage.
BankCo (again, we have disguised the company and other names) is an example of an organization moving
from stage 3 toward stage 4.
BankCo: Moving Beyond Functional Silos to Enterprise Analytics
Wealth management has been a hot topic within the banking industry over the last decade, and banks have
traditionally been well positioned to offer this service. In-house trust departments have provided advice and
services to generations of wealthy clients. But over the past several years, new competitors have emerged.
Banks have moved away from using individual investment managers, taking a more efficient but less
personalized approach. A trend toward greater regulatory oversight further transformed the industry. At the
same time, customers are more open to alternatives to traditional money management. Robo-advisers such as
Betterment and Wealthfront use algorithms to provide fully automated investment portfolio management at a
lower cost than most retail banks charge their clients. This confluence of factors threatens bank trust
departments’ hold on the service of managing individuals’ wealth.
At BankCo, the executive vice presidents of marketing, strategy, and relationship management were asked
by the bank’s senior management team to create a strategic response to this threat. They quickly concluded
that significant changes were needed to improve the bank’s relationships with its customers. BankCo’s trust
department assets had declined by 7 percent over two years, despite a positive market that had produced
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
99
increased assets overall and good performance in individual trust accounts. The decline was attributed to
discounting by more aggressive competition and to cannibalism of accounts by the bank’s brokerage
business. BankCo had tried introducing revamped products to retain customers but with limited success.
As in many banks, each department (retail, brokerage, trust) maintained its own customer data that was not
available to outsiders. As a result, executives could not get a complete picture of their clients’ relationships
throughout the bank, and valuable relationships were jeopardized unnecessarily. For example, one client with
a $100 million trust was charged $35 for bouncing a check. When he called his retail banker to complain, he
was told initially that his savings account wasn’t large enough for the bank to justify waiving the fee.
After analyzing these issues, the team concluded that an enterprise-wide focus on analytics would not only
eliminate the majority of these problems but also uncover cross-selling opportunities. The team realized that
a major obstacle to building an enterprise-level analytical capability would be resistance from department
heads. Their performance measures were based on the assets of their departments, not on enterprise-wide
metrics. The bank’s senior management team responded by introducing new performance metrics that would
assess overall enterprise performance (including measures related to asset size and profitability) and
cross-departmental cooperation.
These changes cleared the path for an enterprise-wide initiative to improve BankCo’s analytical
orientation, beginning with the creation of an integrated and consistent customer database (to the extent
permitted by law) as well as coordinated retail, trust, and brokerage marketing campaigns. At the same time,
an enterprise-wide marketing analytics group was established to work with the marketing teams on
understanding client values and behavior. The group began to identify new market segments and offerings,
and to help prioritize and coordinate marketing efforts to high-net-worth individuals. It also began to develop
a deeper understanding of family relationships and their impact on individual behavior. By bringing together
statisticians and analysts scattered throughout the business, BankCo could deploy these scarce resources
more efficiently. As demand quickly outstripped supply, it hired analytical specialists with industry expertise
and arranged to use an offshore firm to further leverage its scarce analytical talent.
At first, there were occasional breakdowns in decision-making processes. When a competitor restructured
its brokerage pricing, BankCo did not notice until several clients left. On another occasion, an analysis
identified a new market segment, and management authorized changes to marketing, but the organization was
slow to implement the changes. To overcome these hurdles, process changes were implemented to make sure
that decisions were translated into action. Managers received training and tools so that they were equipped to
make decisions analytically and understood how to develop hypotheses, interpret data, and make fact-based
decisions. As the organization’s analytical capabilities improved, these breakdowns became less frequent.
As more tangible benefits began to appear, the CEO’s commitment to competing on analytics grew. In his
letter to shareholders, he described the growing importance of analytics and a new growth initiative to
“outsmart and outthink” the competition. A chief data officer was named to develop and implement the
bank’s data and analytics strategy. Analysts expanded their work to use propensity analysis and neural nets
(an artificial intelligence technology incorporating nonlinear statistical modeling to identify patterns) to target
and provide specialized services to clients with both personal and corporate relationships with the bank. They
also began testing some analytically enabled new services for trust clients. Today, BankCo is well on its way
to becoming an analytical competitor.
Stage 4: Analytical Companies
The primary focus in stage 4 is on building world-class analytical capabilities at the enterprise level. In this
stage, organizations implement the plan developed in stage 3, making considerable progress toward building
the sponsorship, culture, skills, strategic insights, data, and technology needed for analytical competition.
Sponsorship grows from a handful of visionaries to a broad management consensus; similarly, an emphasis
on experimentation and analytics pervades the corporate culture. As the organization learns more from each
analysis, it obtains a rich vein of new insights and ideas to mine and exploit for competitive advantage.
Building analytical capabilities is a major (although not the only) corporate priority.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
100
While there are many challenges at this stage, the most critical one is allocating sufficient attention to
managing cultural and organizational changes. We’ve witnessed many organizations whose analytical
aspirations were squelched by open cultural warfare between the “quant jocks” and the old guard. A related
challenge is extending executive sponsorship to the rest of the management team. If only one or two
executives are committed to analytical competition, interest will immediately subside if they suddenly depart
or retire. In one case, the CEO of a financial services firm saw analytical competition as his legacy to the
organization he had devoted his life to build. But his successors did not share his enthusiasm, and the
analytical systems developed under his leadership quickly fell into disuse.
As each analytical capability becomes more sophisticated, management will gain the confidence and
expertise to build analytics into business processes. In some cases, they use their superior insight into
customers and markets to automate key decision processes entirely.
In stage 4, many organizations realign their analysts and information workers to place them in assignments
that are better suited to their skills. As a company becomes more serious about enterprise-wide analytics, it
often draws together the most advanced analysts into a single group to focus on strategic issues. This
provides the organization with a critical mass of analysts to focus on the most strategic issues, and provides
the analysts with greater job satisfaction and opportunity to develop their skills.
Once an organization has an outstanding analytical capability combined with strategically differentiating
analytics embedded into its most critical business processes and has achieved major improvements to its
business performance and competitiveness, it has reached the final stage.
ConsumerCo: Everything but “Fire in the Belly”
At a large consumer products company, analytical competition is at stage 4. ConsumerCo has everything in
place but a strong executive-level commitment to compete on this basis. It has high-quality data about
virtually every aspect of its business, and a capable IT function. It has a group of analysts who are equal to
any company’s. The analysts have undertaken projects that have brought hundreds of millions of dollars in
value to the company. Still, they have to justify their existence by selling individual projects to functional
managers.
The CEO of ConsumerCo is a strong believer in product innovation and product-oriented research but not
particularly in analytics. The primary advocate for analytics is the COO. Analytics are not discussed in
annual reports and analyst calls, though the company does have a culture of fact-based decision making and
uses a large amount of market research. ConsumerCo is doing well financially but has been growing
primarily through acquisitions. In short, analytics are respected and widely practiced but are not driving the
company’s strategy. With only a bit more “fire in the belly” from senior executives, it could become a true
analytical competitor in a short time.
Stage 5: Analytical Competitors
In stage 5, analytics move from being a very important capability for an organization to the key to its strategy
and competitive advantage. Analytical competitors routinely reap the benefits of their enterprise-wide
analytical capability. Proprietary metrics, analytics, processes, and data create a strong barrier to competitors,
but these companies are always attempting to move the analytical bar further.
Executive commitment and passion for analytical competition in this stage is resolute and widespread. The
organization’s expertise in analytical competition is discussed in annual reports and in discussions with
investment analysts. Internal performance measures and processes reinforce a commitment to scientific
objectivity and analytical integrity. Analytics are used to drive innovation across the organization. And
predictive and prescriptive analytics are an important and growing component of the enterprise’s product
offerings as well.
However, analytical competitors must avoid complacency if they are to sustain their competitive
advantage. They need to build processes to continually monitor the external environment for signs of change.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
101
They must also remain vigilant in order to recognize when changing market conditions require them to
modify their assumptions, analytical models, and rules.
We’ve described a large number of these companies already, so we won’t give an example here. Each
stage 5 company is different in terms of the strategic capability it emphasizes, the applications it employs,
and the path it followed to success. But they have in common an absolute passion for analytics and a
resulting strong financial performance.
Progressing Along the Road Map
As you can see from these examples, becoming an analytical competitor takes more than an enthusiasm for
data analysis. Many executives seeking to build their analytical capabilities begin by purchasing software,
hiring quantitative analysts, and piloting some kind of analytical initiative. While those actions can be a good
start, they are just that—a beginning from which analytical leaders must build in order to truly develop their
analytic capability. “Analytics is a muscle we build,” according to Elpida Ormanidou, formerly vice president
of global people analytics at Walmart and now vice president of advanced analytics and testing at retailer
Chico’s FAS, Inc. “You cannot buy yourself into an analytics capability.”3
The path to success with analytics will contain speed bumps along the way. Managers not yet on the
full-steam-ahead path may be tempted to either shift resources away from or shut down completely an
analytics initiative if business conditions put pressure on the organization. Also, the shift to analytics will
most likely require employees to change their decision-making processes. It takes time for the organization to
adjust to new skills and behaviors, but without them, no real change can occur. As a result, the most
important activity for the leadership team is to keep analytical initiatives on track and to monitor outcomes to
ensure that anticipated benefits are achieved.
At every stage of development, companies need to manage outcomes to achieve desired benefits, setting
priorities appropriately and avoiding common pitfalls.
As we described in , the DELTA model provides guidance to executives seeking to create a roadchapter 2
map for building their organization’s analytical capabilities. DELTA, the Greek letter that signifies “change”
in an equation, seemed like a fitting acronym to describe the elements and steps needed to implement an
analytical capability. Unless you are blessed with analytical leadership and culture, becoming an analytical
competitor means significant change for an organization. To recap, DELTA stands for:
Data: Leveraging data to glean valuable insights
Enterprise: Managing and coordinating resources at an enterprise level
Leadership: Fostering an analytical leadership team and culture
Targets: Focusing analytics investments on the best, high value areas
Analysts: Developing and managing analytical talent
We’ll briefly summarize each of these capability elements here.
Data
Data is, of course, a prerequisite for using analytics. On occasion, a single data point might be enough to
make a difference. But most of the time, having lots of data is better. Having high-quality, diverse, and4
dynamic data—easily accessible to users in a data warehouse, data mart, or data lake—generally yields better
results too. Unique data is better still. Analytical competitors view data as a strategic asset, and like any other
strategic asset, it must be managed to maximize its value to the organization. We will talk more about
finding, organizing, and managing data in .chapter 8
Enterprise
As we explained in , it is important to take an enterprise perspective. Lacking an enterprisechapter 2
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
102
perspective means having a fractured and incomplete understanding of the issues facing the organization, and
the resources available to address them. Executives require a broad business perspective if they are to address
the strategic issues at the core of business competitiveness and effectiveness. Only an enterprise orientation
can properly answer important questions such as, “Which performance factors have the greatest impact on
future growth and profitability?” “How should we optimize investments across our products, geographies and
marketing channels?” Or “Are decisions aligned with company strategy, or just promoting someone’s
self-interest?” Similarly, since major analytics initiatives invariably touch multiple functions across the
organization, it is important to avoid placing vital analytical resources (such as data, technology, or analysts)
in functional silos. We discuss some of these organizational issues further in .chapter 7
Leadership
It is fitting that leadership is at the center of the DELTA model, because without committed analytical
leadership, the potential for using analytics is quite limited. Analytical leaders are passionate advocates for
analytics and fact-based, data-driven decision making. They set a hands-on example by being voracious
consumers of data and analytics. These executives routinely challenge conventional wisdom and untested
assumptions. Analytical leaders are also highly experimental and innovative. They continually seek
innovative ways to get valuable insights. They explore ways to incorporate proprietary data and algorithms
into new products and services. Analytical leaders prefer to surround themselves with smart, analytical
people. And above all, they encourage a culture that views data as a strategic asset—that strives to be a
meritocracy where the best data and ideas win. These types of leaders aren’t found everywhere, but there
definitely are some. We discuss more considerations of being an analytical leader in .chapter 7
Targets
All organizations have finite resources, and therefore it is critical to prioritize potential investments in
analytics where they will have the most beneficial impact. Picking the right spots for investment is the core of
an analytical road map. The right targets will depend on the organization’s analytical maturity, industry, and
business strategy. Targets should be achievable yet have the potential to make a significant impact by cutting
costs, optimizing processes, improving customer engagement, expanding the business, or increasing
profitability. As an enterprise’s analytical maturity improves, targets should be focused on the organization’s
distinctive capabilities, leading to initiatives that are more strategic and game-changing for the organization
and its customers. The number of targets can also grow with time and greater analytical maturity.
Analysts
Managing and developing analytical talent goes beyond hiring a few smart, analytical employees. Analytical
professionals and data scientists build and maintain the models and algorithms used throughout the
organization. Analytical talent also includes the executives who oversee analytical initiatives, decision
makers who use the results of analyses, and analytical amateurs—the information workers who routinely use
data in their jobs. We describe the roles of different types of analysts in —senior executiveschapter 7
(including the chief data and analytics officer), analysts, data scientists, and analytical amateurs—along with
some organizational considerations for getting the most value from this valuable resource. Because
technologies and quantitative techniques tend to become more sophisticated with growing maturity, we also
will describe these factors.
In our first cut at the DELTA model, we felt that these five factors were sufficient to explain and predict how
a company could succeed with analytics. But with the advent of big data and a variety of new analytical
techniques (including artificial intelligence, for example), it may also be useful to add the following two
capabilities to the model.
Technology
Technology for analytics has changed rapidly over the last decade. Providing the infrastructure, tools, and
technologies to support analytics across the organization is no small task, and this responsibility should
belong to the IT department (though it doesn’t always). Most organizations have plenty of data, software, and
processing power; the challenge is getting it all to work together with a minimum of fuss. Too much data is
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
103
locked into organizational silos, and efforts to solve the problem by pulling data into warehouses or data
lakes too often result in conflicting data repositories. Despite the proliferation of analytical software,
statistical programming tools, data warehouses, visualization tools, and the like, relatively few organizations
have a truly robust and well-integrated technical environment that fully supports enterprise-wide analytics
and big data. The technologies underpinning big data and analytics are a rapidly evolving field and beyond
the scope of any one book. In , we describe (at a conceptual level) the required components andchapter 8
considerations for an analytic technical architecture.
Analytical Techniques
There are many different quantitative and analytical disciplines that contribute analytical techniques. They
come from many diverse fields, such as computer science, statistics, econometrics, informatics, physics,
actuarial science, artificial intelligence, operations research, and biostatistics. Organizations are wise to draw
on many different types of analytical techniques drawn from these disciplines, ranging from simple
descriptive statistics and probability, to machine learning and genetic algorithms. Determining the “best” or
“right” technique depends on many different factors and will vary depending on the industry or function, the
types of questions being addressed, data characteristics and the creativity of the analyst. Sometimes the best
answers are obtained by combining techniques rather than relying on a single quantitative discipline. As
organizations become more sophisticated in their application of analytics, they generally rely on more diverse
and advanced analytical techniques too. We provided a sampling of these techniques and some appropriate
situations in which to use them in chapters 4 and 5.
For a high-performing analytical capability, all the elements of the DELTA model need to be working
together. If one element lags too far ahead or behind the others, it can be a roadblock to moving forward.
describes the typical conditions for each element of the DELTA framework at each maturity stageTable 6-3
and does the same for and . These can be used as a quicktable 6-4 technologies analytical techniques
assessment tool or as a reference to help you understand where you need improvement. For a more detailed
explanation of the assets and capabilities needed at every stage of analytical maturity, our book Analytics at
provides much more guidance.Work 5
TABLE 6-3
The DELTA model of analytical capabilities by stage
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
104
TABLE 6-4
Additional technical capabilities for advanced analytics
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
105
Managing for Outcomes
Four types of outcomes are critical to measuring an initiative’s performance: behaviors; processes and
programs; products and services; and financial results. While financial results may be all that matter in the
end, they probably won’t be achieved without attention to intermediate outcomes.
Behaviors
To a large extent, improved financial outcomes depend on changing employee behaviors. Implementing new
analytical insights into pricing, for example, can require thousands of individuals to change their behaviors.
Salespeople may resist pricing recommendations. Sales managers may initially believe that their own pricing
experience is better than that of any system. Managers have to monitor measures and work with employees
who don’t comply with policies. Executives have to send out frequent messages to reinforce the desired
change of direction.
Processes and Programs
Fact-based analyses often require process and program changes to yield results. For example, insights into the
best way to persuade wireless customers not to defect to another carrier need to be translated into
actions—such as developing a new program to train customer-facing employees.
One way to ensure that insights are incorporated into business processes is to integrate analytics into
business applications and work processes. Incorporating analytical support applications into work processes
helps employees accept the changes and improves standardization and use. For more advanced analytical
competitors, automated decision-making applications can be a powerful way of leveraging strategic insights.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
106
Products and Services
One of the best ways to add value with data is by creating innovative products and services that incorporate
data and/or analytics; we described these in greater detail in . For existing products and services,chapter 3
introducing smarter products and services can increase margins and be a powerful differentiator. Innovative
analytically based products can open up entirely new customer markets and revenue streams.
Financial Results
It is important to specify the financial outcomes desired from an analytical initiative to help measure its
success. Specific financial results may include improved profitability, higher revenues, lower costs, or
improved market share or market value. Initially, cost savings are the most common justification for an
analytical initiative because it is much easier to specify in advance how costs will be cut. Revenue increases
are more difficult to predict and measure but can be modeled with analytical tools and extrapolated from
small tests and pilot studies. As an organization’s analytical maturity increases, it will be more willing to
invest in initiatives targeted at exploiting growth opportunities and generating revenue.
Establishing Priorities
Assuming that an organization already has sufficient management support and an understanding of its desired
outcomes, analytical orientation, and decision-making processes, its next step is to begin defining and
prioritizing actions. See the box “ ” for criticalQuestions to Ask When Evaluating New Analytical Initiatives
questions managers should use to assess the potential of an analytical initiative. Projects with the greatest
potential benefit to the organization’s distinctive capabilities and competitive differentiation should take
precedence. Taking an analytical approach to investment decisions, requiring accountability, and monitoring
outcomes will help reinforce the analytical culture and maximize investments where they are likely to have
the greatest impact. A common error is to assume that merely having analytical technology is sufficient to
transform an organization. The approach—“If you build it, they will come”—usuallyField of Dreams
disappoints. If you build a data warehouse or a full-blown analytical technical infrastructure without
developing the other DELTA model components, the warehouse and the data contained in it will just sit
there.
QUESTIONS TO ASK WHEN EVALUATING NEW
ANALYTICAL INITIATIVES
How will this investment make us more competitive?
To what extent will this investment make us more agile to respond to changing market
conditions?
How does the initiative improve our enterprise-wide analytical capabilities?
How will the investment foster greater innovation and growth opportunities?
What complementary changes need to be made in order to take full advantage of new
capabilities, such as developing new or enhanced skills; improving IT, training, and
processes; or redesigning jobs?
Does the right data exist? If not, can we get it or create it? Is the data timely,
consistent, accurate, and complete?
Is the technology reliable? Is it cost-effective? Is it scalable? Is this the right approach
or tool for the right job?
Avoiding the Potholes
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
107
Since every organization is different, we won’t attempt to provide detailed instructions to navigate around all
the potential hazards encountered along the road. Hazards can appear suddenly at any stage of development.
However, we can provide guidelines to help make the planning and implementing efforts go as smoothly as
possible.
First, some missteps are due primarily to ignorance. The most common errors of this kind are:
Focusing excessively on one dimension of analytical capability (e.g., too much technology)
Collecting data without any plans to use it
Attempting to do everything at once
Investing excessive resources on analytics that have minimal impact on the business
Investing too much or too little in any analytical capability, compared with demand
Choosing the wrong problem, not understanding the problem sufficiently, using the wrong analytical
technique or the wrong analytical software
Automating decision-based applications without carefully monitoring outcomes and external
conditions to see whether assumptions need to be modified
Of greater concern to many executives is the intentional undermining of analytical competition. Many
managers share Benjamin Disraeli’s suspicion that there are “lies, damned lies, and statistics.” Data analysis6
has the potential for abuse when employed by the unscrupulous, since statistics, if tortured sufficiently, will
confess to anything. Applying objective criteria and data for decision making is highly threatening to any7
bureaucrat accustomed to making decisions based on his or her own self-interest. Analytical competition
cannot thrive if information is hoarded and analytics manipulated. Executives must relentlessly root out
self-serving, manipulated statistics and enforce a culture of objectivity.8
Conclusion
This chapter has explored the key attributes of an analytical capability and provided a directional guide to the
steps that lead to enhanced analytical capabilities. We wish you rapid movement to the full-steam-ahead path
to becoming an analytical competitor. In , we will explore ways to successfully manage anchapter 7
organization’s commitment to key people who are—or need to be—using analytics.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
108
CHAPTER SEVEN
MANAGING ANALYTICAL PEOPLE
CULTIVATING THE SCARCE INGREDIENT THAT MAKES ANALYTICS
WORK
When most people vizualize business analytics, they think of computers, software, and printouts or screens
full of numbers. What they should be envisioning, however, are their fellow human beings. It is people who
make analytics work and who are the scarce ingredient in analytical competition.
Analytical Urban Legends
This assertion is contrary to some analytical urban legends, so let us dispel those now. Years ago, we began
hearing extravagant tales of software that would eliminate the need for human analysts. The most popular
story involved a data mining episode involving diapers and beer. The gist of the story was that some grocery
retailer had turned loose its powerful data mining software on a sales database, and it had come up with an
interesting finding. Male customers who came in to buy beer for the weekend also tended to remember that
their wives had asked them to buy diapers (some versions of the story switched around the primary shopping
intent), so they put both products in their shopping carts. The retailer quickly moved diapers over by the beer
(or vice versa), and sales exploded.
We chased this one down, and the most credible version of the story happened at Osco, a drugstore chain.
Some of the company’s data analysts do dimly remember seeing a correlation between diaper and beer sales
in their stores. But the analysts had told the software where and how to look for the relationship; it wasn’t
just stumbled on by an enterprising young computer. Most importantly, the finding was deemed an anomaly,
and diapers and beer were never put in proximity in the Osco stores (not all of which could even sell beer).
The legend is worth discussing, however, for a few lessons it provides. While data mining software is a
wonderful thing, a smart human still needs to interpret the patterns that are identified, decide which patterns
merit validation or subsequent confirmation, and translate new insights into recommendations for action.
Other smart humans need to actually take action. When we studied more than thirty firms with a strong
analytical capability in 2000, we found that a heavy dose of human skills was present at each of the firms,
and the analytical competitors we’ve studied over the years certainly have lots of smart analysts on board.1
The other key lesson of the diapers and beer legend is that analytics aren’t enough, even when orchestrated
by a human analyst. In order for analytics to be of any use, a decision maker has to make a decision and take
action—that is, actually move the diapers and beer together. Since decision makers may not have the time or
ability to perform analyses themselves, such interpersonal attributes as trust and credibility rear their ugly
heads. If the decision maker doesn’t trust the analyst or simply doesn’t pay attention to the results of the
analysis, nothing will happen and the statistics might as well never have been computed.
We found another good example of this problem in our previous study of analytical capability. We talked
to analysts at a large New York bank who were studying branch profitability. The analysts went through a
painstaking study in the New York area—identifying and collecting activity-based costs, allocating
overheads, and even projecting current cost and revenue trends for each branch into the near future. They
came up with a neat, clear, ordered list of all branches and their current and future profitability, with an even
neater red line drawn to separate the branches that should be left open from those that should be closed.
What happened? Nary a branch was shut down. The retail banking executive who had asked for the list
was mostly just curious about the profitability issue, and he hardly knew the analysts. He knew that there
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
109
were many political considerations involved in, say, closing the branch in Brooklyn near where the borough
president had grown up, even if it was well below the red line. Analytically based actions usually require a
close, trusting relationship between analyst and decision maker, and that was missing at the bank. Because of
the missing relationship, the analysts didn’t ask the right questions, and the executive didn’t frame the
question for them correctly.
There are really three groups, then, whose analytical skills and orientations are at issue within
organizations. One is the senior management team—and particularly the CEO—which sets the tone for the
organization’s analytical culture and makes the most important decisions. Then there are the professional
analysts, who gather and analyze the data, interpret the results, and report them to decision makers. The third
group is a diverse collection we will refer to as analytical amateurs, a very large group of “everybody else,”
whose use of the outputs of analytical processes is critical to their job performance. These could range from
frontline manufacturing workers, who have to make multiple small decisions on quality and speed, to middle
managers, who also have to make middle-sized decisions with respect to their functions and units. Middle
managers in the business areas designated as distinctive capabilities by their organizations are particularly
important, because they oversee the application of analytics to these strategic processes. IT employees who
put in place the software and hardware for analytics also need some familiarity with the topic. We’ll describe
each of these groups in this chapter.
Before we go any further, however, it is important to point out that the role of humans in analytics is
changing somewhat, and is likely to change more in the near future. One key development is that machine
learning and other intelligent technologies are changing how analytical models are generated. A human
analyst might be able to generate a few new models per week, but a machine learning system could easily
generate tens of thousands of models per week. Thus far, however, machine learning models still need
humans to kick them off, point them in the direction of the right data and the variables to be predicted, and
ensure that the resulting models make sense. We don’t think that machine learning models have inhibited
employment for quantitative analysts and data scientists yet, but they may in the future.
The other technological factor that’s driving change for humans in the world of analytics is automation of
decisions and actions. While an autonomous system is unlikely to close a set of bank branches without
human intervention, complex decisions and digital tasks can both be performed by machine. They can do
things like approve insurance policies, authorize loans, reboot computer servers, and replace and mail a lost
ATM card. It’s likely that the decisions taken over by analytics and computers will be tactical and repetitive
ones, but these probably constitute the bulk of decisions in many organizations. We don’t think high-level
managers will lose their jobs to autonomous systems, but it’s likely that some employees will. The good news
here is that an automated decision system is unlikely to ignore an analytical result. Humans who ignore
analytics in the future will do so at their own peril.
Senior Executives and Analytical Competition
As if CEOs, presidents, COOs and other senior executives weren’t busy enough already, it is their
responsibility to build the analytical orientation and capabilities of their organizations. If the CEO or a
significant fraction of the senior executive team doesn’t understand or appreciate at least the outputs of
quantitative analysis or the process of fact-based decision making, analysts are going to be relegated to the
back office, and competition will be based on guesswork and gut feel, not analytics. Fact-based decision
making doesn’t always involve analytics—sometimes “facts” may be very simple pieces of evidence, such as
a single piece of data or a customer survey result—but the desire to make decisions on the basis of what’s
really happening in the world is an important cultural attribute of analytical competitors.2
For an example, take Phil Knight, the founder and chairman emeritus of Nike. Knight has always been
known as an inspirational but intuitive leader who closely guarded the mythical Nike brand. Perhaps needless
to say, it didn’t take a lot of analytics to come up with the famous “swoosh.” At the beginning of 2005,
however, Knight brought in William Perez, formerly head of S. C. Johnson & Son, as CEO of Nike. Perez,
accustomed to the data-intensive world of Johnson’s Wax, Windex, and Ziploc bags, attempted to bring a
more analytical style of leadership into Nike. He notes about himself, “I am a data man—I like to know what
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
110
the facts are. If you come from the world of packaged goods, where data is always valuable, Nike is very
different. Judgment is very important. Feel is very important. You can’t replace that with facts. But you can
use data to guide you.”3
Perez attempted, for example, to move Nike into middle-tier retail environments, where his data suggested
that footwear growth was fastest. In response to arguments from Knight and other Nike executives that such a
move would weaken the brand, Perez pointed to companies such as Apple that sell successfully in Walmart
without diluting their brands. But these and other clashes eventually led Knight and the board of directors to
remove Perez little more than a year later.
The good news is that Perez’s departure only delayed the rise of analytics at Nike. Over the last several
years, the company has become much more analytical. It uses data and analytics to influence shoe design,
marketing programs, outlet store locations, logistics, and many other types of decisions. Nike has perhaps the
world’s largest analytics group that’s focused on sustainability. The company may have gotten there faster if
Perez had stayed, but it is definitely moving toward analytical competitor status.
The general lesson here, however, is that if a CEO can’t move the culture in a more analytical direction, a
middle or junior manager would have even less of a chance of doing so. In fact, we’ve seen several
companies in which fairly senior functional managers—corporate heads of marketing or technology, for
example—were trying to bring a more analytical orientation to their firms and faced substantial obstacles.
One, for example, a senior vice president of sales and marketing at a technology company, was known as a
true “data hound,” bringing piles of statistical reports to meetings and having perfect command of his own
(and other managers’) statistics. The sales and marketing organizations slowly began to become more data
focused, but for years the overall company culture continued to emphasize chutzpah and confidence more
than correlations. Again, that company eventually embraced analytics (it helped that the “data hound” kept
getting promoted, eventually becoming president), but a more receptive culture would have allowed it to
happen faster.
While there’s no doubt that almost any employee can move an organization in a more analytical direction,
it takes top management commitment for a company to become an analytical competitor. In fact, we didn’t
find a single stage 4 or 5 analytical competitor where either the CEO or a majority of the senior management
team didn’t believe strongly in analytics as a primary competitive resource. Senior executive support is even
important at stage 3, when organizations begin to aspire to analytical competition.
Characteristics of Analytical Leaders
What are the traits that senior executives and other analytical champions in an analytical competitor should
have? A few key ones are described next.
They should be passionate believers in analytical and fact-based decision making
You can’t inspire others to change their behavior in a more analytical direction if you’re not passionate about
the goal. A truly committed executive would demonstrate fact-based and analytical decision making in his or
her own decisions and continually challenge the rest of the organization to do the same. For example,
whenever Barry Beracha, previously CEO of the private-label baker Earthgrains (which was acquired by Sara
Lee Bakery Group), needed to make a decision, he searched to turn up the right data. He insisted that the
entire organization needed better data, and led the company to implement a new ERP system to create it.
After it was available, he pressed his employees to use it when deciding what products to keep and what
customers to serve. He was so passionate about data-based decisions that his employees referred to him as a
“data dog”—to his delight.
They should have some appreciation of analytical tools and methods
The senior executives of analytical competitors don’t necessarily have to be analytical experts (although it
helps!). As Professor Xiao-Li Meng—formerly the chair of the statistics department at Harvard and now dean
of the Graduate School of Arts and Sciences—points out, you don’t need to become a winemaker to become
a wine connoisseur. Management users of data analytics do need to have an awareness of what kinds of4
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
111
tools make sense for particular business problems, and the limitations of those tools. Just as a politician
analyzing polls should know something about confidence intervals, the CEO making a decision about a plant
expansion should know something about the statistical and qualitative assumptions that went into predicting
demand for the goods the plant will make.
They should be willing to act on the results of analyses
There is little point in commissioning detailed analytics if nothing different will be done based on the
outcome. Many firms, for example, are able to segment their customers and determine which ones are most
profitable or which are most likely to defect. However, they are reluctant to treat different customers
differently—out of tradition or egalitarianism or whatever. With such compunctions, they will have a very
difficult time becoming successful analytical competitors—yet it is surprising how often companies initiate
analyses without ever acting on them. The “action” stage of any analytical effort is, of course, the only one
that ultimately counts.
They should be willing to manage a meritocracy
With widespread use of analytics in a company, it usually becomes very apparent who is performing and who
isn’t. Those who perform well should be rewarded commensurately with their performance; those who don’t
perform shouldn’t be strung along for long periods. As with customers, when the differences in performance
among employees and managers are visible but not acted on, nothing good results—and better employees
may well become disheartened. Of course, the leaders of such meritocratic firms have to live and die by this
same analytical sword. It would be quite demoralizing for a CEO to preach the analytical gospel for everyone
else but then to make excuses for his or her own performance as an executive.
How Does Analytical Leadership Emerge?
Some organizations’ leaders had the desire to compete analytically from their beginning. Amazon was
viewed by founder Jeff Bezos as competing on analytics from its start. Its concept of personalization was
based on statistical algorithms and web transaction data, and it quickly moved into analytics on supply chain
and marketing issues as well. Amazon used analytics to determine the timing and extent of its holiday
advertising strategy, outspending Walmart in October and November 2016. Capital One, Netflix, and Google
were also analytical from the beginning because their leaders wanted them so. The visions of the founders of
these startup businesses led to analytical competition.
In other cases, the demand for analytical competition came from a new senior executive arriving at an
established company. Gary Loveman at Caesars, and Tom Ricketts, the new owner of the Chicago Cubs,
brought with them an entirely new analytical strategy.
Sometimes the change comes from a new generation of managers in a family business. At the winemaker
E. & J. Gallo, when Joe Gallo, the son of one of the firm’s founding brothers, became CEO, he focused much
more than the previous generation of leaders on data and analysis—first in sales and later in other functions,
including the assessment of customer taste. At the National Football League’s New England Patriots, the
involvement in the team by Jonathan Kraft, a former management consultant and the son of owner Bob Kraft,
helped move the team in a more analytical direction in terms of both on-field issues like play selection and
team composition and off-field issues affecting the fan experience.
The prime mover for analytical demand doesn’t always have to be the CEO. At Procter & Gamble, for
example, the primary impetus for more analytics at one point came from the firm’s two vice chairmen. One
of them, Bob McDonald, became CEO and accelerated P&G’s analytical journey. And Jonathan Kraft is the
president of the Patriots, not the CEO.
In addition to the general characteristics described earlier in the chapter (which are generally relevant for
the CEO), there are specific roles that particular executives need to play in analytical competition. Three key
roles are the chief financial officer (CFO), the chief information officer (CIO) and the chief data and
analytics officer (CDAO).
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
112
Role of the CFO
The chief financial officer in most organizations will have responsibility for financial processes and
information. Therefore, analytical efforts in these domains would also be the CFO’s concern. Since most
analytical projects should involve some sort of financial information or returns, the CFO is at least a partial
player in virtually all of them.
We have found several companies in which the CFO was leading the analytical charge. In order to play
that role effectively, however, a CFO would have to focus on analytical domains in addition to finance and
accounting. For example, at a large insurance company, the CFO had taken responsibility for analytics
related to cost control and management but also monitored and championed analytical initiatives in the
claims, actuarial, and marketing areas of the company. He also made it his responsibility to try to establish in
the company’s employees the right overall balance of intuitive versus analytical thinking.
At Deloitte’s US business, the person leading the charge on analytics (at least for internal consumption) is
Frank Friedman, the CFO. He has assembled a group of data scientists and quantitative analysts within the
Finance organization. They are working with him to address several initiatives, including optimized pricing,
predictive models of performance, identifying services that help sell other services, and factors that drive
receivables. They have also worked to predict which candidates will be successful recruits to the firm.
Another CFO (technically a senior vice president of finance) at a retail company made analytics his
primary focus, and they weren’t even closely connected to finance. The company had a strong focus on
customers and customer orientation, and he played a very active role in developing measures, systems, and
processes to advance that capability. The company already had good information and analytics on such
drivers of its business as labor, space allocation, advertising, and product assortment. His goal was to add the
customer relationship and customer segment information to those factors. Since his role also incorporated
working with the external financial community (Wall Street analysts, for example), he was also working on
making the company’s analytical story well known to the outside world. He also viewed his role as including
advocacy of a strong analytical orientation in a culture where it wasn’t always emphasized. He noted, “I’m
not the only advocate of analytics in the company—I have a number of allies. But I am trying to ensure that
we tell our stories, both internally and externally, with numbers and analytics.”
At Bank of America, CFO (2005–2006) Al de Molina viewed himself as a major instigator of analytical
activity. The bank had tried—and largely failed with—a big data warehouse in the early 1990s, so managers
were generally wary of gathering together and integrating data. But in his previous job as head of the treasury
function, de Molina felt that in order to accurately assess the bank’s risks, it needed to consolidate
information about assets and rates across the bank. Since the bank was growing rapidly and was assimilating
several acquisitions, integrating the information wasn’t easy, but de Molina pushed it anyway. The CFO also
took responsibility for analytics around US macroeconomic performance. Since it has a wealth of data on the
spending habits of American consumers, Bank of America could make predictions on the monthly
fluctuations in macroeconomic indicators that drive capital markets. This had obvious beneficial implications
for the bank’s risks. Both the interest rate risk and the macroeconomic analytics domains are obvious ones for
a CFO’s focus. De Molina largely deferred to other executives where, for example, marketing analytics were
concerned. De Molina left Bank of America and eventually was named CEO of GMAC, now Ally Financial.
Role of the CIO
The CEO or another top operating executive will have the primary responsibility for changing the culture and
the analytical behaviors of employees. But CIOs play a crucial role in this regard too. They can work with
their executive peers to decide what behaviors are necessary and how to elicit them.
At the telecommunications firm Verizon, the CIO’s objective is to create a similar change in analytical
culture. Verizon and other firms arising out of the “Bell System” have long been analytically oriented, but
decisions were generally made slowly and were pushed up the organizational hierarchy. While CIO at
Verizon from 2000 to 2010 (he later became CEO of Juniper Networks and Coriant, another telecom
equipment firm), Shaygan Kheradpir attempted to change this culture through continual exposure to
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
113
information. He created a form of continuous scorecard in which hundreds of performance metrics of various
types are broadcast to PCs around the company, each occupying the screen for fifteen seconds. The idea was
to get everyone—not just senior executives—focused on information and what it means, and to encourage
employees at all levels to address any issues that appear in the data. Kheradpir felt that the use of the
scorecard changed Verizon’s culture in a positive direction.
Of course, the most traditional approach to analytics for CIOs is through technology. The CIO must craft
an enterprise information strategy that serves the needs of everyone in the organization. This includes much
more than running the enterprise transaction applications, management reporting, and external websites. A
technology infrastructure must be capable of delivering the data, analytics, and tools needed by employees
across the organization. discusses analytical technology, and it should certainly be apparent fromChapter 8
reading that chapter that both an architect and a leader are necessary. Those roles may not have to be played
by the CIO personally, but the person(s) playing them would in all likelihood at least report to the CIO.
The CIO may also provide a home and a reporting relationship for specialized analytical experts. Such
analysts make extensive use of IT and online data, and they are similar in temperament to other IT people.
Some of the analytical competitors where analytical groups report to the office of the CIO include Procter &
Gamble, the trucking company Schneider National, Inc., and Marriott. Procter & Gamble, for example,
consolidated its analytical organizations for operations and supply chain, marketing, and other functions. This
allowed a critical mass of analytical expertise to be deployed to address P&G’s most critical business issues
by “embedded” analysts within functions and units. The group reported to the CIO and is part of an overall
emphasis within the IT function on information and decision making (in fact, the IT function was renamed
“information and decision solutions” at Procter & Gamble). Then-CIO Filippo Passerini worked closely with
vice chairman, later CEO, Bob McDonald to architect a much more analytical approach to global
decision-making at the company. They developed a series of innovations, including “business sphere” rooms
for data-driven decision-making, and “decision cockpits” with real-time data for over fifty thousand
employees.
CIOs wanting to play an even more valuable analytical role than simply overseeing the technology will
focus on the in their titles—the information. Analytical competition, of course, is all about information—doI
we have the right information, is it truly reflective of our performance, and how do we get people to make
decisions based on information? These issues are more complex and multifaceted than buying and managing
the right technology, but organizations wishing to compete on analytics will need to master them. Research
from one important study suggests that companies focusing on their information orientations perform better
than those that address technology alone. The authors of the study argue that 5 information orientation
consists of information behaviors and values, information management practices, and information technology
practices—whereas many CIOs address only the latter category. While that study was not primarily focused
on analytics, it’s a pretty safe bet that information orientation is highly correlated with analytical success.
Role of the CDAO (Chief Data and Analytics Officer)
As we mentioned in , many analytical competitors have created a new role, the chapter 2 chief data and
(or sometimes only or , still with analyticsanalytics officer chief analytics officer chief data officer
responsibilities). The CDAO is responsible for ensuring that the enterprise has the data, organizational
capabilities, and mindset needed to successfully compete on analytics. Gartner describes the role this way:
“The CDO is a senior executive who bears responsibility for the firm’s enterprise wide data and information
strategy, governance, control, policy development, and effective exploitation. The CDO’s role will combine
accountability and responsibility for information protection and privacy, information governance, data quality
and data life cycle management, along with the exploitation of data assets to create business value.”6
The CDAO serves as the champion and passionate advocate for the adoption of big data analytics in the
organization. The analysts and data scientists in a firm may report directly to the CDAO, or they may have a
matrixed reporting relationship. At a minimum, the CDAO keeps data scientists and other analysts
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
114
productively focused on important business objectives, clears bureaucratic obstacles, and establishes effective
partnerships with business customers. Many CDAOs tell us that they spend half their time “evangelizing” for
analytics with the business community.
Depending on the organization and its strategic priorities, the CDAO may report variously to the CEO,
COO, CIO, chief risk officer, or the chief marketing officer. Since the CDAO does not directly own business
processes, he or she must work closely with the rest of the management team to embed data analytics into
decision making and operations. And this executive must ensure that analyst’s insights are put into practice
and produce measurable outcomes. If data management and analytics are combined into one CDAO role, it’s
important for the incumbents to carve out time for both defense—data security, privacy, governance,
etc.—and offense, which includes the use of analytics to create business value.7
What If Executive Commitment Is Lacking?
The enemies of an analytical orientation are decisions based solely on intuition and gut feel. Yet these have
always been popular approaches to decision making because of their ease and speed and a belief that gut-feel
decisions may be better. As we noted in , the presence of a committed, passionate CEO or other topchapter 6
executive can put the organization on the fast track to analytical competition. But for those organizations
without sufficient demand for data and analysis in executive decision making, the obvious question is
whether such demand can be stimulated. If there is no senior executive with a strong analytical orientation,
must the organization wait for such a manager to be appointed?
If you don’t have committed executives, it’s going to be difficult to do much as an outright analytical
competitor, but you can lay the groundwork for a more analytical future. If you’re in a position to influence
the IT infrastructure, you can make sure that your technical platforms, transaction systems, data, and business
intelligence software are in good shape, which means that they reliably produce data and information that is
timely and accurate. You can obtain and encourage the use of analytical software, programming languages,
and data visualization tools. If you head a function or unit, you can make headway toward a smaller-scale
analytical transformation of your part of the business. If you are really smart, influential, and politically
astute, you might even plot an analytical coup and depose your non-analytical rulers. But needless to say,
that’s a risky career strategy.
There are approaches that can be taken to stimulate demand for analytics by executives. These would
generally be actions on the prove-it detour described in . At one pharmaceutical firm where wechapter 6
interviewed several IT executives, there was generally little demand from senior executives for analytical
decision making, particularly in marketing. IT managers didn’t have access to the decisions marketers were
trying to make, and the marketing executives didn’t know what data or analysis might be available to support
their decisions. However, two external events offered opportunities to build analytical demand. One
marketing manager discovered a vendor who showed how sales data could be displayed graphically in terms
of geography on an interactive map. The company’s IT executives felt that the display technique was
relatively simple, and they began to offer similar capabilities to the manager to try to build on his interest and
nurture the demand for marketing analytics.
A second opportunity was offered by an external study from a consulting firm. One outcome of the study
will be a new set of performance indicators. The IT group plans to seize on the indicators and will offer more
analysis and related data to the management team. These IT managers refuse to wait until more analytically
oriented senior executives happen to arrive at the company.
Analytical Professionals and Data Scientists
There is an old joke about analytical professionals. It goes this way:
Question: What did the math PhD say to the MBA graduate?
Answer: Would you like fries with that?
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
115
That joke is now totally obsolete, as demand for analytical talent has skyrocketed. Data science, math, and
other analytical professionals are being avidly recruited to play key roles in helping companies compete on
analytics.
In addition to committed executives, most of the analytical competitors we studied had a group of smart
and hardworking analytical professionals within their ranks. It is the job of these professionals to design and
carry out experiments and tests, to define and refine analytical algorithms, and to perform data mining and
statistical analyses on key data. Analytical pros create the predictive and prescriptive analytics applications
used in the organization. In most cases, such individuals would have advanced degrees—often PhDs—in
such analytical fields as statistics, data science, econometrics, mathematics, operations research, logistics,
physics, and marketing research. As they become more widely available, they are being joined by a new
generation of analysts with master’s degrees in applied analytics, informatics, and data science. In some
cases, where the company’s distinctive capabilities involve a specialized field (such as geology for an oil
exploration firm), the advanced degree will be in that specialty.
One great example of this type of person we found in our research is Katrina Lane. We first encountered
her as vice president of channel marketing at Caesars Entertainment. There, Lane had the job of figuring out
which marketing initiatives to move through which channels, including direct mail, email, call centers, and so
forth. This is a complex field of business that hasn’t been taught in most business schools, so Lane had to
figure a lot out on her own. Fortunately, her skills were up to the task. To start with, she has a PhD in
experimental physics from Cornell. She was head of marketing for a business unit of the May Department
Stores Company and a consultant with McKinsey & Company’s marketing and sales practice. How common
is such a combination of skills and experience? Not very, which is why assembling a capable group of
analytical professionals is never easy. The rarity of her skills also explains why Lane was promoted to chief
technology officer at Caesars, and then recruited to be an executive vice president and general manager of
marketing and operations for Consumer Cards at American Express. Now she is the VP of Global Delivery
Experience for Amazon. She clearly excels in highly analytical management roles in highly analytical
companies.
With her PhD in experimental physics, Lane would today be hired as a “data scientist,” a job that can bring
structure to unstructured data, create sophisticated models to analyze it, and interpret the results for their
implications for key decisions and business directions. Data scientists, whom Tom and his coauthor D. J.
Patil (until recently, the chief data scientist in the White House) described as holding “the sexiest job of the
21st century,” are in hot demand. Some starting salaries for data scientists exceed $200,000. Their earliest8
employers were Silicon Valley startups, but now they’re being hired by large traditional firms as well.
Procter & Gamble, for example, went from one data scientist in 2013 to over thirty in 2017. GE hired a
couple hundred of them for its GE Digital operation in the San Francisco Bay area. There simply aren’t
enough to go around.
Even Google, which is one of the most desired employers on the planet right now, has challenges getting
this sort of talent. It offers generous salaries, stock options, and what is reputed to be the best cafeteria food
anywhere. Yet UC Berkeley professor Hal Varian, who has worked with Google as a consultant since 2002,
notes the difficulty of hiring analytical professionals and data scientists there: “One point that I think needs
more emphasis is the difficulty of hiring in this area. Given the emphasis on data, data warehousing, data
mining and the like you would think that this would be a popular career area for statisticians. Not so! The
bright ones all want to go into biotech, alas. So it is quite hard to pull the talent together, even for Google.”9
Assuming you can find them, how many of these people are necessary? Of course, the answer depends on
what an organization is attempting to do with analytics. In the companies we studied, the numbers range from
about a dozen analytical professionals and data scientists to several hundred. GE had a goal of hiring four
hundred data scientists for its software and analytics business based in the San Francisco area. We’re not sure
of the exact amount the company hired, but there were at least two hundred in this central organization, and
now other business units are hiring their own. Procter & Gamble has two hundred or so. Google has about
five hundred people with the “quantitative analyst” title, and thousands who do some sort of analytical work.
How are they organized? Most companies have centralized them to some degree, although organizational
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
116
structures fluctuate over time. Procter & Gamble, for example, took analytical groups that had been dispersed
around the organization, and combined them to form a new global analytics group as part of the IT
organization. Then it decentralized them a bit while retaining a dotted line to the chief data officer. AIG
created a centralized sixty-person science office for advanced analytics, but then decentralized it for greater
responsiveness to the business.
Another logical alternative as an organizational home for these high-powered analysts would be the
business function that is the primary competitive thrust for the organization. For example, Caesars keeps
most of its “rocket scientists” (including Katrina Lane at the time) in the marketing department, because
customer loyalty programs and improved customer service are the primary orientation of its analytics.
One argument in favor of central coordination is that at the most advanced stages of analytics, extensive
knowledge of specialized statistical methods is required. As a result of what we learned about the companies
we studied, we believe it is impractical for these advanced skills to be broadly distributed throughout the
organization. Most organizations will need to have groups that can perform more sophisticated analyses and
set up detailed experiments, and we found them in most of the companies we interviewed. It’s unlikely, for
example, that a “single echelon, uncapacitated, nonstationary inventory management algorithm,” employed
by one analytical competitor we studied in the supply chain area, would be developed by an amateur analyst.
You don’t learn about that sort of thing in a typical MBA program.
Most analytical competitors we have studied are evolving toward a hub-and-spoke organization reporting
to the CDAO. Analysts are allocated among business units, corporate, and functions so they can specialize
and work closely with decision makers. The centralized hub is responsible for knowledge sharing, spreading
best practices, analytics training, career path development, common standards and tools. Often, the most
highly skilled analysts and data scientists are centralized in the hub so they can be strategically deployed to
work on the enterprise’s most pressing projects.
Regardless of where the professional analysts are located in their firms, many of the analytical competitors
we interviewed stressed the importance of a close and trusting relationship between these analysts and the
decision makers. As the head of one group put it, “We’re not selling analytics, we’re selling trust.” The need
is for analytical experts who also understand the business in general and the particular business need of a
specific decision maker. One company referred to such individuals as “front room statisticians,”
distinguishing them from “backroom statisticians” who have analytical skills but who are not terribly
business oriented and may also not have a high degree of interpersonal skills.
In order to facilitate this relationship, a consumer products firm with an IT-based analytical group hires
what it calls “PhDs with personality”—individuals with heavy quantitative skills but also the ability to speak
the language of the business and market their work to internal (and, in some cases, external) customers. One
typical set of job requirements (listed on for a data scientist in Amazon’s Advertising PlatformMonster.com
group) reads:
PhD in CS [computer science] machine learning, operational research, statistics or in a highly
quantitative field
8+ years of hands-on experience in predictive modeling and analysis
Strong grasp of machine learning, data mining and data analytics techniques
Strong problem solving ability
Comfortable using Java or C++/C. Experience in using Perl or Python (or similar scripting language)
for data processing and analytics
Experience in using R, Weka, SAS, Matlab, or any other statistical software
Communication and data presentation skill
Some of these skills overlap with those of traditional quantitative analysts, but some don’t. Data scientists
tend to have more computer science–oriented backgrounds, whereas analysts tend to have a more statistical
focus. Data scientists are also more likely to be familiar with open-source software and machine learning, and
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
117
perhaps more likely to have a PhD in a scientific discipline. There are also differences in culture and attitude
to some degree. A table of typical differences between these two groups is provided in , which istable 7-1
based on a survey Jeanne led in 2014. Over time (and with the popularity of the data scientist title), however,
these categories have become somewhat intermingled, and the differences may be diminishing.
TABLE 7-1
Analysts and data scientists: not quite a different species
Analysts Data scientists
Data structure Structured and semistructured,
mostly numeric
All, predominantly unstructured
Data types Mostly numeric All, including images, sound, and text
Preferred tools Statistical and modeling tools,
on data usually residing in a
repository such as a data
warehouse
Mathematical languages (such as R and Python),
machine learning, natural language processing, and
open-source tools; data on multiple servers (such as
Hadoop)
Nature of
assignment
Report, predict, prescribe,
optimize
Explore, discover, investigate, visualize
Educational
background
Operations research, statistics,
applied analytics
Computer science, data science, symbolic systems,
cognitive science

Mindset:
Entrepreneurial 69% 96%
Explore new
area
58% 85%
Insights
outside of
projects
54% 89%
Source: Jeanne G. Harris and Vijay Mehrotra, “Getting Value from Your Data Scientists,” MIT Sloan Management
(Fall 2014).Review
Whatever the role is called, business relationships are a critical component of it. At Wells Fargo, a
manager of a customer analytics group described the relationships his group tries to maintain: “We are trying
to build our people as part of the business team; we want them sitting at the business table, participating in a
discussion of what the key issues are, determining what information the business people need to have, and
recommending actions to the business partners. We want this [analytical group] to be more than a general
utility, but rather an active and critical part of the business unit’s success.”10
Other executives who manage or have managed analytics groups described some of the critical success
factors for them in our interviews:
Building a sustainable pipeline. Analytical groups need a pipeline of projects, client relationships, and
analytical technologies. The key is not just to have a successful project or two but to create
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
118
sustainability for the organization over time. It’s no good to invest in all these capabilities and have the
analytical initiative be in existence for only a few years. However, it takes time to build success stories
and to have analytics become part of the mythology of the organization. The stories instill a mindset in
the company so that decision makers have the confidence to act.
Relationship to IT. Even if an analytical group isn’t officially part of an IT organization, it needs to
maintain a close relationship with it. Analytical competitors often need to “push the frontier” in terms
of IT. One analytical group’s manager in a consumer products firm said that his group had been early
users of supercomputers and multiuser servers and had hosted the first website for a product (which got
thousands of hits on the first day). Exploration of new information technologies wasn’t an official
mission for the group but one that the company appreciated. It also helped capture the attention and
imagination of the company’s senior management team.
Governance and funding. How the group is directed and funded is very critical, according to the
executives we interviewed. The key issue is to direct analytical groups at the most important problems
of the business. This can be done either by mandate or advice from some form of steering committee,
or through the funding process. When steering committees are made up of lower-level executives, the
tendency seems to be to suboptimize the use of analytical resources. More senior and strategic
members create more strategic priorities for analytical professionals. Funding can also lead to strategic
or tactical targets. One group we interviewed was entirely paid for by corporate overhead, which meant
that it didn’t have to go “tin cupping” for funding and work on unimportant problems that somehow
had a budget. Another group did have to seek funding for each project, and said they had to
occasionally go to senior management to get permission to turn down lucrative but less important
work.
Managing politics. There are often tricky political issues involved in analytical professionals’ work,
ranging from whose projects the group works on to what the function is called. One company called its
analytical group Decision Analysis, only to find that executives objected because they felt it was their
job to make decisions. The group then changed its name to Options Analysis. It can also be politically
difficult simply to employ analysts. As one head of an analytical group put it: “So I go to Market
Research and say, ‘I have a better way to evaluate advertising expenditures.’ They don’t necessarily
react with joy. It’s very threatening to them. It makes it particularly difficult if you are asking them to
pay you to make them look bad!”11
Heads of analytical groups have to be sensitive to political issues and try to avoid political
minefields. The problem is that people who are good at analytics do not often have patience for
corporate politics! Before any analysis, they should establish with the internal client that both client
and analyst have no stake in any particular outcome, and they will let the analytical chips fall where
they may. CEOs can help their analytical professionals by making it clear that the culture rewards those
who make evidence-based decisions, even when they go against previously established policies.
Don’t get ahead of users. It’s important for analytical professionals to keep in mind that their
algorithms and processes must often be implemented by information workers, who may be analytically
astute but are not expert statisticians. If the analytics and the resulting conclusions are too complex or
laced with arcane statistical terminology, they will likely be ignored. One approach is to keep the
analytics as simple as possible or to embed them into systems that hide their complexity. Another is to
train users of analytical approaches as much as possible. Schneider National’s analytics group has
offered courses such as “Introduction to Data Analysis” and “Statistical Process Control” for users in
various functions at the company. It’s not a formal responsibility for the group, but the courses are
popular, and the group feels it makes their job easier in the long run.
Offshore or Outsourced Analytical Professionals
With expert analytical professionals in short supply within US and European companies, many companies are
considering the possibility of outsourcing them or even going to India or China to find them. It’s certainly
true that an increasing number of firms offer “knowledge process outsourcing” in analytical fields including
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
119
data mining, algorithm development, and quantitative finance. In India, firms such as Mu Sigma,
Evalueserve, and Genpact have substantial practices in these domains. Genpact did work in analytical credit
analysis when it was a part of GE Capital, and now also offers services in marketing and sales analytics. Most
major consulting firms, including Accenture, Deloitte, and IBM, have large analytics groups based in India.
However, it is difficult for analysts to develop a trusting relationship with decision makers from several
thousand miles away. It is likely that the only successful business models for this type of work will combine
onshore and offshore capabilities. Onshore analysts can work closely with decision makers, while offshore
specialists can do back-office analytical work. If a particular analytical application can be clearly described
by the business owner or sponsor before it is developed, there is a good chance that development of the
relevant algorithms could be successfully outsourced or taken offshore.
Analytical Amateurs
Much of the daily work of an analytically focused strategy has to be implemented by people without PhDs in
statistics or operations research. A key issue, then, is how much analytical sophistication frontline workers
need in order to do their jobs. Of course, the nature and extent of needed skills will vary by the company and
industry situations. Some firms, such as Capital One, hire a large number of amateur analysts—people with
some analytical background (perhaps MBAs), but mostly not PhD types. At one point, when we looked at
open jobs on Capital One’s website, there were three times as many analyst openings as there were jobs in
operations—hardly the usual ratio for a bank. According to its particular analytical orientation, a company
simply needs to determine how many analytical amateurs it needs in what positions. Some may verge on
being professionals (we call them ); others may have very limited analyticalanalytical semi-professionals
skills but still have to work in business processes that are heavily based on analytics.
The Boston Red Sox’s situation in 2003, which we described in , is an example of needing tochapter 1
spread analytical orientations throughout the organization. For more business-oriented examples, we’ll
describe two organizations that are attempting to compete on supply chain analytics. One, a beer
manufacturer, put in new supply chain optimization software to ensure that it manufactured and shipped the
right amount of beer at the right time. It even created a new position, “beer flow coordinator” (if only we had
such a title on our business cards!) to use the system and oversee the optimization algorithms and process.
Yet the company’s managers admitted that the beer flow coordinators didn’t have the skills to make the
process work. No new people were hired, and no substantial training was done. The new system, at least in its
early days, was not being used. The company was expecting, one might say, champagne skills on a
beer-skills budget.
At a polymer chemicals company, many of the company’s products had become commoditized. Executives
believed that it was important to optimize the global supply chain to squeeze maximum value and cost out of
it. The complexity of the unit’s supply chain had significantly increased over the previous couple of years.
Responding to the increased complexity, the organization created a global supply chain organization,
members of which were responsible for the movement of products and supplies around the world. In the new
organization, someone was responsible for the global supply chain, there were planning groups in the
regions, and then planners in the different sites. The greatest challenge in the supply chain, however,
involved the people who did the work. The new roles were more complex and required a higher degree of
analytical sophistication. The company knew that the people performing the previous supply chain roles
didn’t have the skills to perform the new analytical jobs, but it kept them anyway. At some point, the
company plans to develop an inventory of skills needed and an approach to developing or hiring for them, but
thus far the lack of skills remains a bottleneck in the implementation of its new logistical process.
When a company is an analytical competitor, it will need to ensure that a wide variety of employees have
some exposure to analytics. Managers and business analysts are increasingly being called on to conduct
data-driven experiments, interpret data, and create innovative data-based products and services. Many
companies have concluded that their employees require additional skills to thrive in a more analytical
environment. An Avanade survey found that more than 63 percent of respondents said their employees need
to develop new skills to translate big data analytics into insights and business value. Anders Reinhardt,12
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
120
formerly head of Global Business Intelligence for the VELUX Group—an international manufacturer of
skylights, solar panels, and other roof products based in Denmark—is convinced that “the standard way of
training, where we simply explain to business users how to access data and reports, is not enough anymore.
Big data is much more demanding on the user.”13
To succeed at an analytical competitor, information workers and decision makers need to become adept at
three core skills:14
Experimental: Managers and business analysts must be able to apply the principles of scientific
experimentation to their business. They must know how to construct intelligent hypotheses. They also
need to understand the principles of experimental testing and design, including population selection
and sampling, in order to evaluate the validity of data analyses. As randomized testing and
experimentation become more commonplace in the financial services, retail, and telecommunications
industries, a background in scientific experimental design will be particularly valued. Google’s
recruiters know that experimentation and testing are integral parts of their culture and business
processes. So job applicants are asked questions such as “How many tennis balls would fit in a school
bus?” or “How many sewer covers are there in Brooklyn?” The point isn’t to find the right answer but
to test the applicant’s skills in experimental design, logic, and quantitative analysis.
Numerate: Analytical leaders tell us that an increasingly critical skill for their workforce is to become
more adept in the interpretation and use of numeric data. VELUX’s Reinhardt explains that “Business
users don’t need to be statisticians, but they need to understand the proper usage of statistical methods.
We want our business users to understand how to interpret data, metrics, and the results of statistical
models.” Some companies, out of necessity, make sure that their employees are already highly adept at
mathematical reasoning when they are hired. Capital One’s hiring practices are geared toward hiring
highly analytical and numerate employees into every aspect of the business. Prospective employees,
including senior executives, go through a rigorous interview process, including tests of their
mathematical reasoning, logic, and problem-solving abilities.
Data literate: Managers increasingly need to be adept at finding, manipulating, managing, and
interpreting data, including not just numbers but also text and images. Data literacy is rapidly
becoming an integral aspect of every business function and activity. Procter & Gamble’s former
chairman and CEO Bob McDonald is convinced that “data modeling, simulation, and other digital
tools are reshaping how we innovate.” And that changed the skills needed by his employees. To meet
this challenge, P&G created “a baseline digital-skills inventory that’s tailored to every level of
advancement in the organization.” The current CEO, David Taylor, also supports and has continued
this policy. At VELUX, data literacy training for business users is a priority. Managers need to
understand what data is available, and to use data visualization techniques to process and interpret it.
“Perhaps most importantly, we need to help them to imagine how new types of data can lead to new
insights,” notes Reinhardt.15
Depending on the business function, additional expertise may be needed. Most IT people, for example,
should have some sense of what analyses are being performed on data, so that they can ensure that IT
applications and databases create and manage data in the right formats for analysis. HR people need to
understand something about analytics so that they can hire people with the right kinds of analytical skills.
Even the corporate legal staff may need to understand the implications of a firm’s approach to analytical and
automated decision making in case something goes awry in the process.
Firms that have upgraded the analytical skills of employees and managers are starting to see benefits. For
example, at a consumer products firm with an analytical strategy, they’re seeing a sea change in middle
managers. Upper middle management has analytical expertise, either from mathematical backgrounds or
from company experience. Two of the central analytical group’s key clients have new managers who are
more analytical. They were sought out for their analytical orientations and have been very supportive of
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
121
analytical competition. The analytical managers are more challenging and drive the professional analyst
group to higher levels of performance. The senior management team now has analytical discussions, not
political ones.
Tools for Amateurs
One of the issues for amateur analysts is what IT tools they use to deal with analytics. There are three
possible choices, and none seems ideal. One choice is to give them powerful statistical analysis tools so that
they can mine data and create powerful algorithms (which they are unlikely to have the skills to do). A
second choice is to have the prescriptive models simply spit out the right answer: the price that should be
charged, the amount of inventory to be shipped, and so on. While we think this may be the best of the three
options, it may sometimes limit the person’s ability to use data and make decisions. The third option, which
is by far the most common, is to have amateurs do their analytics on spreadsheets.
Spreadsheets (by which we really mean Microsoft Excel, of course) are still the predominant tool by which
amateurs manipulate data and perform analytics. Spreadsheets have some strengths, or they wouldn’t be so
common. They are easy to use (at least the basic capabilities); the row-and-column format is widely
understood; and they are inexpensive (since Excel comes bundled with widely used office productivity
software). Yet as we point out in , spreadsheets are a problematic tool for widespread analyticalchapter 2
activity. It’s very difficult to maintain a consistent, “one version of the truth” analytical environment across
an enterprise with a large number of user-managed spreadsheets. And spreadsheets often have errors. Any
firm that embraces spreadsheets as the primary tool for analytical amateurs must have a strong approach to
data architecture and strong controls over analytics.
An intermediate approach would be to give amateur analysts the ability to view and analyze data, while
still providing a structure for the analytical workflow. Vendors of business intelligence and data visualization
software make such a workflow available. They allow the more analytically sophisticated users to do their
own visual queries or create visual reports, while letting less sophisticated users observe and understand
some of the analytical processes being followed. These tools are becoming increasingly popular and are
leading to the democratization of analytics. At some point, we may even see the emergence of the “citizen
data scientist,” for whom most of the difficult data management and analysis tasks are done by intelligent
machines.
Autonomous Decision Making
Another critical factor involving analytical amateurs that must be addressed is how highly automated a
solution for a given problem should be. As automating more and more decisions becomes possible, it is16
increasingly important for organizations to address which decisions have to be made by people and which
can be computerized. Automated decision applications are typically triggered without human intervention:
they sense online data or conditions, apply analytical algorithms or codified knowledge (usually in the form
of rules), and make decisions—all with minimal human intervention.
Fully automated applications are configured to translate high-volume or routine decisions into action
quickly, accurately, and efficiently because they are embedded into the normal flow of work. Among
analytical competitors, we found automated decision technologies being used for a variety of operational
decisions, including extension of credit, pricing, yield management, and insurance underwriting. If experts
can readily codify the decision rules and if high-quality data is available, the conditions are ripe for
automating the decision. Bank credit decisions are a good example; they are repetitive, are susceptible to17
uniform criteria, and can be made by drawing on the vast supply of consumer credit data that is available.
Still, some types of decisions, while infrequently made, lend themselves well to automation—particularly
cases where decision speed is crucial. For example, in the electrical energy grid, quick and accurate shutoff
decisions at the regional level are essential to avert a systemwide failure. The value of this rapid response
capability has often been demonstrated in large power outages, when automated systems in regions of the
United States have been able to respond quickly to power surges to their networks by shutting off or
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
122
redirecting power to neighboring lines with spare capacity. It is also evident in some of today’s most
advanced emergency response systems, which can automatically decide how to coordinate ambulances and
emergency rooms across a city in the event of a major disaster.
Autonomous decision-making applications have some limitations, however. Even when fully automating a
decision process is possible, fiduciary, legal, or ethical issues may still require a responsible person to play an
active role. Also, automated decisions create some challenges for the organization. Because automated
decision systems can lead to the reduction of large staffs of information workers to just a handful of experts,
management must focus on keeping the right people—those with the highest possible skills and effectiveness.
This expert-only approach, however, raises the question of where tomorrow’s experts will come from.
The Override Issue
A related issue is how amateurs should deal with automated decisions with which they don’t agree. Some
firms, such as Caesars, discourage employees from overriding their automated analytical systems, because
they have evidence that the systems get better results than people do. A hotel manager, for example, is not
allowed to override the company’s revenue management system, which figures out the ideal price for a room
based on availability trends and the loyalty level of the customer.
Marriott, as we’ve described in , has similar revenue management systems for its hotels. Yet thechapter 3
company actually encourages its regional “revenue leaders” to override the system. It has devised ways for
regional managers to introduce fast-breaking, anomalous information when local events unexpectedly affect
normal operating data—such as when Houston was inundated with Hurricane Katrina evacuees. The revenue
management system noticed that an unexpected number of people wanted Marriott rooms in Houston in
August, and on its own it would have raised rates. But Marriott hardly wanted to discourage evacuees from
staying in its Houston-area hotels, so revenue leaders overrode the system and lowered rates. Marriott
executives say that such an approach to overriding automated systems is part of a general corporate
philosophy. Otherwise, they argue, they wouldn’t have bothered to hire and train analytically capable people
who make good decisions.
Why these two different philosophies? There are different systems involved, different business processes,
and different levels of skill. Companies with a high skill level among analytical amateurs may want to
encourage overrides when people think they know more than the system. Partners HealthCare’s physicians,
who are often also professors at Harvard Medical School, are encouraged to override automated decision
systems when doing so is in the best interest of the patient. With such highly trained experts involved in the
process, the best result is probably from the combination of humans and automated decision rules.
Companies that feel they have most of the variables covered in their automated analytical models—and
that have lower levels of analytical skills at the front line—may prefer to take a hard line on overrides. To
some degree, the question can be decided empirically—if overrides usually result in better decisions, they
should be encouraged. If not, they should be prohibited most of the time. If a company does decide to allow
overrides, it should develop some systematic means of capturing the reasons for them so that the automated
model might be improved through the input. At Partners, for example, physicians are asked to give a reason
when they override the automated system, and physicians who constantly override a particular system
recommendation are interviewed about their reasoning.
Whatever the decision on people versus automation, the key message of this chapter is that the human
resource is perhaps the most important capability an analytical competitor can cultivate. When we asked
analytical competitors what’s hard about executing their strategies, most said it was getting the right kinds of
analytical people in sufficient numbers. Hardware and software alone can’t come close to creating the kinds
of capabilities that analytical strategies require. Whether we’re talking about senior executives, analytical
professionals and data scientists, or frontline analytical amateurs, everyone has a job to do in making
analytical competition successful.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
123
CHAPTER EIGHT
THE ARCHITECTURE OF ANALYTICS AND BIG
DATA
ALIGNING A ROBUST TECHNICAL ENVIRONMENT WITH BUSINESS
STRATEGIES
Over the last decade or so, it has become technically and economically feasible to capture and store huge
quantities of data. The numbers are hard to absorb for all but the geekiest, as data volumes have grown from
megabytes to gigabytes (billions of bytes) to terabytes (trillions of bytes) to petabytes (quadrillions of bytes).
While low-end personal computers and servers lack the power and capacity to handle the volumes of data
required for analytical applications, high-end 64-bit processors, specialty “data appliances,” and cloud-based
processing options can quickly churn through virtually unfathomable amounts of data.
However, while organizations have more data than ever at their disposal, they rarely know what to do with
it. The data in their systems is often like the box of photos you keep in your attic, waiting for the “someday”
when you impose meaning on the chaos. IDC estimated that only 0.5 percent of all data is ever analyzed, and
we would guess that the amount of data is growing faster than the amount of it that’s analyzed.
Further, the unpalatable truth is that most IT departments strain to meet minimal service demands and
invest inordinate resources in the ongoing support and maintenance of basic transactional capabilities. Unlike
the analytical vanguard, even companies with sound transaction systems struggle with relatively prosaic
issues such as data cleansing when they try to integrate data into analytical applications. In short, while
improvements in technology’s ability to store data can be astonishing, most organizations’ ability to manage,
analyze, and apply data has not kept pace.
Companies that compete on analytics haven’t solved all these problems entirely, but they are a lot better
off than their competition. In this chapter, we identify the technology, data, and governance processes needed
for analytical competition. We also lay out the components that make up the core of any organization’s
analytical architecture and forecast how these elements are likely to evolve in the future.
The Architecture of Analytical Technology
While business users of analytics often play an important role, companies have historically delegated the
management of information technology for analytics and other applications to an information technology (IT)
organization. For example, by capturing proprietary data or embedding proprietary analytics into business
processes, the IT department helps develop and sustain an organization’s competitive advantage.
But it is important to understand that this work cannot be delegated to IT alone. Most “small data” can be
easily analyzed on a personal computer, and even the largest dataset can be sent to Amazon Web Services’ or
Microsoft Azure’s clouds and analyzed by anyone with the requisite knowledge and a credit card. This can
lead to uncontrolled proliferation of “versions of the truth,” but it can also lead to insightful answers to
business problems. Determining how to encourage the latter and prevent the former is a critical task in any
analytical architecture.
Even when IT help is required, determining the technical capabilities needed for analytical competition
requires a close collaboration between IT organizations and business managers. This is a principle that
companies like Progressive Insurance understand fully. Glenn Renwick, formerly both CEO of Progressive
Insurance and head of IT there, understands how critical it is to align IT with business strategy: “Here at
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
124
Progressive we have technology leaders working arm in arm with business leaders who view their job as
solving business problems. And we have business leaders who are held accountable for understanding the
role of technology in their business. Our business plan and IT are inextricably linked because their job
objectives are.”1
Although Renwick has just retired, Progressive has a long history of IT/business alignment and focus on
analytics, and we’re sure they will continue. We found this same collaborative orientation at many analytical
competitors.
Analytical competitors also establish a set of guiding principles to ensure that their technology investments
reflect corporate priorities. The principles may include statements such as:
We will be an industry leader in adopting new technologies for big data and machine learning.
The risk associated with conflicting information sources must be reduced.
Applications should be integrated, since analytics increasingly draw data that crosses organizational
boundaries.
Analytics must be enabled as part of the organization’s strategy and distinctive capability.
Responsibility for getting the data, technology, and processes right for analytics across the enterprise is the
job of the (or the chief data or technology officer, if there is one). This executive (workingIT architect
closely with the chief information officer) must determine how the components of the IT infrastructure
(hardware, software, and networks, and external cloud resources) will work together to provide the data,
technology, and support needed by the business. This task is easier for digital companies, such as Netflix or
eBay, that can create their IT environment with analytical competition in mind from the outset. In large
established organizations however, the IT infrastructure can sometimes appear to have been constructed in a
series of weekend handyman jobs. It does the job it was designed to do but is apt to create problems
whenever it is applied to another purpose.
To make sure the IT environment fully addresses an organization’s needs at each stage of analytical
competition, companies must incorporate analytics and big data technologies into their overall IT
architecture. (Refer to the box “Data and IT Capability by Stage of Analytical Competition.”)
DATA AND IT CAPABILITY BY STAGE OF ANALYTICAL
COMPETITION
Established companies typically follow an evolutionary process to develop their IT analytical
capabilities:
Stage 1. The organization is plagued by missing or poor-quality data, multiple
definitions of its data, and poorly integrated systems.
Stage 2. The organization collects transaction data efficiently but often lacks the right
data for better decision making. Some successful analytical applications or pilot
programs exist and they may even use some sophisticated statistics or technologies.
But these are independent initiatives sponsored by functional executives.
Stage 3. The organization has a proliferation of business intelligence and analytics
tools and data repositories, but some non-transaction data remains unintegrated,
nonstandardized, and inaccessible. IT and data architecture are updated to support
enterprise-wide analytics.
Stage 4. The organization has high-quality data, an enterprise-wide analytical plan, IT
processes and governance principles, and some embedded or automated analytics. It is
also working to some degree on big, less structured data.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
125
Stage 5. The organization has a full-fledged analytics architecture that is
enterprise-wide, automated and integrated into processes, and highly sophisticated.
The company makes effective and integrated use of big and small data from many
internal and external sources, including highly unstructured data. The company begins
to explore and use cognitive technologies and autonomous analytics.
We’re using the term in this context to encompass not only the analysis itself—theanalytics and big data
use of large and small data to analyze, forecast, predict, optimize, and so on—but also the processes and
technologies used for collecting, structuring, managing, and reporting decision-oriented data. The analytics
(a subset of the overall IT architecture) is an umbrella term for an enterprise-wideand big data architecture
set of systems, applications, and governance processes that enable sophisticated analytics by allowing data,
content, and analyses to flow to those who need it, when they need it. (Refer to the box “Signposts of
Effective IT for Analytical Competition.”)
SIGNPOSTS OF EFFECTIVE IT FOR ANALYTICAL
COMPETITION
Analysts have direct, nearly instantaneous access to data, some of it real-time.
Information workers spend their time analyzing data and understanding its
implications rather than collecting and formatting data.
Managers focus on improving processes and business performance, not culling data
from laptops, reports, and transaction systems.
Analytics and data are incorporated into the company’s products and services.
Managers never argue over whose numbers are accurate.
Data is managed from an enterprise-wide perspective throughout its life cycle, from its
initial creation to archiving or destruction.
A hypothesis can be quickly analyzed and tested without a lot of manual
behind-the-scenes preparation beforehand—and some analytical models are created
without human hypotheses at all (i.e., with machine learning).
Data is increasingly analyzed at the “edge” of the organization without needing to send
it to a centralized repository.
Both the supply and demand sides of the business rely on forecasts that are aligned and
have been developed using a consistent set of data.
High-volume, mission-critical decision-making processes are highly automated and
integrated.
Data is routinely and automatically shared between the company and its customers and
suppliers.
Reports and analyses seamlessly integrate and synthesize information from many
sources, both internal and external.
Rather than have data warehouse or analytics initiatives, companies manage data and
analytics as strategic corporate resources in all business initiatives.
“Those who need it” will include data scientists, statisticians of varying skills, analysts, information
workers, functional heads, and top management. The analytics architecture must be able to quickly provide
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
126
users with reliable, accurate information and help them make decisions of widely varying complexity. It also
must make information available through a variety of distribution channels, including traditional reports, ad
hoc analysis tools, corporate dashboards, spreadsheets, emails, and text message alerts—and even products
and services built around data and analytics. This task is often daunting: Amazon, for example, spent more
than ten years and over $1 billion building, organizing, and protecting its data warehouses.2
Complying with legal and regulatory reporting requirements is another activity that depends on a robust
analytical architecture. The Sarbanes-Oxley Act of 2002, for example, requires executives, auditors, and
other users of corporate data to demonstrate that their decisions are based on trustworthy, meaningful,
authoritative, and accurate data. It also requires them to attest that the data provides a clear picture of the
business, major trends, risks, and opportunities. The Dodd-Frank Act, a regulatory framework for financial
services firms enacted in 2010, has equally rigorous requirements for that specific industry (although there
are doubts that it will continue in its present form). Health care organizations have their own set of reporting
requirements.
Conceptually, it’s useful to break the analytics and big data architecture into its six elements (refer to
):figure 8-1
Data management that defines how the right data is acquired and managed
Transformation tools and processes that describe how the data is extracted, cleaned, structured,
transmitted, and loaded to “populate” databases and repositories
Repositories that organize data and metadata (information about the data) and store it for use
Analytical tools and applications used for analysis
Data visualization tools and applications that address how information workers and non-IT analysts
will access, display, visualize, and manipulate data
Deployment processes that determine how important administrative activities such as security, error
handling, “auditability,” archiving, and privacy are addressed
We’ll look at each element in turn, with particular attention to data since it drives all the other architectural
decisions.
FIGURE 8-1
Analytics and big data architecture
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
127
Data Management
The goal of a well-designed data management strategy is to ensure that the organization has the right
information and uses it appropriately. Large companies invest millions of dollars in systems that snatch data
from every conceivable source. Systems for enterprise resource planning, customer relationship management,
and point-of-sale transactions, among others, ensure that no transaction or exchange occurs without leaving a
mark. Many organizations also purchase externally gathered data from syndicated providers such as IRI and
ACNielsen in consumer products and Quintiles IMS in pharmaceuticals. Additionally, data management
strategies must determine how to handle big data from corporate websites, social media, internet
clickstreams, Internet of Things data, and various other types of external data.
In this environment, data overload can be a real problem for time-stressed managers and professionals. But
the greatest data challenge facing companies is “dirty” data: information that is inconsistent, fragmented, and
out of context. Even the best companies often struggle to address their data issues. We found that companies
that compete on analytics devote extraordinary attention to data management processes and governance.
Capital One, for example, estimates that 25 percent of its IT organization works on data issues—an unusually
high percentage compared with other firms.
There’s a significant payoff for those who invest the effort to master data management. For example, GE
addressed the problem of multiple overlapping sources of supplier data within the company. Many business
units and functions had their own versions of supplier databases across hundreds of transaction systems, and
the same suppliers were represented multiple times, often in slightly different ways. As a result, GE couldn’t
perform basic analytics to determine which suppliers sold to multiple business units, which suppliers were
also customers, and how much overall business it did with a supplier. So it embarked on an effort to use new
machine learning tools to curate and integrate the supplier data. After several months, it had created an
integrated supplier database, and it could start pressing the most active suppliers for volume discounts.
Overall, GE estimates that the work led to $80 million in benefits to the company in its first year, and it
expects substantially higher benefits in the future. GE is also working on customer data and parts data using
the same approach.
To achieve the benefits of analytical competition, IT and business experts must tackle their data issues by
answering five questions:
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
128
Data relevance: What data is needed to compete on analytics?
Data sourcing: Where can this data be obtained?
Data quantity: How much data is needed?
Data quality: How can the data be made more accurate and valuable for analysis?
Data governance: What rules and processes are needed to manage data from its creation through its
retirement?
What Data Is Needed to Compete on Analytics?
The question behind this question is, what data is most valuable for competitive differentiation and business
performance? To answer, executives must have a clear understanding of the organization’s distinctive
capability, the activities that support that capability, and the relationship between an organization’s strategic
and operational metrics and business performance. Many of the companies described in this book have
demonstrated the creative insight needed to make those connections.
But ensuring that analysts have access to the right data can be difficult. Sometimes a new metric is needed:
the advent of credit scores made the mortgage-lending business more efficient by replacing qualitative
assessments of consumer creditworthiness with a single, comparative metric. But not everything is readily
reducible to a number. An employee’s performance rating doesn’t give as complete a picture of his work over
a year as a manager’s written assessment. The situation is complicated when business and IT people blame
each other when the wrong data is collected or the right data is not available. Studies repeatedly show that IT
executives believe business managers do not understand what data they need. And surveys of business3
managers reflect their belief that IT executives lack the business acumen to make meaningful data available.
While there is no easy solution to this problem, the beginning of the solution is for business leaders and IT
managers to pledge to work together on this question. This problem has been eased somewhat in companies
like Intel and Procter & Gamble, where quantitative analysts work closely alongside business leaders.
Without such cooperation, an organization’s ability to gather the data it needs to compete analytically is
doomed.
A related issue requiring business and IT collaboration is defining relationships among the data used in
analysis. Considerable business expertise is required to help IT understand the potential relationships in the
data for optimum organization. The importance of this activity can be seen in an example involving health
care customers. From an insurance company’s perspective, they have many different customers—their
corporate customers that contract for policies on behalf of their employees, individual subscribers, and
members of the subscribers’ families. Each individual has a medical history and may have any number of
medical conditions or diseases that require treatment. The insurance company and each person covered by a
policy also have relationships with a variety of service providers such as hospitals, HMOs, and doctors. A
doctor may be a general practitioner or a specialist. Some doctors will work with some hospitals or insurers
but not others. Individuals can have insurance from multiple providers, including the government, that need
to be coordinated. Without insight into the nature of these relationships, the data’s usefulness for analytics is
extremely limited.
Where Can This Data Be Obtained?
Data for analytics and business intelligence originates from many places, but the crucial point is that it needs
to be managed through an enterprise-wide infrastructure. Only by this means will it be streamlined,
consistent, and scalable throughout the organization. Having common applications and data across the
enterprise is critical because it helps yield a “consistent version of the truth,” an essential goal for everyone
concerned with analytics. While it is possible to create such an environment by ex post facto integration and
the transformation of data from many systems, companies are well advised to update and integrate their
processes and transaction systems before embarking on this task.
For internal information, the organization’s enterprise systems are a logical starting point. For example, an
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
129
organization wishing to optimize its supply chain might begin with a demand-planning application. However,
it can be difficult to analyze data from transaction systems (like inventory control) because it isn’t defined or
framed correctly for management decisions. Enterprise systems—integrated software applications that
automate, connect, and manage information flows for business processes such as order fulfillment—often
help companies move along the path toward analytical competition: they provide consistent, accurate, and
timely data for such tasks as financial reporting and supply chain optimization. Vendors increasingly are
embedding analytical capabilities into their enterprise systems so that users can develop sales forecasts and
model alternative solutions to business problems. However, the data from such systems usually isn’t very
distinctive to a particular firm, so it must be combined with other types of data to have competitive
differentiation.
In addition to corporate systems, an organization’s personal computers and servers are loaded with data.
Databases, spreadsheets, presentations, and reports are all sources of data. Sometimes these sources are
stored in a common knowledge management application, but they are often not available across the entire
organization.
Internal data also increasingly means data from Internet of Things (IoT) sensors and devices at the “edge”
of the organization—in the oilfield drilling equipment, the retail point of sale device, or the aircraft engine,
for example. The traditional model was to send all this data to a centralized repository to store and analyze it.
But an alternative paradigm of is growing in currency. The rapid growth of the IoT and otheredge analytics
edge devices that generate data means that it is often unfeasible to send it all to headquarters or even to the
cloud for analysis. In an oilfield, for example, operational data from drilling equipment (including drill-bit
RPMs, cutting forces, vibration, temperature, and oil and water flows) can be used in real time to change
drilling strategies. It’s often not feasible to send all this data to a central repository. Some drilling operations
already use microprocessor-based analytics to determine drilling strategies in real time. The IoT will make
edge-based analytics approaches much more common in the future.
There has been an explosion of external data over the past decade, much of it coming from the internet,
social media, and external data providers. There has also long been the opportunity to purchase data from
firms that provide financial and market information, consumer credit data, and market measurement.
Governments at all levels are some of the biggest information providers (more so since the “Open Data”
movement over the past decade), and company websites to which customers and suppliers contribute are
another powerful resource. Less structured data can also come from such sources as email, voice
applications, images (maps and photos available through the internet), photographs (of people, products, and
of course cats), and biometrics (fingerprints and iris identification). The further the data type is from standard
numbers and letters, however, the harder it is to integrate with other data and analyze—although deep
learning technologies are making image recognition much faster and more accurate.
It can be difficult and expensive to capture some highly valuable data. (In some cases, it might even be
illegal—for example, sensitive customer information or competitor intelligence about new product plans or
pricing strategies.) Analytical competitors adopt innovative approaches to gain permission to collect the data
they need. As we described in , Progressive’s Snapshot program offers discounts to customers whochapter 3
agree to install a device that collects data about their driving behavior. Former CEO Peter Lewis sees this
capability as the key to more accurate pricing and capturing the most valuable customers: “It’s about being
able to charge them for whatever happens instead of what they [customers] say is happening. So what will
happen? We’ll get all the people who hardly ever drive, and our competitors will get stuck with the higher
risks.” Progressive has now gathered over 10 billion miles of customer driving data, and it has become the4
best source of insight about what insurance will cost the company.
How Much Data Is Needed?
In addition to gathering the right data, companies need to collect a lot of it in order to distill trends and
predict customer behavior. What’s “a lot”? In 2007, the largest data warehouse in the world was Walmart’s,
with about 600 terabytes. At roughly the same time, the size of the US Library of Congress’s print collection
was roughly 20 terabytes.5
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
130
Fortunately, the technology and techniques for mining and managing large volumes of data are making
enormous strides. The largest databases are no longer enterprise warehouses, but Hadoop clusters storing data
across multiple commodity servers. The 600 terabytes in Walmart’s warehouse in 2007 grew a hundredfold
by 2017 to 60 petabytes. Digital firms manage even bigger data: Yahoo!’s 600 petabytes are spread across
forty thousand Hadoop servers. That’s the equivalent of storing about 30 trillion pages. Yahoo! isn’t the
perfect example of an analytical competitor anymore, but it’s likely that more successful firms like Google
and Facebook have similar volumes in their data centers.
Two pitfalls must be balanced against this need for massive quantities of data. First, unless you are in the
data business like the companies we’ve just described, it’s a good idea to resist the temptation to collect all
possible data “just in case.” For one thing, if executives have to wade through digital mountains of irrelevant
data, they’ll give up and stop using the tools at hand. “Never throwing away data,” which has been advocated
by Amazon’s Jeff Bezos, can be done, but the costs outweigh the benefits for most companies. The
fundamental issue comes back, again, to knowing what drives value in an organization; this understanding
will prevent companies from collecting data indiscriminately.
A related second pitfall: companies should avoid collecting data that is easy to capture but not necessarily
important. Many IT executives advocate this low-hanging-fruit approach because it relieves them of
responsibility for determining what information is valuable to the business. For example, many companies
fall into the trap of providing managers with data that is a by-product of transaction systems, since that is
what is most readily available. Others analyze social media data simply because it’s possible, even when they
don’t have any actions in mind when sentiment trends down or up a bit. Perhaps emerging technologies will
someday eliminate the need to separate the wheat from the chaff. But until they do, applying intelligence to
the process is necessary to avoid data overload.
How Can We Make Data More Valuable?
Quantity without quality is a recipe for failure. Executives are aware of the problem: in a survey of the
challenges organizations face in developing a business intelligence capability, data quality was second only
to budget constraints. Even analytical competitors struggle with data quality.6
Organizations tend to store their data in hard-walled, functional silos. As a result, the data is generally a
disorganized mess. For most organizations, differing definitions of key data elements such as or customer
add to the confusion. When Canadian Tire Corporation, for example, set out to create a structure forproduct
its data, it found that the company’s data warehouse could yield as many as six different numbers for
inventory levels. Other data was not available at all, such as comparison sales figures for certain products
sold in its 450-plus stores throughout Canada. Over several years, the company created a plan to collect new
data that fit the company’s analytical needs.7
Several characteristics increase the value of data:
It is correct. While some analyses can get by with ballpark figures and others need precision to several
decimal points, all must be informed by data that passes the credibility tests of the people reviewing it.
It is complete. The definition of will vary according to whether a company is selling cement,complete
credit cards, season tickets, and so on, but completeness will always be closely tied to the
organization’s distinctive capability.
It is current. Again, the definition of may vary; for some business problems, such as a majorcurrent
medical emergency, data must be available instantly to deploy ambulances and emergency personnel in
real time (also known as ); for most other business decisions, such as a budget forecast, itzero latency
just needs to be updated periodically—daily, weekly, or monthly.
It is consistent. In order to help decision makers end arguments over whose data is correct,
standardization and common definitions must be applied to it. Eliminating redundant data reduces the
chances of using inconsistent or out-of-date data.
It is in context. When data is enriched with (usually defined as structured data about data), itsmetadata
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
131
meaning and how it should be used become clear.
It is controlled. In order to comply with business, legal, and regulatory requirements for safety,
security, privacy, and “auditability,” it must be strictly overseen.
It is analyzed. Analytics are a primary means of adding value to data, and even creating products from
and monetizing it. Insights are always more valuable than raw data, which is a primary theme of this8
book.
What Rules and Processes Are Needed to Manage the Data from Its Acquisition
Through Its Retirement?
Each stage of the data management life cycle presents distinctive technical and management challenges that
can have a significant impact on an organization’s ability to compete on analytics. Note that this is a9
traditional data management process; an organization seeking to create analytics “at the edge” will have to do
highly abbreviated versions of these tasks.
Data acquisition. Creating or acquiring data is the first step. For internal information, IT managers
should work closely with business process leaders. The goals include determining what data is needed
and how to best integrate IT systems with business processes to capture good data at the source.
Data cleansing. Detecting and removing data that is out-of-date, incorrect, incomplete, or redundant is
one of the most important, costly, and time-consuming activities in any business intelligence
technology initiative. We estimate that between 25 percent and 30 percent of an analytics initiative
typically goes toward initial data cleansing. IT’s role is to establish methods and systems to collect,
organize, process, and maintain information, but data cleansing is the responsibility of everyone who
generates or uses data. Data cleansing, integration, and curation can increasingly be aided by new tools
including machine learning and crowdsourcing.10
Data organization and storage. Once data has been acquired and cleansed, processes to systematically
extract, integrate, and synthesize it must be established. The data must then be put into the right
repository and format so that it is ready to use (see the discussion of repositories later in the chapter).
Some storage technologies require substantially more organization than others.
Data maintenance. After a repository is created and populated with data, managers must decide how
and when the data will be updated. They must create procedures to ensure data privacy, security, and
integrity (protection from corruption or loss via human error, software virus, or hardware crash). And
policies and processes must also be developed to determine when and how data that is no longer
needed will be saved, archived, or retired. Some analytical competitors have estimated that they spend
$500,000 in ongoing maintenance for every $1 million spent on developing new analytics-oriented
technical capabilities. We believe, however, that this cost is declining with newer technologies such as
Hadoop and data lakes.
Once an organization has addressed data management issues, the next step is to determine the technologies
and processes needed to capture, transform, and load data into a data warehouse, Hadoop cluster, or data
lake.
Transformation Tools and Processes
Historically, for data to become usable by managers in a data warehouse, it had to first go through a process
known in IT-speak as ETL, for . It had to be put into a relational format, whichextract, transform, and load
stores data in structured tables of rows and columns. Now, however, new storage technologies like Hadoop
allow storage in virtually any data format. may be based on Hadoop or other underlyingData lakes
technologies, and the concept formalizes the idea of storing data in its original format. These are particularly
useful for storing data before the organization knows what it will do with it. To analyze data statistically,
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
132
however, it must eventually be put in a more structured format—typically rows and columns. The task of
putting data into this format, whether for a data warehouse or a statistics program, can be challenging for
unstructured data.
While extracting data from its source and loading it into a repository are fairly straightforward tasks,
cleaning and transforming data is a bigger issue. In order to make the data in a warehouse decision-ready, it is
necessary to first clean and validate it using business rules that use data cleansing or scrubbing tools such as
Trillium or Talend, which are also available from large vendors like IBM, Oracle, or SAS. For example, a
simple rule might be to have a full nine-digit ZIP code for all US addresses. Transformation procedures
define the business logic that maps data from its source to its destination. Both business and IT managers
must expend significant effort in order to transform data into usable information. While automated tools from
vendors such as Informatica Corporation, Ab Initio Software Corporation, and Ascential Software can ease
this process, considerable manual effort is still required. Informatica’s former CEO Sohaib Abbasi estimates
that “for every dollar spent on integration technology, around seven to eight dollars is spent on labor [for
manual data coding].”11
Transformation also entails standardizing data definitions to make certain that business concepts have
consistent, comparable definitions across the organization. For example, a “customer” may be defined as a
company in one system but as an individual placing an order in another. It also requires managers to decide
what to do about data that is missing. Sometimes it is possible to fill in the blanks using inferred data or
projections based on available data; at other times, it simply remains missing and can’t be used for analysis.
These mundane but critical tasks require an ongoing effort, because new issues seem to constantly arise.
Some of these standardization and integration tasks can increasingly be done by automated machine
learning systems. Companies such as Tamr (where Tom is an adviser) and Trifacta work with data to identify
likely overlaps and redundancies. Tamr, for example, worked with GE on the example we described earlier in
this chapter to create a single version of supplier data from what was originally many different overlapping
sources across business units. The project was accomplished over a few months—much faster than with
traditional, labor-intensive approaches. GE is now working with the same tools on consolidating customer
and product data.
For unstructured big data, transformation is typically performed using open-source tools like Pig, Hive,
and Python. These tools require the substantial coding abilities of data scientists, but may be more flexible
than packaged transformation solutions.
Repositories
Organizations have several options for organizing and storing their analytical data:
Data warehouses are databases that contain integrated data from different sources and are regularly
updated. They may contain, for example, time series (historical) data to facilitate the analysis of
business performance over time. They may also contain prepackaged “data cubes” allowing easy—but
limited—analysis by nonprofessional analysts. A data warehouse may be a module of an enterprise
system or an independent database. Some companies also employ a staging database that is used to get
data from many different sources ready for the data warehouse.
A data mart can refer to a separate repository or to a partitioned section of the overall data warehouse.
Data marts are generally used to support a single business function or process and usually contain some
predetermined analyses so that managers can independently slice and dice some data without having
statistical expertise. Some companies that did not initially see the need for a separate data warehouse
created a series of independent data marts or analytical models that directly tapped into source data.
One large chemical firm, for example, had sixteen data marts. This approach is rarely used today,
because it results in balkanization of data and creates maintenance problems for the IT department.
Data marts, then, should be used only if the designers are confident that no broader set of data will ever
be needed for analysis.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
133
A metadata repository contains technical information and a data definition, including information
about the source, how it is calculated, bibliographic information, and the unit of measurement. It may
include information about data reliability, accuracy, and instructions on how the data should be
applied. A common metadata repository used by all analytical applications is critical to ensure data
consistency. Consolidating all the information needed for data cleansing into a single repository
significantly reduces the time needed for maintenance.
Open-source distributed data frameworks like Hadoop and Spark (both distributed by the Apache
Foundation) allow storage of data in any format and typically at substantially lower cost than a
traditional warehouse or mart. However, they may lack some of the security and simultaneous user
controls that an enterprise warehouse employs, and they often require a higher level of technical and
programming expertise to use. One company, TrueCar, Inc., stores a lot of data (several petabytes) on
vehicles for sale and their attributes and pricing. In converting its storage architecture, it did a
comparison of costs between Hadoop and an enterprise data warehouse. It found that its previous cost
for storing a gigabyte of data (including hardware, software, and support) for a month in a data
warehouse was $19. Using Hadoop, TrueCar pays 23 cents a month per gigabyte for hardware,
software, and support. That two-orders-of-magnitude cost differential has been appealing to many
organizations. There can be performance improvements as well with these tools, although they tend to
be less dramatic than the cost differential.
A data lake employs Apache Hadoop, Apache Spark, or some other technology (usually open source)
to store data in its original format. The data is then structured as it is accessed in the lake and analyzed.
It is a more formalized concept of the use of these open-source tools. Traditional data management
vendors like Informatica, as well as startups like Podium Data, have begun to supply data lake
management tools.
Once the data is organized and ready, it is time to determine the analytic technologies and applications
needed.
Analytical Tools and Applications
Choosing the right software tools or applications for a given decision depends on several factors. The first
task is to determine how thoroughly decision making should be embedded into business processes and
operational systems. Should there be a human who reviews the data and analytics and makes a decision, or
should the decision be automated and something that happens in the natural process workflow? With the rise
of , or artificial intelligence, over the last decade, there are several technologies that cancognitive computing
analyze the data, structure the workflow, reach into multiple computer systems, make decisions, take action,
and even learn over time. Some of these are analytical and statistics-based; others rely on previous12
technologies like rule engines, event-streaming technology, and process workflow support. We addressed this
issue from a human perspective in .chapter 7
The next decision is whether to use a third-party application or create a custom solution. A growing
number of functionally or industry-specific business applications, such as capital budgeting, mortgage
pricing, and anti–money laundering models, now exist. These solutions are a big chunk of the business for
analytics software companies like SAS. Enterprise systems vendors such as Oracle, SAP, and Microsoft are
building more (and more sophisticated) analytical applications into their products. There is a strong economic
argument for using such solutions. According to IDC, projects that implement a packaged analytical
application yield a median ROI of 140 percent, while custom development using analytical tools yields a
median ROI of 104 percent. The “make or buy” decision hinges on whether a packaged solution exists and
whether the level of skill required exists within the organization. Some other research organizations have13
found even greater returns from analytics applications; Nucleus Research, for example, argued in 2014 that
analytics projects yielded $13.01 for every dollar spent.14
But there are also many powerful tools for data analysis that allow organizations to develop their own
analyses (see the boxes “ ” and “ ”). MajorAnalytical Technologies Equifax Evolves Its Analytics Architecture
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
134
players such as SAS, IBM, and SAP offer product suites consisting of integrated tools and applications, as
well as many industry- or function-specific solutions. Open-source tools R and RapidMiner have been the
fastest-growing analytical packages over the past several years. Some tools are designed to slice and dice or15
to drill down to predetermined views of the data, while others are more statistically sophisticated. Some tools
can accommodate a variety of data types, while others are more limited (to highly structured data or textual
analysis, for example). Some tools extrapolate from historical data, while others are intended to seek out new
trends or relationships. Some programming languages like Python are increasingly used for statistical
analysis and allow a lot of flexibility while typically requiring more expertise and effort of the analyst.
ANALYTICAL TECHNOLOGIES
Executives in organizations that are planning to become analytical competitors should be
familiar with the key categories of analytical software tools:
Spreadsheets such as Microsoft Excel are the most commonly used analytical tools
because they are easy to use and reflect the mental models of the user. Managers and
analysts use them for “the last mile” of analytics—the stage right before the data is
presented in report or graphical form for decision makers. But too many users attempt
to use spreadsheets for tasks for which they are ill suited, leading to errors or incorrect
conclusions. Even when used properly, spreadsheets are prone to human error; more
than 20 percent of spreadsheets have errors, and as many as 5 percent of all calculated
cells are incorrect. To minimize these failings, managers have to insist on always16
starting with accurate, validated data and that spreadsheet developers have the proper
skills and expertise to develop models.
Online analytical processors are generally known by their abbreviation, OLAP, and
are used for semistructured decisions and analyses on relational data. While a
(or RDBMS)—in which data is stored in related tables—is arelational database
highly efficient way to organize data for transaction systems, it is not particularly
efficient when it comes to analyzing array-based data (data that is arranged in cells like
a spreadsheet), such as time series. OLAP tools are specifically designed for
multidimensional, array-based problems. They organize data into “data cubes” to
enable analysis across time, geography, product lines, and so on. Data cubes are
simply collections of data in three variables or more that are prepackaged for reporting
and analysis; they can be thought of as multidimensional spreadsheets. While
spreadsheet programs like Excel have a maximum of three dimensions (down, across,
and worksheet pages), OLAP models can have seven or more. As a result, they require
specialized skills to develop, although they can be created by “power users” familiar
with their capabilities. Unlike traditional spreadsheets, OLAP tools must deal with
data proliferation, or the models quickly become unwieldy. SAP’s BusinessObjects
and IBM’s Cognos are among the leading vendors in this category.
Data visualization. OLAP tools were once the primary way to create data
visualizations and reports, but a newer generation of easier-to-use tools that can
operate on an entire dataset (not just a data cube) have emerged and gained substantial
popularity. Tableau and QlikView are the most popular tools in this category; older
vendors like Microsoft, MicroStrategy, and SAS also compete in it.
Statistical or quantitative algorithms enable analytically sophisticated managers or
statisticians to analyze data. The algorithms process quantitative data to arrive at an
optimal target such as a price or a loan amount. In the 1970s, companies such as SAS
and SPSS (now part of IBM) introduced packaged computer applications that made
statistics much more accessible. Statistical algorithms also encompass predictive
modeling applications, optimization, and simulations. SAS remains the proprietary
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
135
analytics software market leader; R and RapidMiner have emerged as open-source
market leaders.
Rule engines process a series of business rules that use conditional statements to
address logical questions—for example, “If the applicant for a motorcycle insurance
policy is male and under twenty-five, and does not either own his own home or have a
graduate degree, do not issue a policy.” Rules engines can be part of a larger
automated application or provide recommendations to users who need to make a
particular type of decision. FICO, IBM’s Operational Decision Manager, and
Pegasystems, Inc. are some of the major providers of rule engines for businesses.
Machine learning and other cognitive technologies that can learn from data over time
have superseded rule engines somewhat in popularity. These include various
technologies, including machine learning, neural networks, and deep learning (the
latter a more complex form of the former, with more layers of explanatory variables),
natural language processing and generation, and combinations of these like IBM’s
Watson. They are both more complex to develop and less transparent to understand
than rule-based systems, but the ability to learn from new data and improve their
analytical performance over time is a powerful advantage.
Data mining tools (some of which use machine learning) draw on techniques ranging
from straightforward arithmetic computation to artificial intelligence, statistics,
decision trees, neural networks, and Bayesian network theory. Their objective is to
identify patterns in complex and ill-defined data sets. Sprint and other wireless
carriers, for example, use neural analytical technology to predict which customers are
likely to switch wireless carriers and take their existing phone numbers with them.
SAS and IBM offer both data and text mining capabilities and are major vendors in
both categories; R and RapidMiner offer open-source alternatives.
Text mining tools can help managers quickly identify emerging trends in near-real
time. Spiders, or , which identify and count words and phrases ondata crawlers
websites, are a simple example of text mining. Text mining tools can be invaluable in
sniffing out new trends or relationships. For example, by monitoring technical-user
blogs, a vendor can recognize that a new product has a defect within hours of being
shipped instead of waiting for complaints to arrive from customers. Other text mining
products can recognize references to people, places, things, or topics and use this
information to draw inferences about competitor behavior.
Text categorization is the process of using statistical models or rules to rate a
document’s relevance to a certain topic. For example, text categorization can be used
to dynamically evaluate competitors’ product assortments on their websites.
Natural language processing tools go beyond text mining and categorization to make
sense of language and even answer human questions; they may employ semantic
analysis, statistical analysis, or some combination of the two. Natural language
creates text for contexts such as sports reporting, business earnings reports,generation
and investment reports in financial services.
Event streaming isn’t, strictly speaking, an analytical technology, but it is
increasingly being combined with analytics to support real-time smart processes. The
idea is to analyze data as it comes in—typically from voluminous and fast-flowing
applications like the Internet of Things. The goal isn’t normally to perform advanced
analytics on the data, but rather to “curate” it—which may involve filtering,
combining, transforming, or redirecting it. This approach has also been employed for a
decade or longer in fast-moving data in the financial services industry.
Simulation tools model business processes with a set of symbolic, mathematical,
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
136
scientific, engineering, and financial functions. Much as computer-aided design (CAD)
systems are used by engineers to model the design of a new product, simulation tools
are used in engineering, R&D, and a surprising number of other applications. For
example, simulations can be used as a training device to help users understand the
implications of a change to a business process. They can also be used to help
streamline the flow of information or products—for example, they can help employees
of health care organizations decide where to send donated organs according to criteria
ranging from blood type to geographic limitations.
Web or digital analytics is a category of analytical tools specifically for managing
and analyzing online and e-commerce data. The bulk of web analytics are
descriptive—telling managers of websites how many unique visitors came to a site,
how long they spent on it, what percentage of visits led to conversions, and so forth.
Some web analytics tools allow statistical comparisons of which versionA/B testing—
of a website gets more clicks or conversions. Web analytics has largely been a world
unto itself in the organizational analytics landscape, but is slowly being integrated into
the larger group of quantitative analysts. Another related category of analytical tools17
is focused on not only counting social activities, but alsosocial media analytics—
assessing the positive or negative sentiment associated with them.
Whether a custom solution or off-the-shelf application is used, the business IT organization must
accommodate a variety of tools for different types of data analysis (see the box “ ” forAnalytical Technologies
current and emerging analytical tools). Employees naturally tend to prefer familiar products, such as a
spreadsheet, even if it is ill suited for the analysis to be done.
Another problem is that without an overall architecture to guide tool selection, excessive technological
proliferation can result. In a 2015 survey, respondents from large organizations reported that their marketing
organizations averaged more than twelve analytics and data management tools for data-driven marketing.18
And there are presumably many other tools being used by other business functions within these firms. Even
well-managed analytical competitors often have a large number of software tools. In the past, this was
probably necessary, because different vendors had different capabilities—one might focus on financial
reporting, another on ad hoc query, and yet another on statistical analysis. While there is still variation among
vendors, the leading providers have begun to offer business intelligence suites with stronger, more integrated
capabilities.
There is also the question of whether to build and host the analytical application onsite or use an “analytics
as a service” application in the cloud. As with other types of IT, the answer is increasingly the latter. Leading
software vendors are embracing this trend by disaggregating their analytics tools into “micro-analytics
services” that perform a particular analytical technique. SAS executives, for example, report that a growing
way to access the vendor’s algorithms and statistical techniques is through open application program
interfaces, or APIs. This makes it possible to combine analytics with other types of transactional and data
management services in an integrated application.
EQUIFAX EVOLVES ITS ANALYTICS ARCHITECTURE
In 2010 Tom consulted at Equifax, a leading provider of consumer credit and financial
information, on an assessment of the company’s analytical capabilities. The company’s then
and current CEO, Rick Smith, an advocate of competing on analytics, wasn’t sure the needed
capabilities were present in the firm. The assessment found that a key barrier to success for
Equifax was that analytics activities took too long to complete due to organizational and
data-related issues. The company had the SAS statistical package, but the absence of an
enterprise data warehouse made it difficult to assemble different types of data in the
necessary time frame. There were pockets of strong analytical capability, but the company
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
137
didn’t address analytics as an enterprise resource. The assessment also recommended the
creation of a chief analytics officer role.
Now, seven years later, the climate and capability for analytics have changed dramatically.
Prasanna Dhore is the company’s chief data and analytics officer (he participated in our 2006
“Competing on Analytics” study at a different company). Peter Maynard, who arrived at
Equifax from Capital One (another early analytical competitor) is the SVP of Global
Analytics. He told us that both the technology and the speed with which analytics are
conducted have undergone major change under Prasanna’s leadership and Equifax’s
infrastructure investment.
A big component of the change is the shift to a Hadoop-based data lake, which allows
Equifax to store and assemble multiple types of data with ease and at low cost. The company
leverages the SAS High-Performance Analytics platform to get maximum value out of the
data that resides in Hadoop.
Maynard notes that this in-memory architecture has dramatically accelerated the speed of
analytics at Equifax:
We have moved from building a model using a month of consumer credit data to two
years’ worth, and we are always analyzing the trended data across time. We have a
neural network model that looks at all the data and identifies trends in the consumer’s
credit history. Whenever we introduce new data and variables into the model, we need to
determine how they affect the trend. It used to take about a month to evaluate a new data
source, but now it’s just a few days because of our much faster analytics environment.
Maynard said that the neural network model was developed using SAS’s Enterprise Miner
offering. It’s a complex model, because it requires a set of “reason codes” that help explain
specific credit decisions to consumers.
The Equifax analytics technology architecture also makes room for open-source tools like
R and Python. Recent graduates in their data science group like them, Maynard notes, but he
says that Equifax has a lot of existing SAS models and code, and many of its data scientists
and quantitative analysts are comfortable with it. Maynard is also considering moving to
SAS streaming analytics for even more speed and to employ SAS Model Risk Management
for ongoing assessment and governance of models.
Maynard and his colleagues regularly attend SAS events and visit the company’s
headquarters in Cary, North Carolina, for briefings and discussions. Equifax’s analytical
leaders have made major changes in their approaches to analytics, and they are satisfied that
SAS’s offerings are changing along with them.
Data Visualization
Since an analysis is only valuable if it is acted on, analytic competitors must empower their people to impart
their insights to others through business intelligence software suites, data visualization tools, scorecards, and
portals. Business intelligence software allows users to create ad hoc reports, interactively visualize complex
data, be alerted to exceptions through a variety of communication tools (such as email, texts, or pagers), and
collaboratively share data. (Vendors such as SAP, IBM, SAS, Microsoft, and Oracle sell product suites that
include data visualization, business intelligence, and reporting solutions.) Commercially purchased analytical
applications usually have an interface to be used by information workers, managers, and analysts. But for
proprietary analyses, these tools determine how different classes of individuals can use the data. For example,
a statistician could directly access a statistical model, but most managers would hesitate to do so.
The current generation of visual analytical tools—from vendors such as Tableau and Qlik and from
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
138
traditional analytics providers such as SAS—allow the manipulation of data and analyses through an intuitive
visual interface. A manager, for example, could look at a plot of data, exclude outlier values, and compute a
regression line that fits the data—all without any statistical skills.
Because they permit exploration of the data without the risk of accidentally modifying the underlying
model, visual analytics tools significantly increase the population of users who can employ sophisticated
analyses. Over the past several years they have made “analytics for the masses” much more of a reality than a
slogan. At Vertex Pharmaceuticals, for example, longtime CIO Steve Schmidt (now a medical device
analytics entrepreneur) estimated several years ago that only 5 percent of his users could make effective use
of algorithmic tools, but another 15 percent could manipulate visual analytics. Our guess is that the
percentage of potential visual analytics users has increased dramatically with the availability of these new
tools.
Deployment Processes
This element of the analytics architecture answers questions about how the organization creates, manages,
implements, and maintains data and applications. Great algorithms are of little value unless they are deployed
effectively. Deployment processes may also focus on how a standard set of approved tools and technologies
are used to ensure the reliability, scalability, and security of the IT environment. Standards, policies, and
processes must also be defined and enforced across the entire organization. There may be times when a
particular function or business unit will need its own analytics technology, but in general it’s a sign of
analytical maturity for the technology to be centrally managed and coordinated. Some firms are beginning to
use structured “platforms” to manage deployment process. One firm, FICO, has a deployment platform and
discusses the deployment issue as managing “the analytics supply chain.”19
Latter-stage deployment issues such as privacy and security as well as the ability to archive and audit the
data are of critical importance to ensure the integrity of the data and analytical applications. This is a business
as well as a technical concern, because lapses in privacy and security (for example, if customer credit card
data is stolen or breached) can have dire consequences. One consequence of evolving regulatory and legal
requirements is that executives can be found criminally negligent if they fail to establish procedures to
document and demonstrate the validity of data used for business decisions.
Conclusion
For most organizations, an enterprise-wide approach to managing data and analytics will be a major
departure from current practice; it’s often been viewed as a “renegade” activity. But centralized analytical
roles—a chief data and analytics officer, for example—and some degree of central coordination are signs of a
company having its analytics act together. Top management can help the IT architecture team plan a robust
technical environment by helping to establish guiding principles for analytical architecture. Those principles
can help to ensure that architectural decisions are aligned with business strategy, corporate culture, and
management style. To make that happen, senior management must be committed to the process. Working20
with IT, senior managers must establish and rigorously enforce comprehensive data management policies,
including data standards and consistency in data definitions. They must be committed to the creation and use
of high-quality data—both big and small—that is scalable, integrated, well documented, consistent, and
standards-based. And they must emphasize that the analytics architecture should be flexible and able to adapt
to changing business needs and objectives. A rigid architecture won’t serve the needs of the business in a
fast-changing environment. Given how much the world of analytics technology has changed in the last
decade, it’s likely that the domain won’t be static over the next one.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
139
CHAPTER NINE
THE FUTURE OF ANALYTICAL COMPETITION
APPROACHES DRIVEN BY TECHNOLOGY, HUMAN FACTORS, AND
BUSINESS STRATEGY
Throughout this book, we have largely described the present state of analytical competition. In many cases,
the analytical competitors we have identified are ahead of their industries in sophistication and progressive
practices and are hence bellwethers leading their peers into the future. In this concluding chapter, we
speculate broadly on what analytical competitors of the future will be doing differently.
As William Gibson once noted, the future is already here but unevenly distributed. We’ve already
observed leading companies beginning to adopt the approaches described later in this chapter, and we believe
they’ll simply become more common and more refined. Like most prognosticators of the future, we predict
more of what we are writing about: more companies choosing to compete on analytics as their distinctive
capability, more companies learning from these analytical competitors to become more analytical themselves,
and analytical firms employing analytics in more parts of their businesses. In other cases, we don’t know of
anybody using a particular practice yet, but logic and trends would dictate that the approach will be employed
before long.
We hesitate to be pinned down as to when these elements of the future will transpire, but we estimate that
five years is the approximate horizon for when many of these ideas will come to fruition. It’s possible that
things could accelerate at a faster rate than we predict if the world continues to discover analytical
competition. For the first thirty or so years of the history of analytics and decision support, actual advances
have been relatively slow. But over the past decade, we’ve seen them accelerate dramatically, as the
introduction to this book describes.
We divide the analytical world of the future into three categories: approaches driven by technology, those
involving human capabilities, and those involving changes in business strategy. Technology probably
changes the most quickly of these three domains and often forces changes in the other areas.
Technology-Driven Changes
A series of technological capabilities are already used on a small scale within organizations, and we expect
they will only expand in the near future. These extrapolations of existing practice include:
Pervasive data. Arguably the biggest change in analytics over the past decade—and probably the next
as well—is the availability of massive quantities of data. The internet and social media applications are
already streaming massive quantities of it—more than 25 terabytes of data streamed by fans at the
2017 Super Bowl, for example. Internet of Things sensors (one estimate suggests 8.4 billion of them
will be in use in 2017) in cars, factories, hospitals, and many other settings will provide much more
data. At an individual level, smartphones, activity trackers, and other personal devices both generate
and receive massive amounts of data.
We will of course need analytics to make sense of all this data, and at the moment we are only
scratching the surface of how pervasive data and analytics can change our work and our lives.
Pervasive data is changing the technologies we use to analyze it and the locations for the analysis;
more analytics are being performed at the edge. Pervasive data also implies a strong need for better
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
140
tools—including the machine learning tools we described in —for “curating” (cleaning,chapter 8
integrating, matching, and so forth) data. And data is also playing a more important role in creating and
improving models (see the next trend); that’s really what machine learning is all about.
More autonomous analytics and decision making, as opposed to relying on humans to look at data and
make decisions. The resource that’s already most in short supply with analytics is the human attention
to look at them, interpret them, and make decisions on the basis of them. Cognitive technologies, AI,
machine learning, deep learning—all of these will increase the ability of smart machines to do
automated analysis, make automated decisions, and take automated actions. Machine learning already
helps many organizations to dramatically increase the productivity of human analysts by creating
thousands of models in the time previously required for one. Quantitative analysts and data scientists’
jobs aren’t threatened yet, but they do need to learn how to work with these new tools. At the moment,
machine-created models can be difficult to interpret, but in the future we may see machines that can
not only find the best-fitting model for data, but also make sense of it for humans who want an
explanation.
The democratization of analytics software. The ability to analyze and report on data is already very
common from a software standpoint. Vendors such as Microsoft embed analytical capabilities in
business versions (particularly Office 365, the cloud version) of Microsoft Office, even including web
analytics and personal productivity analytics. Many traditional applications systems, such as Salesforce
for sales, marketing, service, and e-commerce applications, include various forms of analytics.com
and even artificial intelligence capabilities. Smaller companies that can’t afford expensive analytics
software packages have available free or inexpensive open-source tools such as R and RapidMiner.
And plenty of big, rich companies are using these free tools as well. At one point, the most advanced
analytical capabilities were expensive, but the pace of development for open-source software is such
that they are now more likely to be free. Of course, lower costs for software are sometimes canceled
out by higher costs of people capable of using it—the data scientists who are expert at open-source
tools may be more expensive than traditional quantitative analysts with proprietary software skills.
Increasing use of in-memory processing for analytics that can dramatically speed the response and
calculation time for typical analyses. Instead of storing the data and algorithms on disk, they’re loaded
into the computer’s memory. These are available from vendors like SAP (Hana), SAS, Tableau, Qlik,
and several more. In the future, we may see even greater speed from in-chip analytics. We’re already
also seeing edge analytics in which some analytics and decision making are performed by small, smart
devices at the edge of a network.
Increasing use of real-time (or at least “right-time”) analytics. It has historically taken some
time—from days to weeks—for firms to extract data from transaction systems, load it into analytical
applications, and make sense of it through analysis. Increasingly, however, managers need to make
more rapid decisions, and firms are attempting to implement analytics in real time for at least some
decisions. Of course, some real-time systems make use of autonomous decision making in order to take
humans out of the loop altogether. The granddaddy of real-time applications with human users is
UPS’s ORION, the project we’ve mentioned throughout this book, to provide routing to UPS drivers.
Before they started to use this application, UPS drivers drove the same route every day. Today, they
get a new route every morning that optimizes their deliveries and pickups based on the packages and
requests that came in last night. Tomorrow (or at least within the next few years), their routes will
change in real time based on such factors as traffic, weather, and new requests from customers for
package pickups.
Most organizations should adopt a approach, in which the decision time frame for a classright-time
of decisions is determined within an organization, and the necessary data and analytical processes are
put in place to deliver it by that time. In an survey, 59 percent of the IT executiveInformationWeek
respondents said they were trying to support real-time business information. But research that Tom1
did with SAP suggests that many managers care about real-time information much more in some areas
than others. Companies shouldn’t waste their efforts on delivering real-time information when there2
isn’t a need.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
141
Going beyond alerts to preserve management attention. Alerts have been a useful strategy for
organizations seeking to preserve management attention. They say, “Look at this number—you said
you wanted to hear about it if it went this high!” More organizations are beginning to make use of
automated alerts to notify managers when key indicators are at undesirable levels. Intel, for example,
uses alerts to let supply chain commodity managers know when they should act on purchasing and
pricing data. The concern with alerts, however, is that too many of them will lead to “alert fatigue” on3
the part of the alerted individuals. Of course, if a system can take an automated action, that prevents
humans from needing to be in the loop at all.
More explanatory analytics, as opposed to numbers and programming languages. This trend has been
taking place for a while, simply because many managers prefer to see and digest analytics in visual
formats. Of course, different people have different learning styles. Those who prefer verbal narratives
can increasingly have a visual analytic converted into a story that summarizes the result. The new trend
will be for software to deliver the best format for you as an individual, your data, and your decision or
question. Let’s hope that all these developments mean the end of the pie chart, which visual analytics
experts have noted for years is rarely a useful format.
More prediction and prescription (and less reporting). It’s obviously more useful to predict what’s
going to happen than to explain what’s already happened. Prediction, however, generally requires more
sophisticated analysis and data than reporting or explanation. Prescriptive analytics require enough
context about the task and the situation to make an informed recommendation. Despite these
challenges, predictive and prescriptive analytics are extending into more and more business
domains—from predicting the behavior of customers to telling them what to buy; from predicting
disease to recommending treatment strategies. In a discussion with users, for example,Salesforce.com
Tom heard many of them say that they wanted to move beyond descriptive analytics. One commented,
“We don’t have time for people to look at bar charts and figure out what to do.” They prefer the idea
(which Salesforce and other companies have begun to implement) of “smart data discovery” in which
smart tools identify trends and anomalies in data—with no need for a human hypothesis—and point out
their implications to users. Before long, managers may simply be able to converse with their automated
assistants, who will be able to help interpret their financial reports, point out the weaknesses in their
demand planning forecast, and predict that inventory is likely to run out two quarters from now. In
addition to rapid data discovery, another benefit of this approach is that it’s less subject to biased
human interpretation than traditional descriptive and predictive analytics. If a machine is finding
patterns in data, it’s somewhat more difficult to “torture the data until it confesses.”
More mining of text, speech, images, and other less structured forms of data. The mining or detailed
analysis of structured data is by now quite advanced, but the mining of text, speech, images, and even
video are clearly in their early stages and are likely to expand considerably over the next few years.
Technological capabilities to categorize and discern meaning in the lab are already better than humans
in many cases, but they have yet to penetrate many business applications. Consumer applications, such
as Apple’s Siri and Amazon’s Echo/Alexa, are further along, and businesses are beginning to employ
them within products and applications. Deep learning algorithms based on neural networks are able to
learn how to categorize and make decisions on unstructured data—at least when given enough data to
learn from. The availability of labeled training data—for example, the 14 million ImageNet images, or
the 8 million labeled YouTube videos, will dramatically improve the performance of these algorithms
over the next several years.
Model management finally comes of age. The other major advance lies at the opposite end of the
analytical process and involves the capture of models, learning, and insights from analytical and
experimental results. In cultures with a broad-scale analytical or experimental orientation, there are
likely to be many models created by many people, each with its own assumptions, variables, and
results. Experiments have designs, test and control groups, and results. How do you keep track of such
models and experiments without a repository? The answer, of course, is that you can’t. Capital One,
one of the earliest firms to embrace experimental design for business, had a repository of findings from
its many experiments, but it was extremely unwieldy for users to search through and learn from. What
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
142
the company decided to do, then, was to take a just-in-time approach to providing experimental
knowledge to its analysts. The company built a system to guide the analyst through the process of
designing a new credit card offering for a specified class of customers. It uses knowledge from the
company’s experiments to make suggestions at each stage of the design process about what options
might work best. It might suggest everything from the optimal interest rate for balance transfers to the
best color for the envelope to be used in the direct mail offer to the customer.
Such systems for maintaining information about models are called systems, andmodel management
they are currently widely used only in financial institutions (Capital One was an early adopter of these
too). They are used in that industry primarily because regulators insist on them. However, as analytics
become an enterprise resource—the source of competitive advantage and considerable value—we
expect to see more model management tools employed, even when they aren’t imposed by regulators.
They will provide not only backup when a quantitative analyst leaves the firm, but also can prevent the
need for new analyses when a similar one has already been performed elsewhere in an organization.
Human-Driven Changes
While humans don’t change as rapidly as information technologies, there are changes in analytical
competition that will be driven by the capabilities and configurations of analytical people within
organizations. We expect, first of all, that the growth of analytical competition will lead to a need for
substantially greater numbers of analytically oriented people—some number of analytical professionals and
data scientists, and a much larger group of analytical amateurs, as we have called them. If many more
decisions are to be made based on detailed analysis, many more people will have to have some understanding
of how those analyses were performed and when they should be overridden. In short, analytics, and the use of
them in decisions and actions, will be increasingly extended to the frontline analytical amateurs within
organizations.
From where will these professional and frontline analysts come? Some of these, we believe, will come
from business schools and other parts of universities, which have always offered some courses in statistics
and data analysis. Over the last five years, literally hundreds of universities have added degree programs,
certificates, and courses in analytics and data science. We expect that perceptive schools and their students
will focus even more on analytical training in the future. The most advanced data scientists, who come from
more diverse backgrounds like computer science and physics PhD programs, will continue to be sourced
from relatively nonconventional academic sources. As yet, there are very few PhD programs in data science.
Corporations may also need to offer internal programs to educate their people on various forms of
analytics and data science. Cisco Systems, for example, created a distance education program on data science
for interested and qualified employees, partnering with two universities. The program lasts nine months and
concludes with a certificate in data science from the university. More than two hundred data scientists have
been trained and certified, and are now based in a variety of different functions and business units at Cisco.
Cisco also created a two-day executive program led by business school professors on what analytics and data
science are and how they are typically applied to business problems. The program also covers how to manage
a workforce that includes data scientists, and how to know whether their products are any good.
Of course, not all workers will need to be involved in all analytical activities. It is likely that analytical
professionals will have to become expert not only in quantitative analysis but also in job and process design
for frontline analysts. They may also have to design information environments for frontline workers that
provide just enough analytics and information for them to perform their jobs effectively. As one analyst (who
refers to the concept as “pervasive business intelligence”) put it, “It’s the ability to take relevant information
that is usually reported up to management and push it down to users. At the various organizational levels, the
data is presented so that people see only what is most relevant to their day-to-day tasks . . . with expectations
and performance clearly identified.”4
We also expect substantially increased use of outsourced and offshore analytical resources. Some of that
has already emerged in the form of “math factories” like Mu Sigma in India. We noted in that it’schapter 7
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
143
difficult to establish the close, trusting relationships between analysts and executives that are necessary for
widespread analytical decision making. However, there are certainly analytical tasks that can be
accomplished without close interactions with executives. Back-office development and refinement of
algorithms, cleaning and integration of data, and the design of small-scale experiments can often be done
remotely. Substantial numbers of analytically trained workers in India, Russia, and China will undoubtedly
be doing more analytical work in the future. Offshore companies in the outsourcing business are beginning to
specialize in such services. However, if an analytical task can be outsourced, there is a growing likelihood
that it can be automated with tools like machine learning. This may reduce the growth of analytics
outsourcing over time.
We also anticipate that increasing numbers of firms will develop strong analytical capabilities within their
IT organizations. We’ve already described Gartner surveys from 2006 to 2016 finding that business
intelligence or analytics was the number-one technology priority of corporations. As we have also pointed5
out in previous chapters, “better management decision making” is the number-one objective of large
companies that have installed enterprise resource planning systems. With these priorities and objectives, it’s
only natural that CIOs and other IT executives will want to increase the IT function’s ability to support
analytics. This means that they’ll be hiring quantitative analysts, specialists in analytics software, and IT
professionals with expertise in data warehouses and marts. Some, like Procter & Gamble, will make this
capability a primary focus of the in-house IT function, while outsourcing less critical capabilities to external
suppliers. The business functions that IT supports—logistics and supply chain management, marketing, and
even HR—will also be hiring analytical experts with a strong IT orientation. It will become increasingly
difficult to distinguish analytical people in IT from those in the rest of the business. If you’re trying to decide
on a technical specialization, analytics are a good bet.
With the rise of analytical people across the entire organization, we can expect a greater need for structure
and guidance in the management of their activities—either provided by humans or by the technology itself.
As we’ve noted earlier, there will be no shortage of analytical tools, whether they’re spreadsheets, visual
analytical systems, machine learning, or some other form of software. However, if corporate strategies
depend on the results of analytics, they have to be done with accuracy and professionalism.
How will companies provide greater structure and human capability for strategically important analyses?
There will be no single method but rather a variety of tools and approaches. One way is to have the software
guide the analysis process—letting a human analyst know what assumptions about the data are being made,
what statistical analysis approach to employ, or what visual display to best summarize the data. Analytical
software applications can guide an analyst through a decision process, either making the decision itself
(perhaps with a human override option) or ensuring that a human decision maker has all needed information.
Another answer would be substantial education. We believe that most organizations would benefit from
education to create more capable analysts and improve the skills of existing ones. A third would be a group
of “coaches” to help amateur analysts and certify their work as well done. For analytical work that
substantially affects financial performance, internal and external auditors may need to get involved. There is
no doubt that audits are becoming increasingly analytical. Whatever the means, companies will need to both6
build analytical capability in their employees and ensure that they’re doing a good job. As we’ve already
described, some leading companies are beginning to employ these approaches for building human analytical
capability through some sort of centralized analytics hub.
This attention to human capabilities won’t stop at the analysis stage. Many organizations will begin to
automate decisions and actions, which will have a considerable impact on the humans that previously
performed those tasks. Tom and coauthor Julia Kirby have already written a book on how humans can add
value to smart machines, so we won’t go into detail on that topic here. But determining the relationships7
between humans and machines, and how human work and business processes need to be modified to take
advantage of machine intelligence, is clearly going to be an important topic in the near and distant future.
Strategy-Driven Changes
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
144
We anticipate that a number of changes in the analytical environment will be driven by business strategies.
As more firms become aware of the possibilities for analytical competition, they will push the boundaries of
analytics in their products, services, and business models. Virtually every provider of data and information
services, for example, will probably offer analytics to its customers as a value-added service. Data itself has
become something of a commodity, and customers for data often can’t find the time or people to analyze it
themselves. Software, once primarily focused on business transactions, increasingly includes analytics.
We also expect to see more analytics embedded in or augmenting products and services—describing, for
example, the optimum way to make use of those offerings within the customer’s business processes. The golf
club sensor we described in that tells you how well you are swinging your club is a good example.chapter 3
We are already seeing automobiles (or at least insurance companies) that tell you how safely you are driving,
health care and fitness trackers that analyze how healthily you are eating and living, and industrial machinery
that tells you how well you are using it. Even industrial companies like GE and Monsanto are now selling
products or services that tell their customers how to use their offerings more effectively. Of course, we may
tire of all this advice, but it is potentially very useful.
This trend will be only a part of a broader one involving supplying analytics to customers and suppliers.
We’ve already mentioned some firms, such as Walmart, that furnish analytical information to their customers
or channel partners. There are others we haven’t mentioned that are beginning to do this to some degree.
Marriott shares analytical information with both channel partners—online and traditional travel agencies, for
example—and major corporate customers. Channel partners get analyses involving pricing, joint promotions,
and inventory; customers receive data and analysis that helps them with their own travel management. We
expect that most firms will begin to view the internal audience for analytics as only one of several potential
recipients, and that relationships with suppliers and customers will increasingly include the provision of
analytics.
Another strategic trend involves the content of analytics. Thus far, most quantitative analyses are about
internal business entities: inventory units, dollars, customers, and so forth. Most organizations realize,
however, that internal information, no matter how well it’s analyzed, gives a limited window on the world.
Peter Drucker commented in 1998 that management has a tendency to “focus inward on costs and efforts,
rather than outward on opportunities, changes, and threats.” Drucker said that outside information didn’t8
exist then, but it does now. Data and analytics are increasingly available on what customers and
non-customers are saying about our company, on trends and concerns in our industries, and on economic and
sociopolitical movements that could affect our futures. Any firm that wishes to control—or at least react
quickly to—outside forces must be applying analytics to this external information.
Thus far external information, when accessed at all, has not been put into a structure for easy, ongoing
access. And it has been the subject of descriptive analytics at best—very little predictive or prescriptive
analytics, at least outside of financial services. But such structured information systems—they might be
called , as they are in the military and intelligence communities—are beginningsituational awareness systems
to be found in organizations. Several cities (e.g., Chicago’s WindyGrid) and police forces (the NYPD’s
Domain Awareness System [DAS] is the best one we’ve seen) are using them. Deloitte has created one
(actually, multiple tailored versions) for its senior executives, and it builds them for clients too. Recorded
Future, a company Tom advises, scans and analyzes internet text to better understand what people are saying
and doing around the world, particularly with regard to intelligence and cybersecurity. Many companies are
using similar approaches to understand customer perceptions about products and brands.
Finally, we expect that strategic concerns will also drive firms to pay substantial attention to new metrics
and their interrelationships in analyses and scorecards. We heard from a number of analytical competitors
that they start with metrics in thinking about applying analysis to a distinctive capability. They either invent a
new metric from their own proprietary data or refine an existing one. As metrics become commonplace (e.g.,
as we have discussed, the FICO score in consumer credit or the batting average in baseball), companies and
organizations go beyond them to new measurement frontiers. We anticipate particularly high levels of
activity in the domain of human resources and talent management, since these have been relatively
unmeasured in the past. Once developed, of course, metrics must be incorporated into established scorecards
and measurement processes, and the relationships between different measures must be explored and
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
145
understood. Most importantly, these metrics must be incorporated into business and management
decision-making processes. Just developing a measure, and just using it in some analyses, is never enough.
The Future of Analytical Competition
We’ll end this book by discussing broadly what will happen to analytical competitors in the future. This will
serve as both a summary of the key attributes of analytical competitors and a prediction of the future, because
analytical competitors will continue to do more of what made them successful in the first place.
Analytical competitors will continue to examine their strategies and their business capabilities to
understand where they can get an analytical edge. That’s more and more important as more companies jump
into analytics, at least at a surface level. But the best companies will focus on what makes their organizations
distinctive and how analytics can support or drive a distinctive capability. After they address the most
distinctive areas, analytics will eventually be applied to most other parts of their businesses—their motto will
be, “If it’s worth doing, it’s worth doing analytically.” These companies will identify measures of the
distinctive capability that other organizations don’t yet employ. After they identify a measure, they’ll collect
data on it and embed decisions based on the measures into their daily work processes.
Take Google, for example. The company is perhaps the most analytical firm on the planet today. But the
presence of other analytical companies in its markets hasn’t made it retreat at all. Instead, it’s doubled down
on such capabilities as artificial intelligence software, proprietary mapping data, analyzing the data from its
autonomous vehicles, analyzing YouTube videos, and so forth. It started with its PageRank algorithm and
then advertising algorithms, but has since moved on to being the leader in analytics for human resources,
attribution of digital ads, venture capital, and many others. And not surprisingly, the company continues to
perform extremely well.
In order to continue refining their analytical capabilities, companies will focus on both their human and
technological dimensions. On the human side, they’ll try to further embed an analytical orientation into the
culture and to test as many hypotheses as they can. A 2017 survey of executives in fifty large companies by
NewVantage Partners on big data suggests that while companies have found big data efforts successful and
financially rewarding, the creation of a data-driven culture has been problematic. Of those who responded,9
86 percent said their companies have tried to create a data-driven culture, but only 37 percent said they’ve
been successful at it.
The best analytical competitors will keep trying to achieve that type of culture, however. Their executives
will argue for analytical strategies and decisions with passion and personal example. Their managers will
constantly press subordinates for data or analytics before they take major actions. Employees at every level
will use data and analytics to make decisions and take actions. And data and analytics will be employed to
seek out truth, not to advance some executive’s private objectives.
The managers of analytical competitors of the future will not be narrow “quant jocks.” They’ll always be
thinking broadly about whether their analytical models and data are still relevant to their businesses. They’ll
constantly be reexamining the assumptions behind their analytical models. If a particular type of analysis
becomes commoditized throughout their industries, they’ll find some new basis for analytical competition.
They’ll use intuition sparingly but strategically when it isn’t possible to test an assertion or gather data for an
analysis. They’ll be able to be more experimental and innovative. They’ll advocate for new methods and new
technologies like machine learning and cognitive technologies. They’ll be looking for how they can employ
these intelligent machines for new business strategies and models, and how to extract more productivity from
every activity.
As a result of their efforts, they’ll undoubtedly be hotly pursued by other firms that also want to be
analytical competitors. If their employers are smart, these analytical heat-seeking missiles will find their jobs
stimulating and satisfying and will stay put as long as they’re recognized and promoted.
There will continue to be people in these organizations whose job primarily involves developing and
refining analytics—analytical professionals and data scientists. They will either work in a central group or be
highly networked, and they’ll share approaches and ideas. They will also work to educate and partner with
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
146
the analytical amateurs of the organizations, who need to understand how analytical models and tools support
them in their jobs. The analytical competitor of the future will also supplement internal analytical resources
with outsourced or offshore expertise. And these firms won’t be shy about thinking of ways that machines
themselves can do the difficult, time-consuming work of analytics. They’ll focus particularly on how to
automate the really labor-intensive part of analytics: preparing the data for analysis.
Analytical competitors will continue to have lots of data that is generated from enterprise systems,
point-of-sale systems, and web transactions, as well as external data of various types from customers and
suppliers. They’ll organize it and put it aside for analysis in warehouses and Hadoop-based data lakes. They
will ensure that data is integrated and common in the areas of their business where it really matters. They’ll
have integrated analytics suites or platforms that support reporting and analytics—both proprietary and open
source. In domains where decisions must be made very rapidly or very often, they’ll embed analysis into
automated decision systems, allowing human overrides only under specified conditions.
Perhaps most importantly, analytical competitors will continue to find ways to outperform their
competitors. They’ll get the best customers and charge them exactly the price that the customer is willing to
pay for their product and service. They’ll have the most efficient and effective marketing campaigns and
promotions. Their customer service will excel, and their customers will be loyal in return. Their supply
chains will be ultra-efficient, and they’ll have neither excess inventory nor stock-outs. They will embed data
and analytics into new innovative products. They’ll have the best people in the industry, and the employees
will be evaluated and compensated based on their specific contributions. They’ll understand both their
internal and their external business environments, and they’ll be able to predict, identify, and diagnose
problems before they become too problematic. They will make a lot of money, win a lot of games, or help
solve the world’s most pressing problems. They will continue to lead us into the future.
EBSCOhost - printed on 5/23/2023 1:34 AM via UNIVERSITY OF QUEENSLAND (UQ). All use subject to https://www.ebsco.com/terms-of-use
essay、essay代写