+1(978)310-4246 credencewriters@gmail.com
  

Description

BMAL 720
PROJECT/LEADERSHIP INTEGRATION ESSAY INSTRUCTIONS
There are two Leadership Integration essays in this course. The first project essay is due by
11:59 p.m. (ET) on Sunday of module/week 4. The second project essay is due by 11:59 p.m.
(ET) on Friday of module/week 8. Each essay will be a minimum of 2,500 words. You may
extend to 3,000 words (approx. 10–12 pages). There must be at least 6 high quality peerreviewed sources plus the course textbooks. The sources must include Ahmed (2019), Bartlett
(2013), and Merida (2015) course textbooks. These two projects require substantial pre-work to
research the topics, locating high quality sources and formatting each essay to strict APA
requirements.
Authors should read the assigned chapters in Ahmed (2019), Bartlett (2103), and section topics
in Merida (2015). Then, answer the question being posed as a Christian leader with critical
reflection of the topic, supported by in-text citations and examples. Each of these essays is
focused on the development of students as Christian leaders in organizations. Demonstrate what
you have conceptualized on each of the topics.
Week
4
8
Project 1 and 2 Topics
Leadership Integration Essay 1: Why are people the most important aspect in an
analytics driven culture?
Leadership Integration Essay 2: What are the basic collection and management
methods to keep a business competitive?
BMAL 720
TWO PART ASSIGNMENT ONE
DISCUSSION BOARD/ONE PROJECT ESSAY
PROJECT/LEADERSHIP INTEGRATION ESSAY GRADING RUBRIC
Criteria
Content 70%
Abstract
Content
Conclusion
Levels of Achievement
Advanced
14 to 15 points
There is an abstract statement with
topic selection to provide a clear
overview of the paper’s contents.
The statement adheres to APA and
is in depth to properly introduce the
paper.
69 to 75 points
Each section is complete with clear,
distinct sections (separate headings
for each), and all the content is
properly addressed in a thorough
manner.
Each of the main points for the
subsections is thoroughly addressed
with a specific discussion and
examples applied.
A thorough, specific discussion of
the of the topic exists with a clear
application to the results. Using
professional and/or personal life
examples.
Each of the section is clearly
supported with examples and
research.
14 to 15 points
The conclusion offers a wellrounded summary of topics in the
Proficient
13 points
An abstract statement is provided,
there is a somewhat clear overview
of the paper’s contents and topic
selection. The statement somewhat
adheres to APA with some depth.
Developing
1 to 12 points
A brief overview of the paper is
provided but it is not clear of the
paper’s contents and topic selection.
The statement does not adhere to
APA and is not in depth enough to
properly introduce the paper.
63 to 68 points
Each section is complete with
somewhat clear, somewhat distinct
sections (separate headings for each),
and some of the content is properly
addressed in a thorough manner.
Each of the main points is somewhat
addressed with some discussion and
examples applied.
A somewhat thorough discussion of
the of the topic exists with an
application using professional and/or
personal life examples.
Each of the sections are somewhat
supported.
1 to 62 points
0 points
Each section is completed but not as Each section is not
clear, no distinct sections, and some completed properly or
of the content is not as properly
addressed.
addressed in a thorough manner.
The main points for the
Some of the main points for the
subsections are not
subsections are addressed with little addressed with
discussion and examples applied.
discussion and examples
applied.
A brief discussion of the topic
exists with little to no application to There is no discussion of
professional/ personal life
the major aspects of the
examples.
topic with correlation to
professional/personal life
Some areas are vaguely supported
examples.
with examples and research.
No research or examples
were used.
1 to 12 points
0 points
No conclusion was
The conclusion offers a summary
provided.
of the topics in the paper.
13 points
The conclusion offers a good
summary of topics in the paper and
Not present
0 points
No abstract statement is
provided.
Page 1 of 3
BMAL 720
Structure 30%
Materials/
Source
Structure
Style
paper and suggests a variety of
opportunities for future use of
topics/relevance/purpose.
Advanced
14 to 15 points
The reference page contains at
least 6 scholarly sources within
the last 5 years, which are evident
within the paper. All required
sources are cited and listed in the
reference page.
The materials are properly cited and
quoted in current APA style.
The SafeAssign originality score is
within a proper range.
suggests some opportunities for
future use of
topics/relevance/purpose.
Proficient
13 points
The reference page contains
scholarly sources but not within the
last 5 years, which are somewhat
evident within the paper.
The materials are somewhat
properly cited and quoted in
current APA style.
The SafeAssign originality score is
somewhat within a proper range.
14 to 15 points
The transitions between
paragraphs and sections are clear
with the use of proper headings
and treated with logical order.
The required length is met.
13 points
The transitions between paragraphs
and sections are somewhat clear
with the use of headings and
treated with a somewhat logical
order.
The required length is somewhat
met.
13 points
The paper has current APA format
with minor errors and the paper has
minor spelling and grammatical
errors.
14 to 15 points
The paper has current APA
formatted correctly, and the paper is
without spelling and grammatical
errors.
Developing
1 to 12 points
The reference page lacks enough
scholarly sources within the last 5
years, and they are not all evident
within the paper.
The materials are not all properly
cited and quoted in the current
APA style.
The SafeAssign originality score is
somewhat within a proper range.
1 to 12 points
The transitions between
paragraphs and sections are not
clear with the use of headings and
treated with a logical order.
The required length is not fully met.
1 to 12 points
The paper has somewhat current
APA format with multiple errors
and the paper has multiple
spelling and grammatical errors.
Total
Not present
0 points
The reference page does
not include the
appropriate scholarly
sources.
The materials were not
properly cited and
quoted in current APA
style.
The SafeAssign originality
score is not within a
proper range.
0 points
The transitions between
paragraphs and sections
are not attempted and no
logical order exists.
The required length is not
met.
0 points
The paper does not
comply with current APA
format and has numerous
spelling and grammatical
errors.
/150
PROJECT/LEADERSHIP INTEGRATION ESSAY INSTRUCTIONS
Each essay will be a minimum of 2,500 words. You may extend to 3,000 words (approx. 10–12 pages). There must be at least 6 high quality peer-reviewed
sources plus the course textbooks. The sources must include Ahmed (2019), Bartlett (2013), and Merida (2015) course textbooks. These two projects require
substantial pre-work to research the topics, locating high quality sources and formatting each essay to strict APA requirements.
Page 2 of 3
BMAL 720
Authors should read the assigned chapters in Ahmed (2019), Bartlett (2103), and section topics in Merida (2015). Then, answer the question being posed as a
Christian leader with critical reflection of the topic, supported by in-text citations and examples. Each of these essays is focused on the development of students
as Christian leaders in organizations. Demonstrate what you have conceptualized on each of the topics.
Week
4
Project 1
Leadership Integration Essay 1: Why are people the most important aspect in an analytics driven
culture?
Reading & Study
Textbook Readings
Ahmed (2019): Sections 9.1-9.8
Bartlett (2013): Chapter 5: Organization: The People Side of the Equation.
Merida (2015): Elijah-“A Man Like Us”
Presentation: The People Part of the Equation.
DISCUSSION BOARD INSTRUCTIONS
Weekly Topic: Organization: The People Side of the Equation
Initial Thread: After completing the required reading assignments. Provide a critical analysis discussion using your own
professional work experience and learning from the reading. At the post graduate level you are not to provide a summary
but rather provide a critical thinking assessment of the topic. You must use at least one Biblical citation, one peer-reviewed
journal article citation, and one course primary topic textbook citation to inform you further on the topic.
Your initial thread is due on Wednesday of the module week and must be 500–750 words with in-text citation and references.
The following 3 sources must be included in your thread:
• The course primary topic textbook
• At least 1 peer-reviewed journal article
• 1 passage of Scripture
• You must cite all sources you used in current APA format.
Page 3 of 3
Copyright © 2013 by Randy Bartlett. All rights reserved. Except as permitted under the United States
Copyright Act of 1976, no part of this publication may be reproduced or distributed in any form or by
any means, or stored in a database or retrieval system, without the prior written permission of the
publisher.
ISBN
MHID
978-0-07-180760-9
0-07-180760-8
The material in this eBook also appears in the print version of this title: ISBN 978-0-07-180759-3,
MHID 0-07-180759-4
All trademarks are trademarks of their respective owners. Rather than put a trademark symbol after
every occurrence of a trademarked name, we use names in an editorial fashion only, and to the
benefit of the trademark owner, with no intention of infringement of the trademark. Where such
designations appear in this book, they have been printed with initial caps.
McGraw-Hill eBooks are available at special quantity discounts to use as premiums and sales
promotions, or for use in corporate training programs. To contact a representative please e-mail us at
bulksales@mcgraw-hill.com.
TERMS OF USE
This is a copyrighted work and The McGraw-Hill Companies, Inc. (“McGraw-Hill”) and its licensors
reserve all rights in and to the work. Use of this work is subject to these terms. Except as permitted
under the Copyright Act of 1976 and the right to store and retrieve one copy of the work, you may
not decompile, disassemble, reverse engineer, reproduce, modify, create derivative works based
upon, transmit, distribute, disseminate, sell, publish or sublicense the work or any part of it without
McGraw-Hill’s prior consent. You may use the work for your own noncommercial and personal use;
any other use of the work is strictly prohibited. Your right to use the work may be terminated if you
fail to comply with these terms.
THE WORK IS PROVIDED “AS IS.” McGRAW-HILL AND ITS LICENSORS MAKE NO
GUARANTEES OR WARRANTIES AS TO THE ACCURACY, ADEQUACY OR
COMPLETENESS OF OR RESULTS TO BE OBTAINED FROM USING THE WORK,
INCLUDING ANY INFORMATION THAT CAN BE ACCESSED THROUGH THE WORK VIA
HYPERLINK OR OTHERWISE, AND EXPRESSLY DISCLAIM ANY WARRANTY, EXPRESS
OR IMPLIED, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. McGraw-Hill and its
licensors do not warrant or guarantee that the functions contained in the work will meet your
requirements or that its operation will be uninterrupted or error free. Neither McGraw-Hill nor its
licensors shall be liable to you or anyone else for any inaccuracy, error or omission, regardless of
cause, in the work or for any damages resulting therefrom. McGraw-Hill has no responsibility for the
content of any information accessed through the work. Under no circumstances shall McGraw-Hill
and/or its licensors be liable for any indirect, incidental, special, punitive, consequential or similar
damages that result from the use of or inability to use the work, even if any of them has been advised
of the possibility of such damages. This limitation of liability shall apply to any claim or cause
whatsoever whether such claim or cause arises in contract, tort or otherwise.
Dedicated to Wei “Cynthia” Huang Bartlett—Wife
&
Patricia “Patty” Rita Stalzer Bartlett—Mother
(1944–2005)
Contents
Preface
Acknowledgments
Part I Introduction and Strategic Landscape
Big Data
Chapter 1 The Business Analytics Revolution
Information Technology and Business Analytics
The Need for a Business Analytics Strategy
The Complete Business Analytics Team
Section 1.1 Best Statistical Practice = Meatball Surgery
Bad News and Good News
Section 1.2 The Shape of Things to Come—Chapter Summaries
PART I The Strategic Landscape—Chapters 1 to 6
PART II Statistical QDR: Three Pillars for Best Statistical Practice—
Chapters 7 to 9
PART III Data CSM: Three Building Blocks for Supporting Analytics—
Chapters 10 to 12
Notes
Chapter 2 Inside the Corporation
Section 2.1 Analytics in the Traditional Hierarchical Management
Offense
Leadership and Analytics
Specialization
Delegating Decisions
Incentives
Section 2.2 Corporate Analytics Failures—Shakespearean Comedy of
Statistical Errors
The Financial Meltdown of 2007–2008: Failures in Analytics
Fannie Mae: Next to the Bomb Blast
The Great Pharmaceutical Sales-Force Arms Race by Tom “T.J.” Scott
Inside the Statistical Underground—Adjustment Factors for the
Pharmaceutical Arms Race by Brian Wynne
Section 2.3 Triumphs of the Nerds
Proving Grounds—Model Review at The Associates/Citigroup
Predicting Fraud in Accounting: What Analytics-Based Accounting Has
Brought to “Bare” by Hakan Gogtas, Ph.D.
Notes
Chapter 3 Decisions, Decisions
Section 3.1 Fact-Based Decision Making
Combining Industry Knowledge and Business Analytics
Critical Thinking
Section 3.2 Analytics-Based Decision Making: Four Acts in a Greek
Tragedy
Act I: Framing the Business Problem
Act II: Executing the Data Analysis
Act III: Interpreting the Results
Act IV: Making Analytics-Based Decisions
Consequences (of Tragedy)
Act V: Reviewing and Preparing for Future Decisions
Section 3.3 Decision Impairments: Pitfalls, Syndromes, and Plagues in
Act IV
Plague: Information and Disinformation Overload
Pitfall: Overanalysis
Pitfall: Oversimplification
Syndrome: Deterministic Thinking
Syndrome: Overdependence on Industry Knowledge
Pitfall: Tunnel Thinking
Syndrome: Overconfident Fool Syndrome
Pitfall: Unpiloted Big Bang Launches
Notes
Chapter 4 Analytics-Driven Culture
Left Brain–Right Brain Cultural Clash—Enter the Scientific Method
Denying the Serendipity of Statistics
Denying the Source—Plagiarism
Section 4.1 The Fertile Crescent: Striking It Rich
Catalysts and Change
Two-Trick Pony
Section 4.2 The Blend: Mixing Industry Knowledge and Advanced
Analytics
Cultural Imbalance
The Gemini Myths
Notes
Chapter 5 Organization: The People Side of the Equation
Section 5.1 Analytics Resources
Business Quants—Denizens of the Deep
Analytics Power Users
Business Analysts
Knowledge Workers
Section 5.2 Structure of Analytics Practitioners
Integration Synergies
Technical Connectivity
Specialization
Teamwork
Technical Compatibility
Section 5.3 Building Advanced Analytics Leadership
Leadership and Management Skills
Business Savvy
Communication Skills
Training and Experience
On-Topic Leadership by Charlotte Sibley
Expert Leaders (ELs)—Corporate Trump Cards
The Blood-Brain Barrier
Advantages of On-Topic Business Analytics Leaders
Management Types by David Young
Section 5.4 Location, Location, Location of Analytics Practitioners
Outsourcing Analytics
Dispersed or Local Groups
Central or Enterprise-Wide Groups
Hybrid: Outside + Local + Enterprise-Wide
Notes
Chapter 6 Developing Competitive Advantage
Approach for Identifying Gaps in Analytics
Strategy
Protecting Intellectual Property
Section 6.1 Triage: Assessing Business Needs
Process Mapping of Analytics Needs
Innovation: Identifying New Killer Apps
Scrutinizing the Inventory
Assigning Rigor and Deducing Resources
Section 6.2 Evaluating Analytics Prowess: The White-Glove Treatment
Leading and Organizing
Progress in Acculturating Analytics
Evaluating Decision-Making Capabilities
Evaluating Technical Coverage
Executing Best Statistical Practice
Constructing Effective Building Blocks
Business Analytics Maturity Model
Section 6.3 Innovation and Change from a Producer on the Edge
Emphasis on Speed
Continual Improvement
Accelerating the Offense—For Those Who Are Struggling
Notes
Part II The Three Pillars of Best Statistical Practice
Blind Man’s Russian Roulette Bluff
Chapter 7 Statistical Qualifications
Section 7.1 Leadership and Communications for Analytics Professionals
Leadership
Communication
Leadership and Communication Training
Section 7.2 Training for Making Analytics-Based Decisions
Statistical “Mythodologies”
Section 7.3 Statistical Training for Performing Advanced Analytics
The Benefits of Training
Academic Training
Post-Academic Training—Best Statistical Practice
Training Through Review
Section 7.4 Certification for Analytics Professionals
The PSTAT® (ASA) (Professional Statistician)— ASA’s New
Accreditation by Ronald L. Wasserstein, Ph.D.
Professionalism
Notes
Chapter 8 Statistical Diagnostics
The Model Overfitting Problem
Section 8.1 Overview of Diagnostic Techniques
External Numbers
Juxtaposing Results
Data Splitting (Cross-Validation)
Resampling Techniques with Replacement
Standard Errors for Model-Based Group Differences: Bootstrapping to
the Rescue by James W. Hardin, Ph.D.
Simulation/Stress Testing
Tools for Performance Measurement
Tests for Statistical Assumptions
Tests for Business Assumptions
Intervals and Regions
DoS (Design of Samples)
DoE (Design of Experiments)
Section 8.2 Juxtaposition by Method
Paired Statistical Models
Section 8.3 Data Splitting
Coping with Hazards
K-Fold Cross-Validation
Sequential Validation (with Three or More Splits)
Notes
Chapter 9 Statistical Review—Act V
Élan
Qualifications and Roles of Reviewers
Statistical Malpractice
Section 9.1 Purpose and Scope of the Review
Purpose
Scope
Context
Section 9.2 Reviewing Analytics-Based Decision Making— Acts I to IV
Reviewing Qualifications of Analytics Professionals—Checking the Q
in QDR
Restrictions Imposed on the Analysis
Appropriate and Reliable Data
Analytics Software
Reasonableness of Data Analysis Methodology
Reasonableness of Data Analysis Implementation
Statistical Diagnostics—Checking the D in QDR
Interpreting the Results (Transformation Back), Act III
Reviewing Analytics-Based Decision Making, Act IV
Closing
Considerations—Documentation,
Maintenance,
Recommendations, and Rejoinder
Notes
Part III Building Blocks for Supporting Analytics
Chapter 10 Data Collection
Randomization
Interval and Point Estimation
Return on Data Investment
Measuring Information
Measurement Error
Section 10.1 Observational and Censual Data (No Design)
Section 10.2 Methodology for Anecdotal Sampling
Expert Choice
Quota Samples
Dewey Defeats Truman
Focus Groups
Section 10.3 DoS (Design of Samples)
Sample Design
Simple Random Sampling
Systematic Sampling
Advanced Sample Designs
The Nonresponse Problem
Post-Stratifying on Nonresponse
Panels, Not to Be Confused with Focus Groups
Section 10.4 DoE (Design of Experiments)
Experimental Design
Completely Randomized Design
Randomized Block Design
Advanced Experimental Designs
Experimental Platforms
Notes
Chapter 11 Data Software
Section 11.1 Criteria
Functional and Technical Capabilities
Maintenance
Governance and Misapplication
Fidelity
Efficiency and Flexibility
Section 11.2 Automation
Data Management
Data Analysis
Presenting Findings
Monitoring Results
Decision Making
Notes
Chapter 12 Data Management
Information Strategy
Data Sources
Security
Section 12.1 Customer-Centric Data Management
Customer Needs
Data Quality—That “Garbage In, Garbage Out” Thing
Inspection
Data Repair
Section 12.2 Database Enhancements
Database Encyclopedia
Data Dictionaries
Variable Organization
Notes
Concluding Remarks
Appendix: Exalted Contributors: Analytics Professionals
References
Index
About the Author
Preface
“… true learning must often be preceded by unlearning …”
—Warren Bennis
A Practitioner’s Guide to Business Analytics is a how-to book for all those
involved in business analytics—analytics-based decision makers, senior
leadership advocating analytics, and those leading and providing data
analysis. The book is written for this broad audience of analytics
professionals and includes discussions on how to plan, organize, execute,
and rethink the business. This is certainly not a “stat book” and, hence, will
not talk about performing statistical analysis.
The book’s objective is to help others build a corporate infrastructure to
better support analytics-based decisions. It is hard to judge a book by its
cover. To get a feel for the book, look at Figure 6.1 on p. 117, which shows
types of business analytics that can support decision making. Table 6.2 on
p. 118 provides a glimpse of how to organize business analytics projects.
Figure 6.4 on p. 123 depicts how to assess the relative technical difficulties
of a set of business problems. Do these items complement how you think
about your business?
There is a tremendous opportunity to improve analytics-based decision
making. This book is designed to help those who believe in business
analytics to better organize and focus their efforts. We will discuss practical
considerations in how to better facilitate analytics. This will include a blend
of the big-picture strategy and specifics of how to better execute the tactics.
Many of these topics are not discussed elsewhere. This journey will require
continually updating the corporate infrastructure. At the center of these
enhancements is placing the right personnel in the right roles.
This book serves to enrich the conversation as the reference book you can
take into planning sessions. It is usually difficult to find a reference that
addresses the specifics of what to do. This is largely because one size does
not fit all. The first part of the book provides insights into how we can
update our infrastructure; the second part provides three pillars for
measuring the quality of analytics and analytics-based decisions; and part
three addresses three building blocks for supporting Business Analytics.
This book has a great deal of breadth so that professionals, despite not
possibly being on the same page, can at least be in the same book.
The recommendations in this book are based upon the cumulative
experience of analytics professionals incorporating analytics in numerous
corporations—Best Statistical Practice. This book contains 12 sidebars
relating experiences from the field and viewpoints on how to best apply
analytics to the business. The more you get excited about new ideas, the
more you are going to enjoy this insight-intensive book.
Finally, I wish to add that the way companies approach analytics is
evolving. Big Data is accelerating this evolution. I fully expect
disagreements and respect different opinions,1 and so should you. To
optimize your reading experience, you should retain those ideas that fit into
how you think about your business, and leave on the shelf, for now, those
ideas that do not complement your approach. Do you want to win? Do you
want your company to gain market share? Of course you do. Now is your
opportunity to take your game to the next level!
Notes
1. This is a contentious topic and I will not go unscathed.
Acknowledgments
It takes a team effort to write a book by yourself. I am indebted to Isaac
“Boom Boom” Abiola, Ph.D.; Jennifer Ashkenazy; Cynthia “Wei” Huang
Bartlett, M.D.; Sigvard Bore; Bertrum Carroll; H. T. David, Ph.D.; Karen
Fender; Les Frailey; Hakan Gogtas, Ph.D.; James W. Hardin, Ph.D.; Anand
Madhaven; Girish Malik; Gaurav Mishra; Robert A. Nisbet, Ph.D.;
Sivaramakrishnan Rajagopalan; Douglas A. Samuelson; Tom “T.J.” Scott;
Prateek Sharma; Charlotte Sibley; W. Robert Stephenson, Ph.D.; Jennifer
Thompson; Ronald L. Wasserstein, Ph.D.; Brian Wynne; and David Young.
Their specific contributions are listed in the Appendix. A reviewed book
provides a better reading experience.
Part I
Introduction and Strategic Landscape
The ambition of this book is to take up the challenging task of addressing
how to adapt the corporation to compete on Business Analytics (BA). We
share discoveries on how to transform the corporation to thrive in an
analytics environment. We cover the breadth of the topic so that this book
may serve as a practical guide for those working to better leverage
analytics, to make analytics-based decisions.
Big Data
There has been a great deal of large talk about Big Data. One sensible
definition of Big Data is that it comprises high-volume, high-velocity,
and/or high-variety (including unstructured) information assets.1 The
threshold beyond which data becomes Big is relative to a corporation’s
capabilities. As we grow our abilities, the challenges of Big Data diminish.
The application of the term, Big Data, is evolving to include Business
Analytics and the term is overused at the moment, so we will write plainly.
The opportunity stems from the volume, velocity, and variety of the
information content. This torrent of information is collected in new ways
using new technologies. It can add a different perspective and provide
synergy when combined with traditional sources of information. This new
information has stimulated fresh ideas and a fresh perspective on (1) how
business analytics fits into our business model; and (2) how we can adapt
our business model to facilitate better analytics-based decisions.
The first challenge is to wrestle the data into a warehouse. This involves
collecting, treating, and storing high-volume, high-velocity, and highvariety data. We address these growing needs by improving our operational
efficiencies for handling the data. Although Business Analytics can help in
a data-reduction and organizational capacity,2 this is largely an IT issue and
not the subject of this book. IT has introduced exciting new solutions for
expanding hardware and software capabilities. Brute force alone, such as
continually purchasing hardware, is not a long-term plan for avoiding the
Big Data abyss.
The second challenge is to handle the explosion of information extracted
from the data. This is largely a business analytics issue and it is addressed
by this book. If the volume, velocity, and variety of the data are difficult to
manage, then how well are we handling the volume, velocity, and variety of
the information? Previous authors have made the case for improving
Business Analytics. One implication of Big Data is that we need to
accelerate our development of BA.
This book’s best practices will facilitate increasing our capabilities for
performing Business Analytics and integrating the information into
analytics-based decisions. Part I of this book will inform our strategic
thinking, enabling us to develop a more effective plan.
1
The Business Analytics Revolution
“All revolutions are impossible till they happen, then they become inevitable.”
—Michael Tigar3
W
e are poised to enter a new Information Renaissance that involves
making smarter analytics-based decisions. A grove of recent books4
and articles has made the case for competing based upon business analytics
(BA). These books reveal a potpourri of success stories illustrating the
value proposition.
It took a generation or longer to take full advantage of some past
technological revolutions, such as the automobile, electricity, and the
computer. Business analytics has been introduced to corporations, yet most
lack the infrastructure to fully capitalize on the abundance of high quality
decision-making information. This progression requires significant changes.
Foremost among these are changes in personnel, organization, and
corporate culture. The right infrastructure will facilitate moving from
tactical applications hither and yon, to integrating analytics into the
corporation.
Recent interest in business analytics has been characterized by a growing
awareness of analytics applications, mature IT (Information Technology),
ubiquitous electronic data collection devices, increasingly sophisticated
decision makers, more data-junkie senior leadership, shorter information
shelf life, and “Big Data.”5 We are experiencing such a deluge of data that,
in the future, there is the potential for corporations to be buried in it.
Corporate concerns arising from the inefficient use of analytics extend
beyond just leaving money on the table because of missed opportunities.
Ineffective corporations will not see “it” coming—their demise. They will
not know why they suddenly lost their customers one night or why their
product is still on the shelves. They will have the data to explain it, yet they
will struggle to put the pieces together in time because they will not be
prepared. In addition to the need to face Big Data, there is a second layer to
the problem. Corporations will continue to be awash in dirty data and filthy
information. In a future emergency, they will race to clean the data, filter
information from misinformation, and interpret the findings.
In this book, we dispel stubborn myths and provide a perspective for
understanding the organization, the planning, and the tools needed for
business analytics superstardom. We have seen analytics in the trenches of
effective and ineffective corporations. We leverage the perspectives of
analytics professionals charged with making it happen—that is, those
leading their corporations in how to apply analytics, those basing decisions
upon analytics, and those providing data analysis.
Business Intelligence = Information Technology + Business
Analytics6
Information
Analytics
Technology
and
Business
Information technology and business analytics both involve professionals
leveraging data to provide business insights, which, in turn, facilitate better
decisions. They provide complementary benefits, and we emphasize the
synergy of the two.
Concept Box
Information technology—Gathering and managing data to build a data
warehouse and providing data pulls, reports, and dashboards. (Bringing
the data to the business)
Business analytics—Leveraging data analysis and business savvy to
make analytics-based business decisions. (Bringing the business
questions to the data)
IT involves data collection, security, integrity, management, and
reporting. It begins with gathering data and ends with either constructing a
data warehouse or with using the data warehouse for data pulls, reports, and
dashboards. In reporting, IT measures a consistent set of metrics to track
business performance and guide planning. IT places a great deal of
emphasis on efficiency.
BA is focused upon supporting and making business decisions by
connecting business problems to data analysis—analytics. It tends to work
from the business need to the available or potentially available data. BA
involves reporting, exploratory data analysis, and complex data analysis,
and in our definition, we include analytics-based decision making. We want
to minimize the distance between the decision and the analytics. BA
overlaps with IT with regard to reporting. While IT emphasizes efficiency
and reliability in creating standardized reports that address predetermined
key performance indicators, BA scrutinizes the reports based upon
statistical techniques and business savvy. The BA skill set is valuable for
determining and rethinking how these key performance indicators meet the
business needs. Additionally, the BA skill set includes statistical tools such
as quality control charts and other confidence intervals, techniques that
certainly enhance reports for making better decisions.
BA is concerned with scrutinizing the data. To this end, it recognizes
nuances or problems with the numbers and traces them back through the
data pipeline to discover what these numbers really mean. BA includes
complex data collection, such as statistical sampling, designed experiments,
and simulations. These endeavors need mathematical, statistical, and
algorithmic tools.
We can discern IT and BA by their skills sets; their software; and their
respective locations in the corporation. IT has a stronger computer software
theme, and BA is about data analysis and analytics-based decision making.
IT usually reports to a CIO. BA often resides in or near the same division as
business operations, closer to the business decisions. BA and IT provide an
important synergy. It is difficult to have BA without IT.
We want to redefine the BA team to make it more inclusive and close the
distance between making decisions that are based upon analytics and
performing data analysis to support these decisions.
The Need for a Business Analytics Strategy
Running a large corporation can be compared to flying a commercial jet in
a storm. Industry knowledge is the equivalent of looking out the windows,
while analytics and advanced analytics—tracking, monitoring, and data
analysis—comprise the various gauges, monitoring equipment, and warning
devices. In some corporations, tracking reports and data analysis cannot
withstand the tiniest scrutiny. This means that some portion of the
corporation’s information is fallacious, and, thus, so are some of the
decisions based upon this misinformation. The promise of analytics is to
provide better facts and to facilitate better analytics-based decision making.
Our world is becoming more complex at a dramatic rate, and our brains7
… not so much. The importance of data analysis has crept up on our
corporations over the past decades. Data is now available in abundance, and
our analysis needs range from being straightforward to being extremely
complex. We want to better integrate business analytics into the decision
making process and thus be able to better compete in the marketplace. We
want to meet the quickening pace of decision making, the increased
business complexity, and the deluge of Big Data. Analytics-based decision
making is essential for making the big decisions and thousands of little
ones.
A history of business failures underscores the need to master how to
compete based upon business analytics. One highly developed application
of analytics is in estimating risk and revealing how to manage it. Many of
those corporations that fared the best during the 2007–2008 financial
meltdown made better analytics-based decisions. First, they validated,
reviewed, and refined their risk models. Second, they understood their
models well enough to believe them and interpret them in the face of human
behavior. To return to our commercial jet example, they understood their
instruments well enough to make sense out of them when looking out the
window provided the wrong answer. AIG,8 Fannie Mae, Freddie Mac,
Citigroup, Bear Stearns, Lehman Brothers, Merrill Lynch, WAMU, Fitch
Ratings, Moody’s, and Standard & Poor’s were all competing based upon
analytics in a prominent manner. At the time, they might not have realized
the extent to which their fortunes and their reputations were exposed to
their ability to leverage business analytics into their decision making.
The Complete Business Analytics Team
Facing the next phase of the Information Age will require rethinking
decision management. The turnaround time allowed for making decisions is
decreasing. The amounts of data and the amounts of misinformation are
rising. We need to extend the business analytics team to include senior
leaders investing in analytics, those consuming the information, those
performing the data analyses, and those directing these practitioners. We
must include analytics professionals, who value statistical and mathematical
analysis and yet their job might not call upon them to perform data analysis.
By including everyone involved, we can foster more cohesion between
decision makers, corporate leaders, and those supplying the data analyses.
Also, we need to extend the analytics conversation about how we can apply
analytics to the business. In Table 1.1, we introduce four basic functional
roles.
Our experience has shown that we need sophisticated analytics-based
decision makers and directors of analytics with strong quantitative training
to meet our business analytics needs. Six Sigma has demonstrated that (1)
we must have leadership advocating change, (2) we can change our culture
to better leverage analytics in decision making, and (3) it is impracticable to
train all of our employees to perform data analysis. Instead, we need to
build a specialized group of business analysts and business quants to
provide the data analysis. Organizing and expanding the business analytics
team will lead to making the other infrastructural changes needed for BA
superstardom.
Section 1.1 Best
Meatball Surgery
Statistical
Practice
=
“Most people use statistics the way a drunkard uses a lamp post, more for support than
illumination.”
—Mark Twain
Best Statistical Practice (BSP) is our term for our evolving wisdom
acquired from solving business analytics problems in the field. We must
perform a data analysis within the context of the business need. This need
includes addressing considerations of Timeliness, Client Expectation,
Accuracy, Reliability, and Cost. We perform the data analysis within these
constraints using statistics, mathematics, and software algorithms. These
tools provide business insights that support analytics-based decision
making.9
Through experimentation, and some trial and error, we find solutions that
are fast, client suitable, accurate, reliable, and affordable enough to meet
business needs. We call this ongoing experimentation, The Great Applied
Statistics Simulation. Hence, the cumulative wisdom of Best Statistical
Practice includes our understanding of how to execute techniques quickly,
how to meet the client expectation, what information is needed to make the
analytics-based decisions, how well techniques perform for certain
applications, how to measure the accuracy and reliability of the data
analysis, how we can best leverage the serendipity of data analysis, and
how we can provide analyses inexpensively.
Figure 1.1 Business analytics workbench
Much of our learning comes from performing autopsies (Chapter 9) on
failed and on successful analytics-based decisions and data analyses. We
infer the best techniques, judge the right amount of rigor, develop our
business savvy, and foster the synergism between our training and our
experience. We measure the performance of decisions and techniques where
possible and extrapolate these findings to where it is impossible to measure
performance. For example, a generation of analytics professionals mastered
building predictive models on high-quality banking data. Then they applied
their refined techniques to other applications and to industries where the
data quality was too weak to facilitate mastering the techniques.
Best Statistical Practice consists of know-how built upon this continual
learning, which, in turn, facilitates faster, better, and less expensive
analytics- based decisions. It protects us from hazards that we can not
anticipate.10 We further develop our BSP by improving our training, our
tools, and our understanding of the business problem. This enables us to
make great advances in expanding our capabilities. Finally, we need to keep
in mind that the three most expensive data analyses continue to be the faulty
ones, the absent ones, and the ones nobody uses. The most expensive
decisions are those that fail to leverage the available information.
We wish to emphasize that analyzing the data is a technical problem
within the business analytics problem. The complete problem includes the
broader business needs: Timeliness, Client Expectation, Accuracy,
Reliability, and Cost. We must solve the analytics problem within these
constraints and work toward an infrastructure that will ease them. Our
academic training ignores these business constraints, thus making it
imperative that we adapt the theory to practice. BSP, combined with good
quantitatively trained leadership, facilitates speed and helps avoid both
under-analysis and overanalysis. Quantitatively trained leaders can be relied
upon to understand the trade-offs involved in cutting corners to perform the
analysis within the broader business constraints.
The last six chapters of this book provide the tools necessary to perform
Best Statistical Practice.
Bad News and Good News
First the bad news—all the exciting breakthroughs about leveraging
analytics to create space-age nanite technology and revolutionize business
are full of embellishments intended to impress us and the shareholders.
Corporations are not as sophisticated or as successful as we might grasp
from the sound bytes appearing in conferences, books, and journals. Instead
opinion-based decision making, statistical malfeasance, and counterfeit
analysis are pandemic. We are swimming in make-believe analytics.
One major part of the problem is that corporations have difficulty
measuring the quality of their decisions and the quality of their data
analyses. To measure these, we often need a second layer of data analyses.
This is one of the most disquieting problems because, just like brain
surgery, it takes a second brain surgeon to figure out if the first brain
surgeon is working the correct lobe. Even with the best analysis, it is very
difficult to measure the quality of some decisions and some data analyses.
At present, there is a rather large gap between obtaining the right data
analysis for a decision and actually making the decision. A great deal of
good data analysis is misdirected and fails to drive the business. Some of
this misdirection suits special interests that want the results to match preset
conclusions.11 Meanwhile, it is difficult for others to recognize when there
is a disconnect between the data analysis and the decision.
Now for some good news—this is all one gigantic opportunity and we can
easily make substantial progress. Business analytics can build enormous
competitive advantages and promote innovation. Analytics simplifies the
overwhelming complexity of information12 and decreases misinformation
emissions. Finally, less is more. A tremendous amount of analytics and
advanced analytics can be omitted. The trick is to discern what we need
from what we want.
The current generation of business analysts and business quants are up to
the technical challenges, and they have made incredible breakthroughs. For
example, applying predictive models to banking has built more intelligent
banks, which is contrasted by the fatal opinion-based decisions and sloppy
analyses involved in the financial meltdown of 2007–2008. Also, today’s
statistical software has evolved in efficiency and capabilities. Finally, for
most corporations, IT has matured and can inexpensively provide the data.
We have the talent, we have the software, and the data is overflowing.
Section 1.2 The Shape of Things to Come—
Chapter Summaries
The corporate pacemaker has quickened and analytics is wanted to speed up
and improve decisions. The ambitions of this book are to provide insight
into how analytics can be improved within the corporation, and to address
the major opportunities for corporations to better leverage analytics.
PART I The Strategic Landscape—Chapters 1 to 6
Part I discusses the infrastructure needed to fully leverage analytics in the
corporation. We will discuss changes in corporate culture, personnel,
organization, leadership, and planning.
Chapter 2, “Inside the Corporation,” discusses analytics inside the
corporation based upon experience from both successes and failures.
Section 2.1 discusses how corporations employ a Hierarchical Management
Offense (HMO), which centralizes authority and decision-making. We will
discuss how the right calibration of Leadership, Specialization, Delegation,
and Incentives can nurture analytics. We outline the typical leaders who
support analytics. We note that advanced analytics is a specialization and
discuss the implications of this in a corporate environment. We review good
delegation practices, pointing out that more authority and decision making
must be delegated to those close to the tacit information. Analytics is a team
sport, best encouraged in a meritocracy with team incentives in place.
Section 2.2 provides notorious examples of failure due to the sloppy
implementation of analytics. We review failures at Fannie Mae, AIG,
Moody’s, Standard & Poor’s, the pharmaceutical industry, among others.
Section 2.3 provides examples of triumphs in statistics. These include a
success story in reviewing predictive analytics at The Associates/Citi and
predicting fraud at PricewaterhouseCoopers.
Chapter 3, “Decisions, Decisions,” underscores the importance of
leveraging the facts. It notes the schism between opinion-based and factbased decision making. Section 3.1 discusses how corporations make
decisions and how they incorporate data analysis into their decision making
—that is, analytics-based decision making. It clarifies the need for both
industry knowledge and analytics expertise.
Section 3.2 breaks down the process of integrating the data analysis into
the analytics-based decision or action. Autopsies have revealed where the
mistakes occur, and we will discuss the interplay between industry
knowledge and analytics. Section 3.3 discusses a long list of decision
impairments, which distract us from appropriately leveraging the facts.
Chapter 4, “Analytics-Driven Culture,” discusses the contents of
corporate cultures that succeed in leveraging analytics. It clarifies that
analytics is transferrable across all industries.13 Section 4.1 discusses what
is involved in an analytics-driven corporate culture and how such cultures
arise. Section 4.2 helps us to better think about blending analytics and
industry expertise. It also illustrates that corporations tend to understate
analytics in that blend.
Chapter 5, “Organization: The People Side of the Equation,” discusses the
composition (Section 5.1), structure (Section 5.2), leadership (Section 5.3),
and location (Section 5.4) of analytics teams within the corporation. We
note the difference between management and leadership as illustrated by
Warren Bennis in his book On Becoming a Leader.
Chapter 6, “Developing Competitive Advantage,” is the lynchpin of this
book. It discusses how to assess a corporation’s analytics needs (Section
6.1) and evaluate its prowess (Section 6.2). In Section 6.1, we outline how
to assess the analytics needs of the corporation and translate that into a
strategic analytics plan. This plan will clarify the corporation’s needs on an
annual basis. Next, in Section 6.2, we lead the reader through evaluating the
analytics capabilities of the corporation. The difference between the needs
and capabilities is the gap to be addressed. Section 6.3 discusses aggressive
measures for pursuing the wanted analytics capabilities.
PART II Statistical QDR: Three Pillars for Best Statistical
Practice—Chapters 7 to 9
PART II of this book introduces Statistical QDR—the three pillars for Best
Statistical Practice. These pillars—Statistical Qualifications (Chapter 7),
Statistical Diagnostics (Chapter 8), and Statistical Review (Chapter 9)—
enable the corporation to measure the quality of the analytics-based
decisions and the data analyses. This is the methodology behind Best
Statistical Practice. These tools create the momentum for continually
improving the analytics-based decisions and analytics, and they measure
our performance in delivering the same. In short, they allow us to “fly on
instruments” in poor visibility.14 At least one analytics practitioner should
be responsible for overseeing and continually improving each of these
pillars.
Chapter 7, “Statistical Qualifications,” discusses the qualifications
necessary to be competent in making analytics-based decisions and
performing advanced analytics—including those qualifications needed for
reviewers of this work. Section 7.1 reinforces the idea that leadership and
communication skills are an essential part of performing analytics. Section
7.2 discusses the needs and training for more sophisticated decision makers
and presents the training required for digesting statistical results.
Section 7.3 discusses the advantages of applied statistical training. The
delay in certifying statisticians for so many decades has facilitated
charlatanism and a credibility problem. Section 7.4 makes the case for
certifying those who are qualified to analyze your data.
Chapter 8, “Statistical Diagnostics,” discusses the Statistical Diagnostics
that business analysts and business quants should apply and decision
makers should recognize. Here we list the usual suspects and focus on a few
effective techniques. Section 8.1 outlines the various Statistical Diagnostics
needed for pursuing success. Section 8.2 discusses applying multiple
solutions to solve the same business analytics problem. Section 8.3
discusses the family of Data Splitting techniques, whereby we partition the
data into development datasets and validation datasets—the latter are also
called control or hold-out datasets.
Chapter 9, “Statistical Review—Act V,” discusses what is involved in
reviewing analytics-based decisions and data analyses. Section 9.1
discusses the considerations going into the purpose and scope of the review.
Section 9.2 discusses the nuances of reviewing the analytics-based
decisions and the data analyses.
PART III Data CSM: Three Building Blocks for Supporting
Analytics—Chapters 10 to 12
The transition toward an analytics-driven culture requires a number of
infrastructural changes. PART III discusses the three usual soft spots that,
when poorly managed, hold corporations back. Every analytics professional
will recognize the importance of these three building blocks: Data
Collection (Chapter 10), Data Software (Chapter 11), and Data
Management (Chapter 12)—Data CSM. However, time after time
corporations fail to adequately cover these areas. At least one analytics
professional should be responsible for overseeing and continually
improving each of them. We will clarify what is getting overlooked and
dispel the usual myths.
Chapter 10, “Data Collection,” discusses “the matter with” data
collection. Most corporations have weak data collection abilities. They rely
upon the data to find them. We will discuss the application of Design of
Samples (DoS); Design of Experiments (DoE); and simulation, and
juxtapose the characteristics of these techniques with those of
observational, censual, and anecdotal data. Section 10.1 discusses analysis
of observational or censual data—the context for data mining, where the
data tend to find us. Section 10.2 discusses anecdotal means of collecting
information. Section 10.3 discusses the advantages of randomly selecting a
representative subset from a population—DoS. Section 10.4 discusses the
advantages of randomly assigning treatments (or factors) to a representative
subset from a population—DoE.
Chapter 11, “Data Software,” communicates the advantages of a
complementary suite of data processing and analysis software tools. Section
11.1 discusses the criteria we consider for designing a suite of software
tools for manipulating data. It clarifies the importance of software breadth
and emphasizes using the right tool to solve the right problem. Section 11.2
discusses the productivity benefits of automated software.
Chapter 12, “Data Management,” closes the book with a discussion about
what all analytics professionals need to know about organizing and
maintaining the data. Datasets are corporate assets and need to be managed
to full effect. Section 12.1 discusses the usual data-consumer needs that
corporations overlook. Section 12.2 presents a number of database
enhancements that will make the data a more valuable asset.
Although these chapters build upon each other, the interested reader might
skip ahead to those chapters most relevant to their needs. Chapters 2 – 4 are
burdened by providing support for the more impactful later chapters.
Notes
1. “3D Data Management: Controlling Data Volume, Velocity and Variety” by Douglas, Laney.
Gartner. Retrieved 6 February 2001, and “The Importance of ‘Big Data’: A Definition” by
Douglas, Laney. Gartner. Retrieved 21 June 2012.
2. In some situations, the winner is the first corporation to learn just enough from the data.
3. “The Trials of Henry Kissinger” (2003).
4. To name a few: Competing on Analytics by Harris and Davenport; Super Crunchers by Ian
Ayres; Data Driven by Thomas Redman, and; The Deciding Factor by Rosenberger, Nash, and
Graham; and Business Analytics For Managers by Laursen & Thorlund.
5. Today’s “Big Data” was unimaginable ten years ago. We expect tomorrow’s datasets to be even
more complicated.
6. There are many definitions of Business Intelligence; while less popular, this one is convenient
for our purposes.
7. Oh, our Stone-Age brains. Our brains have not evolved a great deal during the last hundreds of
thousands of years.
8. See “The Man Who Crashed the World,” Vanity Fair, August 2009.
9. We will use the term “statistical” slightly more often because we want to keep in mind the
uncertainty and the inherent unreliability of data.
10. We do not need to always know exactly how every decision or analysis will fail. In many
situations, it is sufficient to know what works and under what circumstances it works.
11. Like in a court case where each side starts with a conclusion and works backward—that being
the appropriate direction.
12. When analytics is making things more complex, then we are doing it wrong.
13. In statistician-speak, statistics, mathematics, and algorithmic software are invariate to industry.
14. A side benefit is that these tools expose charlatans, or alternatively, force them to work harder
to fool us.
2
Inside the Corporation
“There is one rule for the industrialist and that is: Make the best quality of goods possible at the
lowest cost possible, paying the highest wages possible.”
—Henry Ford
A
corporation is an association of individuals—share holders,
embodying their private financial interests, yet possessing distinct
powers and liabilities independent of its members. It can be a “legal
person”1 with the right to litigate, hold assets, hire agents, sign contracts,
etc. Over the years, corporations have needed to adapt to changing
technology. To keep up with the Information Age, their assets have shifted
toward intellectual property, company know-how, and more specialized
knowledge-based professionals. The promise of business analytics will
require greater changes. We will never fully leverage business analytics
without changing the corporate infrastructure—culture, leadership,
organization, and planning!2
In this chapter, we address some characteristics of corporations that affect
how well they can leverage analytics. We discuss the role of analytics inside
the corporation. In the last two sections, we share a number of failures and
successes in applying business analytics.
Section 2.1 Analytics in the Traditional
Hierarchical Management Offense
“I didn’t dictate ever because I really felt that creativity doesn’t come from dictation, it comes
from emancipation.”
—Pen Densham3
“’Politics’ comes from the Greek root poly meaning many and ticks meaning blood sucking
parasites.”
—The Smothers Brothers
The Hierarchical Management Offense (HMO) centralizes power and
decision making. It is characterized by a vertical reporting structure serving
as “ductwork,” dispensing directives downward and vacuuming information
upward. The speed and accuracy of communications moving up and down
depends on the length and quality of the vertical chains of relationships.
More hierarchy means that politics can have a greater impact on analytics …
and everything else.
Leadership, Specialization, Delegation, and Incentives are pivot points
for calibrating the emphasis placed upon analytics. Leadership that
embraces analytics-based decision making produces better decisions.
Specialization facilitates more efficient and effective analytics. Delegating
decisions moves the decision closer to the tacit information and expertise.
Aligned Incentive structures encourage the most productive behavior. These
pivot points facilitate some immediate adjustments to the corporate culture
(see Chapter 4), which can increase the productivity of knowledge-based
professionals.
During the progression of the Information Age, we have seen dramatic
growth in IT to keep pace. Most corporations have built large, efficient data
warehouses. One expectation is that the next phase will focus on better
leveraging this information—this investment. This will involve a new
Information Renaissance, using business analytics to make smarter
analytics-based decisions. The role of analytics inside the corporation will
need to be redefined and expanded. It would be easier if corporations could
enhance their business analytics capabilities while changing nothing about
their current business model. They would prefer to alter analytics so that it
will fit their approach. They want analytics to sell in a sales culture, to
manufacture in a manufacturing culture, and to build things in an
engineering culture. This is reasonable up to a point. However, facilitating
analytics requires change; if only because it is intertwined with the
decision-making process. Complete rigidity against adapting the corporate
structure will dilute the value of analytics.
“General, where is your division?”
—General Nathan Shanks Evans
“Dead on the field.”
—General John Bell Hood
Leadership and Analytics
To succeed in applying analytics, leadership must correctly judge the merits
of analytics and how to best integrate this information into corporate
decision making. There are a number of leadership roles that enhance or
retard a corporation’s analytical capabilities. We will describe five general
leadership roles: Enterprise-Wide Advocates, Mid-Level Advocates,
Ordinary Managers of Analytics, Expert Leaders, and On-Topic Business
Analytics Leaders.
The first two roles are advocates of analytics; they are investors in the
technology. The remaining three roles direct those performing the data
analysis. We find that leaders vary dramatically in the degree to which they
encourage analytics. Those most enthusiastic are likely to have a history of
successfully leveraging analytics—data junkies. Some lead with their own
analytics-based decision making. Such a background makes it more likely
that they will push the company to the next plateau in applying analytics.
Enterprise-Wide Advocates put forth the corporate vision and find the
resources to make it happen. The formal name of the Enterprise-Wide
Advocates is up for grabs. The ubiquitous CIOs are in the running. The less
common Chief Economists would be appropriate leaders. Also, there are
burgeoning new roles, such as Chief Analytics Officer or Chief Statistical
Officer. In Section 5.3, we will discuss the leadership of an enterprise-wide
analytics group. Enterprise-Wide Advocates are in a position to:
1. Promote examples of applying analytics-based decision-making
(Chapter 3)—thus, building an analytics-based or data-driven culture
(Chapter 4).
2. Take an interest in the analytics team’s organization (Chapter 5).
3. Embrace a corporate business analytics plan and make certain that
corporate capabilities are evaluated (Chapter 6).
4. Insist that important analyses be performed by professionals with
Statistical Qualifications, using Statistical Diagnostics, and with
Statistical Review (Chapters 7 to 9).
5. Build and maintain the Data Collection, Data Software, and Data
Management infrastructure (Chapters 10 to 12).
6. Remove conflicts of interest and encourage objective analysis, which
might or might not fit preconceived conclusions.
7. Select like-minded mid-level managers—shrewdly.
8. “Manage a meritocracy,” as mentioned in Competing on Analytics.4
9. Spread breakthroughs in statistical practice across the entire
corporation.
10. Ensure one source of the facts, different corporate units are entitled to
their own opinions just not their own facts.
11. Set the tone as to the value of analytics.
Mid-Level Advocates are critical for projecting analytics into the
appropriate areas of the business—putting the corporate vision in motion.
They can
1. Embrace and advocate analytics-based decision making as the way we
do business (Chapter 3)—thus, affirming an analytics-driven culture
(Chapter 4).
2. Take an interest in the analytics team’s organization (Chapter 5).
3. Embrace a corporate business analytics plan and make certain that
corporate capabilities are evaluated (Chapter 6).
4. Insist that important analyses be performed by professionals with
Statistical Qualifications, using Statistical Diagnostics, and with
Statistical Review (Chapters 7 to 9).
5. Build and maintain the Data Collection, Data Software; and Data
Management infrastructure (Chapter 10 to 12).
6. Uphold the meritocracy.
7. Increase the involvement of analytics professionals.
8. Recognize and reward training.
9. Recognize statistical analysis as intellectual property.
10. Quell resistance to analytics.
Typically, when a corporation has an Enterprise-Wide Advocate, it will
have or find Mid-Level Advocates. This complete structure does the most
to integrate analytics into the business.5 If a corporation lacks an EnterpriseWide Advocate but possesses a Mid-Level Advocate, then there will be a
pocket of analytics behind them.6 This pocket will have markedly less
impact throughout the company.
Directors of those performing data analysis (business analysts and
business quants) fall within a spectrum of management and leadership skills
combined with analytics competence (Section 5.3). We will discuss three
roles in this book: Ordinary Managers of Analytics, Expert Leaders, and
On-Topic Business Analytics Leaders. We define the Ordinary Managers
of Analytics as those with the authority to direct analytics resources, yet
who possess less training in business analytics than those who perform it.
An Expert Leader is someone with the training and experience to lead
analytics, yet less leadership authority. Finally, the On-Topic Business
Analytics Leader has the authority, training, and experience—a triple threat.
These three roles are charged with anticipating the information needs of
decision makers and building an infrastructure that can meet these needs on
a timely basis. Corporations have schedules and must make and remake
decisions based upon whatever information is available. The Ordinary
Managers of Analytics tend to be less engaged in the analytics. The
concerns are that they will think about the business from a perspective that
is too light on analytics and that they will miss critical opportunities. These
managers must delegate shrewdly in order to be successful in analytics.
Most of them will spend a great deal of time managing up7—this is
probably more comfortable for them. We are concerned that they will not
spend enough effort leading the analytics practitioners because they might
not be as comfortable with that aspect of the role.
Next, we consider an informal leadership role—the Expert Leader. We
define an Expert Leader as someone regarded as knowledgeable of the
business, competent in analytics, and possessing leadership skills. This
makes this person “bilingual”8—quant and business. They comprehend the
specialization. They can review an analysis; find mistakes or weak points;
and construe its reliability.
A corporation can have several Expert Leaders. They possess business
analytics expertise, yet with less formal people management authority. They
are sometimes informally “chosen” by the other analytical professionals to
boost the leadership and to fill a void as a spokesperson or decision maker.
They support the other analytical professionals, and they maintain the
integrity of the science.
By granting more formal leadership authority to an Expert Leader, we can
derive:
Business Analytics Leader9 = Expert Leader + Formal Authority
This is a bilingual role with sufficient formal authority and business
analytics expertise.
Expert Leaders and Business Analytics Leaders are necessarily trained on
the topic of analytics. They can better identify talent and judge results. They
understand “best practices” and can skillfully lead a team of practitioners. It
is not just about technical ability; it is the way they think. They can think
more statistically about the business problem. They have greater
appreciation for getting the numbers right and they create less burden on the
other analytics professionals on their team. These skilled leaders are usually
less politically astute—a trade-off. We will discuss these three roles further
in Section 5.3.
Specialization
Specializations facilitate hyper-productivity in the corporation; statistics is a
peculiar specialization. Ordinarily the benefits due to analytics are easy to
quantify. We can measure an increase in sales, the lift due to a scoring
strategy, or a decrease in risk. However, there are situations where the
benefits are difficult to measure, difficult to trace, and difficult to claim. It
takes analytics ability to measure and trace the benefits, and it takes
political sway to claim the credit due. Statistics can produce modest returns
for months and then unexpectedly revolutionize the business during a single
day—the serendipity of statistics. Many analytics professionals are
passionate about pushing the business forward. In addition to producing
facts, statistical training facilitates a “scientific” approach to perceiving the
business problem. It accelerates the search for solutions, which are yet to be
revealed through the trial and error approach that produced the industry
knowledge of the past.
Corporations invest in any specialization relative to its perceived value.
Estimating the future value of analytics requires foresight integrated with an
understanding of analytics. For less analytical corporations, the potential of
analytics is often undervalued because of missed opportunities, which have
prevented it from providing value.10 Certification for quants is nonexistent
in some countries and is just beginning in others, so corporations struggle to
judge qualifications. Hence, it can be a challenge for them to discern the
reliability of the results.
The benefits due to analytics are a function of the value of the data, the
technical capabilities, the shrewdness of the applications, and the degree to
which the analytics team is resourced.11 In practice, many corporations
ring-fence resources (retain resources earmarked for a particular corporate
need) based upon their competitors’ resourcing and advice from
consultants. There is no complicated economic calculation.
“Analytically based actions usually require a close, trusting relationship between analyst and
decision maker …”
—Davenport and Harris12
“One important dictum is to make decisions at the lowest level possible.”
—Thomas Redman13
Delegating Decisions
Delegation is an important characteristic of HMO. In general, leadership
needs to delegate decision making toward those who are in the best position
to make the decision. A single corporate-wide decision maker is unlikely to
have the most complete knowledge. Furthermore, leadership needs to
delegate the execution of analytics to analytics professionals, who have
“practiced.” Effective delegation requires trusting relationships. There is a
burden on the leadership to build strong relationships with their analytics
practitioners.
Delegating decision making moves the decision closer to the tacit
information and expertise. Most corporations tend to involve too few
decision makers.14 Dispersing the decision-making burden fosters a smarter
and less autocratic15 corporation. Involving more qualified decision makers
implies greater engagement and elicits higher quality decisions. Analytics is
subject to nuances that cannot be easily explained. Decisions based upon
advanced analytics require more sophisticated decision makers and that the
decision makers possess greater familiarity with the facts. We need decision
makers who are themselves analytics professionals and can (1) trust the
analytics, (2) understand analytics, or (3) build relationships with other
analytics professionals and recognize analytics qualifications.
Delegating analytics moves the execution closer to the training and
experience. This generates dividends by getting things done faster, cheaper,
and better. Part of the speed is in avoiding unnecessary or poor analysis. For
specialized problems like analytics, the most successful approach for “offtopic leadership”16 continues to be straightforward:
Delegate analytics to the specialists.
Leadership with on-topic training has serious advantages in delegating to
those performing analytics:
1. They can better predict and motivate timeliness and accuracy.
2. They can trust techniques.
3. They are in a position to delegate what they understand to experts, who
they can understand and trust.
4. They can better communicate with other analytical professionals.
Analytics is a way of thinking.
5. They just do not need as much time to make competent decisions about
analytics.
6. They understand that the quants engage in “meatball surgery.” The
focus is on the critical aspects of the analysis.
7. They recognize that specialists are not interchangeable parts. Analytics
does not all look the same to them. Hence, they can match the right
task to the right expert’s competencies. This enables specialization
within specialization—the leap from a vertical organization to a
horizontal one (Section 5.2). They do not require that everyone be an
interchangeable part in order to simplify their leadership role.
8. It is empowering for the quants to work with analytically enthusiastic
leadership.
There are tricks for delegating to all types of specialists. Here are some
tips intended to help:
1. All leaders should review and evaluate the results of the assignment.
Ordinarily the means used to accomplish the task are less relevant.
However, in the case of analytics, the means are an integral part of the
results. Managers are responsible for making sure that both the process
and the outcome of the delegated task are consistent with the goals.
2. The idea is to retain responsibility while delegating authority and
accountability. The analytics professional knows what needs to be done
and how to do it, and only needs the opportunity to do it. Delegate the
freedom to make decisions and the authority to implement them.
Managers should communicate to all individuals affected by the
project that it has been delegated and who has the authority to complete
the work.
3. Managers should discuss with the analytics professionals what
resources they need for a task and then empower them to secure those
resources.
4. Good leaders allow employees to participate in the delegation process.
5. If we are concerned that the project will take too long, then include the
deadline as part of the problem to be solved. On-topic leaders are better
at communicating this point.
6. If we are concerned about over analysis, then we set minimum
accuracy targets as part of the problem to be solved. This trick burdens
the analytics professionals with stopping unproductive data analysis.
We should keep trying to improve solutions yet let the experts discard
useless misleading analysis.
7. Even on-topic leaders should avoid the tendency to intervene simply
due to style differences.
Corporations with the advanced “power” to delegate have the
advantage.17
Incentives
Corporations run on their incentives. With the proper incentives,
corporations can become highly efficient. Incentives are best when they are
aligned with solving the problems. In the U.S., corporate incentive
structures have steepened in the past few decades. About 30 years ago, U.S.
CEOs received approximately 30 times the average employee’s
compensation. This multiple has since exceeded 340 times. As individualist
incentives steepen, this encourages much more individualism and
eventually sociopathic18 behavior. This tampers with team cohesion and
creates horizontal and vertical rifts in the corporation. Analytics, just like
innovation, thrives in a meritocracy with team incentives.19
Dysfunctional or misaligned incentives will lead a corporation toward
destruction. They can make it hazardous to do the right thing for the
company. The senior management of Bear Stearns had their bonuses
aligned to high-risk behavior.20 This encouraged their fatal mistake of
getting over extended and trapped in a liquidity squeeze.21
Incentives for leadership roles need to be long-term, and there should be
some team incentives. The concern with excessive individualistic incentives
is that they encourage everyone to place their self-interests ahead of the
corporation, creating more politics as employees vie for lottery prizes. We
think analytics needs more teamwork and that usually individual incentives
do little to motivate struggling employees. Those employees who are here
for the wrong reasons are difficult to incent.
Complex undertakings, like some analytical projects involving large parts
of the corporation, can be more efficient with team incentives. A
corporation has an incentive problem when it is hazardous or at least not in
an individual’s best interests to solve the statistics problem appropriately.
Section 2.2 Corporate Analytics Failures—
Shakespearean Comedy of Statistical Errors
“Safety is not a lucky system. It’s a system of science, analysis, and facts.”
—Mark Rosenker, Chairman, U. S. National Transportation Safety
Board22
“Data analysis is an aid to thinking and not a replacement for it.”
—Richard Shillington
“It’s easy to lie with statistics. But it is easier to lie without them.”
—Frederick Mosteller
Statistical malfeasance is one of today’s corporate diseases. Several
corporations have gone bankrupt, missed breakthrough opportunities, and
taken big losses because of statistical mistakes. These mishaps go
undiagnosed even after an “autopsy.” Sometimes it is difficult to trace
business mistakes back to absent or faulty data analysis, just as it is difficult
for corporations to measure the quality of the data analysis. The solution to
difficult decisions is not riverboat gambling. It is to measure the quality of
the information and then interpret the facts. Chapters 7 to 9, Statistical QDR
(Qualifications Diagnostics Review), will cover the means by which to
measure the quality of the facts, including certification for quants.23 If we
cannot measure analytics quality and we are unable to reasonably confirm
that there are no problems with our analytics and our analytics-based
decisions, then those are our problems.
For the remainder of this section, we will share accounts of corporate
failures due to poor analytics practice.
The Financial Meltdown of 2007–2008: Failures in
Analytics
Financial corporations are by necessity the most analytically savvy in
the global economy. Part of the banking cycle includes a financial crisis
that culls the weakest. The tinderbox that facilitated the financial
meltdown of 2007–2008 comprises a web of decisions that were based
upon invalid assumptions and faulty analyses. Once the housing bubble
reached a certain size, there was no gentle recourse and the bubble was
popped by additional mistakes in analytics made by the first victims.
Banks, credit rating agencies, and investors struggled to price the risk of
subprime assets, which are notorious for destroying banks.24 Those
highly leveraged banks with the worst algebra tended to lose the most.
The subprime melodrama went as follows. First, in 1999 the U.S.
Congress repealed the to Glass-Steagall Act of 1933, which protected
the economy from banks becoming too big to fail. In time, investment
banks were allowed to raise their leverage—the amount they owe versus
their cash on hand, to obscene ratios. This increased their ROAs (returns
on assets) and made them too soft to experience significant financial
stress.
Next, there was a glut of money looking for high-return AAA
investments.25 Investment banks pooled mortgages into MBSs
(mortgage-backed securities) and CDOs (credit debt obligations) for
sale on this market. They took their CDOs to the rating agencies, who
rated the riskiness of these investments. Then the banks sold them to
these investors. This generated huge profits for the banks, who
demanded more mortgages. Money was cheap. Housing prices rose. As
time progressed, the supply of prime mortgages shrank and was
outpaced by demand for CDOs. Exotic mortgages appeared to keep the
cycle going. This facilitated a growth in subprime mortgage. People
who ordinarily could not obtain credit for a single house were buying
multiple houses. Real estate investors were also buying a large portion
of the new purchases. Housing prices continued to rise at unsustainable
rates. It was clear that this was a shift in the economy.
In order to market these CDOs, the banks split them into tranches
based upon riskiness. They sold the least risky tranches and tended to
retain the highest-risk mortgages. Through magical accounting tricks,
they put these off of their balance sheet. In order to cover the risk, most
of the investment banks purchased credit default swaps (CDSs) from
AIG as an insurance policy against the worst that could happen. This
was their “originate to distribute” model, which was supposed to
generate fees and distribute all of the risk.
Fitch Ratings, Moody’s, and Standard & Poor’s
They were the “arbitrators of value” for these CDOs. They received
lucrative fees from the investment banks for the service of “objectively”
rating these CDOs. An investment bank would solicit multiple ratings
prior to selling the financial investments. The companies with the top
two ratings would be chosen, and they would be paid. Any other rating
agency received nothing. According to The Big Short, the rating
agencies did not always have their own models or complete granular
data. Instead, they relied upon models provided by the investment banks
and pooled data for any supporting analysis. Through some strange
alchemy, subprime mortgages were turned into AAA-rated investments.
Countrywide Bank, Golden West, and Washington Mutual
To maintain the pace in loan originations, shadow banks dealt in exotic
mortgages for consumers lacking the usual credit worthiness. These
included “no money down, interest only” loans that would balloon in
payments. These loans made sense only in an economy where housing
prices would continue to rise. However, the increase in housing prices
was unsustainable. Housing prices rose by 124% from 1997 to 2006.
The business was so lucrative that these shadow banks failed to heed the
mounting warnings from their risk models.
AIG Financial Products
AIG was a AAA-rated corporation with a small Financial Products
group that basically insured the risk for large blocks of debt. They
unwittingly amassed a vast portfolio of risky CDSs, which essentially
were insurance policies on mortgage-related securities. Their experience
illustrates the classic risk involved in directing analytics with an
ordinary off-topic manager of analytics.
Years before the bubble burst, AIG changed its management of the
Financial Products group. The incoming manager did not have the ontopic training in mathematics, statistics, and algorithms. This led to a
cultural change from vigorous discussions about how well their models
were performing toward apathy. The new head of FP managed up,
and for an analytics group, this is mismanagement. During this time,
AIG took on incredible risk without realizing it.26 At the onset, they
were insuring tranches that contained 2% subprime mortgage, and
before they realized it, they had grown this proportion to 95%.27
Fannie Mae: Next to the Bomb Blast
Although Fannie Mae and Freddie Mac, the two main governmentsponsored enterprises (GSEs), continue to haunt the American financial
scene, we should remember that they were not the first monoline
mortgage companies to fail; they were among the last. Countrywide,
Washington Mutual, and a number of others failed first. Golden West, a
thrift in California, was so toxic that it killed off its buyer, Wachovia,
one of the most widely respected national banks in both commercial and
regulatory circles, forcing its sale to Wells Fargo. Of course, nonmortgage businesses, such as the investment banks Lehman Bros. and
Bear Stearns, also failed. These banks generated enormous quantities of
bad mortgage loans and failed to fully implement their “originate to
distribute” model, which would have distributed 100% of their risk.
In order to understand what blew up the mortgage market in 2007–
2008 (and the problems continuing to the present day), we have to go
back in time. Although I’ve worked in a number of environments, my
perspective is fundamentally based on the years I spent regulating
multiple types of risk across a wide variety of banks at the Treasury
Department. This experience gave me a strong appreciation for the logic
and analytics of risk management, but without the level of intimidation
often felt by non-quants around folks with Ph.D.s in things like
statistics, economics (econometrics), operations research, and others.
These experiences included excoriating whole classes of models that
purported to be precise and statistical, but created so-called “reliability”
indices with no underlying statistics on their actual reliability—to the
delight of users and the chagrin of salespeople. It included catching
quants at major institutions faking a model validation. My experience
taught me to trust my own doubts, above all, and to always question,
with the most humble of attitude, the most lettered of businesspeople.
By the time I got to one of the mortgage GSEs, I had lost my youthful
arrogance, but also my youthful awe. And I began asking questions—
like a financial Colombo.
When the dot-com bubble burst in 2001, the Fed lowered interest rates
substantially. This caused a refinancing boom, which, as had happened
many times before, drove mortgage banks to greatly expand their
staffing to handle the temporary rate-driven increase in volume. As rates
hit bottom and stayed there, the “refi boom” consumed most of the
traditional borrowers (credit-worthy non-investors) and began to tail off.
It was at this point that “exotic” mortgages began to be engineered as a
tactic to keep the level of mortgage originations at an artificially
elevated level. Many of these exotic mortgages were neg-am (negative
amortization), characterized by low down payments and low monthly
payments—for a limited time. A large portion were “zero down, interest
only” loans. At the same time, corruption blossomed in the real estate
industry, and a significant portion of housing purchases were driven by
investors who, for “zero money down,” could buy a call option on the
housing market.
For the most part, these neg-am mortgage products were not actually
new; they were simply rediscovered by an industry that had little data
covering mortgage performance prior to 1995. Of course, anyone who
had been paying attention in the 1990s should have remembered28 that
the neg-am feature, which played a critical role in the new designs, had
also contributed to the hole in Citibank’s balance sheet in the early
1990s. There were two differences between the 1990s and the 2000s.
First, in the 2000s the commercial banks played a trailing, rather than a
leading, role. They caught up to the “leaders,” namely the mortgage
companies and their investment banking partners, only after the latter
had seemingly proven the concept, with several years of low losses and
high profits. Second, the recession of 1989–1992 shut down the first
neg-am experiment before it could grow too large. Whereas in the
2000s, the neg-am experiment was protracted by Wall Street redirecting
investment capital from the stock market. This capital came from
investors seeking moderate-yield investment-grade bonds. Instead, they
would receive new types of sub rosa junk bonds.
Throughout the 1990s and into the early 2000s, there were two major
groups in (first) mortgages: (1) conventional and government (i.e., FHA
or VA-insured), and (2) conforming and jumbo (determined by size
relative to GSE loan limits). Before the boom in exotics, the GSEs and
FHA dominated the smaller loan product market, with the banks and
investment banks dominating the rest, albeit at somewhat higher rates
and stricter terms.
When exotic mortgages were first introduced in the 2000s,
commercial banks did not enthusiastically receive them. They tended to
focus on building portfolios of loans and servicing assets, and had
relatively strong risk-based federal regulation. It was the mortgage
specialists, some non-bank mortgage originators and thrifts, who
partnered with investment banks to devise these exotic negatively
amortizing loans to save the mostly non-bank mortgage companies from
the typical cyclical downturn and mass layoffs following a refi wave. Of
course, the investment banks were also eager to bite off as much of the
GSE/Govi market as they could—a market that for years they had been
complaining was dominated by institutions with an unfair advantage. A
confluence of several factors lead to their enthusiasm for this exotic new
product, including the following:
1. Lack of publicly available data that would have underscored the
poor performance of previous experiments in neg-am mortgages
2. Faith that the “originate-to-distribute” model would really allow
the institutions to remove all credit risk from their books
3. A fundamental belief that, because they are backed by real
property, mortgages present minimal to no risk
4. A belief that the historical lack of a national fall in home prices
since the Depression meant that home prices would never fall
nationwide
5. Greed and envy directed toward the sheer scale and profitability of
the GSEs and a desire to take a piece of it
6. Demand for new investment classes as the dot-com crash
discredited equities generally
7. An overly optimistic faith in the ability of financial engineering to
add significant value
So, what likely happened at investment banks is what I witnessed at
Fannie Mae, when it belatedly began approving exotic mortgage
products for purchase through standard channels in 2005 (it was already
buying securitizations). When asked if he had built a model to estimate
the credit guarantee-fee for one of these negatively amortizing products,
a credit modeler, whom we’ll call Jim since he still works there, said to
the SVP in charge:
Yes, I’ve built a model to calculate the credit guarantee-fee, given the information we have.
However, we have no historical data on this product, so we’ve had to make a lot of very
questionable assumptions for the model.29
Of course, by the time this discussion occurred, these new mortgages
had already reduced the GSE’s market share by 50%, and management
was concerned with keeping the agencies “relevant” in this “new
world.”
Jim’s final comment was, “Rather than rely on this very approximate
pricing, what you should ask yourself is, would you want your son to
buy a house with this mortgage.” This idea was ignored by our senior
management, and in their quest to be relevant, they began originating
exotics. Although I haven’t had the opportunity to analyze the GSEs’
actual losses, I believe that they were pushed into failure not by their
origination of these questionable loans in competition with the
investment banks, or even the purchase of pools of asset-backed
securities based on these exotic mortgages. Additionally, it is a well-
repudiated myth that the requirement that the GSEs invest a minimum
amount in loans for low- and moderate-income borrowers pushed them
over the edge. While the GSEs suffered significant losses on their
exotics, losses on their required low-mod product suffered minimal
losses. Despite what their critics have implied, the GSEs never
dominated these high-risk markets but always played catch up to the
“more nimble” fully private-sector players. A careful dissection of GSE
losses will likely show that the bulk of them were due to the simple fact
that home prices declined at double-digit percentage rates nationwide.
This decline was due to a bubble that was fueled almost entirely by an
out-of-control private sector. What killed the GSEs was nothing that
they themselves did, although they were guilty of some lapses in
judgment. What killed them was the definition of who they were and the
waters in which they sailed.
As the mortgage banking/Wall Street axis pushed down the firstyear’s monthly payment on mortgages they generated so much
additional purchase business that they drove home prices higher—to the
point where 10% appreciation per year began to be treated as a new
norm. This rate is clearly unsustainable in a world of 3% inflation and 2
to 5% growth, as housing costs would eventually crowd out all other
consumption and production. These price increases could only be
sustained through a Ponzi “bubble” of neg-am mortgages and house
flipping financed by overrated investment vehicles. Underwriters were
allowed to calculate qualifying income-to-payment ratios based on the
temporarily low payments allowed under these rates—meaning that the
ultimate insurance for lenders on the negatively amortizing loan was
infinitely rising prices, which could only occur if new, and ever more
irrational, buyers were found to pay for increasingly inflated property.
Of course, this greater fool was the very same person that the buyers—
an enormous number of them motivated by pure speculation—were
relying on to protect their minimal investment. It is the collapse of that
mechanism that ultimately brought down home prices and with it the
GSEs, and severely threatened the financial health of the U.S. and many
foreign economies.
At this point I have to refer again to Jim. Over the 2007–2008 period,
and perhaps before that, Jim would periodically distribute a very
startling graph. It was simply the average nationwide price for homes
from 1985 to the present, using Fannie Mae’s internal repeat-sales
index, in constant dollars (I think it was indexed to 1995). It looked like
an artist’s rendering of a tsunami—a little wavy line blending into a
massive tsunami towering, preparing to crash. I have reproduced it as
well as I can by using publicly available information in Figure 2.1 (this
chart uses the national FHFA index and the monthly national CPI to
deflate it to January 1991 dollars).
Needless to say, the marketing department didn’t much care for Jim.
In 2007 he attended an enormous marketing department meeting. At one
point he asked a senior manager, in this open forum, what we were
going to do when the market crashed. The manager retorted, “Jim,
you’ve been saying the market was going to crash for two years. Tell
me, when is it going to happen? When?” In less than six months, the
overconfident manager’s query was answered and a few months after
that, the manager retired.
We, in the Economic Capital group, were so taken with Jim’s simple
analysis that in the fall of 2007, we developed an alternative credit risk
stress tool. The tool generated possible future home price stress paths
for use in calculating the potential economic damage of a stress to
Fannie Mae. We set just a few simple parameters, such as the depth to
which prices might fall measured as a percentage below the previous
real low, how long it would take to reach the real price nadir, and how
long a recovery would take, as well as the inflation rate that would
allow us to translate stress/mean reversion from the real price space
back to nominal. This model did not claim to predict the future. What it
did was to create apparently plausible stress scenarios, using simple
assumptions and obvious logic, that were far worse than those generated
by the Fannie Mae’s statistical models at claimed probability levels
below 0.1%.
Figure 2.1 Housing Price Index (HPI)
A talented quant on my team built this simple tool, and we wrote up
some documentation. We were listened to with politeness in the spring
of 2008, but could not get such a stress tool implemented into the
corporate credit modeling infrastructure. Of course, since we were taken
over that summer, the model would clearly have done us no good. My
point is rather the following:
The failure at the GSEs was due to two factors. First, they are
mortgage monolines with government charters. They couldn’t diversify
their risks, and their executives would only maintain their status—and
pay—if they continued to be a major force in the market. Second, and
equally important, was the fact that these firms were run primarily by
people whose job it was to be optimistic and whose imagination only
entertained dreams of greater success—never nightmares of doom, no
matter how obvious the analytical evidence. Like many successful
organizations, they were dominated too much by marketing, in this case
with a heavy dose of government relations. Analytics’ job was to make
fine distinctions in value and risk—not to influence strategy. The GSEs
missed the opportunity to integrate analytics into their strategy. The
only criticism I ever heard of Jim from other, senior members of the
company’s analytics community was that he was a “chicken little” who
was making no friends. No one ever questioned the relevance or
legitimacy of his straightforward analyses.
There are only two ways the GSEs could have possibly saved
themselves. The first was to simply stop lending in the mid–2000s. This
could have been done by recognizing the bubble and pricing themselves
completely out of the market. This would have been incredibly brazen,
but because of the GSEs’ role in the market, it might well have moved
the entire market back to rationality in a way that the federal regulatory
authorities did not have the courage to begin until 2007, and might not
have been able to accomplish earlier. The second survival path would
have been to hedge our credit risk, taking short positions in the
relatively illiquid Case–Schiller index, and/or buying credit default
swaps on mortgage-backed ABS)—or shorting the ABX index of assetbacked securities. But the faithful don’t hedge. They believe in their
business, so it was much easier for the pilots to keep running the ship
downwind, following the market, then to take a stand and tack into the
headwind. I believe that only a senior management with the ability to
imagine tragedy as well as triumph, and an appreciation of risk analytics
and the courage to follow it wherever it may lead, can be relied upon to
safely steer today’s leveraged risk-taking institutions.
The Great Pharmaceutical Sales-Force Arms Race by
Tom “T. J.” Scott
Many senior leaders are not analytically sophisticated. Some lack even a
basic understanding of statistics or scientific methods. As a result, these
leaders often rely on gut instinct when making decisions. When leaders
use gut instinct, they are relying on long-held beliefs and personal
experience. There are two obvious problems with this. First, markets are
changing rapidly, making many long-held beliefs untrue. Second,
relying on one’s personal experience is like doing important research
with a sample size of one. In this case, ideas outside the experience of
our personal sample are considered less reliable.
In big U.S. pharmaceuticals during the last 15 years sales forces grew
to an immense and inefficient size, maybe two to six times larger than
necessary. This happened for a host of reasons including:
1. Leaders could not let go of their long-held beliefs.
2. Many of the analysts and consultants that supported them were
under-qualified and all too happy to provide senior management
with results that matched preconceived thinking. This provided
unreliable “analysis.”
Meanwhile, there were some of us working in a statistical
underground with complex contradictory analysis. Unfortunately, we
were unsuccessful in convincing our leadership to act on what we had
found. Our leaders could not discern that our analysis was more reliable.
It was difficult to convince them, because pharmaceutical leader’s
long-held beliefs were formed more than 15 years ago, when sales
interactions between doctor and sales representatives were critical. At
that time, physicians relied on sales representatives to provide efficacy
and safety information. Physicians were trained to diagnose; they had
very little pharmacology training. Pharmaceutical sales representatives
made up for this lack of pharmacology knowledge, and helped
physicians stay current with interactions between them and physicians
that were valuable and meaningful. At that time a physician might
spend 20 minutes discussing clinical trials and mechanisms of action
with a sales rep.
Today, however, the typical sales interaction is a 45-second chat about
the local sports team while the representative gets a signature for
samples. Physicians rely on other sources of information—often
managed care providers—to evaluate which drugs are efficacious, costeffective, and yet covered by insurance companies. One response to the
dwindling influence of the sales force was to deny that there was a
problem, and the other was to increase the size of the sales force.
The business rules that senior leaders use to determine how to size
their sales forces and set activity levels assume that selling interactions
today are as valuable and meaningful as they were 15 years ago.
Unfortunately, these business rules don’t distinguish between a 45second chat about sports and a 20-minute talk about the latest clinical
trial. The business rules assume that all interactions are the same—
valuable and meaningful—and assign every sale that occurs to the
selling activity preceding it. Even more unfortunate is that most
interactions today are not valuable or meaningful, and most aren’t even
memorable—but the business rules don’t know it.
The unreliable analysis grossly exaggerated the benefit that increasing
sales calls had on increasing sales. This is because two things changed
during the last 15 years that were not always evident, but structured
qualitative research eventually showed. First, when a doctor
determined that he or she was going to use a product, the doctor
would see that sales representative more often. Second, when a
representative found out a doctor was going to write more
prescriptions for his or her product, the rep would pretend to make
more sales calls on that doctor. Each year the sales representatives
were assigned more visits to the same physician. So each year they
recorded more visits that didn’t actually occur, and each year they were
assigned more to deliver—slow but consistent increases over many
years. In time, the data could no longer be interpreted at face value—but
it was. This was becoming obvious in both qualitative and quantitative
research, but leaders wouldn’t or couldn’t believe it. And often those
supplying numerical execution data an…
Purchase answer to see full
attachment

  
error: Content is protected !!