Featured Collaborator for May: Robert Bloomfield

Interview with Robert Bloomfield, Nicholas H. Noyes Professor of Management and Professor of Accounting at Cornell University’s Johnson Graduate School of Management

 

What are your main areas of research?

For the first part of my career at Cornell, I investigated whether firms could boost their stock price—or delay a price decline—by reporting in ways that fool less sophisticated investors. The traditional answer in finance and economics is “no”; managers don’t benefit by fooling less sophisticated investors, because any influence they have on stock price will be eliminated by the competitive trading of more sophisticated investors. 

But through a series of experiments that blend psychological theory with detailed modeling of how trade actually works (a field called “market microstructure”), I have been able to demonstrate a number of reasons that unsophisticated investors still influence stock price, even in markets that allow a great deal of competitive trade. One key reason is that no matter how sophisticated an investor may be, they can still learn from seeing how the market is reacting…and it is very difficult for them to distinguish price movements that reflect real information from those that just reflect the biases of unsophisticated traders.

In 2002, I distilled my findings into the Incomplete Revelation Hypothesis (IRH): information that is more costly to extract from public data is less completely revealed in market prices. The IRH contradicts the more widely known Efficient Market Hypothesis (EMH), which states that the market will completely reveal information as long as the information is made public so that some investors can extract it. But the IRH not only explains the results of controlled laboratory experiments better than the EMH, it also helps explain why both managers and policy makers care so much about how information is presented to investors:

 

  • Managers often try to boost earnings by making overly rosy assumptions, or by excluding long-term commitments from their liabilities. Given that these “earnings management” tactics are spelled out in the footnotes to financial statements, the EMH would predict that they have no effect on stock price, and managers would have no reason to engage in such dubious practices. But the IRH provides a simple explanation:  managers are tucking the bad news where only investors with time and training can find them.  Policy makers like the Financial Accounting Standards Board and the Securities and Exchange Commission are continually revising reporting standards to limit such “window dressing”.
     
  • Managers who are reporting bad news tend to write annual reports that are longer and harder to read. Again, this is hard to explain with the EMH, but a natural prediction of the IRH: managers are trying to conceal bad news in a flood of words (or alternatively, they work harder to make it easy to read good news). While the SEC has encouraged firms to write in Plain English, they and FASB have only just begun to tackle the question of when detailed disclosure is “obfuscation”, rather than transparency.

Earnings management, window dressing and obfuscation are often perfectly legal, but they still seem both harmful and ethically questionable. Even worse, such behaviors are even more prevalent inside the firm, when managers report to their bosses, peers and subordinates (rather than external investors). This has been the focus of my more recent research and teaching. My first step has been to provide a clear definition of an important class of misreporting, which I call measure management: focusing on improving the measures of performance, rather than improving the true performance those measures are intended to reflect. 

The great social scientist Donald T. Campbell provided a great starting point for understanding measure management when he articulated what has come to be known as Campbell’s Law:

Campbell’s Law:  The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”

Unfortunately, Campbell’s Law doesn’t isn’t very useful in identifying solutions to this widespread problem, because it suggests that measure management will occur any time a measure is tied to incentives.  In response, I have crafted a Law of Measure Management that identifies the three necessary conditions for measure management to occur:

Law of Measure Management:  Measure management arises when incentivized measures capture performance constructs with error, the people being evaluated know the details of how performance is measured, and people have discretion to distort either operations or reporting.

This extension of Campbell’s Law is useful in two ways. First, it helps organizations identify several ways to mitigate measure management. As an example, consider how Volkswagen responded to strong incentives to report favorable emission measures to the Environmental Protection Agency (EPA). Because strong emission control reduces performance and increases wear, VW engineered their cars to ramp up emission controls when they were being tested for emissions; the car could detect this by noting, for example, that the engine was running hard but the wheels were not turning. 

The Law of Measure Management also helps us distinguish between two broad classes of measure management, depending on whether the manager has distorted reporting or operations. Discussions of reporting ethics almost always focus on situations where someone is distorting the reporting processes itself, as in the case of teachers who are changing students’ answers on high-stakes tests, or doctors who are understating the severity of complications after surgery. 

But few consider the serious ethical issues involved with distorting operations, such as teaching to the test or turning away high-risk patients. In fact, operational distortion typically causes more direct harm than reporting distortion: students learn less when teachers teach to the test, and high-risk patients receive worse medical care when they are turned away by doctors.  But despite these concerns, most people don’t even consider operational distortion as an important or even recognizable topic for discussions of business ethics.

 

How does your work on accounting help companies that want to improve themselves as ethical systems?

My research reinforces a maxim long emphasized by accountants: it is most useful to view ethical behavior as a result of institutional choices, rather than as a result of moral character.  It isn’t that a manager’s character has no effect on the ethicality of their decisions, just that it tends to be swamped by institutional forces—and in general, people are far too prone to attribute ethical actions to character when they are actually driven by environment; this bias is so well documented it has been named the Fundamental Attribution Error. Also, we rarely know a person’s character, and even if we do, we don’t know much about how to change it. 

But we know a great deal about the systems we place people in, and how those systems can encourage ethical choices. Using the Law of Measure Management, we know we can limit unethical reporting behaviors by not placing too much incentive on measures, by making sure those measures do a good job of capturing the performance they are intended to, by not letting people know too much about how the measure is created, or by limiting their discretion to distort operations and reporting. The ‘or’ is very important here, because every one of these responses encourages ethicality at some cost. Low incentives may result in low motivation. Good measures require a lot of data. Concealing measurement methods may make it hard for employees to know what you want from them. Limiting discretion can limit organizational agility. Each company must decide which of these costs are most bearable.

We can place this advice in the broader context of the well known ‘fraud triangle’, which is shorthand for the theory that people engage in fraud and other ethically questionable actions when they are under great pressure, they have opportunities to avoid detection, and when they can easily rationalize their choices. Again, you only need to address one of these conditions to make decisions more ethical: don’t impose too much pressure on your employees and they won’t look for opportunities to get away with unethical actions; treat them and your other stakeholders fairly and respectfully, and they will find it very difficult to rationalize behavior they know is causing harm.

 

If you could only highlight one paper or research finding (or piece of work that you’ve been involved with) that relates to Ethical Systems which one would it be and why?

One of my recent experiments shows that we are more effective at deceiving others if we also deceive ourselves. 

Jeremy Bentley and I gave one group of subjects (the “reporters”) some information about two investment opportunities, and asked them to make a recommendation to someone from the other group (the “users”). But the reporters weren’t disinterested—they received a commission for persuading the user to invest in one of the opportunities, but not the other. Half of the time, the user had a chance to meet with the reporter face-to-face after receiving the recommendation but before choosing their investment. 

Most of the reporters’ recommendations were self-serving: reporters pushed the opportunity that earned them a commission. Users were better able to divine the reporters’ true beliefs when they met face-to-face, which is consistent with the theory that people often convey their true beliefs (and their deception) through a number of subtle cues, like body language, hesitations and word choice.  But the key result in the paper was that reporters were more persuasive when they changed their own sincere beliefs to think more highly of the opportunity that would earn them a commission.  Deceiving themselves in this way made it easier for them to deceive others, because they didn’t have any deception to betray:  they believed in what they were selling, even though that belief was shaped by their incentives. 

This study shows the difficulty of eliminating deception in business contexts. Admonishing reporters to act ethically probably wouldn’t have helped the users in our experiment, because the reporters probably thought they were behaving ethically. After all, they said what they believed!

 

Tell us about one of your current or future projects (perhaps something on an upcoming book or paper?)

I currently have a series of studies in progress on moral attitudes toward measure management. We conducted two large surveys of approximately 1000 American households, and presented them with examples of reporting and operational distortion. 

In our first survey, we asked people how acceptable it would be for a salesman who wants to hit a sales target by misreporting when a sale occurred (reporting distortion) or by giving a big discount that increased sales but reduced profits (operational distortion). 

In the second, we asked people how acceptable it would be for a school principal to improve test scores by cherry-picking which students’ scores were reported (reporting distortion) or by ordering teachers to teach to the test (operational distortion). Even though we emphasized the harm caused by operational distortion, people still tended to view reporting distortion as less acceptable. This was especially true among those who care deeply about the moral value of “purity”.

 

If you could only give one piece of advice to companies, what would it be? (I think I did this above).

Focus on the environment you are creating for your employees, not on their moral characters. 

 

If you could only give one piece of advice to individuals, what would it be?

My answer to this one is pretty much the same as for companies: focus on the environment you are shaping for yourself, rather than whether you are acting ethically. Think like a dieter.  If there is a favorite dessert in the house, we have the pressure to cave in (our grumbling belly), the opportunity to do so, and almost always a ready rationalization for why its ok. So even though breaking a diet isn’t typically fraudulent or unethical, we can still use the fraud triangle to predict that resistance will be close to futile. We all know how hard it is to resist that slice of cake calling to us from the refrigerator, so we don’t buy it, or we send it home with our dinner guests! Much the same is true for decisions with more serious ethical consequences. If you don’t want to be that person who distorts reporting or operations in unethical ways, don’t put yourself in situations that are ripe for measure management and other problematic behaviors. 


Featured Video

A playlist of videos on measure management and ethics, all drawn from my book, What Counts and What Gets Counted.

 

 

Featured Academic Article

I have a thought piece in Accounting Horizon called A Pragmatic Approach to More Efficient Corporate Disclosure (See non-paywalled draft version called Pragmatics, Implicature and the Efficiency of Elevated Disclosure). 

Featured Popular Article

Readers of Ethical Systems might be interested in How to Be a Good Professor. While it is directed at Professors, there is a lot of advice in there that would help people be a Good Boss, a Good Employee, or a Good Professional. 

Note that the advice will help you be good at your job, not necessarily successful.  So I discuss a number of ethical quandaries that arise when those two goals diverge, and we are led to manage measures, rather than the performance we truly care about.