/Corporate execs are talking differently on earnings calls to please the machines

Corporate execs are talking differently on earnings calls to please the machines

Getty Images

Would you talk differently if you knew a machine was listening to you and grading you based on what you were saying, or based on whether you were using positive or negative words, or even if the sound of your voice was optimistic or pessimistic?

Apparently, Wall Street executives are talking differently. They are trying to game machine algorithms on earnings calls.

You’ve heard of George Carlin’s “7 words you can’t say on TV?” We may now have “words you can’t say on an earnings report.”

A recent study found that executives on earnings calls are increasingly avoiding using negative words and trying to sound more upbeat, so machine algorithms will score the call as more “positive” than “negative.”

Oh man. Anything to fool the algos.

This is a new round in the war between machines and people. Machines can fool people, but people are trying to fool machines, too.

All of this makes sense if you understand the evolution of trying to figure out what is “really” going on with corporate earnings.

First, there were earnings reports, which came out of the creation of the Securities and Exchange Commission in the early 1930s. Then there were earnings calls. Then there were analysts trying to figure out the “body language” of the executives on the calls to determine how they “really” felt about their company prospects. Then came machines listening to executives for keywords that were deemed important and deciding whether the calls sounded “upbeat” or “downbeat” based on the words being used.

Now, there’s a new twist: Seems like the executives have figured out that the machines are listening, and that if they (the executives) avoid using certain words that sound “downbeat” or “negative” they can improve the score they will get, and their earnings call will magically sound more positive.

So say Sean Cao, Wei Jiang, Baozhong Yang & Alan L. Zhang, authors of “How to Talk When a Machine Is Listening: Corporate Disclosure in the Age of AI,” published on the National Bureau of Economic Research website.

Their main conclusion: “Firms with high expected machine downloads manage textual sentiment and audio emotion in ways catered to machine and AI readers, such as by differentially avoiding words that are perceived as negative by computational algorithms as compared to those by human readers, and by exhibiting speech emotion favored by machine learning software processors.”

In other words, humans are using words they think the machines want to hear and that will give them a more positive score.

The authors noted that this effect was particularly notable on companies that had very high interest in their filings. In other words, the more people paying attention, the more likely the execs were to change their behavior.

Of course, we have known for years about the ability of machines to analyze earnings calls, but the authors say “our study is the first to identify and analyze the feedback effect, i.e., how companies adjust the way they talk knowing that machines are listening.”

OK, so we are in a giant hall of mirrors. Humans (investors) are trying to figure out what other humans (corporate executives) really feel about their company’s prospects by listening to earnings calls that are analyzed by machines, and the humans (corporate executives) are changing their behavior so the machines will tell the other humans (investors) that things are better than they really are, or at least as good as the executives really meant it to sound.

Got that? What could go wrong?

“Humans are taking machines and using them to analyze emotional signals so we can analyze other humans more efficiently,” said DataTrek’s Nicholas Colas. “But the machines are doing it on a scale humans could never do. There’s an endless loop that is being set up, and expect this to get even more refined over time.”

Even the study authors are a little worried about where this will ultimately lead us: “Such a feedback effect can lead to unexpected outcomes, such as manipulation and collusion,” they said.

Subscribe to CNBC PRO for exclusive insights and analysis, and live business day programming from around the world.

Let’s block ads! (Why?)