So be good, for goodness’ sake

Predictive technology can help employers find the roots of both personal and corporate noncompliance. Where are the ethical boundaries?

As data-gathering and analytics technologies amass more and more ability to squeeze information out of what may feel like thin air, employers face new questions about using these tools to predict and detect behavior. “Can” vs. “can’t” isn’t the only frontier. There’s also “can” vs. “should.” At least one participant in Deloitte’s Cross-Industry Compliance Leadership Summit described themselves as “slightly aghast” at the possibilities.

In addressing the summit, hosted by the Deloitte Center for Regulatory Strategy Americas, Deloitte & Touche LLP Advisory Principal John Lucker said that whatever the benefits of predictive technology, one thing organizations “shouldn’t” do is allow the perfect to be the enemy of the good.

The late British statistician George Box said, “All models are wrong. Some models are useful.” Lucker invoked that principle when he recalled seeing companies invest in the design of predictive systems, only to hold back on implementation while they waited for the data to be perfect. And waited. “Your data is never going to be perfectly clean,” he said. “[If you embrace] this concept that you can’t do anything until your data is all organized and clean, you’ll never get anything done.”

According to Lucker, companies that want to build analytics into their compliance programs – to cross the threshold from “can’t” to “can” – must focus on six goals:

  1. Understand the strategic goal
  2. Understand the nature of the data and the analytic processes that will use it
  3. Integrate the technology of the compliance solution into the business process
  4. Determine what systems need to change and what data needs to be gathered – and know that the priorities compliance brings to these questions may not be the same ones IT does
  5. Address change management by explaining what is happening, why, and how data-driven compliance helps everyone
  6. Gather all the data that steps one through five above generate, and aggregate it in a way that allows you to measure and adjust what is and isn’t going well

Not every behavioral case requires heavy analytics. One executive at the summit recalled a time when an employee claimed to be out sick – then used a company credit card for a meal in another country. In other cases, a “sentinel effect” – letting people know their behavior is under scrutiny – can influence their behavior before any misdeeds actually happen. Deloitte & Touche LLP Managing Director Tom Delegram noted, “Transparency works. If you send a message at all levels, some of the problems are self-correcting.”

But in other cases, sophisticated analysis of the information employees generate in the course of doing their jobs — what Lucker called their “data exhaust” — can detect information compliance officers need to know. Sometimes, this is improper behavior already in progress. At other times, the same analysis uncovers latent potential for misbehavior in someone who hasn’t done anything wrong. In those instances, Lucker said “nudges” can set employees back on the right path, where they remain employed and productive.

“This is among the most powerful set of tools that’s available to us, and we don’t use them often enough,” Lucker said. “The question is how do you design processes, procedures, and challenges in a way that is friendly, not accusatory? In a way that will tease out more of the bad stuff than if you did nothing?”

This is where the other threshold, from “can” to “should,” comes into play.

One question is how broad to make the data “take.” Lucker noted that existing tools make it easy to review not only an employee’s social media behavior, but also the behavior of that person’s social media contacts and people several levels removed. Some companies do overt psychoanalysis. Checks of credit scores and driving records are already commonplace. Then there are “outlier” behaviors – for example, the person who works weekends unusually often, or who uses a USB drive to take work data home.

“From a data perspective, sometimes it’s not necessary to snoop in great detail, but merely to take an aggregate – to measure each person in the stream of the norm,” Lucker said. “If you’re on one radar screen, you’re probably on others.”

Transparency plays an important role here as well: If the organization is monitoring employees in ways they can’t plainly see, what is the company’s obligation to tell them it’s happening? Lucker noted that like many employers, Deloitte tells current and prospective employees it will be checking on them, and receives signed permission to do so. But he also noted that at least in the United States, “there are very few boundaries around gathering information on our employees.”

One compliance leader in attendance raised the example of a company that was ahead of its time in behavioral analytics, using data to strengthen its anti-corruption efforts as long ago as 1989. The company was found to have violated privacy laws.

The rules and the practice may have evolved since then, and more organizations accept the idea that “snooping,” under the right controls, is a net moral positive. It’s also attractive to the leaders who would otherwise face more improper behavior on their watch. “Using analytics to predict future problems is a very easy conversation with executives,” a participant said. “It’s a lot easier than saying the problem already happened and we have to fix it.”

A compliance leader who said his organization has had “some turmoil, some arrests,” said his team had amplified its use of data mining and analytics. “We’re looking backwards,” he said. “We’re looking more at the parameters around situations that are neutral-sounding or may seem commonplace. So you’re not looking for what’s wrong; you’re looking for what seems outside a bandwidth of what’s normal.”

And another summit attendee said implementing a more robust analytics program had saved $17 million in workers’ compensation costs within a year.

Lucker acknowledged that predictive technologies in compliance are able to point toward possible trouble, but not to try or convict anyone.

“You look for data signals,” he said. “You look for anomalies. But just because you’re in a bad zone doesn’t mean you’re going to do a bad thing. It’s been demonstrated that having a low credit score correlates with being a bad driver. But not everyone with low scores crashes their cars.”

Read our full report, Ounce of prevention: Using compliance analytics to unearth the precursors of workplace misconduct, to learn more about predictive technologies in compliance.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s