A JP Morgan intern who had to click on 12,000 websites regrets nothing | Ladders

Algorithms can't do everything, as one unfortunate intern found.
Future of Work

A JP Morgan intern who had to click on 12,000 websites regrets nothing

Update: As of March 31, this story has been updated with details on the intern’s work experience. 

The world is awash in stories of interns and low-level staffers having to do gruntwork: The Devil Wears Prada covered it best.

All of those stories, however, came before the advent of the Internet. Now internships have a whole new frontier of intense demands: clicking, clicking, clicking.

Consider the case of the unfortunate JP Morgan intern whose job was to sift through 12,000 websites that carried the bank’s advertising.

One by one.

Click by click.

12,000 times.

Why humans still think better than machines

Here’s the background: on Wednesday, The New York Times released an article about JPMorgan Chase’s new whitelisting policy, which detailed how the company has moved away from programmatic advertising that dropped the bank’s ads on any website, towards more human intervention.

How it all started: a Times reporter telling the company that a Chase ad was appearing on a “Hillary 4 Prison” website.

It became an unnamed intern’s job to sift through 12,000 websites where JPMorgan’s ads were showing some activity.

In a 30-day period, the intern had to click on each of those 12,000 websites to make sure the ads weren’t appearing on controversial sites that could lead to bad press: sites of so-called “fake news,” which were formerly known as propaganda sites.

In total, the intern flagged about 7,000 ads.

The result: Chase is now getting the same engagement with ads on 5,000 of the intern-approved sites as it did with 400,000 sites that included many that featured propaganda. And its brand is not housed alongside so-called “fake news.”

Who was that intern?

The Times story is not about this intern: it’s a friendly story about how Chase is making the same profits using fewer ads.

But, in the new world of digital labor, that intern is the most interesting detail. What was that person’s day like? How much did he or she get paid? What was it like to look through 12,000 websites?

Ladders reached out to JPMorgan to find out more details on the intern. We’re glad to report that she seems okay. For outsiders, clicking through thousands of websites sounds terrible, but it helps when that work gets recognized. And for Elisabeth Barnett, it did.

Barnett, a recent college graduate, is the intern who works as a media marketing analyst as part of the Chase Leadership Development Program.

According to Chase’s Chief Communications Officer Trish Wexler, Barnett was “really proud” that she helped “make a real difference in resisting fake news.”

Wexler told Ladders that she is “not sure” how Barnett managed her time, but the bank plans to find out and clone it.

For her hard work, Barnett got taken out to lunch by her supervisor and even got a public shoutout from JPMorgan Chase’s Chief Marketing Officer, Kristin Lemkau:

The overarching goal for the Chase Leadership Development Program, like all internships, is to get hired.

“At the end of this we want to hire them and we want to make sure they had a good experience,” Wexler said. By getting recognized by a top executive for her work, it seems like Barnett is well on her way there.

Digital labor can be the hardest to define

While automation is becoming more popular for some white-collar jobs, like banker and lawyer, some professions don’t adapt to robots as well. There are still many simple, repetitive tasks we make humans do because we haven’t figured out how to automate them.

This intern’s work was an example: Algorithms were set to distribute Chase’s advertisements across a variety of websites— but those algorithms are unable to discern real news websites from fakes news or propaganda sites. This is where humans are necessary: to make those judgments on quality that algorithms can’t.

What’s so bad about that? Well, one big concern: if you’re a human given a task meant for a robot, you start to feel like one. The best example of this is content moderators, who have sued companies like Facebook, Google and Microsoft for requiring humans to do jobs that involve exposure to hundreds of thousands of traumatic images a day. Content moderators in the Philippines are clicking through the worst of humanity to make sure they don’t appear in your Facebook and YouTube. This invisible labor force is estimated to be “half the total workforce for social media sites,” according to a Wired article, but very little is publicized by these companies about the psychological toll.