Close Button
Newsletter Button

Sign up for our newsletter

The latest from Inc. Southeast Asia delivered to your inbox.

By signing up for newsletters, you are agreeing to our Terms of Use and Privacy Policy.

Taylor Swift Gets Into Some Creepy Stalking of Her Own with AI

When looking at artificial intelligence, step back, find out what it really does, and consider potential consequences.

Share on
BY Erik Sherman - 14 Dec 2018

PHOTO CREDIT: Getty Images

When looking at artificial intelligence, step back, find out what it really does, and consider potential consequences.

Taylor Swift attracts attention, success, and money. As well as stalkers. There is apparently a list of people who have become besotted with the pop star, at times creepily and scarily so.

If you're Swift and her organization, you take such things seriously and find ways to gain control. In this case, according to a Rolling Stone report on Monday, the solution was an artificial intelligence facial recognition app hidden inside a kiosk during her 2018 Reputation Stadium Tour stop at the Rose Bowl in May.

The kiosk reportedly showed rehearsal clips of the show. As people stopped to watch, a camera in the kiosk surreptitiously captured their images and transferred the files to a location in Nashville. A.I. software then compared the images to a "database of hundreds of the pop star's known stalkers," as someone who saw a demo of the system told Rolling Stone.

Hundreds of stalkers. That is downright crazy and understandable that Swift and those working for her might be concerned.

But it is disturbing in its own way. Apparently there may have been no notification or presence of informed consent. Similarly, concertgoers received no explanation that the information would be kept for who knows how long.

As Newsweek noted, it isn't clear whether the entire operation was legal or an infringement of privacy rights.

Even if legal, there are questions of what happens if the software incorrectly matches people to suspected stalkers. In July, Amazon landed in hot water when the ACLU used its software to match members of Congress with a database of publicly available mug shots.

More than five percent of the members of Congress matched to at least one face among the mug shots. At the time, people of color represented 20 percent of Congress but 39 percent of false matches. Including civil rights icon John Lewis.

So, there may have been use of a type of technology that can be wildly faulty and done so in a way that might or might not be legal. That's a double problem of potentially stirring up a legal hornet's nest and getting inaccurate results that could create even more unnecessary problems while undercutting the intent of the action in the first place.

Too many people are far too comfortable using cutting-edge technology and assuming they understand how well it works and all the implications of employing it. That's nothing but bad management, and something too common.

Vendors and those with an axe to grind about a given technology will tell you that it's everything people claim it to be. Basing decisions on the exhortations of salespeople is a mistake. If you don't understand enough about technology to do the research yourself, get someone else who can to do it for you. Talk to a lawyer about the implications of something novel, particularly when it involves capturing data from the public.

Will Swift get hurt by something like this? Probably not, and if it is a problem, she has the money to get experts who can deal with it. You and your company are another matter. The best way to address problems is not to let them occur in the first place.

inc-logo Join Our Newsletter!
The news all entrepreneurs need to know now.


How Muhammad Ali Inspired Chris Sacca, Obama, and Magic Johnson

Read Next

How to Master the Art of Conversation, According to One of the Best Conversationalists Ever

Read Next