Tag Archives: pro life

Why Siri Can’t Find Abortion Clinics

This week, a lot was made in the news about Siri’s supposed pro-life leanings. Essentially, a bunch of people got upset because Siri couldn’t find a local abortion clinic, even though abortion clinics don’t actually call themselves that. Apple denied that Siri had any pro-life leanings whatsoever, saying instead the service was in “beta.”

So what really happened? Well, Apple just learned its first lesson about search: you’re held responsible when the information people are expecting to see doesn’t show up in a search query, even if that information is only tangentially related to the actual words in the query. It’s a headache Google’s been dealing with for almost a decade.

Danny Sullivan over at Search Engine Land explains:

First, Siri doesn’t have answers to anything itself. It’s what we call a “meta search engine,” which is a service that sends your query off to other search engines.

Siri’s a smart meta search engine, in that it tries to search for things even though you might not have said the exact words needed to perform your search. For example, it’s been taught to understand that teeth are related to dentists, so that if you say “my tooth hurts,” it knows to look for dentists.

Unfortunately, the same thing also makes it an incredibly dumb search engine. If it doesn’t find a connection, it has a tendency to not search at all.

When I searched for condoms, Siri understood those are something sold in drug stores. That’s why it came back with that listing of drug stores. It know that condoms = drug stores.

It doesn’t know that Plan B is the brand name of an emergency contraception drug. Similarly, while it does know that Tylenol is a drug, and so gives me matches for drug stores, it doesn’t know that acetaminophen is the chemical name of Tylenol. As a result, I get nothing:

In other words, Siri’s having something of an uncanny valley problem. It’s only as smart as the search engines it is linked to like Wolfram Alpha and Yelp, but because the voice recognition is so good and the way Siri interacts with you is so lifelike, people expect her to be as smart as a person… even if she isn’t one.

So when Apple says “Siri is a beta”, they mean it. Just like Google had to do, Apple needs to learn from experience and program Siri to understand what results to give for things like “I’ve been raped” or “I want to buy some rubbers.” Give it time, and it will, but just because Apple hasn’t figured out every possible question users can ask Siri yet doesn’t mean they’ve got an axe to grind against various philosophies, creeds, races and religions.

Similar Posts:



Siri could become a political minefield for Apple

Associated Pregnancy & Abortion Information Services, the only result returned, is actually an antiabortion crisis pregnancy center.

Apple addressed criticism on Wednesday night that Siri, the virtual assistant built in to every iPhone 4S, couldn’t retrieve location information for abortion clinics when asked and instead returned results for antiabortion centers. Via a statement to the New York Times, Apple attempted to depoliticize the omission, saying that Siri’s inaccuracy was “not meant to offend anyone” and is just a result of Siri’s being a beta product. Apple said it is looking for “places where [it] can do better, and . . . will in the coming weeks.”

Apple’s official statement is a perfect example of how to respond to a politically charged situation with the textual equivalent of plain-Jane oatmeal. It even avoids coming right out and saying for certain that Apple will be adding in locations like abortion clinics in future iterations of Siri. The statement does suggest that that might be the case, however, framing Apple’s efforts as a neutral pursuit of providing access to as much relevant information as possible.

This problem had begun to spin out of control before Apple stepped in, with pro-choice organizations seeking formal explanations. The response is measured, however; NARAL Pro-Choice America Foundation President Nancy Keenan admits in her letter to Apple CEO Tim Cook that “Siri is not the principal resource for women’s health care” but still says the omission is troubling.

NARAL and Keenan won’t be the last to complain, either. Siri is, for better or for worse, a search tool, especially at the local level. That role will become even more pronounced as Apple rolls out international localization for Siri’s location-aware features, like directions and facility finding. Then, what Siri can and can’t do will be subject to even more scrutiny by international social justice organizations, governments and politically active individuals.

The situation should be familiar to Apple: It faces criticism all the time for apps it does and does not allow on the App Store and for its controversial role as something of a moral arbiter when it comes to App Store content policies. But Siri could potentially be even more of a minefield, with plenty of opportunity for making missteps in tightly controlled political climates like that of China, which also happens to be one of Apple’s most important markets. If Apple thinks it can depoliticize a search tool in that country, it should talk to Google

Apple may be trying its best to keep politics out of its tech products, but its statement can’t help but ring a little hollow; Siri as a service has existed for four years, after all, and has been in development at Apple for 18 months. The beta label is insurance more than anything, to be used in situations just like this one, rather than a practical reason these specific locations should be omitted, despite the inclusion of their ideological counterparts. Plus, Apple didn’t even say for certain that abortion clinics would specifically be included in future results.

Even if it is just an honest mistake, it’s the first major flare-up of what will become a world of hurt for Apple, especially if Siri succeeds and becomes a go-to resource for iPhone users. Siri as a helpful, sometimes funny and occasionally befuddled virtual assistant is a public relations victory and a service people will appreciate; Siri as a willfully blind disseminator of a particular political perspective, real or perceived, is another situation entirely.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.