How Nextdoor Addressed Racial Profiling on Its Platform

May 11, 2018 Phil Simon

may18_11_387757-unsplash
chuttersnap/unsplash

On March 3, 2015, hyperlocal social network Nextdoor announced that it had raised $110 million in venture capital. The deal valued the company at more than $1 billion—revered, unicorn status. It had to be a giddy moment for CEO Nirav Tolia and co-founders David Wiesen, Prakash Janakiraman, and Sarah Leary. But just three weeks later, all of that celebrating must have seemed like a distant memory.

The news site Fusion ran an article explaining how Nextdoor “is becoming a home for racial profiling.” Reporter Pendarvis Harshaw detailed how presumably white members were using Nextdoor’s crime and safety forum to report “suspicious” activities by African Americans and Latinos. Jennifer Medina of The New York Times followed up, reporting that “as Nextdoor has grown, users have complained that it has become a magnet for racial profiling, leading African-American and Latino residents to be seen as suspects in their own neighborhoods.”

Insight Center

As I discuss in my book Analytics: The Agile Way, how Nextdoor responded illustrates not only the importance of reacting quickly in a crisis, but how useful a data-driven, agile approach can be.

Agile teams benefit from different perspectives, skills, and expertise, so the co-founders assembled a small, diverse team to tackle the issue. Members included product head Maryam Mohit, communications director Kelsey Grady, a product manager, a designer, a data scientist, and later a software engineer.

As for the data, much of Nextdoor’s data here was unstructured text. Especially at first, this type of data doesn’t lend itself to the type of easy analysis that its structured equivalent does. This goes double when trying to deal with a thorny issue such as racial profiling. Five employees were assigned to read through thousands of user posts.

The outcome of all of this was a three-pronged solution: diversity training for Nextdoor’s neighborhood operations team; an update to Nextdoor’s community guidelines and an accompanying blog post; and a redesign of the app. This last step proved to be the thorniest.

Nextdoor had long allowed people to flag inappropriate posts, either by content or location. For instance, commercial posts don’t belong in noncommercial areas of the site. Nextdoor realized that a binary (re: flagged or not flagged) was no longer sufficient. Their first attempt at fixing the problem was simply to add  a report racial profiling button. But many users didn’t understand the new feature. “Nextdoor members began reporting all kinds of unrelated slights as racial profiling. ‘Somebody reported her neighbor for writing mean things about pit bulls,’ Mohit recall[ed].”

The team responded by developing six different variants of its app and testing them. Doing so helped the company answer key questions such as:

  • If the app alerted users about the potential for racial bias before they posted, would it change user behavior?
  • Characterizing a person isn’t easy. How does an application prompt its users for descriptions of others that are full and fair, rather than exclusively on race?
  • In describing a suspicious person, how many attributes are enough? Which specific attributes are more important than others?

In keeping with Lean methods, the team conducted a series of A/B tests. Blessed with a sufficiently large user base, Nextdoor ran experiments to determine the right answers to these questions. For instance, consider two groups of 25,000 users divided into cohorts (A and B). Each group would see one version of the Nextdoor app with slight but important differences in question wording, order, required fields, and the like.

Over the course of three months, Nextdoor’s different permutations made clear that certain versions of the app worked far better than others. And by August 2015, the team was ready to launch a new posting protocol in its crime and safety section. Users who mentioned race when posting to “Crime & Safety” forums were prompted to provide additional information, such as  hair, clothing, and shoes.

No, simply adding additional details and a little bit of user friction did not eliminate posts by insensitive people or racist users with axes to grind. But by taking a data-oriented and agile approach to design, the company reported it had reduced racial profiling by 75 percent.

Nextdoor was able to stem the bleeding in a relatively short period of time. A different  organization would have announced plans to “study the problem” as it continued unabated. Nextdoor took a different approach, and the results speak for themselves.

Read more...

Previous Article
How to Collaborate with a Perfectionist
How to Collaborate with a Perfectionist

Five ways to prevent the relationship from becoming too draining.

Next Article
Denmark Has Great Maternity Leave and Child Care Policies. So Why Aren’t More Women Advancing?
Denmark Has Great Maternity Leave and Child Care Policies. So Why Aren’t More Women Advancing?

It takes more to level the playing field.

Get the Latest People Performance & Human Capital Resources

Download