Technically Wrong Sexist Apps Biased Algorithms and Other

Technically Wrong Sexist Apps Biased Algorithms and Other


Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech ❴Read❵ ➲ Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Author Sara Wachter-Boettcher – Capitalsoftworks.co.uk Buying groceries tracking our health finding a date whatever we want to do odds are that we can now do it online But few of us realize just how many oversights biases and downright ethical nightmares Buying groceries tracking our Sexist Apps PDF ´ health finding a date whatever we want to do odds are that we can now do it online But few of us realize just how many oversights biases and downright ethical nightmares are baked inside the tech products we use every day It’s time we change thatIn Technically Wrong Sara Wachter Boettcher demystifies the tech industry leaving those of us on the Technically Wrong Kindle - other side of the screen better prepared to make informed choices about the services we use—and to demand from the companies behind themA Wired Top Tech Book of the YearA Fast Company Best Business and Leadership Book of the Year.

  • Paperback
  • 240 pages
  • Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech
  • Sara Wachter-Boettcher
  • 19 May 2016
  • 9780393356045

About the Author: Sara Wachter-Boettcher

Sara Wachter Boettcher is Sexist Apps PDF ´ a web consultant based in Philadelphia and the author of the forthcoming Technically Wrong from WW Norton as well as two books for web professionals Design for Real Life with Eric Meyer and Content Everywhere She helps organizations make sense of their digital content and speaks at conferences worldwide.



10 thoughts on “Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech

  1. John John says:

    Well This is another one of those funny books that is sort of a “5” and sort of a “3” The book broadly claims that the tech industry builds interfaces and products that are not necessarily intentionally biased The book says that the main driver is the homogeneity of tech company investors and employeesThere is no doubt in my mind that this is true and on that basis I’d recommend this to anyone in or outside of tech We product builders and designers are doing a crap job of acknowledging the incredibly broad types of people and styles of interaction out there Because of tech’s homogeneity there’s so much stuff that just isn’t thought about critically eg image analysis software not being able to analyze non white faces But as I’ll get into it in a moment I would very strongly recommend this to historians of technology as a little guide to problems that deserve significantly research The author’s a web consultant but I think we need to bring out the scholars There’s good stuff here about geography the 2010s reports of personal experiences would make the story even valuable I keep thinking about to another book I reviewed Tuco’s The Conversational Firm which shows how far we can get with ethnographical strategiesThere are some arguments here that are very dear to my heart For example on p 137 and chapter 3 the author notes how engineers and product designers will focus on the main experimental flow and minimize the importance of “edge cases” For example say 80% of the users are young and only 20% are old perhaps needing bigger fonts Well the company is going to focus on where the money is So font changing features may be downplayed The author rightly stresses harm and conseuences Even though the 20% might not be where the money is the negative conseuences of not helping them out with a useful UI can cause a lot of damage One area I have been concerned about is privacy and security in healthcare Say a login code is sent to an email But that email might go to a shared account For the most part this is probably not troubling The user “opted in” supplying that email But should we work harder to ensure that only the individual can access that account? What if it’s a shared account and medical details about domestic violence make it to that address Again say the patient has signed a consent to allow that message to go via email to a particular address Should that minority example make us very concerned to protect the “minority” user pattern? I think so The book does a good job walking the reader through thisBut I have some concerns Geography Time and time again the examples lean towards west coast companies Uber Facebook Twitter etc There are some exceptions But I’d like to know If the California tech culture is so bad are there other places that are better? Timespan Is this a particularly bad moment? Wachter Boeettcher provides the appalling facts around the decline of women computer science majors 37% in 1984 18% in 2014 “I can’t pretend to know the precise reason for this shift” p 182 Me neither But this book is so anchored in the present it begs the uestion of how we would assess say the tech culture of the 80s I bet it was better But was it? Just as an example back in the day Ann Wollrath was the lead author on the original RMI article Big stuff What was the culture? It would mean a lot if Wollrath told us that it was the same back then Then we might understand the core problem as a broader ill Intentions There are some good anecdotes here about how female voices are used for Siri Alexa and Google Maps etc pp 36 38 Right But what conclusion should we draw? “Women are expected to be helpful than men The we rely on digital tools in everyday life the we bolster the message that women are society’s ‘helpers’” p 38 I get this But then the author says “Did the designers intend this? Probably not” I protest Go out and interview the designers What were their reasons? Apple in particular thinks hard about this stuff What were the factors going into a female Siri and how did they outweigh providing other Siris male; accented; whatever? I want to know The book makes an insinuation but I think there’s a real research task to be performed Bring out the ethnographersIn large part the book is driven by articles in the tech media The next step is to get out there and start uoting people on their individual experiences in order to test some claims eg is the problem peculiarly tech in California in the 2010s? Or is it men in tech geography would help? Or even a side effect of the investment structure and capitalism seems implicit in chapter 9 and in particular figure out where people are doing it right and why The one positive example given in the book is Slack but I’m not going to give much uarter there Slack was produced by advertising and exploring the corporate customer’s desire to control discourse in the company not by inclusiveness

  2. Rachel Rachel says:

    I want to ualify my rating of this book If you haven’t previously thought about sexism racism or other forms of discrimination in the tech industry this is a five star recommendation However as someone who regularly reads about this topic and pays attention to tech news I encountered very little new information in this book It was also a bit disappointing to see so much focus on recent big news stories eg the Google Photos categorization fail Uber sexism and spying Facebook year in review rather than a wider range of companies and in depth looks at what went wrong how it happened and how companies are or could be doing things differently So I wasn’t blown away by the book but it holds valuable information for some folks and I just might be the wrong audience

  3. ☘Misericordia☘ ~ The Serendipity Aegis ~ ⚡ϟ⚡ϟ⚡⛈ ✺❂❤❣ ☘Misericordia☘ ~ The Serendipity Aegis ~ ⚡ϟ⚡ϟ⚡⛈ ✺❂❤❣ says:

    Some interesting concepts Normalizing Edge vs Stress The 'select one' being the problematic idea The default settings of our lives and everything else metrics are only as good as the goals and intentions that underlie them c inappropriate trying too hard chatty tech products c “marketing negging” The unlikely delights of '1 800 Flowers purchase particularly relevant to a Scorpio' © DAUsMAUsCAUsuite a lot of problematic issues Precisely the ones that lead diversity intentions to ruin Some ludicrous ideas like going about how to connect with a made up persona made up specifically to connect with I'm calling this one a BS jobAlso a lot of genuinely good material Here go handpicked examples of bothThere there dear Don’t worry about what we’re doing with your account Have a balloon c Back in 2011 if you told Siri you were thinking about shooting yourself it would give you directions to a gun store c Now I'm tempted to use Siri Attabotgirlfar too many people in tech have started to believe that they’re truly saving the world Even when they’re just making another ride hailing app or restaurant algorithm c I'm pretty sure that goes way beyond that And even beyond the tech I’m writing this in the wake of the 2016 presidential election—an election that gave us an American president who is infamous for allegations of sexual assault racism conflicts of interest collusion and angry Tweetstorms and who rode to power on a wave of misinformation c The problem was that there were 2 very problematic candidates not just one Hah Another problem is that people actually expect Facebook or Twitter or some other shit to tell them how to vote Problem numero trece is that people seem to be actually believing that that the flow of trash called 'news' in FB isn't actually 'news' So How very comfy to blame the techYou don’t need a computer science degree or a venture capital fund You don’t need to be able to program an algorithm All you need to do is slough away the layers of self aggrandizement and jargon and get at the heart of how people in technology work—and why their decisions so often don’t serve you c That's actually not true Self aggrandizement and jargon all of that is just perception which might be skewed or not Understanding is the keywe’ll take a closer look at how the tech industry operates and see how its hiring practices and work culture create teams that don’t represent most of us—no matter how many “diversity” events these companies put on c Why should they hire someone who represents anything instead of someone who's able to do the job? Diversity is about not refusing to hire a capable young mother or someone of another race Hiring representatives is a totally different operaDesigners and technologists don’t head into the office planning to launch a racist photo filter or build a sexist assumption into a database c LOL she spent the next hour listening to older men tell her about the “female market” The men in the room insisted that most women really care about leisure time activities c Now this must have been fun Even though the company had forty odd employees and had been in business than a decade no staff member had ever been pregnant “We have three other women of childbearing age on our team and we don’t want to set a precedent” the owner told her as if pregnancy were some sort of new trend c Wowser These guys must have grown on trees Some rotten fruits the two teams with lots of women on staff were sent an email by a board member asking them to “put together some kind of dance routine to perform at the company presentation The heads of each department all men stood up and talked about their successes over the course of the year The only women who graced the stage were a group of her peers in crop tops and hot pants The men in the audience wolf whistled while the women danced c That's some companyAmélie Lamont whose manager once claimed she hadn’t seen her in a meeting “You’re so black you blend into the chair” she told her c Damn I've actually once had a very similar discussion I've never before or after wanted so much to suggest that that reviewer should by the effing glasses and spare me the bullshitTech is also known for its obsession with youth—an obsession so absurd that I now regularly hear rumors about early thirties male startup founders getting cosmetic surgery so that investors will think they’re still in their twenties c Yep that's a fact Other companies start their workdays with all staff meetings held while everyone does planks—the fitness activity where you get on the ground prop yourself up by your feet and elbows and hold the position until your abs can’t handle it any If you’re physically able to plank that is And you’re not wearing a dress Or feeling modest Or embarrassed Or uncomfortable getting on your hands and knees at work c Ridiculous Riddiculus I’m not interested in ping pong beer or whatever other gimmick used to attract new grads The fact that I don’t like those things shouldn’t mean I’m not a “culture fit” I don’t want to work in tech to fool around I want to create amazing things and learn from other smart people That is the culture fit you should be looking for c Golden wordsThe good news is there’s actually no magic to tech As opaue as it might seem from the outside it’s just a skill set—one that all kinds of people can and do learn There’s no reason to allow tech companies to obfuscate their work to call it special and exempt it from our pesky ethics Except that we’ve never demanded they do better c And except that many of us don't really bother learning how stuff works Had these companies disclosed all their proprietary code today not many of us would know how to make head or tail of itAre you a “Kelly” the thirty seven year old minivan mom from the Minneapolis suburbs? Or do you see yourself as a “Matt” the millennial urban dweller who loves CrossFit and cold brew coffee? Maybe you’re of a “Maria” the low income community college student striving to stay in school while supporting her parentsNo? Well this is how many companies think about you c Now that's a great point she test drove some menstrual cycle apps looking for one that would help her get the information she neededWhat she found wasn’t so rosy Most of the apps she saw were splayed with pink and floral motifs and Delano immediately hated the gender stereotyping But even she hated how often the products assumed that fertility was her primary concern—rather than you know asking her c LOL It wasn't rosy it was pink and florid Glow works well for women who are trying to get pregnant with a partner But for everyone else both services stop making sense—and can be so alienating that would be users feel frustrated and delete them c Well frankly I don't think the right problem is being highlighted here Glow might be cheesy It also actually was initially rolled out for women trying to get pregnant So IMO women who don't might do better choosing some other app No shit Sherlock Every single app doesn't have to be a multitool capable of Pyhon coding getting one pregnant and building space ships The problem likely is that the market either doesn't clearly specify the alternative needs and apps applicable to other cases or does have voids in some respects That's actually both a problem and a business opportunityWhat happens when those someones are the people we met in Chapter 2 designers and developers who’ve been told that they’re rock stars gurus and geniuses and that the world is made for people like them? c The Big Flip Flop?But when default settings present one group as standard and another as “special”—such as men portrayed as normal than women or white people as normal than people of color—the people who are already marginalized end up having the most difficult time finding technology that works for them c AmenIf you’ve designed a cockpit to fit the average pilot you’ve actually designed it to fit no one So what did the air force do? Instead of designing for the middle it demanded that airplane manufacturers design for the extremes instead—mandating planes that fit both those at the smallest and the largest sizes along each dimension Pretty soon engineers found solutions to designing for these ranges including adjustable seats foot pedals and helmet straps—the kinds of inexpensive features we now take for granted c When designers call someone an edge case they imply that they’re not important enough to care about—that they’re outside the bounds of concern In contrast a stress case shows designers how strong their work is—and where it breaks down c Edge vs Stress gives interesting dichotomy I saw race and ethnicity menus that couldn’t accommodate people of multiple races I saw simple sign up forms that demanded to know users’ gender and then offered only male and female options I saw college application forms that assumed an applicant’s parents lived together at a single address c Fucked up design And not just designTake Shane Creepingbear a member of the Kiowa tribe of Oklahoma In 2014 he tried to log into Facebook But rather than being greeted by his friend’s posts like usual he was locked out of his account and shown this messageYour Name Wasn’t Approved Adding to the insult the site gave him only one option a button that said “Try Again” There was nowhere to click for “This is my real name” or “I need help” Facebook also rejected the names of a number of other Native Americans Robin Kills the Enemy Dana Lone Hill Lance Brown Eyes In fact even after Brown Eyes sent in a copy of his identification Facebook changed his name to Lance Brown c Oh this is top there’s still the fact that Facebook has placed itself in the position of deciding what’s authentic and what isn’t—of determining whose identity deserves an exception and whose does not c Which is uite obviously bonkers People who identify as than one race end up having to select “multiracial” As a result people who are multiracial end up flattened either they get lumped into a generic category stripped of meaning or they have to pick one racial identity to prioritize and effectively hide any others They can’t identify the way they would in real life and the result is just one example of the ways people who are already marginalized feel even invisible or unwelcome c When you remember how few people change the default settings in the software they use Facebook’s motivations become a lot clearer Facebook needs advertisers Advertisers want to target by gender Most users will never go back to futz with custom settings So Facebook effectively designs its onboarding process to gather the data it wants in the format advertisers expect Then it creates its customizable settings and ensures it gets glowing reviews from the tech press appeasing groups that feel marginalized—all the while knowing that very few people statistically will actually bother to adjust anything Thus it gets a feel good story about inclusivity while maintaining as large an audience as possible for advertisers It’s a win win if you’re Facebook or an advertiser that is c It was cute unless you wanted to react to a serious post and all you had was a sad Frankenstein c uite the company“Hi Tyler” one man’s video starts using title cards “Here are your friends” He’s then shown five copies of the same photo The result is eual parts funny and sad—like he has just that one friend It only gets better or worse depending on your sense of humor from there Another title card comes up “You’ve done a lot together” followed by a series of photos of wrecked vehicles culminating in a photo of an injured man giving the thumbs up from a hospital bed I suppose Facebook isn’t wrong exactly getting in a car accident is one definition of “doing a lot together” c This is both hilarious and horrifyingYou can probably guess what went wrong in one Facebook created a montage of a man’s near fatal car crash set to an acoustic jazz ditty Just imagine your photos of a totaled car and scraped up arms taken on a day you thought you might die set to a soft scat vocal track Doo be doo duh duh indeed c Tumblr “Beep beep #neo nazis is here” it read a Tumblr employee told Rooney that it was probably a “what you missed” notification Rooney had previously read posts about the rise in fascism and the notification system had used her past behavior to predict that she might be interested in neo Nazi content another Tumblr user shared a version of the notification he received “Beep beep #mental illness is here” c Well this is what happens when people are being treated as kids by appsMaybe I’m the only one who’s just not interested in snotty comebacks from my phone though I doubt it Why would anyone want their credit card offers to be dependent on the weather? What precisely would we do to make a 1 800 Flowers purchase particularly relevant to a Scorpio? “How the hell did I end up here?” cDelight is a concept that’s been tossed around endlessly in the tech industry these past few years and I’ve always hated it c What Facebook Thinks You Like The extension trawls Facebook’s ad serving settings and spits out a list of keywords the site thinks you’re interested in and why There’s the expected stuff Then there’s a host of just plain bizarre connections “Neighbors 1981 Film” a film I’ve never seen and don’t know anything about A host of no context nouns “Girlfriend” “Brand” “Wall” “Extended essay” “Eternity” I have no idea where any of this comes from—or what sort of advertising it would make me a target for Then it gets creepy “returned from trip 1 week ago” “freuent international travelers” I rarely post anything to Facebook but it knows where I go and how often c1500 individual tidbits of information about you all stored in a database somewhere and handed out to whoever will pay the price cThe technology is based on deep neural networks massive systems of information that enable machines to “see” much in the same way the human brain does c That's not precisely correct a future where Facebook AI listens in on conversations to identify potential terrorists where elected officials hold meetings on Facebook and where a “global safety infrastructure” responds to emergencies ranging from disease outbreaks to natural disasters to refugee crises c Welcome to the fish bowl

  4. Vish Wam Vish Wam says:

    Why do apps and profile info pages mostly come with only two gender options male and female? What if someone doesn't wish to be identified as either? Why is there still a vast underrepresentation of women and minorities in the tech sector? Why hasn't there been a massive MeToo rising in the tech industry across the world? If tech companies are largely run by white or Asian men do the products they release also reflect the bias and stereotypes they believe in? From Uber's severely regressive history in handling sexual harassment complaints to why Google Photos inadvertently turned out to be racist the book is full of anecdotes on where the tech industry is messing up If you are one who has felt that the tech sector needs to diversify this book can help you better understand why through stories from experts in the space If you thought the tech industry's idea of 'Meritocracy' to recruit talent is fair then this is a must read for you too

  5. Manzoor Elahi Manzoor Elahi says:

    Most tech products are full of blind spots biases and outright ethical blunders Like in the spring of 2015 when Louise Selby a pediatrician in Cambridge England joined PureGym a British chain But every time she tried to swipe her membership card to access the women’s locker room she was denied the system simply wouldn’t authorize her Finally PureGym got to the bottom of things the third party software it used to manage its membership data—software used at all ninety locations across England—was relying on members’ titles to determine which locker room they could access And the title “Doctor” was coded as maleIn March of 2016 JAMA Internal Medicine released a study showing that the artificial intelligence built into smartphones from Apple Samsung Google and Microsoft isn’t programmed to help during a crisis The phones’ personal assistants didn’t understand words like “rape” or “my husband is hitting me” In fact instead of doing even a simple web search Siri—Apple’s product—cracked jokes and mocked usersBack in 2011 if you told Siri you were thinking about shooting yourself it would give you directions to a gun store After getting bad press Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something that Siri identified as suicidal But five years later no one had looked beyond that one fix Apple had no problem investing in building jokes and clever comebacks into the interface from the start But investing in crisis or safety? Just not a priorityThree commercially released facial analysis programs from major technology companies demonstrate both skin type and gender biases In the researchers’ experiments the three programs’ error rates in determining the gender of light skinned men were never worse than 08 percent For darker skinned women however the error rates ballooned — to than 20 percent in one case and than 34 percent in the other twoThe findings raise uestions about how today’s neural networks which learn to perform computational tasks by looking for patterns in huge data sets are trained and evaluated For instance according to the paper researchers at a major US technology company claimed an accuracy rate of than 97 percent for a face recognition system they’d designed But the data set used to assess its performance was than 77 percent male and than 83 percent whiteHow I'm fighting bias in algorithms | Joy Buolamwini Algorithms used by the police are better at identifying some racial groups than others Few studies that have been done on facial recognition software suggest a persistently lower accuracy rate for African American faces — usually about 5 to 10 percent lower than for white faces and sometimes even worse This raises concerns given that African Americans are already overscrutinized by law enforcement It suggests that facial recognition technology is likely to be “overused on the segment of the population on which it underperforms” The inaccuracies are troubling but nothing new Many readers might recall that back in 2010 consumer grade facial recognition software was famously failing to detect that Asian users had their eyes open or that black users were in the frame at all Facial recognition software used in web services like Flickr and Google have tagged African Americans as primatesThe algorithmic bias has been described as “the coded gaze” by Joy Buolamwini an MIT Media Lab graduate student in a nod to the literary and sociological term “the white gaze” which describes seeing the world from the perspective of a white person and assuming always that your audience is whiteFrom massive businesses like Walmart and Apple to fledgling startups launching new apps organizations of all types use tools called personas—fictional representations of people who fit their target audiences—when designing their products apps websites and marketing campaigns So that ideally team members think about them regularly and internalize their needs and preferences That’s great in theory but when personas are created by a homogenous team that hasn’t taken the time to understand the nuances of its audience they often end up designing products that alienate audiences rather than making them feel at homeWe can see a common example in the story of Fatima a Middle Eastern American design strategist based in the Bay Area As the project kicked off Fatima sat down with the teams from both companies—and was literally the only woman at the table Pretty soon someone started a video meant to show the product’s positioning It was all flash yacht parties private jets 2000 shoes Fatima cringed The smartwatch they were designing was meant to target the midrange marketShe spent the next hour listening to older men tell her about the “female market” using tales of their wives’ shopping habits as proof The team wanted to target women who are fashionable tech savvy or both About two thirds of respondents were identified as the former and half as the latter Except the men refused to believe Fatima As soon as she started presenting her data they wrote her off “Oh 51 percent of the women can’t be tech savvy” they said“I felt like I was in an episode of Mad Men That’s a specific project a physical piece of technology that would exist in the world or not based on whether these men in the room accepted what I had to say or not” she said “They just weren’t willing to accept the research and use it as a foundation” The project got shelved and the brand partnered with a celebrity to design a smartwatch instead It flopped “It wasn’t based on needs; it was based on stereotypes” Fatima saidThis mind set—where someone assumes they have all the answers about a product and leaves out anyone with a different perspective—isn’t rare Scratch the surface at all kinds of companies—from Silicon Valley’s “unicorns” startups with valuations of than a billion dollars to tech firms in cities around the world—and you’ll find a culture that routinely excludes anyone who’s not young white and maleBiased algorithms Alienating online forms Harassment friendly platforms All kinds of problems plague digital products from tiny design details to massively flawed features But they share a common foundation a tech culture that’s built on white male values—while insisting it’s brilliant enough to serve all of us Or as they call it in Silicon Valley “meritocracy” Tech industry clings to meritocracy like a tattered baby blanket David Sacks an early executive at PayPal claimed that “if meritocracy exists anywhere on earth it is in Silicon Valley”The meritocracy myth is particularly pernicious in tech because it encourages the belief that the industry doesn’t need to listen to outside voices—because the smartest people are always already in the room This presumption uickly breeds a sort of techno paternalism when a group of mostly white guys from mostly the same places believes it deserves to be at the top it’s also uick to assume that it has all the perspective it needs in order to make decisions for everyone elseTied up in this meritocracy myth is also the assumption that technical skills are the most difficult to learn—and that if people study something else it’s because they couldn’t hack programming As a result the system prizes technical abilities—and systematically devalues the people who bring the very skills to the table that could strengthen products both ethically and commercially people with the humanities and social science training needed to consider historical and cultural context identify unconscious bias and be empathetic to the needs of usersOriginally programming was often categorized as “women’s work” lumped in with administrative skills like typing and dictation in fact during World War II the word “computers” was often applied not to machines but to the women who used them to compute data As colleges started offering computer science degrees in the 1960s women flocked to the programs 11 percent of computer science majors in 1967 were women By 1984 that number had grown to 37 percent Starting in 1985 that percentage fell every single year—until in 2007 it leveled out at the 18 percent figure we saw through 2014That shift coincides perfectly with the rise of the personal computer—which was marketed almost exclusively to men and boys We heard endless stories about Steve Jobs Bill Gates Paul Allen—garage tinkerers boy geniuses geeks Software companies and soon after internet companies all showcased men at the helm backed by a sea of techies who looked just like them And along the way women stopped studying computer science even as of them were attending college than ever beforeYou might assume that much of the attrition comes from women leaving to start or care for a family Nope Only about 20 percent of those who uit SET leave the workforce The rest either take their technical skills to another industry working for a nonprofit or in education say or move to a nontechnical position People call this the “leaky bucket” when women and underrepresented groups leave because they’re fed up with biased cultures where they can’t get aheadIf the tech industry has acknowledged this problem and says it wants to fix it why are the stats so slow to change? If you ask tech companies they’ll all point to the same culprit the pipeline The term “pipeline” refers to the number of people who are entering the job market prepared to join the tech industry those who are learning to code in high school and graduating from computer science or similar programs If the pipeline doesn’t include enough women and people of color though honestly many companies never get beyond talking about gender here then tech companies simply can’t hire them Or so the story goesIn a 2014 analysis USA Today concluded that “top universities turn out black and Hispanic computer science and computer engineering graduates at twice the rate that leading technology companies hire them” Adding to the problem potential employers spend their time looking for a “culture fit”—someone who neatly matches the employees already in the company—which ends up reinforcing the status uo rather than changing itIn a 2014 report for Scientific American Columbia professor Katherine W Phillips examined a broad cross section of research related to diversity and organizational performance And over and over she found that the simple act of interacting in a diverse group improves performance because it “forces group members to prepare better to anticipate alternative viewpoints and to expect that reaching consensus will take effort”In one study that Phillips cited published in the Journal of Personal Social Psychology researchers asked participants to serve on a mock jury for a black defendant Some participants were assigned to diverse juries some to homogenous ones Across the board diverse groups were careful with details than were homogenous groups and open to conversation When white participants were in diverse groups rather than homogenous ones they were likely to cite facts rather than opinions and they made fewer errors the study foundIn another study led by Phillips and researchers from Stanford and the University of Illinois at Urbana Champaign undergraduate students from the University of Illinois were asked to participate in a murder mystery exercise Each student was assigned to a group of three with some groups composed of two white students and one nonwhite student and some composed of three white students Each group member was given both a common set of information and a set of uniue clues that the other members did not have Group members needed to share all the information they collectively possessed in order to solve the puzzle But students in all white groups were significantly less likely to do so and therefore performed significantly worse in the exercise The reason is that when we work only with those similar to us we often “think we all hold the same information and share the same perspective” Phillips writes “This perspective which stopped the all white groups from effectively processing the information is what hinders creativity and innovation”Uber may be an extreme example but it can help us understand tech’s insular culture much clearly if tech wants to be seen as special—and therefore able to operate outside the rules—then it helps to position the people working inside tech companies as special too And the best way to ensure that happens is to build a monoculture where insiders bond over a shared belief in their own brilliance That’s also why you see so many ridiculous job titles floating around Silicon Valley and places like it “rock star” designers “ninja” JavaScript developers user experience “unicorns” yes these are all real Fantastical labels like these reinforce the idea that tech and design are magical skill sets that those on the outside wouldn’t understand and could never learnThe reality is a lot mundane design and programming are just professions—sets of skills and practices just like any other field Admitting that truth would make tech positions feel a lot welcoming to diverse employees but tech can’t tell that story to the masses If it did then the industry would seem normal understandable and accessible—and that would make everyday people comfortable pushing back when its ideas are intrusive or unethical So tech has to maintain its insider y brilliant than thou feel—which affects who decides to enter that legendary “pipeline” and whether they’ll stick around once they’ve arrivedNot every tech company looks at the world like Uber does thank god Just look at messaging app Slack a darling of the startup world with an office motto that’s refreshingly healthy “Work hard and go home” Slack is often described as a delight to use—but it’s a delight borne of nuance and detail not shoved down your throat cuteness And the company got there by what so few tech companies seem to bother with considering their users as real whole peopleOne of the first things CEO Stewart Butterfield wants to know about when interviewing candidates for a position isn’t which programming languages they know or where their computer science It’s whether they believe luck played a role in getting them where they are—whether they think their success is a product not just of merit and talent but of good circumstances His goal is simple to build a team where people don’t assume they’re special No rock stars no gurus no ninjas—just people who bring a combination of expertise humility and empathy Slack doesn’t rely on believing that programmers are the chosen ones in fact Butterfield who has a master’s degree in philosophy is known for extolling the values of the liberal arts to anyone in tech who’ll listenLo and behold that culture also leads to a diverse staff women held than 40 percent of Slack’s management positions in 2016 and than a fourth of engineering roles too Black people accounted for nearly 8 percent of engineers Slack’s disarming honesty and disinterest in chest thumping are antithetical to the way most of tech talks about itself And it’s working Slack is the fastest growing business app ever

  6. Tam Tam says:

    A good and short read Plenty of examples but mostly the famous ones on the internet the author's alignment with the truly marginalized is limited mostly with femalegaystransgendernonwhites but still the educated unlike O'Neil in Weapons of Math Destruction How Big Data Increases Ineuality and Threatens Democracy who places her heart towards the poor the abused whose stories may not be heard at all buried deep powerless The problems aren't less worthy to discuss though The sexist and racist culture is so embedded the privileges so taken for granted the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce That needs to change

  7. Kelly Kelly says:

    Nothing surprising here but infuriating and important nonetheless if you at all work in tech as a woman or person of color you'll recognize all of this Well researched and written The sexism in algorithms is something I've not thought about but damn was that interesting

  8. Jay Jay says:

    Given the title of this book I assumed it would focus exclusively on the problems of bias in software and machine learning This has been in the news for uite a while and on top of the news recently While most of the book provides stories about bias as I expected a large part of the book was about various other behaviors sexist racist illegal and just bad Think hiring at Uber If you have kept up with these kinds of issues in WiredFast Company magazines and their ilk you get many examples here but not much by way of solutions Despite that mild disappointment I found the writing kept my interest at least up until the end when it felt like the authors were reaching for things to write about Good for helping an ITer data scientist or a tech company exec to think through how these issues may touch on your own company products and practices

  9. Kate Kaput Kate Kaput says:

    Long review coming This book was my first Feminist Book Club delivery it was brilliant techie but written in a digestible accessible down to earth way for those of us who don't work in tech I had no idea of all these problems like Google Photos identifying black faces as gorillas mobile ads targeting people in low income areas with ads for for profit colleges or a gym chain in Britain where a woman couldn't get into the locker rooms because the locker rooms were coded by title like Mr or Mrs hers Doctor was coded as male This book tackles problems small large including how they occur how they can be stopped Read this

  10. Amy Rhoda Brown Amy Rhoda Brown says:

    This is a crystal clear description of how the monoculture of tech leads to terrible apps toxic online behaviour and the failure of the developers to take responsibility for what their decisions based on their narrow worldview have wrought Easy to read well laid out and compelling

Leave a Reply

Your email address will not be published. Required fields are marked *

10 thoughts on “Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech

  1. John John says:

    Well This is another one of those funny books that is sort of a “5” and sort of a “3” The book broadly claims that the tech industry builds interfaces and products that are not necessarily intentionally biased The book says that the main driver is the homogeneity of tech company investors and employeesThere is no doubt in my mind that this is true and on that basis I’d recommend this to anyone in or outside of tech We product builders and designers are doing a crap job of acknowledging the incredibly broad types of people and styles of interaction out there Because of tech’s homogeneity there’s so much stuff that just isn’t thought about critically eg image analysis software not being able to analyze non white faces But as I’ll get into it in a moment I would very strongly recommend this to historians of technology as a little guide to problems that deserve significantly research The author’s a web consultant but I think we need to bring out the scholars There’s good stuff here about geography the 2010s reports of personal experiences would make the story even valuable I keep thinking about to another book I reviewed Tuco’s The Conversational Firm which shows how far we can get with ethnographical strategiesThere are some arguments here that are very dear to my heart For example on p 137 and chapter 3 the author notes how engineers and product designers will focus on the main experimental flow and minimize the importance of “edge cases” For example say 80% of the users are young and only 20% are old perhaps needing bigger fonts Well the company is going to focus on where the money is So font changing features may be downplayed The author rightly stresses harm and conseuences Even though the 20% might not be where the money is the negative conseuences of not helping them out with a useful UI can cause a lot of damage One area I have been concerned about is privacy and security in healthcare Say a login code is sent to an email But that email might go to a shared account For the most part this is probably not troubling The user “opted in” supplying that email But should we work harder to ensure that only the individual can access that account? What if it’s a shared account and medical details about domestic violence make it to that address Again say the patient has signed a consent to allow that message to go via email to a particular address Should that minority example make us very concerned to protect the “minority” user pattern? I think so The book does a good job walking the reader through thisBut I have some concerns Geography Time and time again the examples lean towards west coast companies Uber Facebook Twitter etc There are some exceptions But I’d like to know If the California tech culture is so bad are there other places that are better? Timespan Is this a particularly bad moment? Wachter Boeettcher provides the appalling facts around the decline of women computer science majors 37% in 1984 18% in 2014 “I can’t pretend to know the precise reason for this shift” p 182 Me neither But this book is so anchored in the present it begs the uestion of how we would assess say the tech culture of the 80s I bet it was better But was it? Just as an example back in the day Ann Wollrath was the lead author on the original RMI article Big stuff What was the culture? It would mean a lot if Wollrath told us that it was the same back then Then we might understand the core problem as a broader ill Intentions There are some good anecdotes here about how female voices are used for Siri Alexa and Google Maps etc pp 36 38 Right But what conclusion should we draw? “Women are expected to be helpful than men The we rely on digital tools in everyday life the we bolster the message that women are society’s ‘helpers’” p 38 I get this But then the author says “Did the designers intend this? Probably not” I protest Go out and interview the designers What were their reasons? Apple in particular thinks hard about this stuff What were the factors going into a female Siri and how did they outweigh providing other Siris male; accented; whatever? I want to know The book makes an insinuation but I think there’s a real research task to be performed Bring out the ethnographersIn large part the book is driven by articles in the tech media The next step is to get out there and start uoting people on their individual experiences in order to test some claims eg is the problem peculiarly tech in California in the 2010s? Or is it men in tech geography would help? Or even a side effect of the investment structure and capitalism seems implicit in chapter 9 and in particular figure out where people are doing it right and why The one positive example given in the book is Slack but I’m not going to give much uarter there Slack was produced by advertising and exploring the corporate customer’s desire to control discourse in the company not by inclusiveness

  2. Rachel Rachel says:

    I want to ualify my rating of this book If you haven’t previously thought about sexism racism or other forms of discrimination in the tech industry this is a five star recommendation However as someone who regularly reads about this topic and pays attention to tech news I encountered very little new information in this book It was also a bit disappointing to see so much focus on recent big news stories eg the Google Photos categorization fail Uber sexism and spying Facebook year in review rather than a wider range of companies and in depth looks at what went wrong how it happened and how companies are or could be doing things differently So I wasn’t blown away by the book but it holds valuable information for some folks and I just might be the wrong audience

  3. ☘Misericordia☘ ~ The Serendipity Aegis ~ ⚡ϟ⚡ϟ⚡⛈ ✺❂❤❣ ☘Misericordia☘ ~ The Serendipity Aegis ~ ⚡ϟ⚡ϟ⚡⛈ ✺❂❤❣ says:

    Some interesting concepts Normalizing Edge vs Stress The 'select one' being the problematic idea The default settings of our lives and everything else metrics are only as good as the goals and intentions that underlie them c inappropriate trying too hard chatty tech products c “marketing negging” The unlikely delights of '1 800 Flowers purchase particularly relevant to a Scorpio' © DAUsMAUsCAUsuite a lot of problematic issues Precisely the ones that lead diversity intentions to ruin Some ludicrous ideas like going about how to connect with a made up persona made up specifically to connect with I'm calling this one a BS jobAlso a lot of genuinely good material Here go handpicked examples of bothThere there dear Don’t worry about what we’re doing with your account Have a balloon c Back in 2011 if you told Siri you were thinking about shooting yourself it would give you directions to a gun store c Now I'm tempted to use Siri Attabotgirlfar too many people in tech have started to believe that they’re truly saving the world Even when they’re just making another ride hailing app or restaurant algorithm c I'm pretty sure that goes way beyond that And even beyond the tech I’m writing this in the wake of the 2016 presidential election—an election that gave us an American president who is infamous for allegations of sexual assault racism conflicts of interest collusion and angry Tweetstorms and who rode to power on a wave of misinformation c The problem was that there were 2 very problematic candidates not just one Hah Another problem is that people actually expect Facebook or Twitter or some other shit to tell them how to vote Problem numero trece is that people seem to be actually believing that that the flow of trash called 'news' in FB isn't actually 'news' So How very comfy to blame the techYou don’t need a computer science degree or a venture capital fund You don’t need to be able to program an algorithm All you need to do is slough away the layers of self aggrandizement and jargon and get at the heart of how people in technology work—and why their decisions so often don’t serve you c That's actually not true Self aggrandizement and jargon all of that is just perception which might be skewed or not Understanding is the keywe’ll take a closer look at how the tech industry operates and see how its hiring practices and work culture create teams that don’t represent most of us—no matter how many “diversity” events these companies put on c Why should they hire someone who represents anything instead of someone who's able to do the job? Diversity is about not refusing to hire a capable young mother or someone of another race Hiring representatives is a totally different operaDesigners and technologists don’t head into the office planning to launch a racist photo filter or build a sexist assumption into a database c LOL she spent the next hour listening to older men tell her about the “female market” The men in the room insisted that most women really care about leisure time activities c Now this must have been fun Even though the company had forty odd employees and had been in business than a decade no staff member had ever been pregnant “We have three other women of childbearing age on our team and we don’t want to set a precedent” the owner told her as if pregnancy were some sort of new trend c Wowser These guys must have grown on trees Some rotten fruits the two teams with lots of women on staff were sent an email by a board member asking them to “put together some kind of dance routine to perform at the company presentation The heads of each department all men stood up and talked about their successes over the course of the year The only women who graced the stage were a group of her peers in crop tops and hot pants The men in the audience wolf whistled while the women danced c That's some companyAmélie Lamont whose manager once claimed she hadn’t seen her in a meeting “You’re so black you blend into the chair” she told her c Damn I've actually once had a very similar discussion I've never before or after wanted so much to suggest that that reviewer should by the effing glasses and spare me the bullshitTech is also known for its obsession with youth—an obsession so absurd that I now regularly hear rumors about early thirties male startup founders getting cosmetic surgery so that investors will think they’re still in their twenties c Yep that's a fact Other companies start their workdays with all staff meetings held while everyone does planks—the fitness activity where you get on the ground prop yourself up by your feet and elbows and hold the position until your abs can’t handle it any If you’re physically able to plank that is And you’re not wearing a dress Or feeling modest Or embarrassed Or uncomfortable getting on your hands and knees at work c Ridiculous Riddiculus I’m not interested in ping pong beer or whatever other gimmick used to attract new grads The fact that I don’t like those things shouldn’t mean I’m not a “culture fit” I don’t want to work in tech to fool around I want to create amazing things and learn from other smart people That is the culture fit you should be looking for c Golden wordsThe good news is there’s actually no magic to tech As opaue as it might seem from the outside it’s just a skill set—one that all kinds of people can and do learn There’s no reason to allow tech companies to obfuscate their work to call it special and exempt it from our pesky ethics Except that we’ve never demanded they do better c And except that many of us don't really bother learning how stuff works Had these companies disclosed all their proprietary code today not many of us would know how to make head or tail of itAre you a “Kelly” the thirty seven year old minivan mom from the Minneapolis suburbs? Or do you see yourself as a “Matt” the millennial urban dweller who loves CrossFit and cold brew coffee? Maybe you’re of a “Maria” the low income community college student striving to stay in school while supporting her parentsNo? Well this is how many companies think about you c Now that's a great point she test drove some menstrual cycle apps looking for one that would help her get the information she neededWhat she found wasn’t so rosy Most of the apps she saw were splayed with pink and floral motifs and Delano immediately hated the gender stereotyping But even she hated how often the products assumed that fertility was her primary concern—rather than you know asking her c LOL It wasn't rosy it was pink and florid Glow works well for women who are trying to get pregnant with a partner But for everyone else both services stop making sense—and can be so alienating that would be users feel frustrated and delete them c Well frankly I don't think the right problem is being highlighted here Glow might be cheesy It also actually was initially rolled out for women trying to get pregnant So IMO women who don't might do better choosing some other app No shit Sherlock Every single app doesn't have to be a multitool capable of Pyhon coding getting one pregnant and building space ships The problem likely is that the market either doesn't clearly specify the alternative needs and apps applicable to other cases or does have voids in some respects That's actually both a problem and a business opportunityWhat happens when those someones are the people we met in Chapter 2 designers and developers who’ve been told that they’re rock stars gurus and geniuses and that the world is made for people like them? c The Big Flip Flop?But when default settings present one group as standard and another as “special”—such as men portrayed as normal than women or white people as normal than people of color—the people who are already marginalized end up having the most difficult time finding technology that works for them c AmenIf you’ve designed a cockpit to fit the average pilot you’ve actually designed it to fit no one So what did the air force do? Instead of designing for the middle it demanded that airplane manufacturers design for the extremes instead—mandating planes that fit both those at the smallest and the largest sizes along each dimension Pretty soon engineers found solutions to designing for these ranges including adjustable seats foot pedals and helmet straps—the kinds of inexpensive features we now take for granted c When designers call someone an edge case they imply that they’re not important enough to care about—that they’re outside the bounds of concern In contrast a stress case shows designers how strong their work is—and where it breaks down c Edge vs Stress gives interesting dichotomy I saw race and ethnicity menus that couldn’t accommodate people of multiple races I saw simple sign up forms that demanded to know users’ gender and then offered only male and female options I saw college application forms that assumed an applicant’s parents lived together at a single address c Fucked up design And not just designTake Shane Creepingbear a member of the Kiowa tribe of Oklahoma In 2014 he tried to log into Facebook But rather than being greeted by his friend’s posts like usual he was locked out of his account and shown this messageYour Name Wasn’t Approved Adding to the insult the site gave him only one option a button that said “Try Again” There was nowhere to click for “This is my real name” or “I need help” Facebook also rejected the names of a number of other Native Americans Robin Kills the Enemy Dana Lone Hill Lance Brown Eyes In fact even after Brown Eyes sent in a copy of his identification Facebook changed his name to Lance Brown c Oh this is top there’s still the fact that Facebook has placed itself in the position of deciding what’s authentic and what isn’t—of determining whose identity deserves an exception and whose does not c Which is uite obviously bonkers People who identify as than one race end up having to select “multiracial” As a result people who are multiracial end up flattened either they get lumped into a generic category stripped of meaning or they have to pick one racial identity to prioritize and effectively hide any others They can’t identify the way they would in real life and the result is just one example of the ways people who are already marginalized feel even invisible or unwelcome c When you remember how few people change the default settings in the software they use Facebook’s motivations become a lot clearer Facebook needs advertisers Advertisers want to target by gender Most users will never go back to futz with custom settings So Facebook effectively designs its onboarding process to gather the data it wants in the format advertisers expect Then it creates its customizable settings and ensures it gets glowing reviews from the tech press appeasing groups that feel marginalized—all the while knowing that very few people statistically will actually bother to adjust anything Thus it gets a feel good story about inclusivity while maintaining as large an audience as possible for advertisers It’s a win win if you’re Facebook or an advertiser that is c It was cute unless you wanted to react to a serious post and all you had was a sad Frankenstein c uite the company“Hi Tyler” one man’s video starts using title cards “Here are your friends” He’s then shown five copies of the same photo The result is eual parts funny and sad—like he has just that one friend It only gets better or worse depending on your sense of humor from there Another title card comes up “You’ve done a lot together” followed by a series of photos of wrecked vehicles culminating in a photo of an injured man giving the thumbs up from a hospital bed I suppose Facebook isn’t wrong exactly getting in a car accident is one definition of “doing a lot together” c This is both hilarious and horrifyingYou can probably guess what went wrong in one Facebook created a montage of a man’s near fatal car crash set to an acoustic jazz ditty Just imagine your photos of a totaled car and scraped up arms taken on a day you thought you might die set to a soft scat vocal track Doo be doo duh duh indeed c Tumblr “Beep beep #neo nazis is here” it read a Tumblr employee told Rooney that it was probably a “what you missed” notification Rooney had previously read posts about the rise in fascism and the notification system had used her past behavior to predict that she might be interested in neo Nazi content another Tumblr user shared a version of the notification he received “Beep beep #mental illness is here” c Well this is what happens when people are being treated as kids by appsMaybe I’m the only one who’s just not interested in snotty comebacks from my phone though I doubt it Why would anyone want their credit card offers to be dependent on the weather? What precisely would we do to make a 1 800 Flowers purchase particularly relevant to a Scorpio? “How the hell did I end up here?” cDelight is a concept that’s been tossed around endlessly in the tech industry these past few years and I’ve always hated it c What Facebook Thinks You Like The extension trawls Facebook’s ad serving settings and spits out a list of keywords the site thinks you’re interested in and why There’s the expected stuff Then there’s a host of just plain bizarre connections “Neighbors 1981 Film” a film I’ve never seen and don’t know anything about A host of no context nouns “Girlfriend” “Brand” “Wall” “Extended essay” “Eternity” I have no idea where any of this comes from—or what sort of advertising it would make me a target for Then it gets creepy “returned from trip 1 week ago” “freuent international travelers” I rarely post anything to Facebook but it knows where I go and how often c1500 individual tidbits of information about you all stored in a database somewhere and handed out to whoever will pay the price cThe technology is based on deep neural networks massive systems of information that enable machines to “see” much in the same way the human brain does c That's not precisely correct a future where Facebook AI listens in on conversations to identify potential terrorists where elected officials hold meetings on Facebook and where a “global safety infrastructure” responds to emergencies ranging from disease outbreaks to natural disasters to refugee crises c Welcome to the fish bowl

  4. Vish Wam Vish Wam says:

    Why do apps and profile info pages mostly come with only two gender options male and female? What if someone doesn't wish to be identified as either? Why is there still a vast underrepresentation of women and minorities in the tech sector? Why hasn't there been a massive MeToo rising in the tech industry across the world? If tech companies are largely run by white or Asian men do the products they release also reflect the bias and stereotypes they believe in? From Uber's severely regressive history in handling sexual harassment complaints to why Google Photos inadvertently turned out to be racist the book is full of anecdotes on where the tech industry is messing up If you are one who has felt that the tech sector needs to diversify this book can help you better understand why through stories from experts in the space If you thought the tech industry's idea of 'Meritocracy' to recruit talent is fair then this is a must read for you too

  5. Manzoor Elahi Manzoor Elahi says:

    Most tech products are full of blind spots biases and outright ethical blunders Like in the spring of 2015 when Louise Selby a pediatrician in Cambridge England joined PureGym a British chain But every time she tried to swipe her membership card to access the women’s locker room she was denied the system simply wouldn’t authorize her Finally PureGym got to the bottom of things the third party software it used to manage its membership data—software used at all ninety locations across England—was relying on members’ titles to determine which locker room they could access And the title “Doctor” was coded as maleIn March of 2016 JAMA Internal Medicine released a study showing that the artificial intelligence built into smartphones from Apple Samsung Google and Microsoft isn’t programmed to help during a crisis The phones’ personal assistants didn’t understand words like “rape” or “my husband is hitting me” In fact instead of doing even a simple web search Siri—Apple’s product—cracked jokes and mocked usersBack in 2011 if you told Siri you were thinking about shooting yourself it would give you directions to a gun store After getting bad press Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something that Siri identified as suicidal But five years later no one had looked beyond that one fix Apple had no problem investing in building jokes and clever comebacks into the interface from the start But investing in crisis or safety? Just not a priorityThree commercially released facial analysis programs from major technology companies demonstrate both skin type and gender biases In the researchers’ experiments the three programs’ error rates in determining the gender of light skinned men were never worse than 08 percent For darker skinned women however the error rates ballooned — to than 20 percent in one case and than 34 percent in the other twoThe findings raise uestions about how today’s neural networks which learn to perform computational tasks by looking for patterns in huge data sets are trained and evaluated For instance according to the paper researchers at a major US technology company claimed an accuracy rate of than 97 percent for a face recognition system they’d designed But the data set used to assess its performance was than 77 percent male and than 83 percent whiteHow I'm fighting bias in algorithms | Joy Buolamwini Algorithms used by the police are better at identifying some racial groups than others Few studies that have been done on facial recognition software suggest a persistently lower accuracy rate for African American faces — usually about 5 to 10 percent lower than for white faces and sometimes even worse This raises concerns given that African Americans are already overscrutinized by law enforcement It suggests that facial recognition technology is likely to be “overused on the segment of the population on which it underperforms” The inaccuracies are troubling but nothing new Many readers might recall that back in 2010 consumer grade facial recognition software was famously failing to detect that Asian users had their eyes open or that black users were in the frame at all Facial recognition software used in web services like Flickr and Google have tagged African Americans as primatesThe algorithmic bias has been described as “the coded gaze” by Joy Buolamwini an MIT Media Lab graduate student in a nod to the literary and sociological term “the white gaze” which describes seeing the world from the perspective of a white person and assuming always that your audience is whiteFrom massive businesses like Walmart and Apple to fledgling startups launching new apps organizations of all types use tools called personas—fictional representations of people who fit their target audiences—when designing their products apps websites and marketing campaigns So that ideally team members think about them regularly and internalize their needs and preferences That’s great in theory but when personas are created by a homogenous team that hasn’t taken the time to understand the nuances of its audience they often end up designing products that alienate audiences rather than making them feel at homeWe can see a common example in the story of Fatima a Middle Eastern American design strategist based in the Bay Area As the project kicked off Fatima sat down with the teams from both companies—and was literally the only woman at the table Pretty soon someone started a video meant to show the product’s positioning It was all flash yacht parties private jets 2000 shoes Fatima cringed The smartwatch they were designing was meant to target the midrange marketShe spent the next hour listening to older men tell her about the “female market” using tales of their wives’ shopping habits as proof The team wanted to target women who are fashionable tech savvy or both About two thirds of respondents were identified as the former and half as the latter Except the men refused to believe Fatima As soon as she started presenting her data they wrote her off “Oh 51 percent of the women can’t be tech savvy” they said“I felt like I was in an episode of Mad Men That’s a specific project a physical piece of technology that would exist in the world or not based on whether these men in the room accepted what I had to say or not” she said “They just weren’t willing to accept the research and use it as a foundation” The project got shelved and the brand partnered with a celebrity to design a smartwatch instead It flopped “It wasn’t based on needs; it was based on stereotypes” Fatima saidThis mind set—where someone assumes they have all the answers about a product and leaves out anyone with a different perspective—isn’t rare Scratch the surface at all kinds of companies—from Silicon Valley’s “unicorns” startups with valuations of than a billion dollars to tech firms in cities around the world—and you’ll find a culture that routinely excludes anyone who’s not young white and maleBiased algorithms Alienating online forms Harassment friendly platforms All kinds of problems plague digital products from tiny design details to massively flawed features But they share a common foundation a tech culture that’s built on white male values—while insisting it’s brilliant enough to serve all of us Or as they call it in Silicon Valley “meritocracy” Tech industry clings to meritocracy like a tattered baby blanket David Sacks an early executive at PayPal claimed that “if meritocracy exists anywhere on earth it is in Silicon Valley”The meritocracy myth is particularly pernicious in tech because it encourages the belief that the industry doesn’t need to listen to outside voices—because the smartest people are always already in the room This presumption uickly breeds a sort of techno paternalism when a group of mostly white guys from mostly the same places believes it deserves to be at the top it’s also uick to assume that it has all the perspective it needs in order to make decisions for everyone elseTied up in this meritocracy myth is also the assumption that technical skills are the most difficult to learn—and that if people study something else it’s because they couldn’t hack programming As a result the system prizes technical abilities—and systematically devalues the people who bring the very skills to the table that could strengthen products both ethically and commercially people with the humanities and social science training needed to consider historical and cultural context identify unconscious bias and be empathetic to the needs of usersOriginally programming was often categorized as “women’s work” lumped in with administrative skills like typing and dictation in fact during World War II the word “computers” was often applied not to machines but to the women who used them to compute data As colleges started offering computer science degrees in the 1960s women flocked to the programs 11 percent of computer science majors in 1967 were women By 1984 that number had grown to 37 percent Starting in 1985 that percentage fell every single year—until in 2007 it leveled out at the 18 percent figure we saw through 2014That shift coincides perfectly with the rise of the personal computer—which was marketed almost exclusively to men and boys We heard endless stories about Steve Jobs Bill Gates Paul Allen—garage tinkerers boy geniuses geeks Software companies and soon after internet companies all showcased men at the helm backed by a sea of techies who looked just like them And along the way women stopped studying computer science even as of them were attending college than ever beforeYou might assume that much of the attrition comes from women leaving to start or care for a family Nope Only about 20 percent of those who uit SET leave the workforce The rest either take their technical skills to another industry working for a nonprofit or in education say or move to a nontechnical position People call this the “leaky bucket” when women and underrepresented groups leave because they’re fed up with biased cultures where they can’t get aheadIf the tech industry has acknowledged this problem and says it wants to fix it why are the stats so slow to change? If you ask tech companies they’ll all point to the same culprit the pipeline The term “pipeline” refers to the number of people who are entering the job market prepared to join the tech industry those who are learning to code in high school and graduating from computer science or similar programs If the pipeline doesn’t include enough women and people of color though honestly many companies never get beyond talking about gender here then tech companies simply can’t hire them Or so the story goesIn a 2014 analysis USA Today concluded that “top universities turn out black and Hispanic computer science and computer engineering graduates at twice the rate that leading technology companies hire them” Adding to the problem potential employers spend their time looking for a “culture fit”—someone who neatly matches the employees already in the company—which ends up reinforcing the status uo rather than changing itIn a 2014 report for Scientific American Columbia professor Katherine W Phillips examined a broad cross section of research related to diversity and organizational performance And over and over she found that the simple act of interacting in a diverse group improves performance because it “forces group members to prepare better to anticipate alternative viewpoints and to expect that reaching consensus will take effort”In one study that Phillips cited published in the Journal of Personal Social Psychology researchers asked participants to serve on a mock jury for a black defendant Some participants were assigned to diverse juries some to homogenous ones Across the board diverse groups were careful with details than were homogenous groups and open to conversation When white participants were in diverse groups rather than homogenous ones they were likely to cite facts rather than opinions and they made fewer errors the study foundIn another study led by Phillips and researchers from Stanford and the University of Illinois at Urbana Champaign undergraduate students from the University of Illinois were asked to participate in a murder mystery exercise Each student was assigned to a group of three with some groups composed of two white students and one nonwhite student and some composed of three white students Each group member was given both a common set of information and a set of uniue clues that the other members did not have Group members needed to share all the information they collectively possessed in order to solve the puzzle But students in all white groups were significantly less likely to do so and therefore performed significantly worse in the exercise The reason is that when we work only with those similar to us we often “think we all hold the same information and share the same perspective” Phillips writes “This perspective which stopped the all white groups from effectively processing the information is what hinders creativity and innovation”Uber may be an extreme example but it can help us understand tech’s insular culture much clearly if tech wants to be seen as special—and therefore able to operate outside the rules—then it helps to position the people working inside tech companies as special too And the best way to ensure that happens is to build a monoculture where insiders bond over a shared belief in their own brilliance That’s also why you see so many ridiculous job titles floating around Silicon Valley and places like it “rock star” designers “ninja” JavaScript developers user experience “unicorns” yes these are all real Fantastical labels like these reinforce the idea that tech and design are magical skill sets that those on the outside wouldn’t understand and could never learnThe reality is a lot mundane design and programming are just professions—sets of skills and practices just like any other field Admitting that truth would make tech positions feel a lot welcoming to diverse employees but tech can’t tell that story to the masses If it did then the industry would seem normal understandable and accessible—and that would make everyday people comfortable pushing back when its ideas are intrusive or unethical So tech has to maintain its insider y brilliant than thou feel—which affects who decides to enter that legendary “pipeline” and whether they’ll stick around once they’ve arrivedNot every tech company looks at the world like Uber does thank god Just look at messaging app Slack a darling of the startup world with an office motto that’s refreshingly healthy “Work hard and go home” Slack is often described as a delight to use—but it’s a delight borne of nuance and detail not shoved down your throat cuteness And the company got there by what so few tech companies seem to bother with considering their users as real whole peopleOne of the first things CEO Stewart Butterfield wants to know about when interviewing candidates for a position isn’t which programming languages they know or where their computer science It’s whether they believe luck played a role in getting them where they are—whether they think their success is a product not just of merit and talent but of good circumstances His goal is simple to build a team where people don’t assume they’re special No rock stars no gurus no ninjas—just people who bring a combination of expertise humility and empathy Slack doesn’t rely on believing that programmers are the chosen ones in fact Butterfield who has a master’s degree in philosophy is known for extolling the values of the liberal arts to anyone in tech who’ll listenLo and behold that culture also leads to a diverse staff women held than 40 percent of Slack’s management positions in 2016 and than a fourth of engineering roles too Black people accounted for nearly 8 percent of engineers Slack’s disarming honesty and disinterest in chest thumping are antithetical to the way most of tech talks about itself And it’s working Slack is the fastest growing business app ever

  6. Tam Tam says:

    A good and short read Plenty of examples but mostly the famous ones on the internet the author's alignment with the truly marginalized is limited mostly with femalegaystransgendernonwhites but still the educated unlike O'Neil in Weapons of Math Destruction How Big Data Increases Ineuality and Threatens Democracy who places her heart towards the poor the abused whose stories may not be heard at all buried deep powerless The problems aren't less worthy to discuss though The sexist and racist culture is so embedded the privileges so taken for granted the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce That needs to change

  7. Kelly Kelly says:

    Nothing surprising here but infuriating and important nonetheless if you at all work in tech as a woman or person of color you'll recognize all of this Well researched and written The sexism in algorithms is something I've not thought about but damn was that interesting

  8. Jay Jay says:

    Given the title of this book I assumed it would focus exclusively on the problems of bias in software and machine learning This has been in the news for uite a while and on top of the news recently While most of the book provides stories about bias as I expected a large part of the book was about various other behaviors sexist racist illegal and just bad Think hiring at Uber If you have kept up with these kinds of issues in WiredFast Company magazines and their ilk you get many examples here but not much by way of solutions Despite that mild disappointment I found the writing kept my interest at least up until the end when it felt like the authors were reaching for things to write about Good for helping an ITer data scientist or a tech company exec to think through how these issues may touch on your own company products and practices

  9. Kate Kaput Kate Kaput says:

    Long review coming This book was my first Feminist Book Club delivery it was brilliant techie but written in a digestible accessible down to earth way for those of us who don't work in tech I had no idea of all these problems like Google Photos identifying black faces as gorillas mobile ads targeting people in low income areas with ads for for profit colleges or a gym chain in Britain where a woman couldn't get into the locker rooms because the locker rooms were coded by title like Mr or Mrs hers Doctor was coded as male This book tackles problems small large including how they occur how they can be stopped Read this

  10. Amy Rhoda Brown Amy Rhoda Brown says:

    This is a crystal clear description of how the monoculture of tech leads to terrible apps toxic online behaviour and the failure of the developers to take responsibility for what their decisions based on their narrow worldview have wrought Easy to read well laid out and compelling

Leave a Reply

Your email address will not be published. Required fields are marked *