Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)


10 thoughts on “Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

  1. says: Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Sara Wachter-Boettcher Ü 8 Review Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

    Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Sara Wachter-Boettcher Ü 8 Review Well This is another one of those funny books that is sort of a “5” and sort of a “3” The book broadly claims that the tech industry builds interfaces and products that are not necessarily intentionally biased The book says that the main driver is the homogeneity of tech company investors and employeesThere is no doubt in m

  2. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Sara Wachter-Boettcher Ü 8 Review

    Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Some interesting concepts Normalizing Edge vs Stress The 'select one' being the problematic idea The default settings of our lives and everything else metrics are only as good as the goals and intentions that underlie them c inappropriate trying too hard chatty tech products c “marketing negging” The unlikely delights of '1 800 Flowers purchase particularly relevant to a Scorpio' © DAUsMAUsCAUsuite a lot

  3. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

    Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Sara Wachter-Boettcher Ü 8 Review Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) I want to ualify my rating of this book If you haven’t previously thought about sexism racism or other forms of discrimination in the tech industry this is a five star recommendation However as someone who regularly reads about this topic and pays attention to tech news I encountered very little new information in this book It w

  4. says: Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Sara Wachter-Boettcher Ü 8 Review Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

    Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Why do apps and profile info pages mostly come with only two gender options male and female? What if someone doesn't wish to be identified as either? Why is there still a vast underrepresentation of women and minorities in the tech sector? Why hasn't there been a massive MeToo rising in the tech industry across the world

  5. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Sara Wachter-Boettcher Ü 8 Review

    Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Most tech products are full of blind spots biases and outright ethical blunders Like in the spring of 2015 when Louise Selby a pedi

  6. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

    Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Sara Wachter-Boettcher Ü 8 Review Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech A good and short read Plenty of examples but mostly the famous ones on the internet the author's alignment with the truly marginalized is limited mostly with femalegaystransgendernonwhites but still the educated unlike O'Neil in Weapons of Math Destruction How Big Data Increases Ineuality and Threatens Democracy who places her heart

  7. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher

    Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Sara Wachter-Boettcher Ü 8 Review Nothing surprising here but infuriating and important nonetheless if you at all work in tech as a woman or person of color you'll recognize all of this Well researched and written The sexism in algorithms is something I've not thought about

  8. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF)

    Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Given the title of this book I assumed it would focus exclusively on the problems of bias in software and machine learning This has been in the news for uite a while and on top of the news recently While most of the book provides stories about bias as I expected a large part of the book was about various other behaviors sexist racist illegal

  9. says: Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Sara Wachter-Boettcher Ü 8 Review

    Sara Wachter-Boettcher Ü 8 Review Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Long review coming This book was my first Feminist Book Club delivery it was brilliant techie but written in a digestible accessible down to earth way for those of us who don't work in tech I had no idea of all these problems like

  10. says: Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher

    Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech (PDF) This is a crystal clear description of how the monoculture of tech leads to terrible apps toxic online behaviour and the failure of the developers to take responsibility for what their decisions based on their narrow worldview have wr

Leave a Reply

Your email address will not be published. Required fields are marked *

Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher

Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech

Sara Wachter-Boettcher Ü 8 Review Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Summary ↠ 108 Values processes and assumptions that lead to these and other problems Technically Wrong demystifies the tech industry leaving those of us on the other side of the screen better prepared to make informed choices about the services we use and demand from the companies behind th. Given the title of this book I assumed it would focus exclusively on the problems of bias in software and machine learning This has been in the news for uite a while and on top of the news recently While most of the book provides stories about bias as I expected a large part of the book was about various other behaviors sexist racist illegal and just bad Think hiring at Uber If you have kept up with these kinds of issues in WiredFast Company magazines and their ilk you get many examples here but not much by way of solutions Despite that mild disappointment I found the writing kept my interest at least up until the end when it felt like the authors were reaching for things to write about Good for helping an ITer data scientist or a tech company exec to think through how these issues may touch on your own company products and practices

Read & download Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech

Sara Wachter-Boettcher Ü 8 Review Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Summary ↠ 108 Buying groceries tracking our health finding a date whatever we want to do odds are that we can now do it online But few of us ask why all these digital products are designed the way they are It’s time we change that Many of the services we rely on are full of oversights bias. Some interesting concepts Normalizing Edge vs Stress The select one being the problematic idea The default settings of our lives and everything else metrics are only as good as the goals and intentions that underlie them c inappropriate trying too hard chatty tech products c marketing negging The unlikely delights of 1 800 Flowers purchase particularly relevant to a Scorpio DAUsMAUsCAUsuite a lot of problematic issues Precisely the ones that lead diversity intentions to ruin Some ludicrous ideas like going about how to connect with a made up persona made up specifically to connect with I m calling this one a BS jobAlso a lot of genuinely good material Here go handpicked examples of bothThere there dear Don t worry about what we re doing with your account Have a balloon c Back in 2011 if you told Siri you were thinking about shooting yourself it would give you directions to a gun store c Now I m tempted to use Siri Attabotgirlfar too many people in tech have started to believe that they re truly saving the world Even when they re just making another ride hailing app or restaurant algorithm c I m pretty sure that goes way beyond that And even beyond the tech I m writing this in the wake of the 2016 presidential election an election that gave us an American president who is infamous for allegations of sexual assault racism conflicts of interest collusion and angry Tweetstorms and who rode to power on a wave of misinformation c The problem was that there were 2 very problematic candidates not just one Hah Another problem is that people actually expect Facebook or Twitter or some other shit to tell them how to vote Problem numero trece is that people seem to be actually believing that that the flow of trash called news in FB isn t actually news So How very comfy to blame the techYou don t need a computer science degree or a venture capital fund You don t need to be able to program an algorithm All you need to do is slough away the layers of self aggrandizement and jargon and get at the heart of how people in technology work and why their decisions so often don t serve you c That s actually not true Self aggrandizement and jargon all of that is just perception which might be skewed or not Understanding is the keywe ll take a closer look at how the tech industry operates and see how its hiring practices and work culture create teams that don t represent most of us no matter how many diversity events these companies put on c Why should they hire someone who represents anything instead of someone who s able to do the job Diversity is about not refusing to hire a capable young mother or someone of another race Hiring representatives is a totally different operaDesigners and technologists don t head into the office planning to launch a racist photo filter or build a sexist assumption into a database c LOL she spent the next hour listening to older men tell her about the female market The men in the room insisted that most women really care about leisure time activities c Now this must have been fun Even though the company had forty odd employees and had been in business than a decade no staff member had ever been pregnant We have three other women of childbearing age on our team and we don t want to set a precedent the owner told her as if pregnancy were some sort of new trend c Wowser These guys must have grown on trees Some rotten fruits the two teams with lots of women on staff were sent an email by a board member asking them to put together some kind of dance routine to perform at the company presentation The heads of each department all men stood up and talked about their successes over the course of the year The only women who graced the stage were a group of her peers in crop tops and hot pants The men in the audience wolf whistled while the women danced c That s some companyAm lie Lamont whose manager once claimed she hadn t seen her in a meeting You re so black you blend into the chair she told her c Damn I ve actually once had a very similar discussion I ve never before or after wanted so much to suggest that that reviewer should by the effing glasses and spare me the bullshitTech is also known for its obsession with youth an obsession so absurd that I now regularly hear rumors about early thirties male startup founders getting cosmetic surgery so that investors will think they re still in their twenties c Yep that s a fact Other companies start their workdays with all staff meetings held while everyone does planks the fitness activity where you get on the ground prop yourself up by your feet and elbows and hold the position until your abs can t handle it any If you re physically able to plank that is And you re not wearing a dress Or feeling modest Or embarrassed Or uncomfortable getting on your hands and knees at work c Ridiculous Riddiculus I m not interested in ping pong beer or whatever other gimmick used to attract new grads The fact that I don t like those things shouldn t mean I m not a culture fit I don t want to work in tech to fool around I want to create amazing things and learn from other smart people That is the culture fit you should be looking for c Golden wordsThe good news is there s actually no magic to tech As opaue as it might seem from the outside it s just a skill set one that all kinds of people can and do learn There s no reason to allow tech companies to obfuscate their work to call it special and exempt it from our pesky ethics Except that we ve never demanded they do better c And except that many of us don t really bother learning how stuff works Had these companies disclosed all their proprietary code today not many of us would know how to make head or tail of itAre you a Kelly the thirty seven year old minivan mom from the Minneapolis suburbs Or do you see yourself as a Matt the millennial urban dweller who loves CrossFit and cold brew coffee Maybe you re of a Maria the low income community college student striving to stay in school while supporting her parentsNo Well this is how many companies think about you c Now that s a great point she test drove some menstrual cycle apps looking for one that would help her get the information she neededWhat she found wasn t so rosy Most of the apps she saw were splayed with pink and floral motifs and Delano immediately hated the gender stereotyping But even she hated how often the products assumed that fertility was her primary concern rather than you know asking her c LOL It wasn t rosy it was pink and florid Glow works well for women who are trying to get pregnant with a partner But for everyone else both services stop making sense and can be so alienating that would be users feel frustrated and delete them c Well frankly I don t think the right problem is being highlighted here Glow might be cheesy It also actually was initially rolled out for women trying to get pregnant So IMO women who don t might do better choosing some other app No shit Sherlock Every single app doesn t have to be a multitool capable of Pyhon coding getting one pregnant and building space ships The problem likely is that the market either doesn t clearly specify the alternative needs and apps applicable to other cases or does have voids in some respects That s actually both a problem and a business opportunityWhat happens when those someones are the people we met in Chapter 2 designers and developers who ve been told that they re rock stars gurus and geniuses and that the world is made for people like them c The Big Flip FlopBut when default settings present one group as standard and another as special such as men portrayed as normal than women or white people as normal than people of color the people who are already marginalized end up having the most difficult time finding technology that works for them c AmenIf you ve designed a cockpit to fit the average pilot you ve actually designed it to fit no one So what did the air force do Instead of designing for the middle it demanded that airplane manufacturers design for the extremes instead mandating planes that fit both those at the smallest and the largest sizes along each dimension Pretty soon engineers found solutions to designing for these ranges including adjustable seats foot pedals and helmet straps the kinds of inexpensive features we now take for granted c When designers call someone an edge case they imply that they re not important enough to care about that they re outside the bounds of concern In contrast a stress case shows designers how strong their work is and where it breaks down c Edge vs Stress gives interesting dichotomy I saw race and ethnicity menus that couldn t accommodate people of multiple races I saw simple sign up forms that demanded to know users gender and then offered only male and female options I saw college application forms that assumed an applicant s parents lived together at a single address c Fucked up design And not just designTake Shane Creepingbear a member of the Kiowa tribe of Oklahoma In 2014 he tried to log into Facebook But rather than being greeted by his friend s posts like usual he was locked out of his account and shown this messageYour Name Wasn t Approved Adding to the insult the site gave him only one option a button that said Try Again There was nowhere to click for This is my real name or I need help Facebook also rejected the names of a number of other Native Americans Robin Kills the Enemy Dana Lone Hill Lance Brown Eyes In fact even after Brown Eyes sent in a copy of his identification Facebook changed his name to Lance Brown c Oh this is top there s still the fact that Facebook has placed itself in the position of deciding what s authentic and what isn t of determining whose identity deserves an exception and whose does not c Which is uite obviously bonkers People who identify as than one race end up having to select multiracial As a result people who are multiracial end up flattened either they get lumped into a generic category stripped of meaning or they have to pick one racial identity to prioritize and effectively hide any others They can t identify the way they would in real life and the result is just one example of the ways people who are already marginalized feel even invisible or unwelcome c When you remember how few people change the default settings in the software they use Facebook s motivations become a lot clearer Facebook needs advertisers Advertisers want to target by gender Most users will never go back to futz with custom settings So Facebook effectively designs its onboarding process to gather the data it wants in the format advertisers expect Then it creates its customizable settings and ensures it gets glowing reviews from the tech press appeasing groups that feel marginalized all the while knowing that very few people statistically will actually bother to adjust anything Thus it gets a feel good story about inclusivity while maintaining as large an audience as possible for advertisers It s a win win if you re Facebook or an advertiser that is c It was cute unless you wanted to react to a serious post and all you had was a sad Frankenstein c uite the company Hi Tyler one man s video starts using title cards Here are your friends He s then shown five copies of the same photo The result is eual parts funny and sad like he has just that one friend It only gets better or worse depending on your sense of humor from there Another title card comes up You ve done a lot together followed by a series of photos of wrecked vehicles culminating in a photo of an injured man giving the thumbs up from a hospital bed I suppose Facebook isn t wrong exactly getting in a car accident is one definition of doing a lot together c This is both hilarious and horrifyingYou can probably guess what went wrong in one Facebook created a montage of a man s near fatal car crash set to an acoustic jazz ditty Just imagine your photos of a totaled car and scraped up arms taken on a day you thought you might die set to a soft scat vocal track Doo be doo duh duh indeed c Tumblr Beep beep neo nazis is here it read a Tumblr employee told Rooney that it was probably a what you missed notification Rooney had previously read posts about the rise in fascism and the notification system had used her past behavior to predict that she might be interested in neo Nazi content another Tumblr user shared a version of the notification he received Beep beep mental illness is here c Well this is what happens when people are being treated as kids by appsMaybe I m the only one who s just not interested in snotty comebacks from my phone though I doubt it Why would anyone want their credit card offers to be dependent on the weather What precisely would we do to make a 1 800 Flowers purchase particularly relevant to a Scorpio How the hell did I end up here cDelight is a concept that s been tossed around endlessly in the tech industry these past few years and I ve always hated it c What Facebook Thinks You Like The extension trawls Facebook s ad serving settings and spits out a list of keywords the site thinks you re interested in and why There s the expected stuff Then there s a host of just plain bizarre connections Neighbors 1981 Film a film I ve never seen and don t know anything about A host of no context nouns Girlfriend Brand Wall Extended essay Eternity I have no idea where any of this comes from or what sort of advertising it would make me a target for Then it gets creepy returned from trip 1 week ago freuent international travelers I rarely post anything to Facebook but it knows where I go and how often c1500 individual tidbits of information about you all stored in a database somewhere and handed out to whoever will pay the price cThe technology is based on deep neural networks massive systems of information that enable machines to see much in the same way the human brain does c That s not precisely correct a future where Facebook AI listens in on conversations to identify potential terrorists where elected officials hold meetings on Facebook and where a global safety infrastructure responds to emergencies ranging from disease outbreaks to natural disasters to refugee crises c Welcome to the fish bowl

Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher

Sara Wachter-Boettcher Ü 8 Review Read & download µ eBook, ePUB or Kindle PDF Ü Sara Wachter-Boettcher Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech Summary ↠ 108 Es and downright ethical nightmares Chatbots that harass women Signup forms that fail anyone who’s not straight Social media sites that send peppy messages about dead relatives Algorithms that put black people behind barsSara Wachter Boettcher takes an unflinching look at the. A good and short read Plenty of examples but mostly the famous ones on the internet the author s alignment with the truly marginalized is limited mostly with femalegaystransgendernonwhites but still the educated unlike O Neil in Weapons of Math Destruction How Big Data Increases Ineuality and Threatens Democracy who places her heart towards the poor the abused whose stories may not be heard at all buried deep powerless The problems aren t less worthy to discuss though The sexist and racist culture is so embedded the privileges so taken for granted the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce That needs to change

  • Hardcover
  • 240
  • Technically Wrong Sexist Apps Biased Algorithms and Other Threats of Toxic Tech
  • Sara Wachter-Boettcher
  • English
  • 05 October 2020
  • 9780393634631