// Required code

Announcement

Collapse

Civics 101 Guidelines

Want to argue about politics? Healthcare reform? Taxes? Governments? You've come to the right place!

Try to keep it civil though. The rules still apply here.
See more
See less

Computers be racist

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Computers be racist

    Slate now has incontrovertible proof of cyber-racism:

    Can a computer program be racist? Imagine this scenario: A program that screens rental applicants is primed with examples of personal history, debt, and the like. The program makes its decision based on lots of signals: rental history, credit record, job, salary. Engineers “train” the program on sample data. People use the program without incident until one day, someone thinks to put through two applicants of seemingly equal merit, the only difference being race. The program rejects the black applicant and accepts the white one. The engineers are horrified, yet say the program only reflected the data it was trained on. So is their algorithm racially biased?
    The answer WILL SHOCK YOU:

    Yes, it definitely is, and it’s just one of the dangers that can arise from an overreliance on widespread corporate and governmental data collection. University of Maryland law professor Frank Pasquale’s notable new book, The Black Box Society, tries to come to grips with the dangers of “runaway data” and “black box algorithms” more comprehensively than any other book to date. (An essay I wrote on “The Stupidity of Computers” is quoted in the book, though I wasn’t aware of this until I read it.) It’s an important read for anyone who is interested in the hidden pitfalls of “big data” and who wants to understand just how quantified our lives have become without our knowledge.
    This is basically a plea to programmers: "Please stop generating hatefacts without our knowledge, it's really annoying when a computer discovers something embarrassing and we don't have a ready-made narrative spin for it.

    Harvard professor Latanya Sweeney found that black-identified names (including her own) frequently generated Google ads like “Lakisha Simmons, Arrested?” while white-identified names did not. Because Google’s secret sauce is, well, secret, Sweeney could only speculate as to whether it was because her first and/or last names specifically linked to ad templates containing “arrest,” because those ads have had higher click-through rates, or some other reason. Though Google AdWords was certainly not programmed with any explicit racial bias, the results nonetheless showed a kind of prejudice.
    Check your privilege, Sweeney. Obviously you're not cool enough to read the Thug Report, or the various arrest record newspapers you can buy at gas stations throughout America's diverse areas. How could you ignore your roots like that?

    (Normal, i.e., nonblack people would consider maybe not giving their kids ghetto names if they're not trying to get them to fit into that culture. Not a lot of Latanya Changs in the world, maybe a growing amount of Latanya Lopezes, but they also fall into the Arrested Demographic.)

    Pasquale writes that some third-party data-broker microtargeting lists include “probably bipolar,” “daughter killed in car crash,” “rape victim,” and “gullible elderly.” There are no restrictions on marketers assembling and distributing such lists, nor any oversight, leading to what Pasquale terms “runaway data.” With such lists circulating among marketers, credit bureaus, hiring firms, and health care companies, these categories—which cross the line into racial or gender classification as well—easily slip from marketing tools into reputation indicators.
    When Obama bans consideration of FICO scores for underprivileged communities, the smart will turn to marketing. Also, "runaway data"? Really? Are you saying that data is supposed to be your SLAVE?

    The final paragraph is, of course, predictable:

    Philosophy professor Samir Chopra has discussed the dangers of such opaque programs in his book A Legal Theory for Autonomous Artificial Agents, stressing that their autonomy from even their own programmers may require them to be regulated as autonomous entities.
    Humans barely get that sort of respect.

    Pasquale stresses the need for an “intelligible society,” one in which we can understand how the inputs that go into these black box algorithms generate the effects of those algorithms. I’m inclined to believe it’s already too late—and that algorithms will increasingly have effects over which even the smartest engineers will have only coarse-grained and incomplete control. It is up to us to study the effects of those algorithms, whether they are racist, sexist, error-laden, or simply invasive, and take countermeasures to mitigate the damage. With more corporate and governmental transparency, clear and effective regulation, and a widespread awareness of the dangers and mistakes that are already occurring, we can wrest back some control of our data from the algorithms that none of us fully understands.
    Or you could, like, make them significantly less necessary by not banning the investigation of race/gender/etc by actual human beings, not hounding out of a job the humans who independently came up with the same conclusions as the machines did in ages past, and not depending on these algorithms to win elections then expecting those who created them to just pack them up and go home.

  • #2
    How can a algorithm be racist?
    "It's evolution; every time you invent something fool-proof, the world invents a better fool."
    -Unknown

    "Preach the gospel, and if necessary use words." - Most likely St.Francis


    I find that evolution is the best proof of God.
    ---------------------------------------------------------------------------------------------------------------
    I support the :
    sigpic

    Comment


    • #3
      Originally posted by Irate Canadian View Post
      How can a algorithm be racist?
      For that answer, we have to consult the great Rakim:

      1206481667896.gif

      Comment


      • #4
        But yes, in all seriousness, it turns out that when you feed an algorithm the credit history and buying patterns of individuals chosen at random, it will disproportionately judge against black people. Probably in the exact same proportions as the white people of yesteryear did. This will not, of course, cause any liberals to repent of jokes/thefts/murders made at their expense, of course.

        Comment

        Related Threads

        Collapse

        Topics Statistics Last Post
        Started by seanD, Yesterday, 05:25 PM
        2 responses
        51 views
        3 likes
        Last Post seanD
        by seanD
         
        Started by Whateverman, Yesterday, 01:23 PM
        4 responses
        51 views
        1 like
        Last Post Starlight  
        Started by CivilDiscourse, Yesterday, 10:30 AM
        59 responses
        330 views
        0 likes
        Last Post Sparko
        by Sparko
         
        Started by Starlight, 10-16-2020, 06:20 PM
        23 responses
        208 views
        0 likes
        Last Post oxmixmudd  
        Started by Whateverman, 10-16-2020, 06:20 PM
        16 responses
        141 views
        0 likes
        Last Post Whateverman  
        Working...
        X