1. Same story, different day...........year ie more of the same fiat floods the world
    Dismiss Notice
  2. There are no markets
    Dismiss Notice
  3. Week of 6/24/2017 Closing prices & Chg Over Last Wk---- Gold $1256.40 Silver $16.64 Oil $43.01 USD $96.94
  4. "Spreading the ideas of freedom loving people on matters regarding high finance, politics, constructionist Constitution, and mental masturbation of all types"
    Dismiss Notice

Sent to Prison by a Software Program’s Secret Algorithms

Discussion in 'Politics Forum (Local/National/World)' started by TAEZZAR, May 9, 2017.



  1. TAEZZAR

    TAEZZAR LADY JUSTICE ISNT BLIND, SHES JUST AFRAID TO WATCH Midas Member Site Supporter

    Joined:
    Apr 2, 2010
    Messages:
    10,152
    Likes Received:
    15,421
    Trophy Points:
    113
    Location:
    ORYGUN
    Looks like 1984 finally arrived!!

    Sent to Prison by a Software Program’s Secret Algorithms
    Sidebar

    By ADAM LIPTAK MAY 1, 2017


    [​IMG]

    Chief Justice John G. Roberts Jr., center, recently said that the day of using artificial intelligence in courtrooms was already here, “and it’s putting a significant strain on how the judiciary goes about doing things.” CreditStephen Crowley/The New York Times

    When Chief Justice John G. Roberts Jr. visited Rensselaer Polytechnic Institute last month, he was asked a startling question, one with overtones of science fiction.

    “Can you foresee a day,” asked Shirley Ann Jackson, president of the college in upstate New York, “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”

    The chief justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”

    He may have been thinking about the case of a Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.

    In March, in a signal that the justices were intrigued by Mr. Loomis’s case, they asked the federal government to file a friend-of-the-court brief offering its views on whether the court should hear his appeal.

    The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes.

    The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”

    The Wisconsin Supreme Court ruled against Mr. Loomis. The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.

    At the same time, the court seemed uneasy with using a secret algorithm to send a man to prison. Justice Ann Walsh Bradley, writing for the court, discussed, for instance, a report from ProPublica about Compas that concluded that black defendants in Broward County, Fla., “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism.”

    Justice Bradley noted that Northpointe had disputed the analysis. Still, she wrote, “this study and others raise concerns regarding how a Compas assessment’s risk factors correlate with race.”

    Read it all at the link:
    https://www.nytimes.com/2017/05/01/...&contentPlacement=10&pgtype=sectionfront&_r=2
     
    <SLV> and searcher like this.
  2. searcher

    searcher Mother Lode Found Site Supporter ++ Mother Lode

    Joined:
    Mar 31, 2010
    Messages:
    124,080
    Likes Received:
    36,832
    Trophy Points:
    113
    Face-reading AI will soon detect your political beliefs and IQ, researcher behind controversial 'gaydar' software claims
    • Researcher Dr Michal Kosinski went viral last week thanks to his 'gaydar' AI
    • The system identified whether someone was gay or straight based on photos
    • Dr Kosinski claims computers could be used to detect political beliefs and IQ
    • He says many personality traits are hidden in people's facial features


    Read more: http://www.dailymail.co.uk/sciencetech/article-4876230/Face-reading-AI-detect-political-beliefs-IQ.html#ixzz4sTbUOYsn
    Follow us: @MailOnline on Twitter | DailyMail on Facebook
     

Share This Page