<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="titles.xsl"?>
<record
    biblionix-libraryname="Mary Riley Styles Public Library"
    biblionix-libraryid="1263"
    biblionix-libraryusername="fallschurch"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd"
    xmlns="http://www.loc.gov/MARC21/slim">

  <leader>02675cam a2200325 i 4500</leader>
  <controlfield tag="001">889030726</controlfield>
  <controlfield tag="003">TxAuBib</controlfield>
  <controlfield tag="005">20230731120000.0</controlfield>
  <controlfield tag="008">220709s2023||||||||||||||||||||||||eng|u</controlfield>
  <datafield tag="010" ind1=" " ind2=" ">
    <subfield code="a">2022019913</subfield>
  </datafield>
  <datafield tag="020" ind1=" " ind2=" ">
    <subfield code="a">9780262047654</subfield>
    <subfield code="q">HRD</subfield>
    <subfield code="c">26.95</subfield>
  </datafield>
  <datafield tag="020" ind1=" " ind2=" ">
    <subfield code="a">0262047659</subfield>
    <subfield code="q">HRD</subfield>
    <subfield code="c">26.95</subfield>
  </datafield>
  <datafield tag="040" ind1=" " ind2=" ">
    <subfield code="d">TxAuBib</subfield>
    <subfield code="e">rda</subfield>
  </datafield>
  <datafield tag="100" ind1="1" ind2=" ">
    <subfield code="a">Broussard, Meredith,</subfield>
    <subfield code="e">author</subfield>
    <subfield code="t">More than a glitch.</subfield>
  </datafield>
  <datafield tag="245" ind1="1" ind2=" ">
    <subfield code="a">More than a glitch</subfield>
    <subfield code="h">[BOOK] :</subfield>
    <subfield code="b">confronting race, gender, and ability bias in tech /</subfield>
    <subfield code="c">Meredith Broussard.</subfield>
  </datafield>
  <datafield tag="264" ind1=" " ind2="1">
    <subfield code="a">Cambridge, Massachusetts : </subfield>
    <subfield code="b">The MIT Press, </subfield>
    <subfield code="c">[2023]</subfield>
  </datafield>
  <datafield tag="300" ind1=" " ind2=" ">
    <subfield code="a">234 pages :</subfield>
    <subfield code="b">illustrations ;</subfield>
    <subfield code="c">24 cm.</subfield>
  </datafield>
  <datafield tag="336" ind1=" " ind2=" ">
    <subfield code="b">txt</subfield>
    <subfield code="2">rdacontent</subfield>
  </datafield>
  <datafield tag="337" ind1=" " ind2=" ">
    <subfield code="b">n</subfield>
    <subfield code="2">rdamedia</subfield>
  </datafield>
  <datafield tag="338" ind1=" " ind2=" ">
    <subfield code="b">nc</subfield>
    <subfield code="2">rdacarrier</subfield>
  </datafield>
  <datafield tag="504" ind1=" " ind2=" ">
    <subfield code="a">Includes bibliographical references (pages 193-332) and index.</subfield>
  </datafield>
  <datafield tag="505" ind1=" " ind2=" ">
    <subfield code="a">Understanding machine bias -- Recognizing bias in facial recognition -- Machine fairness and the justice system -- Real students, imaginary grades -- Ability and technology -- Gender rights and databases -- Diagnosing racism -- An AI told me I had cancer-- Creating public interest technology -- Potential reboot.</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">The word “glitch” implies an incidental error, as easy to patch up as it is to identify. But what if racism, sexism, and ableism aren’t just bugs in mostly functional machinery—what if they’re coded into the system itself? Meredith Broussard demonstrates in More Than a Glitch how neutrality in tech is a myth and why algorithms need to be held accountable. Broussard, a data scientist and one of the few Black female researchers in artificial intelligence, synthesizes concepts from computer science and sociology. She explores a range of examples: from facial recognition technology trained only to recognize lighter skin tones, to mortgage-approval algorithms that encourage discriminatory lending, to the dangerous feedback loops that arise when medical diagnostic algorithms are trained on insufficiently diverse data. Even when such technologies are designed with good intentions, Broussard shows, fallible humans develop programs that can result in devastating consequences. Broussard argues that the solution isn’t to make omnipresent tech more inclusive, but to root out the algorithms that target certain demographics as “other” to begin with.</subfield>
    <subfield code="c">Provided by publisher.</subfield>
  </datafield>
  <datafield tag="541" ind1=" " ind2=" ">
    <subfield code="d">20230731.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2=" ">
    <subfield code="a">Technology</subfield>
    <subfield code="x">Social aspects.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2=" ">
    <subfield code="a">Data processing</subfield>
    <subfield code="x">Social aspects.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2=" ">
    <subfield code="a">Artificial intelligence</subfield>
    <subfield code="x">Social aspects.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2=" ">
    <subfield code="a">Discrimination.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2=" ">
    <subfield code="a">Software failures.</subfield>
  </datafield>
</record>