<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
        {font-family:"Cambria Math";
        panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
        {font-family:Calibri;
        panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
        {font-family:"Franklin Gothic Book";
        panose-1:2 11 5 3 2 1 2 2 2 4;}
@font-face
        {font-family:Georgia;
        panose-1:2 4 5 2 5 4 5 2 3 3;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
        {margin:0in;
        font-size:11.0pt;
        font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
        {mso-style-priority:99;
        color:#0563C1;
        text-decoration:underline;}
span.EmailStyle17
        {mso-style-type:personal-compose;
        font-family:"Franklin Gothic Book",sans-serif;
        color:windowtext;}
.MsoChpDefault
        {mso-style-type:export-only;
        font-family:"Calibri",sans-serif;}
@page WordSection1
        {size:8.5in 11.0in;
        margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
        {page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1027" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="#0563C1" vlink="#954F72" style="word-wrap:break-word">
<div class="WordSection1">
<p style="mso-margin-top-alt:12.0pt;margin-right:0in;margin-bottom:0in;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:#222222;background:white">OLA IFC Tuesday Topics June 2021: Artificial Intelligence and Libraries</span></b><span style="font-size:12.0pt;font-family:"Georgia",serif"><o:p></o:p></span></p>
<p class="MsoNormal" style="margin-bottom:12.0pt"><o:p> </o:p></p>
<p style="margin:0in"><span style="font-family:"Arial",sans-serif;color:#222222;background:white">Welcome to Tuesday Topics, a monthly series covering topics with intellectual freedom implications for libraries of all types. Each message is prepared by a member
 of OLA's Intellectual Freedom Committee or a guest writer. Questions can be directed to the author of the topic or to the IFC Committee.</span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p align="center" style="margin:0in;text-align:center"><span style="font-family:"Arial",sans-serif;color:#0563C1;border:none windowtext 1.0pt;padding:0in"><img width="392" height="156" style="width:4.0833in;height:1.625in" id="Picture_x0020_1" src="cid:image001.png@01D761C3.1B78D660"></span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">What is AI?</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Artificial intelligence (AI) is no longer just a science fiction trope. In fact, AI technologies have become so prevalent in our lives over the past few years, that we encounter these technologies daily
 in applications like GPS navigation, online shopping recommendations, targeted ads, chatbots, virtual assistants, and search engines, to name just a few. Artificial Intelligence refers to all forms of
</span><a href="https://www.expert.ai/blog/machine-learning-definition/"><span style="font-family:"Arial",sans-serif;color:#1155CC">machine learning</span></a><span style="font-family:"Arial",sans-serif;color:black">, including
</span><a href="https://www.techopedia.com/definition/32902/deep-neural-network"><span style="font-family:"Arial",sans-serif;color:#1155CC">deep neural networks</span></a><span style="font-family:"Arial",sans-serif;color:black">, as well as
</span><a href="https://www.techopedia.com/definition/32309/computer-vision"><span style="font-family:"Arial",sans-serif;color:#1155CC">computer vision</span></a><span style="font-family:"Arial",sans-serif;color:black">,
</span><a href="https://www.techopedia.com/definition/653/natural-language-processing-nlp"><span style="font-family:"Arial",sans-serif;color:#1155CC">natural language processing</span></a><span style="font-family:"Arial",sans-serif;color:black">, and other
 complex algorithms that attempt to replicate human decision-making. These technologies have a wide variety of applications, revealing patterns in data that would take human analysts eons to process. The American Library Association’s
</span><a href="http://www.ala.org/tools/future"><span style="font-family:"Arial",sans-serif;color:#1155CC">Center for the Future of Libraries</span></a><span style="font-family:"Arial",sans-serif;color:black"> has identified
</span><a href="http://www.ala.org/tools/future/trends/artificialintelligence"><span style="font-family:"Arial",sans-serif;color:#1155CC">Artificial Intelligence</span></a><span style="font-family:"Arial",sans-serif;color:black">, along with several technologies
 powered by AI, such as </span><a href="http://www.ala.org/tools/future/trends/facialrecognition"><span style="font-family:"Arial",sans-serif;color:#1155CC">facial recognition</span></a><span style="font-family:"Arial",sans-serif;color:black"> and
</span><a href="http://www.ala.org/tools/future/trends/selfdriving"><span style="font-family:"Arial",sans-serif;color:#1155CC">self-driving cars</span></a><span style="font-family:"Arial",sans-serif;color:black">, as top technology trends relevant to libraries.
 As libraries adopt these technologies, for things like digital text and image processing, algorithmic recommendations and discovery, and even virtual reference,  it is important that we consider how this might impact our intellectual freedoms. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">What are the IF concerns posed by AI?</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">As artificial intelligence transforms our lives and work, it has become clear that, as useful as it is, the technology poses numerous ethical concerns around bias, privacy, and misinformation. AI systems
 are trained on massive quantities of data, often harvested from social media and other online sources. There are concerns around how this data is collected, who controls it, how representative it is, and how it is used to target, profile, and manipulate. Libraries
 must contend with these issues as they implement AI tools in their own practices and as they help library users navigate digital life. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<i><span style="font-family:"Arial",sans-serif;color:black">Digital privacy and consent</span></i><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Most people have probably heard by now “</span><a href="https://theconversation.com/if-its-free-online-you-are-the-product-95182"><span style="font-family:"Arial",sans-serif;color:#1155CC">if it’s free
 online, you’re the product</span></a><span style="font-family:"Arial",sans-serif;color:black">,” or more specifically, your data is the product. Whenever you click through a privacy agreement to use an app, you are most likely signing off on the collection,
 use, storage, and sale of your data. This includes your personal information, demographic data, location, and any and all interactions you have on the site. Big data is big business, and privacy policies are
</span><a href="https://www.usdirect.com/business/resource-center/privacy-policy-lengths/"><span style="font-family:"Arial",sans-serif;color:#1155CC">notoriously long</span></a><span style="font-family:"Arial",sans-serif;color:black"> and difficult to parse.
 As IFC member, Miranda Doyle, discussed in a previous </span><a href="https://ola.memberclicks.net/assets/IntellectualFreedom/TuesdayTopics/tuesdaytopicnovember2019.pdf"><span style="font-family:"Arial",sans-serif;color:#1155CC">Tuesday Topic about student
 privacy</span></a><span style="font-family:"Arial",sans-serif;color:black">, libraries and schools should be safeguarding the privacy of their users when contracting with third party vendors.
</span><a href="https://libraryfreedom.org/scorecard/"><span style="font-family:"Arial",sans-serif;color:#1155CC">Library Freedom Project’s scorecard</span></a><span style="font-family:"Arial",sans-serif;color:black"> rates the privacy practices of some of
 the most popular library vendors, providing a starting point for selecting and negotiating with vendors. Governments have also begun enacting
</span><a href="https://www.cnbc.com/2021/04/08/from-california-to-brazil-gdpr-has-created-recipe-for-the-world.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">regulations
</span></a><span style="font-family:"Arial",sans-serif;color:black">that give people more control over what data they share. However, many people don’t realize that anything on the web is easily scraped by outside companies, researchers, or individuals that
 want to harvest data, no consent required. Artificial intelligence developers frequently purchase or scrape the large quantities of data they need for machine learning projects. Such developers include companies like
</span><a href="https://www.nytimes.com/interactive/2021/03/18/magazine/facial-recognition-clearview-ai.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">Clearview A.I.</span></a><span style="font-family:"Arial",sans-serif;color:black">, which
 secretly developed a real-time facial recognition database from images and data scraped from the open web.  </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<i><span style="font-family:"Arial",sans-serif;color:black">Replicating and reinforcing bias</span></i><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">MIT researcher Joy Buolamwini’s work, as shown in the film
</span><a href="https://www.codedbias.com/"><span style="font-family:"Arial",sans-serif;color:#1155CC">Coded Bias</span></a><span style="font-family:"Arial",sans-serif;color:black">, has drawn attention to the fact that when facial recognition systems are trained
 on mostly white male faces, they perform poorly at identifying non-white or non-male faces, often
</span><a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28"><span style="font-family:"Arial",sans-serif;color:#1155CC">misidentifying</span></a><span style="font-family:"Arial",sans-serif;color:black">
 or failing to identify them. This has led to wrongful arrests, like the case of 
</span><a href="https://www.washingtonpost.com/technology/2021/04/13/facial-recognition-false-arrest-lawsuit/"><span style="font-family:"Arial",sans-serif;color:#1155CC">Robert Williams</span></a><span style="font-family:"Arial",sans-serif;color:black"> in
 Detroit, and has spurred dozens of cities,</span><a href="https://www.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html"><span style="font-family:"Arial",sans-serif;color:#1155CC"> including Portland</span></a><span style="font-family:"Arial",sans-serif;color:black">,
 to ban the use of facial recognition technology by law enforcement and public agencies. Although this issue is most visible with facial recognition, the truth is that any machine learning system replicates, reinforces, and sometimes amplifies biases in the
 data it is trained on. The “black box” nature of AI algorithms can lead people to believe that the decisions they make are fairer than those made by humans, but as anyone who works with data knows, “garbage in, garbage out,” and much of the data fed into machine
 learning systems is not cleansed of the racist, sexist, and classist garbage of the society that produced it. Virginia Eubanks’ book
<i>Automating Inequality</i> and Cathy O’Neill’s <i>Weapons of Math Destruction</i> both<i>
</i>detail the harms caused by black box algorithms, when they are unchecked by human empathy and judgment. This problem of bias in AI systems has been recognized in fields as wide ranging as
</span><a href="https://venturebeat.com/2021/02/18/studies-find-racial-and-gender-bias-in-ai-models-that-recommend-ventilator-usage-and-diagnose-diseases/"><span style="font-family:"Arial",sans-serif;color:#1155CC">medical diagnostics</span></a><span style="font-family:"Arial",sans-serif;color:black">,
</span><a href="https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/"><span style="font-family:"Arial",sans-serif;color:#1155CC">predictive policing</span></a><span style="font-family:"Arial",sans-serif;color:black">,
 and even search engines. Safiya Noble’s book, <i>Algorithms of Oppression</i> is a seminal work on understanding the bias implicit in the search algorithms we rely on every day and the ways they profile and misrepresent BIPOC and other marginalized demographic
 groups. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<i><span style="font-family:"Arial",sans-serif;color:black">Deep fakes and viral misinformation</span></i><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Another threat to intellectual freedom is the rise of
</span><a href="https://www.brookings.edu/wp-content/uploads/2020/06/The-role-of-technology-in-online-misinformation.pdf"><span style="font-family:"Arial",sans-serif;color:#1155CC">misinformation produced and spread by artificial intelligence systems</span></a><span style="font-family:"Arial",sans-serif;color:black">.
 Artificially produced content, including text, images, and video, has become so sophisticated, that it is nearly indistinguishable from real content. Malicious actors have used fake AI-produced content to interfere with elections and sow widespread confusion.
 Social media algorithms that prioritize high levels of engagement, have also been shown to
</span><a href="https://www.brookings.edu/blog/order-from-chaos/2018/05/09/how-misinformation-spreads-on-social-media-and-what-to-do-about-it/"><span style="font-family:"Arial",sans-serif;color:#1155CC">spread misinformation more quickly</span></a><span style="font-family:"Arial",sans-serif;color:black">
 than true stories. Ironically, AI has also been proposed as a solution to the problem of misinformation, as the same tools that are able to produce fake content are best able to detect it. However, this raises questions about how much trust to put in such
 moderation algorithms to define what is true. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">What can libraries do? </span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">AI is here to stay. It offers undeniable benefits, such as better accessibility with speech to text and conversational searching, tools for managing and analyzing digital documents, and improved search
 and discovery. Libraries are experimenting with AI to improve optical character recognition in text documents, automate processing of digital images, provide recommendations to users based on their past searches, and for virtual reference assistance. These
 uses of AI are potentially transformative, but the ethical issues inherent in these technologies also threaten values that librarians hold dear. Libraries using AI applications need to be aware of these intellectual freedom issues, work to mitigate them, and
 help users to understand them. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">As digital literacy advocates and conduits to emerging technologies, libraries have an opportunity to demystify and democratize AI technologies and advocate for more equitable data practices. In 2019
 the Urban Libraries Council launched an </span><a href="https://www.urbanlibraries.org/initiatives/securing-digital-democracy"><span style="font-family:"Arial",sans-serif;color:#1155CC">AI and Digital Citizenship initiative</span></a><span style="font-family:"Arial",sans-serif;color:black">,
 calling for libraries to get ahead of the curve by educating ourselves and our users about AI and incorporating the technology into our services in an ethical and transparent way. The International Federation of Library Associations (IFLA) also advocates for
</span><a href="https://www.ifla.org/publications/node/93397"><span style="font-family:"Arial",sans-serif;color:#1155CC">ethical use of AI in libraries</span></a><span style="font-family:"Arial",sans-serif;color:black">. Libraries have
</span><a href="https://americanlibrariesmagazine.org/2019/03/01/exploring-ai/"><span style="font-family:"Arial",sans-serif;color:#1155CC">partnered with AI researchers</span></a><span style="font-family:"Arial",sans-serif;color:black"> to develop library specific
 apps and programs to teach people about AI and with advocacy organizations to provide creative
</span><a href="https://americanlibrariesmagazine.org/2020/09/01/dragging-ai-facial-recognition-software/"><span style="font-family:"Arial",sans-serif;color:#1155CC">programs</span></a><span style="font-family:"Arial",sans-serif;color:black"> about issues of
 bias and surveillance. Libraries have also enabled hands-on exploration through </span>
<a href="https://ejournals.bc.edu/index.php/ital/article/view/10974"><span style="font-family:"Arial",sans-serif;color:#1155CC">maker kits</span></a><span style="font-family:"Arial",sans-serif;color:black"> or
</span><a href="https://web.uri.edu/ai/"><span style="font-family:"Arial",sans-serif;color:#1155CC">lab spaces</span></a><span style="font-family:"Arial",sans-serif;color:black"> and encouraged civic engagement by hosting
</span><a href="https://vimeo.com/374180709"><span style="font-family:"Arial",sans-serif;color:#1155CC">community conversations</span></a><span style="font-family:"Arial",sans-serif;color:black">.  Providing broad access to AI technology and helping people
 understand it is a first step toward </span><a href="https://www.nytimes.com/2021/03/15/technology/artificial-intelligence-google-bias.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">diversifying the artificial intelligence workforce</span></a><span style="font-family:"Arial",sans-serif;color:black">,
 developing public policy solutions, and reducing the problem of bias. Informed librarians can also provide human guidance, such as inclusive data curation and technological literacy, to mitigate the harms of unchecked algorithms. </span><o:p></o:p></p>
<p style="margin:0in"><b><span style="font-family:"Arial",sans-serif;color:black">Ellie Avis</span></b><o:p></o:p></p>
<p style="margin:0in"><span style="font-family:"Arial",sans-serif;color:black">OLA Intellectual Freedom Committee Member</span><o:p></o:p></p>
<p style="margin:0in"><span style="font-family:"Arial",sans-serif;color:black">Technical Services Manager, Josephine Community Library</span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">Learn More: </span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">AI in Libraries</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Center for the Future of Libraries. (n.d.).
<i>Trends</i>. American Library Association. </span><a href="http://www.ala.org/tools/future/trends"><span style="font-family:"Arial",sans-serif;color:#1155CC">http://www.ala.org/tools/future/trends</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Finley, T. K. (2019). The Democratization of Artificial Intelligence: One Library’s Approach.
<i>Information Technology and Libraries</i>, <i>38</i>(1), 8-13. </span><a href="https://doi.org/10.6017/ital.v38i1.10974"><span style="font-family:"Arial",sans-serif">https://doi.org/10.6017/ital.v38i1.10974</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Garcia-Febo, L. (2019, March 1). Exploring AI: How libraries are starting to apply artificial intelligence in their work .
<i>American Libraries</i>. </span><a href="https://americanlibrariesmagazine.org/2019/03/01/exploring-ai/"><span style="font-family:"Arial",sans-serif">https://americanlibrariesmagazine.org/2019/03/01/exploring-ai/</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Ghosh, S. (2021, March 15).
<i>Future of AI in libraries</i>. SJSU School of Information. </span><a href="https://ischool.sjsu.edu/ciri-blog/future-ai-libraries"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://ischool.sjsu.edu/ciri-blog/future-ai-libraries</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">IFLA. (2020, October 21).
<i>Statement on Libraries and Artificial Intelligence. </i></span><a href="https://www.ifla.org/publications/node/93397"><span style="font-family:"Arial",sans-serif">https://www.ifla.org/publications/node/93397</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Wheatley, A & Hervieux, S. (2020). Artificial intelligence in academic libraries: An environmental scan.
<i>Information Services & Use, 39</i>(4), 347–356.</span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">Algorithmic Bias</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Altman, A. (2021, May 20). Users, bias, and sustainability in A.I.
<i>Digital Public Library of America.</i> </span><a href="https://dp.la/news/users-bias-and-sustainability-in-ai"><span lang="ES" style="font-family:"Arial",sans-serif;color:#1155CC">https://dp.la/news/users-bias-and-sustainability-in-ai</span></a><span lang="ES" style="font-family:"Arial",sans-serif;color:black"> </span><span lang="ES"><o:p></o:p></span></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span lang="ES" style="font-family:"Arial",sans-serif;color:black">Coded Bias. (2020).
</span><a href="https://www.codedbias.com/"><span lang="ES" style="font-family:"Arial",sans-serif;color:#1155CC">https://www.codedbias.com</span></a><span lang="ES"><o:p></o:p></span></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span lang="ES" style="font-family:"Arial",sans-serif;color:black">Eubanks, V. (2018).
</span><i><span style="font-family:"Arial",sans-serif;color:black">Automating inequality: How high-tech tools profile, police, and punish the poor.
</span></i><span style="font-family:"Arial",sans-serif;color:black">St. Martin’s Press.</span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Gilman, M. & Madden, M. (2021). Digital barriers to economic justice in the wake of COVID-19.
<i>Data & Society.</i> </span><a href="https://datasociety.net/library/digital-barriers-to-economic-justice-in-the-wake-of-covid-19/"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://datasociety.net/library/digital-barriers-to-economic-justice-in-the-wake-of-covid-19</span></a><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Harwell, D. (2021, April 13). Wrongfully arrested man sues Detroit police over false facial recognition match.
<i>Washington Post. </i></span><a href="https://www.washingtonpost.com/technology/2021/04/13/facial-recognition-false-arrest-lawsuit/"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.washingtonpost.com/technology/2021/04/13/facial-recognition-false-arrest-lawsuit/</span></a><i><span style="font-family:"Arial",sans-serif;color:black"> </span></i><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Noble, S. U. (2019). <i>
Algorithms of oppression: How search engines reinforce racism</i>. NYU Press. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">O’Neill, C. (2017). <i>Weapons of math destruction: How big data increases inequality and threatens democracy.
</i>Crown. </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Snow, J. (2018, July 26). Amazon’s face recognition falsely matched 28 members of Congress with mugshots.
<i>American Civil Liberties Union. </i></span><a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Metz, C. (2021, March 15). Who is making sure the A.I. machines aren’t racist?
<i>New York Times</i>. </span><a href="https://www.nytimes.com/2021/03/15/technology/artificial-intelligence-google-bias.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.nytimes.com/2021/03/15/technology/artificial-intelligence-google-bias.html</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Wiggers, K. (2021, February 18). Studies find bias in AI models that recommend treatments and diagnose diseases.
<i>Venture Beat. </i></span><a href="https://venturebeat.com/2021/02/18/studies-find-racial-and-gender-bias-in-ai-models-that-recommend-ventilator-usage-and-diagnose-diseases/"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://venturebeat.com/2021/02/18/studies-find-racial-and-gender-bias-in-ai-models-that-recommend-ventilator-usage-and-diagnose-diseases/</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">Laws and Regulations</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Keane, J. (2021, April 8). From California to Brazil: Europe’s privacy laws have created a recipe for the world.
</span><i><span lang="ES" style="font-family:"Arial",sans-serif;color:black">CNBC</span></i><span lang="ES" style="font-family:"Arial",sans-serif;color:black">.
</span><a href="https://www.cnbc.com/2021/04/08/from-california-to-brazil-gdpr-has-created-recipe-for-the-world.html"><span lang="ES" style="font-family:"Arial",sans-serif;color:#1155CC">https://www.cnbc.com/2021/04/08/from-california-to-brazil-gdpr-has-created-recipe-for-the-world.html</span></a><span lang="ES" style="font-family:"Arial",sans-serif;color:black"> </span><span lang="ES"><o:p></o:p></span></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Metz, R. (2020, September 9). Portland passes broadest facial recognition ban in the US.
<i>CNN Business.</i> </span><a href="https://www.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html</span></a><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">Misinformation</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Kreps, S. (2020, June). <i>
The role of technology in online misinformation</i>. Foreign Policy at Brookings.
</span><a href="https://www.brookings.edu/wp-content/uploads/2020/06/The-role-of-technology-in-online-misinformation.pdf"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.brookings.edu/wp-content/uploads/2020/06/The-role-of-technology-in-online-misinformation.pdf</span></a><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Meserole, C. (2018, May 9). How misinformation spreads on social media—And what to do about it.
<i>Order from Chaos. </i>Brookings. </span><a href="https://www.brookings.edu/blog/order-from-chaos/2018/05/09/how-misinformation-spreads-on-social-media-and-what-to-do-about-it/"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.brookings.edu/blog/order-from-chaos/2018/05/09/how-misinformation-spreads-on-social-media-and-what-to-do-about-it/</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<b><span style="font-family:"Arial",sans-serif;color:black">Digital Privacy</span></b><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Hill, K. (2021, March 18). Your face is not your own.
<i>New York Times Magazine. </i></span><a href="https://www.nytimes.com/interactive/2021/03/18/magazine/facial-recognition-clearview-ai.html"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.nytimes.com/interactive/2021/03/18/magazine/facial-recognition-clearview-ai.html</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Hodge, K. (2018, April 19). If it’s free online, you are the product.
<i>The Conversation. </i></span><a href="https://theconversation.com/if-its-free-online-you-are-the-product-95182"><span style="font-family:"Arial",sans-serif;color:#1155CC">https://theconversation.com/if-its-free-online-you-are-the-product-95182</span></a><span style="font-family:"Arial",sans-serif;color:black"> </span><o:p></o:p></p>
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:8.0pt;margin-left:0in">
<span style="font-family:"Arial",sans-serif;color:black">Roderick, M. (2021, January 18). Visualizing the length of privacy policies.
<i>USDirect. </i></span><a href="https://www.usdirect.com/business/resource-center/privacy-policy-lengths/"><i><span style="font-family:"Arial",sans-serif;color:#1155CC">https://www.usdirect.com/business/resource-center/privacy-policy-lengths/</span></i></a><i><span style="font-family:"Arial",sans-serif;color:black"> </span></i><o:p></o:p></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><!--[if gte vml 1]><v:shapetype id="_x0000_t75" coordsize="21600,21600" o:spt="75" o:preferrelative="t" path="m@4@5l@4@11@9@11@9@5xe" filled="f" stroked="f">
<v:stroke joinstyle="miter" />
<v:formulas>
<v:f eqn="if lineDrawn pixelLineWidth 0" />
<v:f eqn="sum @0 1 0" />
<v:f eqn="sum 0 0 @1" />
<v:f eqn="prod @2 1 2" />
<v:f eqn="prod @3 21600 pixelWidth" />
<v:f eqn="prod @3 21600 pixelHeight" />
<v:f eqn="sum @0 0 1" />
<v:f eqn="prod @6 1 2" />
<v:f eqn="prod @7 21600 pixelWidth" />
<v:f eqn="sum @8 21600 0" />
<v:f eqn="prod @7 21600 pixelHeight" />
<v:f eqn="sum @10 21600 0" />
</v:formulas>
<v:path o:extrusionok="f" gradientshapeok="t" o:connecttype="rect" />
<o:lock v:ext="edit" aspectratio="t" />
</v:shapetype><v:shape id="Straight_x0020_Connector_x0020_2" o:spid="_x0000_s1026" type="#_x0000_t75" style='position:absolute;margin-left:-2.25pt;margin-top:4.2pt;width:317.25pt;height:.75pt;z-index:251659264;visibility:visible;mso-wrap-distance-left:9pt;mso-wrap-distance-top:0;mso-wrap-distance-right:9pt;mso-wrap-distance-bottom:0;mso-position-horizontal:absolute;mso-position-horizontal-relative:text;mso-position-vertical:absolute;mso-position-vertical-relative:text'>
<v:imagedata src="cid:image002.png@01D761C3.1B78D660" o:title="" />
<o:lock v:ext="edit" aspectratio="f" />
</v:shape><![endif]--><![if !vml]><span style="mso-ignore:vglayout;position:relative;z-index:251659264;left:-3px;top:6px;width:423px;height:7px"><img width="423" height="1" style="width:4.4062in;height:.0104in" src="cid:image002.png@01D761C3.1B78D660" v:shapes="Straight_x0020_Connector_x0020_2"></span><![endif]><span style="font-family:"Franklin Gothic Book",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif"><o:p> </o:p></span></p>
<br style="mso-ignore:vglayout" clear="ALL">
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif">Ellie Avis (she/her)<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif">Technical Services Mgr<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif">Josephine Community Library<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif">541-476-0571 x113<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"Franklin Gothic Book",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Franklin Gothic Book",sans-serif">“It wasn’t until I started reading and found books they wouldn’t let us read in school that I discovered you could be insane and happy and have a good life without
 being like everybody else.”  – John Waters<o:p></o:p></span></p>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</body>
</html>