Law Times

October 1, 2018

The premier weekly newspaper for the legal profession in Ontario

Issue link: https://digital.lawtimesnews.com/i/1033733

Contents of this Issue

Navigation

Page 4 of 15

Law Times • OcTOber 1, 2018 Page 5 www.lawtimesnews.com 'The impact on lives is very real' Independent oversight of AI needed: report BY AIDAN MACNAB For Law Times T he Canadian govern- ment's use of artificial intelligence for deci- sion-making in areas such as immigration needs inde- pendent oversight and carefully developed standards of use, says a new report from the University of Toronto. U of T's International Hu- man Rights Program at the Faculty of Law, along with the Citizen Lab at the Munk School of Global Affairs, has released "Bots at the Gate: A Human Rights Analysis of Automated Decision-Making in Canada's Immigration and Refugee Sys- tem," which says the federal government has been using al- gorithms and other AI tech- nology since 2014, turning the experiences of immigrants and refugee claimants into experi- ments for the new technology. "Without appropriate safe- guards and oversight, the use of AI is very risky," says Petra Mol- nar, a researcher at U of T's law faculty's IHRP. "The impact on lives is very real because we're dealing with a population that doesn't have ac- cess to the same types of rights that others have. "This is the last group that should be subject to technologi- cal experiments of this nature," says Molnar, whose training is in refugee and immigration law. The report recommends in- dependent oversight of all auto- mated decision-making by the federal government, publication of all uses of AI for any purpose and the establishment of a task force composed of "government, civil society and academia." The government has algo- rithmically automated certain duties formerly done by immi- gration officers, such as evaluat- ing immigrant and visitor appli- cations. It has indicated plans to ex- pand this technology, includ- ing evaluating the legitimacy of a marriage, whether an ap- plication is complete, whether a person should be protected as a refugee or if an applicant is a se- curity risk. Molnar says Charter rights to freedom of expression, free- dom of religion, freedom of mobility, privacy, equality and freedom from discrimination are at stake given the use of this technology. AI already has a "problematic track record" on race and gen- der, exemplified in predictive policing, she says. "As a society, we are really excited about the use of these technologies. But we need to re- member that AI is not neutral. It's essentially like a recipe, right? It depends what you put in it," Molnar says. "And just like a human deci- sion-maker, an algorithm will hold its own biases and replicate them throughout the system." James Kosa is a technology lawyer and partner at WeirFoulds LLP in Toronto who advises cli- ents involved in AI. He says the report "is correct in so many ways and demonstrates really ad- vanced thinking on many aspects of artificial intelligence" but he would caution against rejecting the technology itself and focus on how to properly develop it. "I encourage the authors and government to not abandon AI as a potential tool but also to look at how to do it properly so that they can achieve its objec- tives of efficiency, the cost effec- tiveness and reliability, without sacrificing human rights." Kosa says that, as people are imperfect machines themselves, unconscious bias can lead to ar- bitrary decision-making by real — human — immigration offi- cials, for example. "The tool is not malicious or nefarious; it is just a tool," he says. "In some cases, it may end up being more accurate, reliable and responsive than the humans making the decisions." Cynthia Khoo, a digital rights lawyer and fellow at the Citizen Lab, refers to a 2014 case in the U.K. where 7,000 foreign students were wrongfully deported be- cause an automated voice analy- sis system proved unreliable. Khoo says automated deci- sion-making by the federal gov- ernment raises serious questions about what happens with per- sonal information. The report was developed using interviews with government analysts, poli- cies, records and public state- ments made by the government and information requests. The researchers are wait- ing on responses to 27 access to information requests, filed in April of this year. The "limited" information the researchers obtained from government is "part of the prob- lem" and led to their recom- mendation for greater transpar- ency, says Khoo. "The question isn't are we go- ing to be using AI but more, if we are using it, how are we going to ensure that it's done right in an accountable kind of framework of oversight to make sure that the human rights ramifications are accounted for and make sure that we use these technolo- gies critically and carefully," says Molnar. LT NEWS Expanding capabilities to the Canadian legal market with Epiq acquires Bruneau Group — offering unmatched expertise through major markets across Canada: • Class action settlements • Data breach response • Government and regulatory remediation matters • National and international legal notice programs Learn more at epiqglobal.com Business Process Solutions | Class Action & Mass Tort | Court Reporting | eDiscovery | Regulatory & Compliance | Restructuring & Bankruptcy People. Partnership. Performance. Untitled-3 1 2018-09-26 11:05 AM James Kosa says that, in some cases, arti- ficial intelligence may 'end up being more accurate, reliable and responsive than the humans making the decisions.' Offices on Bloor Street/Yorkville with collegial law firms, full service suite. Call Teresa (416) 866-4144 ext. 225. LAW TIMES Marketplace

Articles in this issue

Links on this page

Archives of this issue

view archives of Law Times - October 1, 2018