The premier weekly newspaper for the legal profession in Ontario
Issue link: https://digital.lawtimesnews.com/i/1090990
LAW TIMES 4 COVERING ONTARIO'S LEGAL SCENE | MARCH 11, 2019 www.lawtimesnews.com BY ANITA BAL AKRISHNAN Law Times ARTIFICIAL intelligence ex- pert To Anh Tran said that in the wake of Europe's privacy laws, other lawmakers must step up to minimize the concerns about bias and artificial intelligence. Tran was one of a panel of ex- perts who spoke at the Law Soci- ety of Ontario's event in Toronto, to mark International Women's Day, on March 6. "Globally, government would have to make some effort — everyone would have to make some effort — to step up and make those types of regulations: norms, values, ethics," said Tran. "In the United States, we have [the Food and Drug Adminis- tration], which has a drug ap- proval process. We need to do something similar with artifi- cial intelligence, where we know which products we want to be accepted . . . I'm glad that Europe has already raised the f lag, as you know, with GDPR. When it comes to the privacy of the data, most of the application has to be in compliance, and if not, you don't do the business." Members of the panel were Helen Kontozopoulos, a star- tup founder and University of Toronto professor, along with Tran, a senior technology ad- visor for the Canadian govern- ment. Quebec lawyer Valentine Goddard, founder and CEO of AI Impact Alliance, moderated the panel. Law has not been immune to the rise of artificial intelligence technology, as evidenced by To- ronto companies such as Deal- Maker, a cloud platform to help lawyers manage transactions, said Tran. "That platform allows you to automate an entire deal from a legal perspective. Legal fields are so risk-averse with anything that is outside of their firewall," said Tran. "Data, signatures, electronic approvals, moving the documentation. . . . As it is today, [lawyers] are probably spending 80 per cent of their time doing administrative tasks." But the use of artificial intel- ligence in law enforcement has raised concerns about its bias, she said. Tran pointed to a U.S. case, Wisconsin v. Loomis, which was published after the judge used a risk-assessment technol- ogy called Compas in deciding the sentence for the defend- ant. In that case, the defendant claimed that the use of Compas violated his right to due process, partially because "it improperly uses gendered assessments in sentencing," but the state's Su- preme Court disagreed. "When you check the data and you analyze on a cer- tain population that is not the entire population, certainly, the bias is there," said Tran. Goddard said the Supreme Court of Canada last year also raised the issue of bias in technol- ogy in Ewert v. Canada, 2018 SCC 30. That case explored a ser- ies of tools used by Correctional Service Canada to assess the risk of recidivism. Appellant Jeffrey Ewert claimed that the set of tools was "developed and tested on predominantly non-Indigen- ous populations and that there was no research confirming that they were valid when applied to Indigenous persons," the SCC's split decision said. Kontozopoulos said that if an algorithm is based on decisions that have been made in the past — for example, hiring decisions in a field that has been historically male-dominated — there is risk that the algorithm could encour- age perpetuating those trends. Goddard also noted that law- yers may become involved in the process through the drafting of contracts that may be designed to protect intellectual property of the underlying technology but also create a "black box" where it becomes difficult to audit the al- gorithms to ensure fairness. "This is impacting our lives — how we buy houses, how we buy insurance," said Kontozopoul- os. "We are putting people into these boxes, and you might get a more favourable insurance or discount than who is sitting next to you. . . . I'm training some- thing on the information of the past to try and make things fast- er and easier." LT To Anh Tran (left), Helen Kontozopoulos (centre), and Valentine Goddard (right) spoke at the Law Society of Ontario recently. Photo by Angelyn Francis/Law Society of Ontario LSO event focuses on tech Laws should monitor bias in AI, experts say As it is today, [lawyers] are probably spending 80 per cent of their time doing administrative tasks. To Anh Tran NEWS SCC chief justice kicks off debate Continued from page 1 LawSocietyOntario_LT_Feb25_19.indd 1 2019-02-20 3:38 PM have a responsibility to help solve the access-to-justice problem." In response to Wagner's comments, as well as other issues emerging during the bencher campaign related to supporting Pro Bono Ontar- io, there's been commentary online about pro bono work. Law Times is follow- ing election commentary at #BencherElection2019. LT