Canadian Lawyer

April 2017

The most widely read magazine for Canadian lawyers

Issue link: https://digital.canadianlawyermag.com/i/803157

Contents of this Issue

Navigation

Page 30 of 47

w w w . C A N A D I A N L a w y e r m a g . c o m A P R I L 2 0 1 7 31 grasp the nuances of the responses churned out by the machine learning systems. Implementing new technology, espe- cially involving AI systems, needs to be carefully planned, requires time, ongoing support and buy-in from associates and partners. "You need to continue to evolve your practice as the technology improves, and as you work more closely with the pro- gram, you start seeing more opportunities to use the technology that you may have not realized originally," points out Ellis, speaking from her experience overseeing the implementation of Kira. "That all takes time and effort, and that is probably the hardest thing." Some would argue that the most chal- lenging task is to convince lawyers within the firm or legal department to use the new technology. Buy-in in the middle ranks is critical, says Nextlaw Labs' Jensen. "You have to have buy-in across the board, make sure you can drive the implementation and the integration and put project manage- ment skills against it, and then manage expectations about what the tool is and what it's not." It is also crucial that the AI tool be clean, simple and intuitive; otherwise, lawyers will simply not use it, says Chuck Rothman, director of e-discovery services at Wortz- mans, now a division of McCarthy Tétrault. "In order for artificial intelligence to be really adopted in the legal industry, it has to be presented in a way that lawyers can very quickly grasp what the system is saying so that they can use it, because, if they don't understand it, they are not going to trust it and, if they don't trust it, they won't use it." The drive toward AI, however incre- mentally, will likely also mean that law firms are going to have to review their traditional billing model, says Furlong. The time when law firms were the only game in town, where lawyers were the "only vehicle" by which legal services could be delivered, is coming to a close, and AI is going to help to put that to an end, he says. "All of these innovations like artificial intelligence are going to reduce the amount of time and amount of effort required to obtain a legal outcome, so the very lax business model of selling time and expertise, rather than outcomes and results, is coming to an end." Firms such as McCarthys, Osler and Torys are paying attention to the evolving market demands. McCarthys is planning to have 50 per cent of its work charged on a non-hourly basis, while Torys is moving toward a fixed-fee billing model. "The model is changing as we incorporate these new technologies and because of the demands of the client," says Nickerson. In the meantime, in-house counsel like Garcia are going to likely have to bide their time if they expect to bear witness to a monumental change thanks to AI. As Peters puts it: "For sure, artificial intel- ligence is going to play a role in the future, but not as soon and not in the way that a lot of people are imagining it now." The dark side of AI: bias and loss of skills A lgorithms — the set of instructions computers use to carry out a task — have become an integral part of everyday lives, and they are immersing themselves in law. In the U.S., judges in some states can use algorithms as part of the sentencing process. Many law enforcement officials in the U.S. are using them to predict when and where crimes are likely to occur. They have been used for years in law firm recruitment. And with advancements in machine learning, they are also being used to conduct legal research, predict legal outcomes and to find out which lawyers win before which judges. Most algorithms are created with good intentions, but questions have surfaced over algorithmic bias at job hunting web sites, credit reporting bureaus, social media sites and even the criminal justice system where sentencing and parole decisions appear to be biased against African-Americans. And the issue is likely to gain traction as machine learning and predictive coding become more sophisticated, particularly since with deep learning (which learn autonomously), algo - rithms can reach a point where humans can often no longer explain or understand them, says Nicolas Vermeys, assistant director at Cyberjustice Laboratory in Montreal. "We have no idea how they arrived at their deci - sion and, therefore, cannot evaluate whether the decision has value or not," says Vermeys, whose research institution is studying the issue. "There is a risk to relying completely on machines without necessarily understanding its reasoning." No human is completely objective, and so it is with algorithms as they have been pro - grammed by programmers, notes Ian Kerr, a law professor at the University of Ottawa and the Canada Research Chair in Ethics, Law and Technology. Programmers operate on certain premises and presumptions that are not tested by anybody else, which leads to results based on those premises and pre - sumptions that in turn give rise to bias, adds Kerr. On top of that, it is very difficult to chal- lenge such decisions because "whoever owns the algorithms has trade secrets, isn't likely to show you the source code, isn't likely to want to talk about the secret source and what makes the algorithm work," says Kerr. "What justifies the algorithm is its success or per - ceived success, which is very different from whether or not it operates in biased ways." Aaron Courville, a professor with the Montreal Institute for Learning Algorithms, shares those concerns. "We are really in a phase where these algorithms are starting to do interesting things, and we need to take seriously the issues of responsibility," he says. Both Kerr and Vermeys are also con - cerned about artificial intelligence perform- ing more and more legal grunt work. By delegating an increasing amount of tasks to machines, there is a danger that existing skills will atrophy, says Kerr. "We have to be aware of that and make sure we make good judg - ments about which things to delegate and which things not to." Vermeys says there is some merit to per- forming thankless and menial tasks because it is, in many ways, how lawyers become good and experienced. His institute is also looking into the issue. Lawyers, for instance, should learn how to write contracts, tedious as the work may be, and should do it numerous times to be able to have a solid grasp of it, he says. "We're going to try to figure out how these artificial intelligence solutions should be used while not affecting the quality of service law - yers are giving today and will be able to give in five to 10 years."

Articles in this issue

Archives of this issue

view archives of Canadian Lawyer - April 2017