Robot

“Committing to 380 hours with you- my girlfriend is going to start asking questions…”

Ropes & Gray will allow its most junior lawyers to use a fifth of their billable time to “experiment” with AI. 

The ‘TrAIlblazers programme” is being rolled out in Europe having been piloted in the firm’s US offices in November last year. Remarkably, the shoehorning of ‘AI’ into the programme’s name survived the test period. 

Trainees and NQs will be given 20% of their billable hours for “hands-on AI exploration” to “actively test” tools, such as Hebbia, Harvey and ILS’ ProVision, in “tackling real client challenges”. Their feedback will help Ropes determine how it will integrate AI at the firm. 

Juniors at the firm are typically set billable targets of 1,900 hours a year, so they’ll be able to devote around 380 hours to the programme. When it comes to salary, trainees in London earn £60k (year one) and £65k (year two), while NQs are paid a base salary of £165k plus a bonus.

The use of AI is now prevalent across the legal profession. Last year, AI tool Harvey arrived in classrooms at four major law schools in the UK; Shoosmiths sought to incentivise staff with an extra bonus pot of £1 million if they collectively made one million prompts on Copilot; and HSF Kramer appointed its first Chief AI officer.  

However, the rapid growth has also caused some staff to worry about their job security. When Freshfields commenced a redundancy process last year affecting paralegals, the firm cited “a fast-changing legal market” and “investing in technology” – which many have interpreted as meaning an investment in AI. 

RollOnFriday asked Ropes & Gray if it had concerns that by advancing the firm’s use of AI, juniors could be embracing the thing that might replace them. Partner Jane Rogers, on the firm’s policy committee, responded:  “AI will not replace our lawyers, but will be integrated into their practices. Our junior lawyers are essential to our work, and our focus is on harnessing technology to support and enhance their ability to do their jobs.” 

It’s also been a balancing act for firms; while many are increasingly making use of AI, it’s clear that expert supervision is still required, and the SRA has warned lawyers not to rely on machine learning. It’s certainly been a high-risk for robot cock-ups. Very high-risk.