There is a question that nobody in the legal profession is asking out loud, but that quietly organises almost every conversation about AI and the future of law. It is not whether AI will replace senior partners. Everyone agrees it will not, at least not soon. It is not whether AI will change the way legal work gets done. Everyone agrees it already has.
The question nobody is asking is this: if AI can do what junior lawyers do, who will become the senior lawyers of 2035?
The legal profession runs on a pyramid. At the top, a small number of senior partners who handle the judgment-intensive, relationship-driven, strategically complex work that clients actually pay premium rates for. Below them, a larger tier of senior associates and counsel who manage deals and matters and translate partner strategy into executed work. And at the base, a broad layer of junior associates — the first and second and third-year lawyers who do the document review, the first drafts, the research memos, the due diligence, the contract summaries. The work that is repetitive, time-consuming, and absolutely essential for learning how the law actually functions in practice.
That base layer is shrinking. Not dramatically, not all at once, but measurably and in one direction.
A December 2025 report from Citi Global Wealth at Work and Hildebrandt Consulting, drawing on survey data from nearly 200 US and UK law firms, found that while 86% of large firms plan to grow their associate ranks through 2027, only 35% plan to increase the size of their first-year associate classes. The report was precise about what this gap represents: firms are adjusting their associate populations toward a more senior demographic, driven by the adoption of AI that handles the repetitive, lower-value work that juniors used to do. A survey published by Law360 Pulse found that 70% of attorneys now use AI at least once a week — a sharp increase from the previous year — and that both early-career and senior lawyers believe AI is already replacing the responsibilities typically assigned to junior associates.
The numbers are not yet catastrophic. The most recent data from the American Bar Association found that 82% of the 2024 graduating class secured legal employment — a record high. But median starting salaries at law firms dropped by 3% in the same period. More graduates competing for fewer first-year spots, at lower pay, in an industry that is openly discussing whether it needs as many juniors as it once did. The pattern is not yet a collapse. It is a narrowing.
One newly launched AI-native law firm, General Legal, has made the logic explicit. Its founder said recently that the firm is not hiring junior lawyers at all. Instead it recruits fifth to eighth-year associates from large firms — people who have already been trained, who can already take responsibility for their own work, who already know what good looks like. The junior years, in this model, simply do not exist. The AI does what the junior used to do, and the humans begin where the AI stops.
This is a rational response to the economics. It is also, if it spreads, a profound structural problem for the profession.
The legal pyramid exists not just as a staffing model but as a training system. Junior lawyers do the repetitive work not only because it is cheap but because it teaches them something irreplaceable. You learn to read a contract by reading thousands of them. You learn what a poorly drafted clause looks like by finding them, flagging them, and watching a senior lawyer explain why they matter. You learn judgment, in law as in most fields, by accumulating experience — by making small mistakes in low-stakes contexts and gradually being trusted with more. The junior years are slow and often frustrating and frequently boring. They are also the foundation of everything that comes after.
If the junior work disappears, the question is not just where the junior lawyers go. It is who trains the senior lawyers of the future.
A senior litigator quoted in Above the Law put the problem precisely. She described watching junior associates submit AI-generated drafts, receiving her markup, and then feeding that markup back into the AI to integrate the changes.
The answer, she suspected, was no. The iteration — the slow, sometimes painful process of being told your work is wrong and understanding why — was being short-circuited. The AI absorbed the feedback. The associate observed.
This is the subtler displacement. Not the junior who loses their job to AI outright, but the junior who keeps their job while AI does the work that was supposed to teach them. They are present. They are employed. And they are not learning.
A Harvard Law School study found that in high-volume litigation, AI tools reduced the time spent on a standard complaint response from sixteen hours to three or four minutes. The efficiency gain is real and significant. But sixteen hours of a junior lawyer reading a case file, thinking about arguments, drafting and redrafting — that was also sixteen hours of becoming a lawyer. Three minutes of reviewing an AI output is something else entirely.
The legal profession is not unique in this. The same structural question is emerging across every knowledge-work field where AI is absorbing the entry-level tasks that once served as training. In consulting, junior analysts used to spend months building models and writing decks before anyone trusted them with a client. In journalism, reporters used to cover local council meetings and court hearings for years before they were given the stories that mattered. In medicine, residents do years of supervised procedural work that is repetitive, exhausting, and absolutely essential to developing clinical judgment. In every case, the repetition is the point.
Anthropic’s research published in March 2026 found that the occupations with the highest AI exposure are concentrated in exactly these early-career knowledge-work roles: data entry, document processing, customer service, market research, first-draft writing. The work that entry-level professionals do. The work that teaches them to do the harder work.
The paper also found that hiring of workers aged 22 to 25 into AI-exposed occupations has fallen by around 14% since ChatGPT launched. These are not workers being fired. They are workers who are not being hired. The door to the bottom of the pyramid is narrowing, quietly, without announcement.
There is an optimistic version of this story, and it deserves to be stated fairly. The optimists argue that as AI absorbs the lower-level work, the remaining professionals will be freed to operate at a higher level from the start. Junior lawyers will engage immediately with complex, judgment-intensive work rather than spending years on document review. They will develop faster, and they will be better. The pyramid will not disappear; it will compress, with fewer people doing more valuable work at every level.
This is possible. It is also, at the moment, largely unproven. What is proven is that the training system that produced the current generation of senior lawyers depended on years of structured apprenticeship, with the repetitive work as its scaffolding. Nobody has yet built a convincing alternative. AI can do the work. It cannot, at least not yet, replicate what doing the work taught the people who used to do it.
I have been a corporate lawyer for a decade. I use AI daily now. I find it, as I have written elsewhere on this site, genuinely liberating — the unbillable hours that used to consume my evenings have largely gone, and the quality of my first drafts has improved. I do not think AI will replace me soon. The judgment I exercise, the relationships I maintain, the strategic counsel I provide — that still requires something that current AI cannot supply.
But I think about the lawyer I was at twenty-five, reading contracts for hours, getting things wrong, being corrected, slowly understanding why. I think about what that taught me. And I think about the twenty-five year olds who are entering law now, in a world where the AI produces the first draft and the junior feeds back the markup.
The legal profession is not unique in being slow to change. It is possible that it will find a way to preserve the apprenticeship structure even as AI absorbs more of the underlying work. But the economics are pointing in one direction, and the economics of professional services tend, over time, to prevail.
Check your own risk
Where does your legal role sit?
The risk varies enormously within law. See your specific role’s automation risk score, based on Oxford Martin School + Anthropic Economic Index — 758 occupations.
Check my job risk →