Ever since the 1950s, there have been promises of a new programming language or techology which would eliminate the need for programmers.
Depending on how you look at it, either "real" programmers were elimiated by COBOL, or every attempt has been a failure and we'll always need programmers.
One such transition happened in the 1980s and 1990s, when assembly language was replaced by C, which in turn was replaced by C++ and Java. Developers who knew how to juggle registers found that their skills were no longer needed unless they could also talk to stakeholders and reason about how software can support business needs.
These days somebody who knows one language but doesn't understand the computer that runs it can go pretty far—if they have compensating skills, such as UI design or being able to suss out the real business needs. But sooner or later they'll run into something that just doesn't work, and the'll have no idea why.
Often the problem is around performance. Network or filesystem calls buried inside a tight loop. Or not understanding how speculative execution can make a set slower than an array. Or not knowing how pages of memory are loaded.
Just yesterday an iOS developer I know tracked down a mysterious UI bug to conflicting equals and hashCode definitions. This is something every Java developer needs to worry about, but is rarely an issue in iOS.
It's important to have people around who know deep system internals to keep people from making rookie mistakes and to track down weird bugs. But if every developer needed an advanced degree, a lot of important software would never get written.