Remember the days when software developers spent almost all of their time developing software? When the only tools they had to master – like SCMs, CI servers, and IDEs – were directly related to software development?
If not, it's probably because you weren't paying attention to software development prior to the rise of cloud-native technologies, like Docker containers and Kubernetes, which blur the line between software development and infrastructure management – and require developers to learn more about infrastructure than they may like.
I'm not sure that's a good thing. Here's why.
The main reason why developers today can no longer focus on coding alone is that the advent of cloud-native technologies has required developers to learn about tools and platforms that fall outside the scope of the traditional coder's toolbox.
To understand why, you have to understand how the meaning of "infrastructure" has both expanded and become a bit hazier over the past decade or so due to the rise of cloud-native computing technology.
In many ways, cloud-native platforms like Docker and Kubernetes are essentially infrastructure platforms. Containers that you run with Docker (or your favorite runtime) are a layer of your infrastructure, and one of the main jobs of Kubernetes is to manage containers across a cluster of servers – which is also infrastructure.
To be clear, I'm not trying to say that Docker or Kubernetes are infrastructure in the same sense that a VM is infrastructure. Obviously, containers and container orchestrators are not the same thing as servers. They constitute a different layer of your stack. But my point is that these technologies are basically parts of the infrastructure stack and/or tools that play a role in infrastructure management.
More to the point, however you choose to label these technologies, you can't really call them development tools. Docker and Kubernetes don't help you develop, test, or deploy software. They just run software after you've written it. (OK, I guess you could use Docker or Kubernetes to help host a development environment, but that doesn't make them development tools; it makes them part of the infrastructure that hosts your development environment.)
And yet, despite the fact that cloud-native infrastructure platforms are not development tools, they are tools that many developers have to know quite well today. You'd be hard-pressed to find work as a cloud-native developer if you don't know how to build a container image or deploy a Kubernetes Pod.
What that means is that, in essence, the typical developer (at least, the typical cloud-native developer) is no longer just a developer. She's also an infrastructure manager. Even if the Kubernetes-based, Docker-powered production environment into which she deploys her code is managed by someone else (like the IT team), she still has to know a ton about how Kubernetes and Docker work in order to make sure her code runs in that environment.
You may be thinking: "Well, haven't developers always had to understand their production infrastructure in order to build software for it?" The answer is, basically, yes. But what has changed in the cloud-native age is that infrastructure has become much, much more complex. Ten years ago, the only thing that a developer really had to know in order to build a monolith and deploy it on a VM was basic stuff like how the VM's file system worked or how to build a package that would install through the VM's package manager.
Compare that with the present, where developers can't build cloud-native apps if they don't understand what a container base image is, or how containers manage UIDs, or how to make their app interface with whichever CSI and CNI plugins their production Kubernetes cluster uses. In the modern world, there's just a lot more that developers need to know about infrastructure.
Now, I'm not here to knock cloud-native technology, or to say that we should go back to the days before Docker and Kubernetes. They are great tools.
But I am also not convinced that it's a good thing for developers to have to master those tools. Historically, developers focused on writing code – which was the thing they excelled at. They left infrastructure management and provisioning to the IT team.
Today, though, lots of developers spend a lot of their time managing infrastructure, or at least learning how it works. I imagine that these coders would be a lot more productive if they got to spend all of their time writing, testing, and deploying code rather than trying to debug Docker or keep up-to-date with the latest feature releases in Kubernetes. Some developers might be happier, too, because learning to be K8s admins or Docker debuggers is probably not what they signed up for when they decided to become coders.
You could argue that it's a good thing for developers to have to learn about infrastructure. The idea that developers should help to "own" tasks that traditionally fell to the IT team is at the core of DevOps.
I'd respond, though, that not every developer is a DevOps engineer, and not every developer should have to become a K8s guru in order to code.
Sure, mastering containerized, Kubernetes-based infrastructure management may make sense for some developers. But it shouldn't be a basic expectation for doing modern coding. Yet, unfortunately, it increasingly is.
Is a better world possible? Can we relieve developers from the necessity of becoming infrastructure experts in addition to coding experts?
I think so. The solution is to simplify the way developers work with containers and other cloud-native technology. So far, most of the focus in the cloud-native ecosystem has been on building really robust infrastructure solutions, not building developer-friendly solutions. That's why it's so much harder for developers to master cloud-native infrastructure platforms than it is for them to understand conventional infrastructure.
But if we make the processes required to containerize and deploy cloud-native apps simpler, developers would enjoy simpler routines and a much lower barrier of entry to cloud-native development. Doing so would return us to the days when a developer's only major job was to develop – which, foreign though it may seem today, would probably be in the interests of both developers and the businesses that employ them.
Chris Tozzi has worked as a Linux systems administrator and freelance writer with more than ten years of experience covering the tech industry, especially open source, DevOps, cloud native and security. He also teaches courses on the history and culture of technology at a major university in upstate New York.