In this post, I'd like to address some changes I've been seeing in the way enterprise applications are developed, and how this affects how we, as enterprise developers, need to change along with that trend. We need to learn new skills, change the way we look at our every day job and the requirements it has. We're accustomed to working in an Agile way in our day to day lives, so look at this post as a retrospective of our own work, and the implications it has.
Observing the history of the enterprise development world
But let's start out with why I'm interested in these changes in the first place. My job at GROUP9, the company I work for, is twofold. Partly I'm a consultant working for our various clients, often in the role of architect, helping our clients in creating software solutions and in general becoming better at delivering these solutions. However, my other, internal, responsibility is that of competence manager. Basically that means ensuring our consultants stay the smart people they are, and that means reflecting on what skills are needed in our profession every now and then.
When I started out as a developer, I was taught the basics of Java development: Java SE, Java EE (or J2EE as it was called back then), Spring, Hibernate, log4j, etc. Those were the basic building blocks which we teach our developers to create applications. Of course, that's not enough to build mature applications, so we also educate them process methodologies like RUP, Agile,. We teach them how to design applications, design patterns, architecture etc.
Most importantly though, we taught them a "common sense" on how to build enterprise software. This means we taught them topics like databases, transactions, remote method invocation, distributed systems, security. The enterprise world was quite easy to understand and divided into the big chunks of people doing Java EE, or people doing Spring. However, knowing these underlying principles of software engineering allowed you to easily move from one platform to the other. You could effectively and efficiently create software on these platforms, even if you didn't have the full knowledge on how to use them.
In practice, given the general understanding and a sound knowledge of how enterprise systems work, adapting to each of these different environments was "easy". It's just a matter of learning how to work with the specific tool, and "Bob's your uncle". More importantly, we often saw that people who knew the underlying principles were often way more effective than people who just knew the trick of applying a specific tool. Think of this as being a carpenter: if you know how to handle wood, it doesn't matter which brand of hammer you use. Or like being a car mechanic: if you know how cars work, it doesn't matter whether you're repairing a Honda or a Mercedes.
The ideal case is of course to have someone who knows both the principles and all of the tools by heart, but in a highly contested job market, that's often a luxury we don't have.
In that world, there were a lot of certifications, like the Sun Certified Java Programmer (and later on of course the Oracle variants). These certifications required you to be a human "compiler". Just know the tool by heart, and even if you didn't have any clue on how to use the tool effectively, you could still easily pass the certification. This lead to a world where a lot of the more proficient developers frowned upon certification. Having certain certifications didn't in any way imply that you knew what you were doing or applying it in the right way.
Fast forward to the modern world
Currently I'm working at one of our clients, designing the architecture for their core systems, trying to solve the various business challenges they face and coming up with effective and efficient software solutions to resolve those (or sometimes no software solution at all, which may be even more efficient). It's my job to advise and support multiple teams by helping them making the architectural decisions together and come up with the best solutions.
Doing that, and based on that reflecting on the knowledge needed by modern enterprise developers, two trends occurred to me, which are obviously linked:
- In modern, often micro-service based, architectures, there is a highly increased usage of tools to solve the various business challenges were are facing. We are building software at increasingly higher levels of abstraction using more pre-made components and tools.
- The complexity and impact these tools have on our ability to deliver our solutions is constantly increasing. Components are often highly complex, require us to think cross-domain or into domains which were previously shielded from enterprise developers, and when used wrong have a profound (negative) effect on the workings of our solutions.
Let's start by demonstrating this with a few examples.
In many applications, we use a plethora of tools to build our applications. Tools like Kubernetes, which abstract from the underlying hardware platforms, networking interface, storage facilities, are a prime example. But we can also think about tools like Apache Hadoop and Spark for data processing, Spring Boot and Quarkus, ready made systems to build complete applications which support high availability, monitoring, persistence, remote methods (in the form of REST services), etc. We use sidecar containers to ensure our logging is written to ELK instances, and Istio to handle our service to service security requirements.
And we see that many of these tools are part of domains which were previously shielded from application developers. Our docker images run on Linux, on Kubernetes we have to configure load balancers, services, name resolution and all the other lower level networking and hardware configuration needed to build our applications. Monitoring systems like Grafana and Kibana are deployed as containers along side our applications and we think about the metrics we expose to them.
The above two trends are of course a direct result of innovations like Cloud, DevOps and Microservice architectures. Innovations which allow us to build even more complex systems serving customers and users across the globe with ever increasing demands. We see that these innovations are needed to keep up with the ever increasing demands on non-functional requirements, whilst at the same time keeping productivity high or even increasing it (which we observe doesn't work always).
How does this affect us as Enterprise Developers?
Where do we need to adapt, evolve, along the lines of these ever changing and more complex environments in which we need to work. Where previously developers could "get away" with learning the few tools we used on the job, I believe this is no longer the case in the modern world. Current infrastructure, tools and complexities requires us developers to have a more profound knowledge of the various tools we use, know the buttons to press and know their pitfalls.
Of course, this doesn't mean we're shifting away from common sense and a good insight into IT. But on top of that comes the added need for additional knowledge on tools. And thus, we need to educate ourselves more in these tools, grow more accustomed to really getting intimate knowledge on the tools and how to use them.
The effects of not educating ourselves are quite clear: either we will build system ridden with bugs, or our productivity will drop, or we decide not to use the tools and get overtaken by our competitors. Neither of these scenarios are of course desirable.
But what about no/low code?
I wont delve into that subject fully, simply because its a topic too vast to fully discuss in a single paragraph. I do however believe that on the topic of this discussion, no/low code systems can ensure we can more easily solve "easier" cases of software development, thereby freeing up a workforce able to solve the really hard questions. Therefore, to me no/low code is not a threat, but a necessary evolution. We need to solve the simple cases in an easy way, so we can devote our brightest minds, us developers, to studying and solving the really hard questions.
What's in store for us then?
Basically, I believe we need to transition to a world where developers need ever increasing and intimate knowledge on the tools at our disposal to solve the business problems we face, and we need to start embracing certification or other ways of verifying our level of knowledge for these topics too. I still believe it is very hard (read: impossible) to measure common sense or insight into IT using exams or certification, however factual knowledge on the tools which is needed can easily be handled this way.
I also believe that more and more we need to start embracing learning, having time scope as part of the project budget. We often discuss that people these days need to do life-long-learning. The above to me shows this will only accelerate. In extremis, I expect user-stories popping up on our scrum boards about sending people to courses or doing certification, in preparation for usage of tools to solve the real business problems.
To end this in an Agile way: As a competence manager and passionate enterprise developer, I want to ensure we are all educated both in common sense IT and the tools we use, so that we can keep up to pace with the ever increasing world around us, and the requirements on our systems.
Comments
Post a Comment