Around 2500 years ago, the Greek philosopher Heraclitus said: “the only constant in life is change.” He was probably writing this on a piece of paper made from the pith of the papyrus plant, using ink derived from ground-up organic compounds.
And from where we’re sitting now, with our keyboards, large touch screens and ergonomic everything, we think he deserves 10/10 for his amazing insight.
Working in the ICT industry means working in an environment where everything is constantly evolving. And the role of a developer is no exception. This blog will look at how the developer’s role has evolved over the last few decades.
In the 1990s, monolithic architecture was the norm. A single application would encompass several tightly coupled components, including the UI, application controller and the database. Remember the game Jenga, with its stack of strategically placed wooden blocks forming a single tower? Well, that’s a good analogy for monolithic architecture.
So back then, a developer was expected to be proficient in handling both the back-end and front-end components of a system. They were the equivalent of what today we call a ‘full-stack developer’.
By the early 2000s, internet and distributed computing became more ubiquitous.
Distributed computing is a request-reply messaging pattern used by web browsers. The application client (web browser) sends a request to a given server, then the server performs its task and sends a response back to the client when the job is complete. The architectural design commonly used in conjunction with this messaging pattern is Service Oriented Architecture (SOA).
In SOA, a solution is broken down into self-contained units responsible for different tasks, allowing for reusability between app and services. If we are equating monolithic architecture to the game Jenga, then SOA is like Lego by comparison. Each individual Lego brick can be assembled with the others to form a larger object.
One of the overall goals of SOA is to allow organisations to deliver solutions more quickly. SOA takes a bottom-up approach, with the functionality built to expose the back-end system data consumed by all the organisation’s business units.
The upshot of all this was that by the early 2000s, the role of a developer became more specialised still, and any remaining full-stack developers further narrowed their areas of expertise. Some became front-end developers (which is the component of a solution that is facing the user). Others became back-end developers in charge of the business logic and code behind the scenes. And many chose to become middleware developers specialising in tying the front-end and back-end components together.
The base principles of SOA are still used today. However, as nothing in life remains constant, the technology industry has moved from Simple Object Access Protocol (SOAP) into REpresentational State Transfer (REST) implementation or event-driven architecture.
REST is a set of architectural constraints that defines how an API should be implemented, including stateless, client-server communication, which is cacheable data that streamlines client-server interactions (for more details check out: https://www.redhat.com/en/topics/api/what-is-a-rest-api).
A lot of organisations are also now adopting the DevOps practice. DevOps is a combination of the term Development and Operations. It is a set of practices that promote better communication and collaboration between teams, such as continuous integration (CI) or continuous deployment (CD), cloud computing and containerisation. Ultimately, DevOps attempts to break down the barrier between developers and the operations team, i.e. after finish developing, the developer can deliver/deploy the solution.
As for the role of today’s developers? Ironically, the trend is going full circle back to the 1990s.
It’s now no longer feasible for the modern developer to only focus on specific technology skillsets or roles. Given the trends and choices of technology in the market, developers in the 2020s are expected to work across the entire technology stack. They also need to perform testing (to some extent) and look after the operational side of the solution.
With the cloud computing boom of recent years, writing services or making use of Software-as-a-Service (SaaS) has become less complex and requires less IT infrastructure knowledge. So delivering a full-stack solution is easier.
All this change isn’t a bad thing, though. As the world responds to the coronavirus pandemic, and in light of the desperate shortage of technology resources, the ability to do full-stack development and help customers migrate to or enhance their virtual business environment is a highly sought after skill. So it’s a win-win for employers and developers alike.
In 1859, Charles Darwin noted that those species that adapt best to their changing environment have the best chance of survival, while those who do not adapt – don’t make it.
And as history shows, developers are nothing if not adaptable.