Every decade for the past forty years we’ve seen a major paradigm shift in software development. The next major paradigm shift for software developers—cloud native application development—is occurring now and it will require developers to once again re-think and modify their approach to developing software.

In the seventies we saw the transition from assembler to third and fourth generation programming languages. This change gave rise to implicit processing based on language constructs and the use of functional notation.

In the eighties we saw a paradigm shift from procedural programming to object-oriented and with it the introduction of constructs like inheritance and polymorphism. This gave rise to the ability to develop abstract models that were fluid and could be implemented concretely across an abundance of compute architectures.  However, for developers trained in the concrete design of functional notation, abstraction often represented a major hurdle to overcome. In both of these cases the developer had to learn to think differently about how to structure their code in order to best leverage the power of these advancements.

In the nineties we saw a paradigm shift from local programming models to distributed programming models. Frameworks like Remote Procedure Calls and Remote Invocation attempted to limit the impact of this shift to developers by providing a façade that mimicked a local programming model hiding the complexities of interacting with a remote service, but ultimately, developers had to understand the nuances of developing distributed applications in order build in resiliency and troubleshoot failure.

In the early 2000’s we saw a paradigm shift due to the advancement of chip and operating system architecture to support multiple applications running simultaneously. Now applications could run multiple threads of execution in parallel supporting multiple users. This caused developers to now re-think how they structured their code and leveraged global and local variable usage. Many soon found their existing programming methods incompatible with these new requirements as threads were stepping all over each other and corrupting data.

The next major paradigm shift in software development is now upon us and it is developing cloud native applications. Building on the paradigm shift of the early 2000’s applications must now be able to scale across machines as well within a single machine. Once again, developers will need to rethink how they develop applications to leverage the power of the cloud and escape the limitations and confines of current distributed application development models.

There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way to learn is to actually put code into production and monitor its performance and actions.

I believe one of the greatest flaws we have seen in software engineering is that none of these great paradigm shifts has resulted in instrumentation becoming integral to the application design itself. It seems there has been a considerable disregard for the scientist (developer) to watch their experiment in the wild. Perhaps it’s the deadlines and backlog that have limited this requirement as well as the failure of higher education to incorporate this thinking at the university level, but I digress.

The reality is each major shift has brought about tremendous power for businesses to harness compute for growth and financial gain. Each change has allowed more powerful applications to be developed faster. But, with each paradigm shift we have to realize there is a need for skills to catch up to the technology. During that period, we have to accept that to take advantage of advances in compute and information technology, the software will be more brittle and more failures will be incurred.

Along the way, vacuums will be formed due to limited resources and there will be those who look for ways to fill the talent void by providing models that bridge the current and future paradigms. In the 90’s we had DCE and CORBA, in the early 2000’s we had application servers and now we have Platform-as-a-Service (PaaS). These platforms are developed by the leading edge that not only helped foster the paradigm shift but are also tooling the workforce to take advantage of these new capabilities.

So, while PaaS may not be seen as essential by businesses yet and adoption may be low, this will change as more and more businesses attempt to enter the digital revolution and realize they do not have the staff and skills to deliver resilient and scalable cloud native applications. They will be forced to rely on PaaS as a delivery system that allows developers to build components using their understanding of building distributed applications and then takes those components and adds availability, scalability, security and other cloud-centric value.

Leave a Reply

Your email address will not be published. Required fields are marked *

*