Episode 101
In the previous part of this article, we continued a journey through important milestones in the history of software development through the eighties and nineties, including non-obvious influences of Gamers and Hackers, Version Control, Open Source, Common Runtime Environments, Virtual Machines, Agile, DevOps, Continuous Integration, and Automated Tests.
Image by Jose Borges
In this part, we will mostly explore what happened in the last twenty years, but as usually some of the concepts can be traced back to earlier times. As previously, we will be interested not only in pure technology but also methodologies and organizational ideas.
Extreme Programming and Software Craftsmanship
Prominent methodologies within Agile software development like Scrum or Kanban, provided some organizational guidelines over the process, which while very useful, is not very specific on the technical side. While navigating various simple and complex organizational ideas at the top, the industry was seeking a decent codification of good engineering practices at the bottom. One such approach was Extreme Programming. It’s a set of various techniques that focus on improving the quality and efficiency of software development at its core. It encourages Test Driven Development, where we work in short cycles writing a unit test that fails and only then follow with production code to make the test pass. It emphasizes the role of code review, sometimes advising a continuous version of it where two developers sit at one machine – also known as pair programming. Main driving values in XP including communication, simplicity, feedback, courage, and respect were elaborated on by Kent Beck in his book Extreme Programming Explained that came out in 1999.
Image by Nicolas Ferrand
Many people had seen elaborate processes and rules as a helpless attempt to counterweight the lack of decent technical skills in young development teams across the rapidly accelerating industry. Some would ironically paraphrase that you cannot run a software project with a finite number of students even if your management is perfect (which is not). The emerging software craftsmanship movement emphasizes a strive for technical mastery and high professional ethic of software developer, drawing a parallel to medieval craftsmen who started as apprentices, followed by journeyman stage of seeking guidance from more advanced colleagues to finally become master craftsmen themselves. The Software Craftsmanship manifesto extends the Agile Manifesto with concepts like well-crafted software, steadily added value, the community of professionals, and productive partnerships. The first writings on software development being something else than pure engineering discipline can be traced to Jack W. Reeves essay “What is software design” from 1992. In 1999 Pragmatic Programmer by Andrew Hunt and David Thomas has drawn the medieval craftsman allegory. In 2001 Pete McBreen published the “Software Craftsmanship” book. In 2008 the industry beloved Robert C. Martin, aka Uncle Bob, proposed fifth value to the agile manifesto, originally stated as Craftsmanship over Crap and later changed to Craftsmanship over Execution. Year later Manifesto for Software Craftsmanship website went live.
Microservices and Continuous Deployment
The microservice architecture assumes arranging the system into a set of multiple loosely-coupled applications. The microservice revolution was triggered by the rising popularity of containers in the 2010s and suited well the DevOps ideas. Moving away from monolithic architectures proved to allow for much faster and less risky releases. Working on smaller repositories saved developers lots of conflicts and figuring out which exact commit broke the service. Physical runtime separation makes it less likely that an error in one module will bring down the entire system down. New languages and frameworks can be easily tested. High availability, fault tolerance, scalability, and cost optimization are easier to achieve.
Image by Inward
Continuous Delivery is an extension of the Continuous Integration approach where new software features are not only verified but also ready to be deployed after each commit with a push of a button if the build is green. From the software developer perspective, it opens a possibility to fix production problems very fast and in a safe environment – just rollback to a previous, stable, version to put out the fire immediately and then calmly investigate the root cause. The term was introduced in 2010 in a book by the same name, written by Jez Humble and David Farley. The next step in software delivery pipeline automation is Continuous Deployment, where we don’t even have to push any buttons – code is automatically deployed to production as soon as it passes all the tests.
Big Data and IoT
The number of users and various online services grows. So does the number of devices that are connected to the Internet and collecting all kinds of data from their sensors and interactions with users. We not only have smartphones in our pockets and smartwatches on our wrists but also smart cars, fridges, microwaves, vacuum cleaners, heaters, and more. All that is forming the Internet of Things. As scary as it might seem, all those devices together with industrial machines, monitoring systems, and our online presence, generate an overwhelming amount of information. It is estimated, that in 2020 the global data volume is in the range of tens of Zettabytes. That’s is 10^22 bytes and it continues to grow exponentially. The term Big Data itself was coined in mid-1990 to describe the amount of data that is difficult or impossible to capture and process using standard software tools in a reasonable time.
All those data, when used correctly, gave us unprecedented insights and understanding of trends, the behavior of people and machines, and lets us make better-informed decisions, often automatically. In order to handle the data, a new paradigm of processing and technologies had to emerge. This led to massively parallel systems based on MapReduce frameworks and event-driven architectures that started to become popular in the mid-2000s. Suddenly an IT system was not necessarily based on a single relational database. Cloud computing was one of the key enabling factors in big data revolution providing ease and flexibility in running a massive number of virtual machines to process and take advantage of the data. The appearance of mobile devices led to another specialization in software development to take advantage of additional options offered by smartphones, tablets, and other personal electronics. The growing presence of IoT and following the rise of Industry 4.0 with smart manufacturing changed the type of programmers that was mostly required in this domain from embedded software specialists to more mainstream developers that were also familiar with Cloud Computing.
The Cloud and Serverless
The real beginning of Cloud Computing dates back to 2006 when Amazon Web Services released the Elastic Compute Cloud platform, although some simpler services were available earlier. Microsoft followed with Azure in 2010 and Google Cloud Platform joined to form the Big Three in 2012. You can follow the full story here. The Cloud appearance is seen as one of the biggest disrupting factors in the IT industry. Moving the infrastructure from on-premise datacentres to hosted virtual machines, that were billed per second, opened a new level of possibilities in scalability, availability, security, presence, and many other aspects of software systems. The phenomenon was well aligned with the emergence of DevOps approach, microservice architecture, and Big Data. The programmer’s role, to a degree, shifted towards understanding and manipulating the infrastructure itself, something that was earlier sitting in a silo of administrators. To take full advantage of the new model, we had to learn how to develop cloud-native applications.
Image by Wojtek Fus
They say that no server is easier to manage than no server. Platform as a Service (PaaS) was an intermediate step towards Function as a Service (FaaS) or true Serverless. The idea of PaaS is to simplify the deployment of an application and provide the required environment with dynamically scalable resources. The first commercially successful PaaS solution was the Google App Engine launched in 2008. FaaS goes a step further and takes over another level of complexity of application development. The developer simply uploads a code of a single function in a given programming language that might be triggered by an HTTP call or various types of events on the Cloud platform. It allows for lightning-fast prototyping and suits well a wide range of solutions. The first popular FaaS was AWS Lambda launched in 2014.
The trend of simplifying elements of software components continued. Modern Cloud providers offer dozens of services regarding computing, storage, networking, messaging, databases, security, artificial intelligence, and many more. Usually, there are several solutions in a given area that occupy various places on the scale of abstraction. At one end we dive to a low level and have a lot of control, but we have to take care of details ourselves. At the other end, we have a managed service that’s easy to start with, but tricky to customize and fine-tune to specific scenarios. With the increasing number of building blocks available in one place, the developer role shifts from building blocks themselves to choosing which block to use and learning how to configure it properly.
Web 2.0 and Artificial Intelligence
Web 2.0 brought us ease of putting content online. Some used that power to upload cat pictures to the Facebook wall, some used it to help others solving software development problems. In the old days solving a problem was often accompanied by hours of debugging, diving into thick books, or looking for more experienced colleagues around the office. Now it’s sometimes still the case, but there is an overwhelming possibility that there is a post on some forum that describes a similar problem and somebody provided an answer. An iconic example of that is StackOverflow. There is a myriad of blog posts and video tutorials on basically any popular technology in the IT world. On the other hand, hosting and engineering the entire web 2.0 with the explosion of user-created contend was another important catalyst for cloud computing.
Image by Numenera Art
Crowd wisdom can be further enhanced with AI. We can autocomplete code using machine learning models trained on data dumped from StackOverflow or usage of IDEs. DeepCode, a startup from Zurich, produced an automatic assistant trained on GitHub repositories that can help developers improve their code. Google uses deep learning to perform static code analysis to find problems that evade a classic rule-based approach. Goldman Sachs is successfully leveraging AI-based tools to automatically create unit tests. Existing solutions keep improving with time and data and the possibilities seem endless.
The Future
Software development evolution is a long and fascinating journey. The pace of changes is increasing, the number of available technologies and abstraction levels where a certain problem can be solved is astounding and the trend will continue. Industry automation will increase. IT systems are going to be more interconnected and distributed. Productivity and quality improvements will carry on from a low-level syntactic sugar in established programming languages to machine-learning powered high-level architecture design assistants and auditors. Cloud platform operators and other software vendors will keep providing new tools. New layers of abstractions are going to emerge to handle the growing complexity. AI is going to take over more responsibility.
The role of a software developer is shifting from coding in a single programming language to understanding a growing number of technologies and elegantly combining them to achieve the desired effect. On the other end, we are moving towards automating more and more mundane manual tasks, so we don’t have to care about many little things we had to in the past. We let ourselves be removed further away from bare metal of servers that ultimately run our software creations. We move faster and more agile, accepting that we may fall, but we are much more prepared to take those falls without any damage, stand up, adapt, improve, and continue.
Carlos Herrera
September 29, 2020 at 9:00 am
Ah! I absolutely loved the series. It was quite interesting to see how software development has changed over the years and how it is yet evolving now. The way I see it, which you mentioned in your article, it is going from just being about coding to a more holistic process. This series of articles is really insightful, definitely sharing it with my colleagues.
LikeLiked by 1 person