What is modern application development? That may sound like an impossible question to answer. Not only is “modern” a relative and subjective term, but there are so many different approaches to software development today that it’s hard to sum them up without making sweeping generalizations.
Nonetheless, if you compare mainstream application development practices today to those that prevailed in the past, it is possible to identify some common threads that markedly distinguish modern development strategies from historical approaches. These are the features that define modern application development.
Here’s a look at five defining characteristics of modern application development and their implications for the way developers work.
Before diving into the details of modern application development, let’s quickly clear up one potential question mark that often arises in these conversations: the relationship between modern app development and cloud-native applications.
This is a subjective issue (after all, “cloud-native” is itself a pretty ambiguous term), and there is no one right way to think about it. In general, however, I tend to think that it makes the most sense to associate cloud-native apps with certain types of applications – such as those that are distributed and deliver a high degree of agility and reliability. In contrast, modern application development refers to the processes and strategies that allow developers to build apps that offer those features.
Put another way, modern application development is what makes cloud-native applications possible. Modern development is the means, and cloud-native apps are (often, at least) the ends.
It’s difficult to talk about application development today without discussing microservices. As of 2020, 72 percent of organizations reported using microservices architectures for at least some of their applications. That figure is almost certainly even higher today.
That’s a big change from the past, of course, because microservices came into vogue only within the past decade. They were prefigured in some ways by trends like Service Oriented Architecture (which was popular in the 2000s), and microkernels (which were a hot idea circa 1990), but those approaches never resulted in the complexity of true microservices apps.
Microservices are important not only because they change the nature of the application architecture that developers work with, but also because they change the development processes themselves. When you build a microservices app, you probably assign different developers to oversee different microservices. You also probably update and deploy microservices on an individual basis instead of redeploying the entire application at once.
In other words, developing a microservices application is more like developing a series of individual applications than a single app. Each microservice brings its own development cycle. The result is a much greater degree of complexity for developers to manage, as well as the need for tight coordination between teams developing interdependent microservices.
A heavy reliance on open source is another key factor that distinguishes modern application development from historical approaches. As of 2021, 90 percent of organizations are using open source software in one way or another, according to Red Hat.
Like microservices, open source changes the game when it comes to application development practices and risks. When you use open source libraries or packages to help build your own applications, you are subject to potential security problems that may lurk within those dependencies. That means that developers who rely on open source must build processes for both identifying open source components that they introduce into their development cycles and validating those components for security.
After all, consider the fact that GitHub identified about 8 million security vulnerabilities in a survey of 0.5 million repositories. That’s a lot of vulnerabilities that could easily end up in applications that pull code from those repositories, especially if developers aren’t aware of exactly where upstream application components originate.
Although there are certainly still plenty of apps out there that don’t rely heavily on APIs, the typical application today is API-centric. That means that it uses APIs to integrate with other applications or services. Most Web-based apps, as well as those designed to run on a public cloud, use an API-centric approach.
APIs are similar to open source in that any security, reliability, or performance issues that impact a third-party API will impact an application that depends on it. For this reason, API security and performance testing should be another basic best practice within modern application development workflows.
While container adoption rates vary significantly from one business to the next, a majority of organizations are running at least some of their applications in containers. They are also orchestrating them with Kubernetes, in most cases.
There are great reasons to use containers and Kubernetes. They make applications more scalable and portable while avoiding the resource overhead of VMs.
Yet containers and Kubernetes also add significant complexity to modern application development cycles. If you deploy your applications using containers, you need to factor container image builds and scans into the delivery pipeline. You also need to create and validate the various configuration files (like security context definitions and RBAC settings) that are typically required to deploy an application into a Kubernetes cluster. If you don’t do these things, you risk configuration problems that could create significant performance or security issues.
There’s nothing stopping you from deploying an app on an old fashioned on-premises server. Even containerized, Kubernetes-based apps can run on-prem.
Yet the more common deployment strategy for modern applications is to run them in the cloud, where organizations can take advantage of limitless scalability.
The tradeoff for cloud-based deployment, however, is that your development lifecycle is more complex. You need to ensure that you configure your app for whichever cloud (or clouds, if you’ve gone the multicloud route) you plan to deploy it to. You must also validate the security of any IaC templates, IAM configurations, and other resources you use to manage deployment to the cloud environment.
To be sure, not every developer today embraces each of the modern development practices described above. Nor are all of these practices totally modern. Some, like APIs and open source, have existed for decades (although they weren’t as heavily used until more recently).
By and large, however, strategies like microservices architectures, a reliance on open source components, and the decision to deploy applications using containers and/or public clouds distinguish modern application development processes from those that were the norm ten or even just five years ago.
All of these practices have become popular because they offer benefits like increased application scalability, faster development, or simpler deployments. However, the major tradeoff, in most cases, is complexity: as teams embrace strategies that make their development processes “modern,” they also face new challenges associated with security and performance. It’s only by bricking controls for these challenges into application delivery pipelines that organizations can effectively manage the risks of modern app development and ensure that those risks are outweighed by the benefits.
Chris Tozzi has worked as a journalist and Linux systems administrator. He has particular interests in open source, agile infrastructure, and networking. He is Senior Editor of content and a DevOps Analyst at Fixate IO. His latest book, For Fun and Profit: A History of the Free and Open Source Software Revolution, was published in 2017.
The post A Developer’s View: What Exactly Is Modern Application Development? appeared first on Checkmarx.com.