A brief history of the agile methodology

Raw Text

Home

Software Development

Most organizations today practice some form of agile development, but it wasn't always so. To understand agile's success, it helps to look back to the heyday of the waterfall methodology and the birth of the Agile Manifesto.

By Isaac Sacolick

Contributing Editor, InfoWorld |

Thinkstock

Before agile: The waterfall methodology

The pivot to agile methods

Why agile development delivers better software

Every technology organization these days seems to practice some version of agile methodology. Or at least they believe they do. Whether you are new to software development or you started decades ago, your work today is at least influenced by agile methods.

But what is agile , and how do developers and organizations incorporate agile methodologies? This article is a brief history of agile development and how it differs from the classic waterfall methodology. I'll discuss the differences between agile and waterfall methods in practice, and explain why agile is so much better suited to how developers and teams actually work, especially in today's development environments.

Before agile: The waterfall methodology

Old hands like me remember when the waterfall methodology was the gold standard for software development. Using the waterfall method required a ton of documentation up front, before any coding started. Usually, the process started with a business analyst writing a business requirements document that captured what the business needed from the application. These documents were long and detailed, containing everything from overall strategy to comprehensive functional specifications and visual user interface designs.

Technologists used the business requirements document to develop a technical requirements document. This document defined the application’s architecture, data structures, object-oriented functional designs, user interfaces, and other nonfunctional requirements.

Once the business and technical requirements documents were complete, developers would kick off coding, then integration, and finally testing. All of this had to be done before an application was deemed production-ready. The whole process could easily take a couple of years.

The waterfall methodology in practice

The documentation used for the waterfall methodology was called "the spec," and developers were expected to know it just as well as its authors did. You could be chastised for failing to properly implement a key detail, say, outlined on page 77 of a 200-page document.

Software development tools also required specialized training, and there weren't anywhere near as many tools to choose from. We developed all the low-level stuff ourselves, such as opening database connections and multithreading our data processing.

For even basic applications, teams were large and communication tools were limited. Our technical specifications aligned us, and we leveraged them like the Bible. If a requirement changed, we’d put the business leaders through a long review-and-sign-off process. Communicating changes across the team and fixing code were expensive procedures.

Because software was developed based on the technical architecture, lower-level artifacts were developed first and dependent artifacts came next. Tasks were assigned by skill, and it was common for database engineers to construct tables and other database artifacts first. Application developers coded the functionality and business logic next, and the user interface was overlaid last. It took months before anyone saw the application working. By then, stakeholders were usually getting antsy—and often smarter about what they really wanted. No wonder implementing changes was so expensive!

In the end, not everything you put in front of users worked as expected. Sometimes, they wouldn’t use a feature at all. Other times, a capability was widely successful but required re-engineering to support scalability and performance. In the waterfall world, you only learned these things after the software was deployed, following a long development cycle.

Related video: How agile methodology really works

Everyone talks about agile software development, but many organizations don’t understand how it works in practice. This five-minute video breaks it down.

Pros and cons of the waterfall methodology

Invented in 1970, the waterfall methodology was revolutionary because it brought discipline to software development and ensured there was a clear spec to follow. It was based on the waterfall manufacturing method derived from Henry Ford’s 1913 assembly line innovations, which provided certainty about each step in the production process. The waterfall method was intended to ensure that the final product matched what was specified in the first place.

When software teams started adopting the waterfall methodology, computing systems and their applications were typically complex and monolithic, requiring discipline and clear outcomes to deliver. Requirements also changed slowly compared to today, so large-scale efforts were less problematic. In fact, systems were built assuming they would not change but would be perpetual battleships. Multiyear timeframes were common not only in software development but also in manufacturing and other enterprise activities. But waterfall’s rigidity became its downfall as we entered the internet era, and speed and flexibility were more prized.

The Agile Manifesto

Agile was formally launched in 2001, when 17 technologists drafted the Agile Manifesto . They wrote four major principles for agile project management, intended to guide teams on developing better software:

Individuals and interactions over processes and tools

Working software over comprehensive documentation

Customer collaboration over contract negotiation

Responding to change over following a plan

The pivot to agile methods

Software development started to change when developers began working on internet applications. A lot of the early work was done at startups where teams were smaller, were colocated, and often did not have traditional computer science backgrounds. There were financial and competitive pressures to bring websites, applications, and new capabilities to market faster. Development tools and platforms changed rapidly in response.

This led many of us working in startups to question the waterfall methodology and look for ways to be more efficient. We couldn’t afford to do all the detailed documentation up front, and we needed a more iterative and collaborative process. We still debated changes to the requirements, but we were more open to experimentation and adapting our software based on user feedback. Our organizations were less structured, and our applications were less complex than enterprise legacy systems, so we were more open to building versus buying applications. More importantly, we were trying to grow businesses, so when users told us something wasn’t working, we usually listened to them.

Having the skills and abilities to innovate became strategically important. You could raise all the money you wanted, but you couldn’t attract talented software developers, able to work with rapidly changing internet technologies, and then force them to follow “the spec.” Developers rejected project managers who led with end-to-end schedules describing what we should develop, when applications should ship, and sometimes even how to structure the code. We were terrible at hitting the three-month and six-month schedules that our project managers drafted and unceasingly updated.

Instead, we started telling them how internet applications needed to be engineered, and we delivered results on a schedule that we drew up iteratively. It turns out we weren’t that bad at delivering what we said we would when we committed to it in one-week to four-week intervals.

In 2001, a group of experienced software developers realized that they were collectively practicing software development differently from the classic waterfall methodology. Not all of them were in startups, either. This group—which included technology luminaries Kent Beck, Martin Fowler, Ron Jeffries, Ken Schwaber, and Jeff Sutherland—came up with the Agile Manifesto that documented their shared beliefs about how a modern software development process should operate. They stressed collaboration over documentation, self-organization rather than rigid management practices, and the ability to manage constant change rather than being locked into a rigid waterfall development process.

From those principles the agile methodology for software development was born.

Why agile development delivers better software

When you take the aggregate of agile principles, implement them in an agile framework, leverage collaboration tools, and adopt agile development practices, you usually get applications that are better quality and faster to develop. You also get better technical methods,  aka   hygiene .

The main reason is that agile is designed for flexibility and adaptability. You don’t need to define all the answers up front, as you do in the waterfall method. Instead, you break the problem into digestible components that you then develop and test with users. If something isn’t working well or as expected, or if the effort reveals something that you hadn’t considered, you can adapt the effort and get back on track quickly—or even change tracks if that’s what’s needed. Agile lets each team member contribute to the solution, and it requires that each member assumes personal responsibility for their work.

Agile principles, frameworks, and practices are designed for today’s operating conditions. Agile typically prioritizes iterative development and leveraging feedback to improve the application and the development process. Both iteration and feedback are well suited to today’s world of operating smarter and faster .

Agile development also encourages ongoing improvement. Imagine if Microsoft ended Windows development after version 3.1, or Google stopped improving its search algorithms in 2002. Software is in constant need of being updated, supported, and enhanced; agile methodology establishes both a mindset and process for that continuous improvement.

Finally, agile development leads to better software because people on agile teams are typically more productive and happier. Engineers have a say in how much work they take on, and they are proud to show their results. Product owners like seeing their vision expressed in software sooner and being able to change priorities based on the latest insights. Users like getting software that does what they actually need it to do.

Today, enterprises need a high level of software competency to deliver exceptional digital experiences in a hypercompetitive world. And they need to attract and keep great talent to build great software. Agile development helps enterprises do both.

Related:

Software Development

Agile Development

Isaac Sacolick is president of StarCIO and the author of the Amazon bestseller Driving Digital: The Leader’s Guide to Business Transformation through Technology and Digital Trailblazer: Essential Lessons to Jumpstart Transformation and Accelerate Your Technology Leadership . He covers agile planning , devops, data science, product management, and other digital transformation best practices. Sacolick is a recognized top social CIO and digital transformation influencer. He has published more than 900 articles at InfoWorld.com , CIO.com , his blog Social, Agile, and Transformation , and other sites.

Copyright © 2022 IDG Communications, Inc.

How to choose a low-code development platform

Single Line Text

Home. Software Development. Most organizations today practice some form of agile development, but it wasn't always so. To understand agile's success, it helps to look back to the heyday of the waterfall methodology and the birth of the Agile Manifesto. By Isaac Sacolick. Contributing Editor, InfoWorld |. Thinkstock. Before agile: The waterfall methodology. The pivot to agile methods. Why agile development delivers better software. Every technology organization these days seems to practice some version of agile methodology. Or at least they believe they do. Whether you are new to software development or you started decades ago, your work today is at least influenced by agile methods. But what is agile , and how do developers and organizations incorporate agile methodologies? This article is a brief history of agile development and how it differs from the classic waterfall methodology. I'll discuss the differences between agile and waterfall methods in practice, and explain why agile is so much better suited to how developers and teams actually work, especially in today's development environments. Before agile: The waterfall methodology. Old hands like me remember when the waterfall methodology was the gold standard for software development. Using the waterfall method required a ton of documentation up front, before any coding started. Usually, the process started with a business analyst writing a business requirements document that captured what the business needed from the application. These documents were long and detailed, containing everything from overall strategy to comprehensive functional specifications and visual user interface designs. Technologists used the business requirements document to develop a technical requirements document. This document defined the application’s architecture, data structures, object-oriented functional designs, user interfaces, and other nonfunctional requirements. Once the business and technical requirements documents were complete, developers would kick off coding, then integration, and finally testing. All of this had to be done before an application was deemed production-ready. The whole process could easily take a couple of years. The waterfall methodology in practice. The documentation used for the waterfall methodology was called "the spec," and developers were expected to know it just as well as its authors did. You could be chastised for failing to properly implement a key detail, say, outlined on page 77 of a 200-page document. Software development tools also required specialized training, and there weren't anywhere near as many tools to choose from. We developed all the low-level stuff ourselves, such as opening database connections and multithreading our data processing. For even basic applications, teams were large and communication tools were limited. Our technical specifications aligned us, and we leveraged them like the Bible. If a requirement changed, we’d put the business leaders through a long review-and-sign-off process. Communicating changes across the team and fixing code were expensive procedures. Because software was developed based on the technical architecture, lower-level artifacts were developed first and dependent artifacts came next. Tasks were assigned by skill, and it was common for database engineers to construct tables and other database artifacts first. Application developers coded the functionality and business logic next, and the user interface was overlaid last. It took months before anyone saw the application working. By then, stakeholders were usually getting antsy—and often smarter about what they really wanted. No wonder implementing changes was so expensive! In the end, not everything you put in front of users worked as expected. Sometimes, they wouldn’t use a feature at all. Other times, a capability was widely successful but required re-engineering to support scalability and performance. In the waterfall world, you only learned these things after the software was deployed, following a long development cycle. Related video: How agile methodology really works. Everyone talks about agile software development, but many organizations don’t understand how it works in practice. This five-minute video breaks it down. Pros and cons of the waterfall methodology. Invented in 1970, the waterfall methodology was revolutionary because it brought discipline to software development and ensured there was a clear spec to follow. It was based on the waterfall manufacturing method derived from Henry Ford’s 1913 assembly line innovations, which provided certainty about each step in the production process. The waterfall method was intended to ensure that the final product matched what was specified in the first place. When software teams started adopting the waterfall methodology, computing systems and their applications were typically complex and monolithic, requiring discipline and clear outcomes to deliver. Requirements also changed slowly compared to today, so large-scale efforts were less problematic. In fact, systems were built assuming they would not change but would be perpetual battleships. Multiyear timeframes were common not only in software development but also in manufacturing and other enterprise activities. But waterfall’s rigidity became its downfall as we entered the internet era, and speed and flexibility were more prized. The Agile Manifesto. Agile was formally launched in 2001, when 17 technologists drafted the Agile Manifesto . They wrote four major principles for agile project management, intended to guide teams on developing better software: Individuals and interactions over processes and tools. Working software over comprehensive documentation. Customer collaboration over contract negotiation. Responding to change over following a plan. The pivot to agile methods. Software development started to change when developers began working on internet applications. A lot of the early work was done at startups where teams were smaller, were colocated, and often did not have traditional computer science backgrounds. There were financial and competitive pressures to bring websites, applications, and new capabilities to market faster. Development tools and platforms changed rapidly in response. This led many of us working in startups to question the waterfall methodology and look for ways to be more efficient. We couldn’t afford to do all the detailed documentation up front, and we needed a more iterative and collaborative process. We still debated changes to the requirements, but we were more open to experimentation and adapting our software based on user feedback. Our organizations were less structured, and our applications were less complex than enterprise legacy systems, so we were more open to building versus buying applications. More importantly, we were trying to grow businesses, so when users told us something wasn’t working, we usually listened to them. Having the skills and abilities to innovate became strategically important. You could raise all the money you wanted, but you couldn’t attract talented software developers, able to work with rapidly changing internet technologies, and then force them to follow “the spec.” Developers rejected project managers who led with end-to-end schedules describing what we should develop, when applications should ship, and sometimes even how to structure the code. We were terrible at hitting the three-month and six-month schedules that our project managers drafted and unceasingly updated. Instead, we started telling them how internet applications needed to be engineered, and we delivered results on a schedule that we drew up iteratively. It turns out we weren’t that bad at delivering what we said we would when we committed to it in one-week to four-week intervals. In 2001, a group of experienced software developers realized that they were collectively practicing software development differently from the classic waterfall methodology. Not all of them were in startups, either. This group—which included technology luminaries Kent Beck, Martin Fowler, Ron Jeffries, Ken Schwaber, and Jeff Sutherland—came up with the Agile Manifesto that documented their shared beliefs about how a modern software development process should operate. They stressed collaboration over documentation, self-organization rather than rigid management practices, and the ability to manage constant change rather than being locked into a rigid waterfall development process. From those principles the agile methodology for software development was born. Why agile development delivers better software. When you take the aggregate of agile principles, implement them in an agile framework, leverage collaboration tools, and adopt agile development practices, you usually get applications that are better quality and faster to develop. You also get better technical methods,  aka   hygiene . The main reason is that agile is designed for flexibility and adaptability. You don’t need to define all the answers up front, as you do in the waterfall method. Instead, you break the problem into digestible components that you then develop and test with users. If something isn’t working well or as expected, or if the effort reveals something that you hadn’t considered, you can adapt the effort and get back on track quickly—or even change tracks if that’s what’s needed. Agile lets each team member contribute to the solution, and it requires that each member assumes personal responsibility for their work. Agile principles, frameworks, and practices are designed for today’s operating conditions. Agile typically prioritizes iterative development and leveraging feedback to improve the application and the development process. Both iteration and feedback are well suited to today’s world of operating smarter and faster . Agile development also encourages ongoing improvement. Imagine if Microsoft ended Windows development after version 3.1, or Google stopped improving its search algorithms in 2002. Software is in constant need of being updated, supported, and enhanced; agile methodology establishes both a mindset and process for that continuous improvement. Finally, agile development leads to better software because people on agile teams are typically more productive and happier. Engineers have a say in how much work they take on, and they are proud to show their results. Product owners like seeing their vision expressed in software sooner and being able to change priorities based on the latest insights. Users like getting software that does what they actually need it to do. Today, enterprises need a high level of software competency to deliver exceptional digital experiences in a hypercompetitive world. And they need to attract and keep great talent to build great software. Agile development helps enterprises do both. Related: Software Development. Agile Development. Isaac Sacolick is president of StarCIO and the author of the Amazon bestseller Driving Digital: The Leader’s Guide to Business Transformation through Technology and Digital Trailblazer: Essential Lessons to Jumpstart Transformation and Accelerate Your Technology Leadership . He covers agile planning , devops, data science, product management, and other digital transformation best practices. Sacolick is a recognized top social CIO and digital transformation influencer. He has published more than 900 articles at InfoWorld.com , CIO.com , his blog Social, Agile, and Transformation , and other sites. Copyright © 2022 IDG Communications, Inc. How to choose a low-code development platform.