Agile is one of the most common development methodologies — and one of the most commonly misimplemented. So it’s not surprising that the world of online programmers responded strongly to the release of an informative new document from the U.S. Department of Defense (DoD) offering advice to managers on how to fine-tune their projects.
The Defense Department’s “Office of Prepublication and Security Review” had cleared the document, “Detecting Agile BS,” for release in October, but it finally went viral last week, attracting 2,975 upvotes on Reddit, and hundreds of comments from geeks around the world.
“One of the best government documents I’ve read,” wrote one user, adding “Take that as you will.”
Scope of the Problem
Its first sentence warns that because agile is a popular buzzword, “all DoD software development projects are, almost by default, now declared to be ‘agile.'”
Board member Richard Murray, a professor at the California Institute of Technology, shared some sympathetic thoughts for the program managers trying to guess whether their team is actually using agile. “They say they’re using agile, there are a bunch of words there I don’t recognize, maybe they’re really using agile and maybe not,” he told FedScoop.
“We hope that this will be useful…”
The Board’s document says its goal is to help program executives and acquisition professionals to spot projects “that are simply waterfall or spiral development in agile clothing.” One attempted hybrid of Waterfall and Agile development has been dubbed “water-Scrum-fall,” but the document warns against imposters trying to claim this is agile development, using the sardonic moniker “agile-scrum-fall”.
It identifies six signs that a project isn’t really using an agile development methodology — for example, a lack of collaboration, with even the end users “missing-in-action throughout development.” It recommends observing users at work, if not actually talking to them, or giving them prototypes for feedback. “At a minimum, they should be present during Release Planning and User Acceptance Testing…”
Note: “Talking once at the beginning of a program to verify requirements doesn’t count!”
And in addition, the Program Executive Office” does not count as an actual user, nor does the commanding officer, unless she uses the code.”
Other signs of a non-agile project include tolerating manual processes that should be automated (for example, testing and continuous integration and deployment). And perhaps the greatest warning sign of all? A failure to prioritize “getting something useful into the field as quickly as possible.”
“As a member of a tryhard agile-scrum-fall team, I’m offended by the level of accuracy,” commented one reader on Reddit.
“As a member of another fragile-fall team,” added another, “I’m laughing because they literally have described every work environment I’ve had to deal with so far.”
And one commenter seemed to suggest their own workplace represented a worst-case scenario. “We are using ‘Agile’ but our last program took like [four] years to release and the user didn’t even touch the product until we had completed all verification and validation.”
Knowing What to Ask
The document also provides a helpful list of tell-tale questions to ask programming teams. For example, “How automated are your development, testing, security, and deployment pipelines?” with follow-up questions asking about specific tool suites for CI/CD, security scans, and deployment certification, and the all-important “is your infrastructure defined by code?”
Some other questions to detect Agile BS include:
- Who are your users and how are you interacting with them?
- What is your (current and future) cycle time for releases to your users?
- What are your management metrics for development and operations?
- What have you learned in your past three sprint cycles and what did you do about it? (“Wrong answers: ‘what’s a sprint cycle’…?”)
There’s also questions for the end users to pin down whether they’re really able to communicate their needs to developers (and what kind of feedback they’re getting), and even some questions for program leaders. For example, “Are teams delivering working software to at least some subset of real users every iteration (including the first) and gathering feedback?”
The document even includes a list of commonly used tools for agile development, including version control and issue tracking software, as well as continuous Integration services like Jenkins, Circle CI or Travis CI, and configuration management software like Chef, Ansible, or Puppet. Also on that list is Docker — defined as “a computer program that performs operating-system-level virtualization, also known as ‘containerization’ — and Kubernetes or Docker Swarm for container orchestration.”
Silicon Valley View
The remarkable document is the product of something called the “Defense Innovation Board” Its website describes the group as “one of several independent federal advisory committees advising the Secretary of Defense” — but that ignores some of its star power. It was set up in 2016 specifically to bring Silicon Valley’s best practices and innovations to the U.S. military. One site described them as “a Pentagon advisory board of luminaries.” Headed by Alphabet CEO Eric Schmidt, its members include professors, CEOs, and even astrophysicist Neil deGrasse Tyson.
It sounds like an interesting gig. “DIB Members believe in gathering data firsthand by traveling to military facilities, bases, and commands both within and outside of the U.S. to hear from warfighters at the tactical edge,” explains the home page. “The DIB aims to understand their challenges, learn what they’re doing to overcome them, and offer advice on how to achieve their mission.”
The group describes itself as “part of the larger, emerging innovation ecosystem at DoD,” and notes that its recommendations “are designed to be concise, actionable, high-impact, and swiftly delivered.”
Indeed, the Agile paper was five pages long.
America’s 2018 defense budget had specifically recommended that the Defense Innovation Board do a study “on streamlining software development and acquisition regulations.” (The group’s web site succinctly describes their mission: “Fix the Department’s approach to software.”) Other papers released earlier in the year cover the metrics for software development, along with the “Defense Innovation Board Ten Commandments of Software” and “Defense Innovation Board Do’s and Don’ts for Software.”
As “Detecting Agile BS” is a working draft, DIB is taking feedback for the final revision through February 2.