This article previously appeared in Defense Scoop
The Department of War (DoW) senior Acquisition leadership (the people who decide what and how the DoW buys equipment and services) now is headed by people from private capital (venture capital and private equity.)
- Deputy Secretary of War Steven Feinberg ran Cerebus Capital
- Secretary of the Army Daniel Driscoll was a former VC and investment banker
- Secretary of the Navy John Phelan ran MSD capital.
- Deputy Secretary of the Army Michael Obadal was a senior director at Anduril
The Department of War is in the midst of once-in-a-lifetime changes of how it acquires weapons, software and systems. The new Warfighting Acquisition System rewards speed and timely delivery of things that matter to the Warfighter. But this new system is at risk of making the wrong things go faster.
Here’s why and what they should do.
What Now?
Acquisition in the DoW is being reorganized how a Private Equity would reorganize a large company. They bring in (or empower) a new operating team, swap executives, change incentives, kill things not core to their mission, cut costs, invest for growth, and restructure to find additional financing.
That’s being played out at the Department of War right now. The announcement of the consolidation of individual weapons systems (each of which had their own silos of Requirements, Test/Evaluation, Budgeting, and Acquisition) into a unified Portfolio Acquisition Executive, is a classic Private Equity strategy. Instead of 100s of programs operating with separate budgets, across different Program Executive Offices, the intent of the Portfolio Acquisition Executives is to consolidate overlapping programs, eliminate the redundant ones, pick winners, kill losers, get rid of processes that kill speed, and focus on rapid deployment.
What’s Missing?
Organizing by Portfolio Acquisition Executives is a great start, but simply consolidating the parts of the defense Acquisition system that were broken under one umbrella organization won’t make it better. Making bad ideas go faster should not be the goal. However, we’re at risk of doing just that. (Pete Newell at BMNT has been reminding me of this for years.)
For example, many of these new Portfolio executives are managing their programs by holding monthly reviews of proposed investments and current portfolio performance (just like private investors.) Here they’ll decide which programs get funded, which get more funding, and which should stop. (Actually having a regular process to kill programs early is sorely needed.) These are great ideas. However, if the meetings start by reviewing progress of prototypes to show that the technology works or that warfighters want it, and funds progress on those metrics, it misses the changes needed in an effective acquisition system.
The result will be building a faster version of a weapons requirements process that starts with a top-down list of features, or worse, shiny tech products (e.g. “I need drones.”) This “requirements first” process is what will drive the “bad ideas faster” problem.
A more productive approach – one that delivers truly decisive capabilities – would be to build a different process upfront – a rigorous problem identification and validation phase on the front-end of every acquisition program.
This process would start with a wide funnel of problems, ideas, technology, each with a 10-line problem summary that describes the specific challenge to address; why it can’t be solved currently; what it will take to solve it; and how a solution will be funded and deployed in the field.
The goal would be to 1) consolidate problems that may be different descriptions of the same core problem, and/or 2) discover if the problems are a symptom of something more complex.
Then each problem would go through an iterative process of problem and technical discovery. This will help define what a minimum deployable product and its minimum constraints (security, policy, reliability) should be, such as how long the solution would take to deploy, the source of funding for scale and who needs to buy-in.
This exercise will keep the focus where it needs to be — not on reforming a system but on delivering things to warfighters with speed and urgency so we can actually deter or win a war.
Want to Keep Up With the Changes in the DoW?
Get the free 2026 DoW Directory.
Both a startup “go-to-market” guide and the first ever Directory of the Department of War. It’s an invaluable desk reference to figure out who, what and where.
Download the free DoW Directory here.
Keep current with updates here
Order a desk copy here
Filed under: National Security |


Get the Podcasts for Free
The analysis is spot on. The problem is that government has one mission: spend more money in the next fiscal year. Being smart with the peoples’ monies is not important.
Last article of yours I read this change was an awesome thing and that the go fast break things model was awesome… now it may not be – I’m guessing these are the wrong people to lead our defense. A bunch of vc money managers and a distressed asset liquidator.
I agree that more up front work can be done, as the post suggests, but more emphasis needs to be placed on the validations. They need to go beyond assessing what the real problems are, and include an estimate of warfighting impact, the operational research that can help drive the future investment decisions.
Really interesting perspective on how restructuring the Department of War’s acquisition system could accidentally speed up the wrong projects if problem definition isn’t done first. It’s helpful to think about why speed alone isn’t enough — we need smart prioritization too!
Steve, this is exactly the structural risk. A PE-style portfolio review cadence can increase velocity, but if the upstream problem selection is weak, you simply institutionalise faster misalignment.
The most important section is the front-end discipline: a wide problem funnel, tightly written problem briefs, and explicit validation before solution teams spin up. Without that, portfolio reviews revert to “prototype works” and “users like it”, rather than “this materially changes operational outcomes”.
One additional layer I would argue for in defence is an explicit operational impact hypothesis tied to real units, budget holders and deployment pathways. Not just whether the problem is real, but whether it sits inside an adoption pathway that can move from pilot to programme of record. Otherwise the system still produces elegant experiments that cannot scale.
This becomes even more critical in an alliance context. If we want venture-backed dual-use companies to build for NATO rather than just individual services, problem definition, validation standards and interoperability assumptions need to be shared across partners from the outset. Otherwise we optimise locally and fragment globally.
We are currently exploring how to pilot a more structured adoption pipeline with a NATO capital partner and a frontline service, precisely to address this gap between validated problem, venture formation and institutional uptake. Your framing of “don’t make the wrong things go faster” is a useful north star for designing that architecture.