It only took 20 years, but the Strategic Management Society now Believes the Lean Startup is a Strategy

I’ve always thought of myself as a practitioner. In the startups I was part of, the only “strategy” were my marketing tactics on how to make the VP of Sales the richest person in the company. After I retired, I created Customer Development and co-created the Lean Startup as a simple methodology which codified founders best practices – in a language and process that was easy to understand and implement. All from a practitioner’s point of view.

So you can imagine my surprise when I received the annual “Strategy Leadership Impact” Award from the Strategic Management Society (SMS). The SMS is the strategy field’s main professional society with over 3,100 members. They publish three academic journals; the Strategic Management Journal, Strategic Entrepreneurship Journal, and Global Strategy Journal.

The award said, [Steve Blank] as the Father of Modern Entrepreneurship, changed how startups are built, how entrepreneurship is taught, how science is commercialized, and how companies and government innovate.

Here’s my acceptance speech.


Thank you for the Strategy Leadership Impact Award. As a practitioner standing in front of a room full of strategists, I’m humbled and honored.

George Bernard Shaw reminded us that Americans and British are “one people separated by a common language.” I’ve often felt the same way about the gap between practitioners and strategists.

The best analogy I can offer, is the time after a long plane flight to Sydney, I jumped into a taxi and as the taxi driver started talking I started panicking – wondering what language he was speaking, and how I was going to be able to communicate to him.

It took me almost till we got to the hotel to realize he was speaking in English.

That’s sometimes how it feels between those who do strategy and those who study it.

So today, I’d like to share with you how this practitioner accidently became a strategist and how that journey led to what we now call the Lean Startup.

It’s a story that begins, perhaps surprisingly with what I call the Secret History of Silicon Valley.

—-

Silicon Valley’s roots lie in solving urgent, high-uncertainty national-security problems during World War II and the Cold War with the Soviet Union.

During WW II, the United States mastered scale and exploitation—mass-producing ships, aircraft, and tanks through centralized coordination. Ford, GM, Dupont, GE and others became the “arsenals of democracy.” In less than 4 years the U.S. built 300,000 aircraft, 124,000 of all types of ships, 86,000 tanks.

But simultaneously we created something radically different, something no other nation did – we created the Office of Science and Research and Development – OSR&D. This was a decentralized network of university labs that worked on military problems that involved electronics, chemistry and physics. These labs solved problems where outcomes were unknown and time horizons uncertain—exactly the conditions that later came to define innovation under uncertainty.

These labs delivered radar, rockets, proximity fuses, penicillin, sulfa drugs, and for the first two years ran the U.S. nuclear weapons program.

In hindsight, way before we had the language, the U.S. was practicing dynamic capabilities: the capacity to sense, seize, and transform under extreme uncertainty. It was also an early case of organizational ambidexterity—balancing mass production with rapid exploration.

One branch of this Office of Science and Research and Development – focused on electronic warfare—became the true genesis of the Valley’s innovation model.

In 1943, U.S. bombers over Europe faced catastrophic losses—4–5% of planes were shot down every mission. The German’s had built a deadly effective radar-based air defense system. The U.S. responded by creating the Harvard Radio Research Lab, led by Stanford’s Fred Terman. The lab had nothing to do with Harvard, Radio or Research.

Its goal was to rapidly develop countermeasures: jammers, receivers, and radar intelligence.

In the span of three years, Terman’s lab created an entire electronic ecosystem to defeat the German air defense systems. By war’s end U.S. factories were running 24/7 mass producing tens of thousands of the most complicated electronics and microwave systems that went on every bomber over Europe and Japan.

These teams were interdisciplinary, field-connected, and operating in continuous learning cycles:

  • Scientists and engineers worked directly with pilots and operators—what we’d now call frontline customer immersion.
  • They built rapid prototypes—the Minimum Viable Products of their time.
  • They engaged in short feedback loops between lab and battlefield—what John Boyd would later formalize as the OODA loop.
  • They were, in essence, running a learning organization under fire—a live example of strategic adaptation and iterative sensemaking.

But what does this have to do with Silicon Valley?

When the war ended Terman came back to Stanford and became Dean of Engineering and institutionalized this model. He embedded government research into the university, recruited his wartime engineers as faculty, and redefined Stanford as an outward-facing institution.

While most universities pursued knowledge exploitation – publishing, teaching, and extending established disciplines, Terman at Stanford did something that few universities in the 1950’s, 60’s or 70’s were doing – he pursued knowledge exploration and recombination. Turning Stanford into an outward facing university – with a focus on commercializing their inventions.

  1. He reconfigured incentives — encouraging professors to consult and found companies, an unprecedented act of strategic boundary spanning
  2. He believed spinning out microwave and electronics companies from his engineering labs was good for the university and for the country.
  3. He embedded exploration in the curriculum — mixing physics, electronics, and systems engineering.
  4. Cultivating external linkages — he and his professors were on multiple advisory boards with the Department of Defense, intelligence agencies, and industry.

Terman’s policies as now Provost effectively turned Stanford into an early platform for innovation ecosystems—decades before the term existed.

The technology spinouts from Stanford and small business springing up nearby were by their very nature managing uncertainty, complexity, and unpredictability. These early Valley entrepreneurs weren’t “lone inventors”; they were learning organizations, long before that term existed. They were continuously testing, learning, and iterating based on real operational data and customer feedback rather than long static plans.

However, at the time there was no risk capital to guide them. They were undercapitalized small businesses chasing orders and trying to stay in business.

It wasn’t until the mid 1970’s when the “prudent man” rule was revised for pension funds, and Venture Capital began to be treated as an institutional asset class, that venture capital at scale became a business in Silicon Valley. This is the moment when finance replaced learning as the dominant logic.

For the next 25 years, Venture investors – most of them with MBAs or with backgrounds in finance, treated startups like smaller versions of large companies. None of them had worked on cold war projects nor were they familiar with the agile and customer centric models defense innovation organizations had built. No VC was thinking about whether lessons from corporate strategic management thinkers of the time could be used in startups. Instead, VCs imposed a waterfall mindset —business plans and execution of the strategy in the plan — the opposite of how the Valley first innovated. The earlier language of experimentation, iteration, and customer learning disappeared.


And now we come full circle – to the Lean Startup.

At the turn of the century after 21 years as a practitioner, and with a background working on cold war weapons systems, I retired from startups and had time to think.

The more I looked at the business I had been in, and the boards I was now sitting on, I realized a few things.

  1. No business plan survived first contact with customers.
  2. On day one all startups have is a series of untested hypotheses
    • Yet startups were executing rather than learning
  3. Our strategic language and tools—all designed for large firms—were useless in contexts of radical uncertainty.
  4. Startups that succeeded were the ones that learned from their customers and iterated on the plan. Those that didn’t, ended up selling off their furniture.
  5. Most importantly – as I started reading all the literature I found on innovation strategy, almost all of it was about corporate innovation.
    • We had almost a century of management tools and language to describe corporate strategy for both growth and innovation – yet there were no tools, language or methods for startups.
    • But it was worse. Because both practitioners and their investors weren’t strategists, we had been trapped in thinking that startups were smaller versions of large companies
    • When the reality was that at their core, large companies were executing known business models, but startups? Startups were searching for business models
    • This distinction between startup search and large company execution had never been clearly articulated.
  6. There was a mismatch between the reality and practice.
    • We needed to reframe entrepreneurship as a strategic process, not a financial one
  7. I realized that every startup believed their journey was unique, and thought they had to find their own path to profitability and scale.
  8. That was because we had no shared methodology, language or common tools. So I decided to build them.
    • The first was Customer Development – at its heart a very simple idea – there are no facts inside the building – so get outside.
    • Here we were reinventing what the best practices from the wartime military organizations, and from Lead User Research and Discovery Driven Planning – this time for startups
    • The goal is to test all the business model hypotheses – including the two most important – customer and value proposition – which we call product/ market fit.
  9. The next, Agile Engineering – a process to build products incrementally and iteratively – was a perfect match for customer development.
  10. And then finally, repurposing Alexander Osterwalder’s Business Model Canvas to map the hypotheses needed in commercialization of a technology

The sum of these tools – Customer Development, Agile Engineering and the Business Model Canvas – is the Lean Methodology.

What I had done is turn a craft into a discipline of strategic learning—a continuous loop of hypothesis testing, experimentation via minimum viable products, and adaptation via pivots.

Lean is a codified system for strategy formation under uncertainty.

Over the last two decades Lean has turned into the de facto standard for starting new ventures. The classes I created at Stanford were adopted by the National Science Foundation and the National Institutes of Health, to commercialize science in the U.S.

And while contemporary entrepreneurs didn’t know it they were adopting the continuous learning cycles that had fueled wartime innovation.

What comes next is going to be even more interesting.

We’re going to remember – for better or worse – 2025 as another inflection point.

AI in everything, synthetic biology, and capital at previously unimaginable scale, are collapsing the distance between exploration and exploitation.

The boundary between discovery, invention, and strategy is dissolving.

Given how fast things are changing I’m looking forward to seeing strategy itself become a dynamic capability—not a plan, but a process of learning faster than the environment changes.

I can’t wait to see what you all create next.

In closing, my work at Stanford was made possible by the unflinching support from Tom Byers, Kathy Eisenhardt and Riitta Katila in the Stanford Technology Ventures Program who let a practitioner into the building.

Thank you.

Lean Launchpad at Stanford – 2025

The PowerPoints embedded in this post are best viewed on steveblank.com

We just finished the 15th<>annual Lean LaunchPad class at Stanford. The class had gotten so popular that in 2021 we started teaching it in both the winter and spring sessions.

During the 2025 spring quarter the eight teams spoke to 935 potential customers, beneficiaries and regulators. Most students spent 15-20 hours a week on the class, about double that of a normal class.

This Class Launched a Revolution in Teaching Entreprenurship
This class was designed to break out of the “how to write a business plan” as the capstone of entrepreneurial education. A business plan assumed that all startups needed to was to write a plan, raise money and then execute the plan. We overturned that orthodoxy when we pointed out that while existing organizations execute business models, startups are searching for them. And that a startup was a temporary organization designed to search for a repeatable and scaleable business model. This class was designed to teach startups how to search for a business model.
Several government-funded programs have adopted this class at scale. The first was in 2011 when we turned this syllabus into the curriculum for the National Science Foundation I-Corps. Errol Arkilic, the then head of commercialization at the National Science Foundation, adopted the class saying, “You’ve developed the scientific method for startups, using the Business Model Canvas as the laboratory notebook.”

Below are the Lessons Learned presentations from the spring 2025 Lean LaunchPad.

Team Cowmeter – early detection of cow infections through biological monitoring of milk.

If you can see the Team Cowmeter presentation click here

I-Corps at the National Institute of Health
In 2013 I partnered with UCSF and the National Institute of Health to offer the Lean LaunchPad class for Life Science and Healthcare (therapeutics, diagnostics, devices and digital health.) In 2014, in conjunction with the National Institute of Health, I took the UCSF curriculum and developed and launched the I-Corps @ NIH program.

Team NowPilot – AI copilot for enhancing focus and executive function.

If you can’t see the Team NowPilot presentation click here

I-Corps at Scale
I-Corps is now offered in 100 universities and has trained over 9,500 scientists and engineers; 7,800 participants in 2,546 teams at I-Corps at NSF (National Science Foundation), 950 participants in 317 teams at I-Corps at NIH, and 580 participants in 188 teams at Energy I-Corps (at the DOE).  15 universities in Japan now teach the class.

Team Godela – AI physics engine – with a first disruptive market in packaging.

If you can’t see the Team Godela presentation click here

$4 billion in Venture Capital For I-Corps Teams
1,380 of the NSF I-Corps teams launched startups raising $3.166 billion. Over 300 I-Corps at NIH teams have collectively raised $634 million. Energy I-Corps teams raised $151 million in additional funding.

Team ProspectAI – An AI sales development agent for lean sales teams.

If you can’t see the Team ProspectAI presentation click here

Mission Driven Entreprenurship
In 2016, I co-created both the Hacking for Defense course with Pete Newell and Joe Felter as well as the Hacking for Diplomacy course with Jeremy Weinstein at Stanford. In 2022, Steve Weinstein created Hacking for Climate and Sustainability. In 2024  Jennifer Carolan launched Hacking for Education at Stanford.

Team VLAB – accelerating clinical trials with AI orchestration of data.

If you can’t see the team VLAB presentation click here

Design of This Class
While the Lean LaunchPad students are experiencing what appears to them to be a fully hands-on, experiential class, it’s a carefully designed illusion. In fact, it’s highly structured. The syllabus has been designed so that we are offering continual implicit guidance, structure, and repetition. This is a critical distinction between our class and an open-ended experiential class. Guidance, Direction and Structure –
For example, students start the class with their own initial guidance – they believe they have an idea for a product or service (Lean LaunchPad/I-Corps) or have been given a clear real-world problem (Hacking for Defense). Coming into the class, students believe their goal is to validate their commercialization or deployment hypotheses. (The teaching team knows that over the course of the class, students will discover that most of their initial hypotheses are incorrect.)

Team Blix – IRB clinical trial compliance / A control layer for AI governance for financial services.

If you can’t see the team Blix presentation click here

The Business Model Canvas
The business model / mission model canvas offers students guidance, explicit direction, and structure. First, the canvas offers a complete, visual roadmap of all the hypotheses they will need to test over the entire class. Second, the canvas helps the students goal-seek by visualizing what an optimal endpoint would look like – finding product/market fit. Finally, the canvas provides students with a map of what they learn week-to-week through their customer discovery work. I can’t overemphasize the important role of the canvas. Unlike an incubator or accelerator with no frame, the canvas acts as the connective tissue – the frame – that students can fall back on if they get lost or confused. It allows us to teach the theory of how to turn an idea, need, or problem into commercial practice, week by week a piece at a time.

Team Plotline – A smart marketing calendar for author’s book launch.

If you can’t see the team Plotline presentation click here

Lean LaunchPad Tools
The tools for customer discovery (videos, sample experiments, etc.) offer guidance and structure for students to work outside the classroom. The explicit goal of 10-15 customer interviews a week along with the requirement for building a continual series of minimal viable products provides metrics that track the team’s progress. The mandatory office hours with the instructors and support from mentors provide additional guidance and structure.

Team Eluna/Driftnet  – Data Center data aggregation and energy optimization software.

If you can’t see the team Eluna/Driftnet presentation click here

AI Embedded in the Class
This was the first year where all teams used AI to help create their business model canvas, build working MVPs in hours, generate customer questions, analyze and summarizing interviews.

It Takes A Village
While I authored this blog post, this class is a team project. The secret sauce of the success of the Lean LaunchPad at Stanford is the extraordinary group of dedicated volunteers supporting our students in so many critical ways.

The teaching team consisted of myself and:

  • Steve Weinstein, partner at America’s Frontier Fund, 30-year veteran of Silicon Valley technology companies and Hollywood media companies. Steve was CEO of MovieLabs, the joint R&D lab of all the major motion picture studios.
  • Lee Redden – CTO and co-founder of Blue River Technology (acquired by John Deere) who was a student in the first Lean LaunchPad class 14 years ago!
  • Jennifer Carolan, Co-Founder, Partner at Reach Capital the leading education VC and author of the Hacking for Education class.

Our teaching assistants this year were Arthur C. Campello, Anil Yildiz, Abu B. Rogers and Tireni Ajilore.

Mentors helped the teams understand if their solutions could be a commercially successful business. Thanks to Jillian Manus, Dave Epstein, Robert Feldman, Bobby Mukherjee, Kevin Ray, Deirdre Clute, Robert Locke, Doug Biehn, and John Danner. Martin Saywell from the Distinguished Careers Institute joined the Blix team. The mentor team was led by Todd Basche.

Summary
While the Lean LaunchPad/I-Corps curriculum was a revolutionary break with the past, it’s not the end. In the last decade enumerable variants have emerged. The class we teach at Stanford has continued to evolve. Better versions from others will appear. AI is already having a major impact on customer discovery and validation and we had each team list the AI tools they used. And one day another revolutionary break will take us to the next level.

Hacking for Defense @ Stanford 2025 – Lessons Learned Presentations

The videos and PowerPoints embedded in this post are best viewed on steveblank.com

We just finished our 10th annual Hacking for Defense class at Stanford.

What a year.

Hacking for Defense, now in 70 universities, has teams of students working to understand and help solve national security problems. At Stanford this quarter the 8 teams of 41 students collectively interviewed 1106 beneficiaries, stakeholders, requirements writers, program managers, industry partners, etc. – while simultaneously building a series of minimal viable products and developing a path to deployment.

This year’s problems came from the U.S. Army, U.S. Navy, CENTCOM, Space Force/Defense Innovation Unit, the FBI, IQT, and the National Geospatial-Intelligence Agency.

We opened this year’s final presentations session with inspiring remarks by Joe Lonsdale on the state of defense technology innovation and a call to action for our students. During the quarter guest speakers in the class included former National Security advisor H.R. McMaster, Jim Mattis ex Secretary of Defense, John Cogbill Deputy Commander 18th Airborne Corps, Michael Sulmeyer former Assistant Secretary of Defense for Cyber Policy, and John Gallagher Managing Director of Cerberus Capital.

“Lessons Learned” Presentations
At the end of the quarter, each of the eight teams gave a final “Lessons Learned” presentation along with a 2-minute video to provide context about their problem. Unlike traditional demo days or Shark Tanks which are, “Here’s how smart I am, and isn’t this a great product, please give me money,” the Lessons Learned presentations tell the story of each team’s 10-week journey and hard-won learning and discovery. For all of them it’s a roller coaster narrative describing what happens when you discover that everything you thought you knew on day one was wrong and how they eventually got it right.
While all the teams used the Mission Model Canvas, Customer Development and Agile Engineering to build Minimal Viable Products, each of their journeys was unique.

This year we had the teams add two new slides at the end of their presentation: 1) tell us which AI tools they used, and 2) their estimate of progress on the Technology Readiness Level and Investment Readiness Level.

Here’s how they did it and what they delivered.

Team Omnyra – improving visibility into AI-generated bioengineering threats.

If you can’t see the team Omnyra summary video click here

If you can’t see the Omnyra presentation click here

These are “Wicked” Problems
Wicked problems refer to really complex problems, ones with multiple moving parts, where the solution isn’t obvious and lacks a definitive formula. The types of problems our Hacking For Defense students work on fall into this category. They are often ambiguous. They start with a problem from a sponsor, and not only is the solution unclear but figuring out how to acquire and deploy it is also complex. Most often students find that in hindsight the problem was a symptom of a more interesting and complex problem – and that Acquisition of solutions in the Dept of Defense is unlike anything in the commercial world. And the stakeholders and institutions often have different relationships with each other – some are collaborative, some have pieces of the problem or solution, and others might have conflicting values and interests.
The figure shows the types of problems Hacking for Defense students encounter, with the most common ones shaded.

Team HydraStrike – bringing swarm technology to the maritime domain.

If you can’t see the HydraStrike summary video click here.


If you can’t see the HydraStrike presentation click here

Mission-Driven Entrepreneurship
This class is part of a bigger idea – Mission-Driven Entrepreneurship. Instead of students or faculty coming in with their own ideas, we ask them to work on societal problems, whether they’re problems for the State Department or the Department of Defense or non-profits/NGOs  or the Oceans and Climate or for anything the students are passionate about. The trick is we use the same Lean LaunchPad / I-Corps curriculum — and the same class structure – experiential, hands-on– driven this time by a mission-model not a business model. (The National Science Foundation and the Common Mission Project have helped promote the expansion of the methodology worldwide.)
Mission-driven entrepreneurship is the answer to students who say, “I want to give back. I want to make my community, country or world a better place, while being challenged to solve some of the toughest problems.”

Team HyperWatch – tracking hypersonic threats.

If you can’t see the HyperWatch video click here

If you can’t see the HyperWatch presentation click here

It Started With An Idea
Hacking for Defense has its origins in the Lean LaunchPad class I first taught at Stanford in 2011. I observed that teaching case studies and/or how to write a business plan as a capstone entrepreneurship class didn’t match the hands-on chaos of a startup. Furthermore, there was no entrepreneurship class that combined experiential learning with the Lean methodology. Our goal was to teach both theory and practice. The same year we started the class, it was adopted by the National Science Foundation to train Principal Investigators who wanted to get a federal grant for commercializing their science (an SBIR grant.) The NSF observed, “The class is the scientific method for entrepreneurship. Scientists understand hypothesis testing” and relabeled the class as the NSF I-Corps (Innovation Corps). I-Corps became the standard for science commercialization for the National Science Foundation, National Institutes of Health and the Department of Energy, to date training 3,051 teams and launching 1,300+ startups.

Team ChipForce – Securing U.S. dominance in critical minerals.

If you can’t see the ChipForce video click here

If you can’t see the ChipForce presentation click here
Note: After briefing the Department of Commerce, the Chipforce was offered jobs with the department.

Origins Of Hacking For Defense
In 2016, brainstorming with Pete Newell of BMNT and Joe Felter at Stanford, we observed that students in our research universities had little connection to the problems their government was trying to solve or the larger issues civil society was grappling with. As we thought about how we could get students engaged, we realized the same Lean LaunchPad/I-Corps class would provide a framework to do so. That year we launched both Hacking for Defense and Hacking for Diplomacy (with Professor Jeremy Weinstein and the State Department) at Stanford. The Department of Defense adopted and scaled Hacking for Defense across 60 universities while Hacking for Diplomacy has been taught at Georgetown, James Madison University, Rochester Institute for Technology, University of Connecticut and now Indiana University, sponsored by the Department of State Bureau of Diplomatic Security (see here).

Team ArgusNet – instant geospatial data for search and rescue.

If you can’t see the ArgusNet video click here

If you can’t see the ArgusNet presentation click here

Goals for Hacking for Defense
Our primary goal for the class was to teach students Lean Innovation methods while they engaged in national public service.
In the class we saw that students could learn about the nation’s threats and security challenges while working with innovators inside the DoD and Intelligence Community. At the same time the experience would introduce to the sponsors, who are innovators inside the Department of Defense (DOD) and Intelligence Community (IC), a methodology that could help them understand and better respond to rapidly evolving threats. We wanted to show that if we could get teams to rapidly discover the real problems in the field using Lean methods, and only then articulate the requirements to solve them, defense acquisition programs could operate at speed and urgency and deliver timely and needed solutions.
Finally, we wanted to familiarize students with the military as a profession and help them better understand its expertise, and its proper role in society. We hoped it would also show our sponsors in the Department of Defense and Intelligence community that civilian students can make a meaningful contribution to problem understanding and rapid prototyping of solutions to real-world problems.

Team NeoLens – AI-powered troubleshooting for military mechanics.

If you can’t see the NeoLens video click here

If you can’t see the NeoLens presentation click here

Go-to-Market/Deployment Strategies
The initial goal of the teams is to ensure they understand the problem. The next step is to see if they can find mission/solution fit (the DoD equivalent of commercial product/market fit.) But most importantly, the class teaches the teams about the difficult and complex path of getting a solution in the hands of a warfighter/beneficiary. Who writes the requirement? What’s an OTA? What’s color of money? What’s a Program Manager? Who owns the current contract? …

Team Omnicomm – improving the quality, security and resiliency of communications for special operations units.

If you can’t see the Omnicomm video click here


If you can’t see the Omnicomm presentation click here

Mission-Driven in 70 Universities and Continuing to Expand in Scope and Reach
What started as a class is now a movement.
From its beginning with our Stanford class, Hacking for Defense is now offered in over 70 universities in the U.S., as well as in the UK as Hacking for the MOD and in Australia. In the U.S., the course is a program of record and supported by Congress, H4D is sponsored by the Common Mission Project, Defense Innovation Unit (DIU), and the Office of Naval Research (ONR). Corporate partners include Boeing, Northrop Grumman and Lockheed Martin.
Steve Weinstein started Hacking for Impact (Non-Profits) and Hacking for Local (Oakland) at U.C. Berkeley, and Hacking for Oceans at bot Scripps and UC Santa Cruz, as well as Hacking for Climate and Sustainability at Stanford. Jennifer Carolan started Hacking for Education at Stanford.

Team Strom – simplified mineral value chain.

If you can’t see the Strom video click here

If you can’t see the Strom presentation click here

What’s Next For These Teams?
.When they graduate, the Stanford students on these teams have the pick of jobs in startups, companies, and consulting firms .This year, seven of our teams applied to the Defense Innovation Unit accelerator – the DIU Defense Innovation Summer Fellows Program – Commercialization Pathway. Seven were accepted. This further reinforced our thinking that Hacking for Defense has turned into a pre-accelerator – preparing students to transition their learning from the classroom to deployment

See the teams present in person here

It Takes A Village
While I authored this blog post, this class is a team project. The secret sauce of the success of Hacking for Defense at Stanford is the extraordinary group of dedicated volunteers supporting our students in so many critical ways.

The teaching team consisted of myself and:

  • Pete Newell, retired Army Colonel and ex Director of the Army’s Rapid Equipping Force, now CEO of BMNT.
  • Joe Felter, retired Army Special Forces Colonel; and former deputy assistant secretary of defense for South Asia, Southeast Asia, and Oceania; and currently the Director of the Gordian Knot Center for National Security Innovation at Stanford which we co-founded in 2021.
  • Steve Weinstein, partner at America’s Frontier Fund, 30-year veteran of Silicon Valley technology companies and Hollywood media companies. Steve was CEO of MovieLabs, the joint R&D lab of all the major motion picture studios.
  • Chris Moran, Executive Director and General Manager of Lockheed Martin Ventures; the venture capital investment arm of Lockheed Martin.
  • Jeff Decker, a Stanford researcher focusing on dual-use research. Jeff served in the U.S. Army as a special operations light infantry squad leader in Iraq and Afghanistan.

Our teaching assistants this year were Joel Johnson, Rachel Wu, Evan Twarog, Faith Zehfuss, and Ethan Hellman.

31 Sponsors, Business and National Security Mentors
The teams were assisted by the originators of their problems – the sponsors.

Sponsors gave us their toughest national security problems: Josh Pavluk, Kari Montoya, Nelson Layfield, Mark Breier, Jason Horton, Stephen J. Plunkett, Chris O’Connor, David Grande, Daniel Owins, Nathaniel Huston, Joy Shanaberger, and David Ryan.
National Security Mentors helped students who came into the class with no knowledge of the Department of Defense, and the FBI understand the complexity, intricacies and nuances of those organizations: Katie Tobin, Doug Seich, Salvadore Badillo-Rios, Marco Romani, Matt Croce, Donnie Hasseltine, Mark McVay, David Vernal, Brad Boyd, Marquay Edmonson.
Business Mentors helped the teams understand if their solutions could be a commercially successful business: Diane Schrader, Marc Clapper, Laura Clapper, Eric Byler, Adam Walters, Jeremey Schoos, Craig Seidel, Rich “Astro” Lawson.

Thanks to all!

Teaching National Security Policy with AI

The videos embedded in this post are best viewed on steveblank.com

International Policy students will be spending their careers in an AI-enabled world. We wanted our students to be prepared for it. This is why we’ve adopted and integrated AI in our Stanford national security policy class – Technology, Innovation and Great Power Competition.

Here’s what we did, how the students used it, and what they (and we) learned.


Technology, Innovation and Great Power Competition is an international policy class at Stanford (taught by me, Eric Volmar and Joe Felter.) The course provides future policy and engineering leaders with an appreciation of the geopolitics of the U.S. strategic competition with great power rivals and the role critical technologies are playing in determining the outcome.

This course includes all that you would expect from a Stanford graduate-level class in the Masters in International Policy – comprehensive readings, guest lectures from current and former senior policy officials/experts, and deliverables in the form of written policy papers. What makes the class unique is that this is an experiential policy class. Students form small teams and embark on a quarter-long project that got them out of the classroom to:

  • select a priority national security challenge, and then …
  • validate the problem and propose a detailed solution tested against actual stakeholders in the technology and national security ecosystem

The class combines multiple teaching tools.

  • Real world – Students worked in teams on real problems from government sponsors
  • Experiential – They get out of the building to interview 50+ stakeholders
  • Perspectives – They get policy context and insights from lectures by experts
  • And this year… Using AI to Accelerate Learning

Rationale for AI
Using this quarter to introduce AI we had three things going for us: 1) By fall 2024 AI tools were good and getting exponentially better, 2) Stanford had set up an AI Playground enabling students to use a variety of AI Tools (ChatGPT, Claude, Perplexity, NotebookLM, Otter.ai, Mermaid, Beautiful.ai, etc.) and 3) many students were using AI in classes but it was usually ambiguous about what they were allowed to do.

Policy students have to read reams of documents weekly. Our hypotheses was that our student teams could use AI to ingest and summarize content, identify key themes and concepts across the content, provide an in-depth analysis of critical content sections, and then synthesize and structure their key insights and apply their key insights to solve their specific policy problem.  They did all that, and much, much, more.

While Joe Felter and I had arm-waved “we need to add AI to the class” Eric Volmar was the real AI hero on the teaching team. As an AI power user Eric was most often ahead of our students on AI skills. He threw down a challenge to the students to continually use AI creatively and told them that they would be graded on it. He pushed them hard on AI use in office hours throughout the quarter. The results below speak for themselves.

If you’re not familiar with these AI tools in practice it’s worth watching these one minute videos.

Team OSC
Team OSC was trying to understand what is the appropriate level of financial risk for the U.S. Department of Defense to provide loans or loan guarantees in technology industries?

The team started using AI to do what we had expected, summarizing the stack of weekly policy documentsusing Claude 3.5. And like all teams, the unexpected use of AI was to create new leads for their stakeholder interviews. They found that they could ask AI to give them a list of leaders that were involved in similar programs, or that were involved in their program’s initial stages of development.

See how Team OSC summarized policy papers here:

If you can’t see the video click here

Claude was also able to create a list of leaders with the Department of Energy Title17 credit programs, Exim DFC, and other federal credit programs that the team should interview. In addition, it created a list of leaders within Congressional Budget Office and the Office of Management and Budget that would be able to provide insights. See the demo here:

If you can’t see the video click here
The team also used AI to transcribe podcasts. They noticed that key leaders of the organizations their problem came from had produced podcasts and YouTube videos. They used Otter.ai to transcribe these. That provided additional context for when they did interview them and allowed the team to ask insightful new questions.

If you can’t see the video click here

Note the power of fusing AI with interviews. The interviews ground the knowledge in the teams lived experience.

The team came up with a use case the teaching team hadn’t thought of – using AI to critique the team’s own hypotheses. The AI not only gave them criticism but supported it with links from published scholars. See the demo here:

If you can’t see the video click here

Another use the teaching team hadn’t thought was using Mermaid AI to create graphics for their weekly presentations. See the demo here:

If you can’t see the video click here

The surprises from this team kept coming. Their last was that the team used Beautiful.ai in order to generate PowerPoint presentations. See the demo here:

If you can’t see the video click here

For all teams, using AI tools was a learning/discovery process all its own. By and large, students were largely unfamiliar with most tools on day 1.

Team OSC suggested that students should start using AI tools early in the quarter and experiment with tools like ChatGPT, Otter.ai. Tools that that have steep learning curves, like Mermaid should be started at the very start of the project to train their models.

Team OSC AI tools summary: AI tools are not perfect, so make sure to cross check summaries, insights and transcriptions for accuracy and relevancy. Be really critical of their outputs. The biggest takeaway is that AI works best when prepared with human efforts.

Team FAAST
The FAAST team was trying to understand how can the U.S. improve and scale the DoE FASST program in the urgent context of great power competition?

Team FAAST started using AI to do what we had expected, summarizing the stack of weekly policy documents they were assigned to read and synthesizing interviews with stakeholders.

One of the features of ChatGPT this team appreciated, and important for a national security class, was the temporary chat feature –  data they entered would not be used to train the open AI models. See the demo below.

If you can’t see the video click here

The team used AI do a few new things we didn’t expect –  to generate emails to stakeholders and to create interview questions. During the quarter the team used ChatGPT, Claude, Perplexity, and NotebookLM. By the end of the 10-week class they were using AI to do a few more things we hadn’t expected. Their use of AI expanded to include simulating interviews. They gave ChatGPT specific instructions on who they wanted it to act like, and it provided personalized and custom answers. See the example here.

If you can’t see the video click here

Learning-by-doing was a key part of this experiential course. The big idea is that students learn both the method and the subject matter together. By learning it together, you learn both better.

Finally, they used AI to map stakeholders, get advice on their next policy move, and asked ChatGPT to review their weekly slides (by screenshotting the slides and putting them into ChatGPT and asking for feedback and advice.)

The FAAST team AI tool summary: ChatGPT was specifically good when using images or screenshots, so in these multi-level tasks, and when you wanted to use kind of more custom instructions, as we used for the stakeholder interviews.  Claude was better at more conversational and human in writing, so used it when sending emails. Perplexity was better for researchers because it provides citations, so you’re able to access the web and actually get directed to the source that it’s citing. NotebookLM was something we tried out, but it was not as successful. It was a cool tool that allowed us to summarize specific policy documents into a podcast, but the summaries were often pretty vague.

Team NSC Energy
Team NSC Energy was working on a National Security Council problem, “How can the United States generate sufficient energy to support compute/AI in the next 5 years?”

At the start of the class, the team began by using ChatGPT to summarize their policy papers and generate tailored interview questions, while Claude was used to synthesize research  for background understanding. As ChatGPT occasionally hallucinated information, by the end of the class they were cross validating the summaries via Perplexity pro.

The team also used ChatGPT and Mermaid to organize their thoughts and determine who they wanted to talk to. ChatGPT was used to generate code to put into the Mermaid flowchart organizer. Mermaid has its own language, so ChatGPT was helpful, so we didn’t have to learn all the syntax for this language.
See the video of how Team NSC Energy used ChaptGPT and Mermaid here:

If you can’t see the video click here

Team Alpha Strategy
The Alpha Strategy team was trying to discover whether the U.S. could use AI to create a whole-of-government decision-making factory.

At the start of class, Team Alpha Strategy used ChatGPT.40 for policy document analysis and summary, as well for stakeholder mapping. However, they discovered going one by one through the countless numbers of articles was time consuming. So the team pivoted to using Notebook LM, for document search and cross analysis. See the video of how Team Alpha Strategy used Notebook LM here:

If you can’t see the video click here

The other tools the team used were custom Gpts to build stakeholder maps and diagrams and organize interview notes. There’s going to be a wide variety of specialized Gpts. One that was really helpful, they said, was a scholar GPT.
See the video of how Team Alpha Strategy used custom GPTs:

If you can’t see the video click here

Like other teams, Alpha Strategy used ChatGPT to summarize their interview notes and to create flow charts to paste into their weekly presentations.

Team Congress
The Congress team was exploring the question, “if the Department of Defense were given economic instruments of power, which tools would be most effective in the current techno-economic competition with the People’s Republic of China?”

As other teams found, Team Congress first used ChatGPT to extract key themes from hundreds of pages of readings each week and from press releases, articles, and legislation. They also used for mapping and diagramming to identify potential relationships between stakeholders, or to creatively suggest alternate visualizations.

When Team Congress weren’t able to reach their sponsor in the initial two weeks of the class, much like Team OSC, they used AI tools to pretend to be their sponsor, a member of the defense modernization caucus. Once they realized its utility, they continued to do mock interviews using AI role play.

The team also used customized models of ChatGPT but in their case found that this was limited in the number of documents they could upload, because they had a lot of content. So they used retrieval augmented generation, which takes in a user’s query, and matches it with relevant sources in their knowledge base, and fed that back out as the output. See the video of how Team Congress used retrieval augmented generation here:

If you can’t see the video click here

Team NavalX
The NavalX team was learning how the U.S. Navy could expand its capabilities in Intelligence, Surveillance, and Reconnaissance (ISR) operations on general maritime traffic.

Like all teams they used ChatGPT to summarize and extract from long documents, organizing their interview notes, and defining technical terms associated with their project. In this video, note their use of prompting to guide ChatGPT to format their notes.

See the video of how Team NavalX used tailored prompts for formatting interview notes here:

If you can’t see the video click here

They also asked ChatGPT to role play a critic of our argument and solution so that we could find the weaknesses. They also began uploading many interviews at once, and asked Claude to find themes or ideas in common that they might have missed on their own.

Here’s how the NavalX team used Perplexity for research.

If you can’t see the video click here
Like other teams, the NavalX team discovered you can customize ChatGPT by telling it how you want it to act.

If you can’t see the video click here

Another surprising insight from the team is that you can use ChatGPT to tell you how to write better prompts for itself.

If you can’t see the video click here
In summary, Team NavalX used Claude to translate texts from Mandarin, and found that ChatGPT was the best for writing tasks, Perplexity the best for research tasks, Claude the best for reading tasks, and notebook LM was the best for summarization.

Lessons Learned

  • Integrating AI into this class took a dedicated instructor with a mission to create a new way to teach using AI tools
  • The result was AI vastly enhanced and accelerated learning of all teams
    • It acted as a helpful collaborator
    • Fusing AI with stakeholders interviews was especially powerful
  • At the start of the class students were familiar with a few of these AI tools
    • By the end of the class they were fluent in many more of them
    • Most teams invented creative use cases
  • All Stanford classes we now teach – Hacking for Defense, Lean Launchpad, Entrepreneurship Inside Government – have AI integrated as part of the course
  • Next year’s AI tools will be substantively better

What Does Product Market Fit Sound Like? This.

I got a call from an ex-student asking me “how do you know when you found product market fit?”

There’s been lots of words written about it, but no actual recordings of the moment.

I remembered I had saved this 90 second, 26 year-old audio file because this is when I knew we had found it at Epiphany.

The speaker was the the Chief Financial Officer of a company called Visio, subsequently acquired by Microsoft

I played it for her and I think it provided some clarity.

It’s worth a listen.

If you can’t hear the audio click here

Lean LaunchPad @Stanford 2024 – 8 Teams In, 8 Companies Out

This post previously appeared in Poets and Quants.

We just finished the 14th annual Lean LaunchPad class at Stanford. The class had gotten so popular that in 2021 we started teaching it in both the winter and spring sessions.

During the quarter the eight teams spoke to 919 potential customers, beneficiaries and regulators. Most students spent 15-20 hours a week on the class, about double that of a normal class.

In the 14 years we’ve been teaching the class, we had something that has never happened before – all eight teams in this cohort have decided to start a company.

This Class Launched a Revolution in Teaching Entreprenurship
Several government-funded programs have adopted this class at scale. The first was in 2011 when we turned this syllabus into the curriculum for the National Science Foundation I-Corps. Errol Arkilic, the then head of commercialization at the National Science, adopted the class saying, “You’ve developed the scientific method for startups, using the Business Model Canvas as the laboratory notebook.”

Below are the Lessons Learned presentations from the spring 2024 Lean LaunchPad.

Team Neutrix – Making Existing Nuclear Reactors More Profitable By Upgrading Their Fuel

If you can’t see the Neutrix video, click here

If you can’t see the Neutrix Presentation, click here

I-Corps at the National Institute of Health
In 2013 I partnered with UCSF and the National Institute of Health to offer the Lean LaunchPad class for Life Science and Healthcare (therapeutics, diagnostics, devices and digital health.) In 2014, in conjunction with the National Institute of Health, I took the UCSF curriculum and developed and launched the I-Corps @ NIH program.

Team Virgil – Capturing Memoirs of Loved Ones (and Using AI to Do It Profitably)

If you can’t see the Virgil video, click here

If you can’t see the Virgil Presentation, click here.

I-Corps at Scale
I-Corps is now offered in 100 universities and has trained over 9,500 scientists and engineers; 7,800 in 2,546 teams in I-Corps at NSF (National Science Foundation), 950 participants at I-Corps at NIH in 317 teams, and 580 participants at Energy I-Corps (at the DOE) in 188 teams.

Team Claim CoPilot – Overturning Denied Healthcare Claims

If you can’t see the Claim Pilot Presentation, click here

If you can’t see the Claim CoPilot video of their demo click here

$4 billion in Venture Capital For I-Corps Teams
1,380 of the NSF I-Corps teams launched startups raising $3.166 billion. Over 300 I-Corps at NIH teams have collectively raised $634 million. Energy I-Corps teams raised $151 million in additional funding.

Team Emy.ai – Using Brainwaves to Biohack Moods

If you can’t see the Emy.ai video, click here

If you can’t see the Emy.ai Presentation, click here

Mission Driven Entreprenurship
In 2016, I co-created both the Hacking for Defense course with Pete Newell and Joe Felter as well as the Hacking for Diplomacy course with Jeremy Weinstein at Stanford. In 2022, Steve Weinstein created Hacking for Climate and Sustainability. This fall Jennifer Carolan will launch Hacking for Education at Stanford.

Team TeachAssist – Automating Student Assessments for Special Education Teachers

If you can’t see the TeachAssist video, click here

If you can’t see the TeachAssist Presentation, click here

Design of This Class
While the Lean LaunchPad students are experiencing what appears to them to be a fully hands-on, experiential class, it’s a carefully designed illusion. In fact, it’s highly structured. The syllabus has been designed so that we are offering continual implicit guidance, structure, and repetition. This is a critical distinction between our class and an open-ended experiential class.

Guidance, Direction and Structure
For example, students start the class with their own initial guidance – they believe they have an idea for a product or service (Lean LaunchPad/I-Corps) or have been given a clear real-world problem (Hacking for Defense). Coming into the class, students believe their goal is to validate their commercialization or deployment hypotheses. (The teaching team knows that over the course of the class, students will discover that most of their initial hypotheses are incorrect.)

Team Maurice.ai – A Home Robot for the GPT Era

If you can’t see the Maurice.ai video, click here

If you can’t see the Maurice.ai Presentation, click here

The Business Model Canvas
The business/mission model canvas offers students guidance, explicit direction, and structure. First, the canvas offers a complete, visual roadmap of all the hypotheses they will need to test over the entire class. Second, the canvas helps the students goal-seek by visualizing what an optimal endpoint would look like – finding product/market fit. Finally, the canvas provides students with a map of what they learn week-to-week through their customer discovery work.

I can’t overemphasize the important role of the canvas. Unlike an incubator or accelerator with no frame, the canvas acts as the connective tissue – the frame – that students can fall back on if they get lost or confused. It allows us to teach the theory of how to turn an idea, need, or problem into commercial practice, week by week a piece at a time.

Team Waifinder – Personalized Guidance For High School Students to Effectively Apply to College

If you can’t see the Waifinder video, click here

If you can’t see the Waifinder Presentation, click here

Lean LaunchPad Tools
The tools for customer discovery (videos, sample experiments, etc.) offer guidance and structure for students to work outside the classroom. The explicit goal of 10-15 customer interviews a week along with the requirement for building a continual series of minimal viable products provides metrics that track the team’s progress. The mandatory office hours with the instructors and support from mentors provide additional guidance and structure.

Team PocketDot – Gamified Braille Self-Learning Solution for Braille Learners

If you cant see the PocketDot video click here.

If you can’t see the PocketDot Presentation, click here

It Takes A Village
While I authored this blog post, this class is a team project. The secret sauce of the success of the Lean LaunchPad at Stanford is the extraordinary group of dedicated volunteers supporting our students in so many critical ways.

The teaching team consisted of myself and:

Our teaching assistants this year were Chapman Ellsworth, Francesca Bottazzini and Ehsan Ghasemi.

Mentors helped the teams understand if their solutions could be a commercially successful business. Thanks to Lofton Holder, Bobby Mukherjee, Steve Cousins, David Epstein, Kevin Ray, Rekha Pai, Rafi Holtzman and Kira Makagon. They were led by Todd Basche.

Summary
While the Lean LaunchPad/I-Corps curriculum was a revolutionary break with the past, it’s not the end. In the last decade enumerable variants have emerged. The class we teach at Stanford has continued to evolve. Better versions from others will appear. AI is already having a major impact on customer discovery and validation. And one day another revolutionary break will take us to the next level.

But today, we get to celebrate – 8 teams in – 8 companies out.

Founders Need to Be Ruthless When Chasing Deals

One of the most exciting things a startup CEO in a business-to-business market can hear from a potential customer is, “We’re excited. When can you come back and show us a prototype?”

This can be the beginning of a profitable customer relationship or a disappointing sinkhole of wasted time, money, resources, and a demoralized engineering team.

It all depends on one question every startup CEO needs to ask.


I was having coffee and pastries with Justin, an ex-student, listening to him to complain over the time he wasted with a potential customer. He was building a complex robotic system for factories. “We spent weeks integrating the sample data they gave us to build a functional prototype, and then after our demo they just ghosted us. I still don’t know what happened!”

After listening to how he got into that predicament, I realized it sounded exactly like the mistake I had made selling enterprise software.

Enthusiasm Versus Validation
Finding product/market fit is the holy grail for startups. For me, it was a real rush when potential users in a large company loved our slideware and our minimum viable product (MVP). They were ecstatic about the time the product could save them and started pulling others into our demos. A few critical internal recommenders and technical evaluators gave our concept the thumbs up. Now we were in discussions with the potential buyers who had the corporate checkbook, and they were ready to have a “next step” conversation.

This buyer wanted us to transform our slideware and MVP into a demonstration of utility with their actual data. This was going to require our small, overcommitted engineering team to turn the MVP into a serviceable prototype.

When I heard a potential customer offer us their own internal customer data I was already imagining popping Champagne corks once we showed them our prototype. (For context, our products sold for hundreds of thousands of dollars, and lifetime value to each customer was potentially measured in millions.) I rallied our engineering team to work for the next few months to get the demo of the prototype ready. As much as we could, we integrated the customers’ users and technical evaluators into our prototype development process. Then came the meeting with the potential customer. And it went great. The users were in the room, the buyer asked lots of questions, everyone made some suggestions and then we all went home. And the follow up from the potential customer? Crickets…

Even our user advocates stopped responding to emails.

What did I do wrong?
In my unbridled and very naive enthusiasm for impressing a potential customer, I made a rookie mistake – I never asked the user champion or the potential buyer what were the steps for turning the demo into a purchase order. I had made a ton of assumptions – all of them wrong. And most importantly I wasted the most precious things a startup has – engineering resources, time, and money.

In hindsight I had no idea whether my potential customer was asking other companies to demo their product. I had no idea whether the buyer had a budget or even purchase authority. If they did, I had no idea of their timeline for a decision. I had no idea who were the other decision-makers in the company to integrate, deploy and scale the product. I didn’t even know what the success criteria for getting an order looked like. I didn’t check for warning signs of a deal that would go nowhere: whether the person requesting the demo was in a business unit or a tech evaluation/innovation group, whether they’d pay for a functional prototype they could use, etc.  And for good measure, I never even considered asking the potential customer to pay for the demo and/or my costs.

(My only excuse was that this was my first foray into enterprise sales.)

Be Ruthless about the Opportunity Costs of Chasing Deals
After that demoralizing experience I realized that every low probability demo got us further from success rather than closer. While a big company could afford to chase lots of deals I just had a small set of engineering resources. I became ruthless about the opportunity costs of chasing deals whose outcome I couldn’t predict.

So we built rigor into our sales process.

We built a sales road map of finding first product/market fit with the users and recommenders. However, we realized that there was a second product/market fit with the organization(s) that controlled the budget and the path to deployment and scale.

For this second group of gatekeepers we came up with a cheap hack to validate that a demo wasn’t just a tire-kicking exercise on their part. First, we asked them basic questions about the process: the success criteria, the decision timeline, did a budget exist, who had the purchase authority, what were the roles and approval processes of other organizations (IT, Compliance and Security, etc.) and what was the expected rate of scaling the product across their enterprise. (All the rookie questions I should have asked the first time around.)

That was just the starting point to decide if we wanted to invest our resources. We followed up our questions by sending them a fully cancelable purchase order. We listed all the features we had demoed that had gotten the users excited and threw in the features the technical evaluators had suggested. And we listed our price. In big letters the purchase order said, “FULLY CANCELABLE.” And then we sent it to the head of the group that asked us for the prototype.

As you can imagine most of the time the response was – WTF?

Figure Out Who’s A Serious Prospect
That’s when the real learning started. It was more than OK with me if they said they weren’t ready to sign. Or they told me there were other groups who needed be involved. I was now learning things I never would have if I just showed up with a prototype. By asking the customer to sign a fully cancelable purchase order we excluded “least likely to close prospects”; those who weren’t ready to make a purchase decision, or those who already had a vendor selected but needed to go through “demo theater” to make the selection seem fair. But most importantly it started a conversation with serious prospects that informed us about the entire end-to-end approval process to get an order- who were the additional people who needed to say yes across the corporation – and what were their decision processes.

Our conversions of demos into orders went through the roof.

Finally, I was learning some of the basics of complex sales.

Justin stared at his uneaten pastry for a while and then looked up at me and said smiling, “I never knew you could do that. That’s given me a few ideas what we could do.”  And just like that he was gone.

Lessons Learned

  • In complex sales there are multiple product/market fits – Users, Buyers, etc. — each with different criteria
  • Don’t invest time and resources in building on-demand prototypes if you don’t know the path to a purchase order
  • Use polite forcing functions, e.g. cancelable purchase orders, to discover who else needs to say “yes”

Apple Vision Pro – Tech in the Search of a Market

A version of this article previously appeared in Fortune.


If you haven’t been paying attention Apple has started shipping its Apple Vision Pro, its take on a headset that combines Virtual Reality (VR) and Augmented Reality (AR). The product is an amazing technical tour de force.

But the product/market fit of this first iteration is a swing and a miss.


I’ve watched other world class consumer product companies make the same mistakes:

  1. Come up with amazing hardware that creates entirely new capabilities
  2. Forecast demand based on volumes of their previous consumer products
  3. Confuse consumers by defining a new category without a frame of reference
  4. Discover the hardware doesn’t match their existing consumer customer base needs
  5. Work hard (read spend a lot of money) on trying to “push” sales to their existing customers
  6. Revenue is woefully short of forecast. Marketing and capital expenses (new factory, high R&D expense) were predicated on consumer-scale sales. The new product is burning a ton of cash
  7. Ignore/not understand adjacent niche markets that would have “pulled” the product out of their hands, if they had developed niche-specific demos and outreach
  8. Eventually pivot to the niche markets that are excited about the product
  9. The niche markets make great beachhead markets, but are too small to match the inflated forecasts and the built-in burn rates of consumer scale sales
  10. Either…
    • After multiple market pivots and changes in leadership, abandon the product
    • Pivot and perserve

Déjà vu All Over Again
I lived the equivalent of this when Kodak (remember them?) launched a product in 1990 called PhotoCD. Kodak wanted consumers to put their film photos on their home CDROM drive and then display them on their televisions. You dropped off your film at a film processor and instead of just getting physical prints of your pictures they would scan the film, and burn them onto a Compact Disc. You’d go home with a Compact Disc with your pictures on it.

I got a preview of PhotoCD when I was the head of marketing at SuperMac, a supplier of hardware and software for graphics professionals. The moment I saw the product I knew every one of my professional graphics customers (ad agencies, freelancers, photo studios, etc.) would want to use it. In fact, they would have paid a premium for it. I was floored when Kodak told me they were launching PhotoCD as a consumer product.

The problem was that in 1990 consumers did not have CDROM drives to display the pictures. At the time even most personal computers lacked them. But every graphics professional did own a CDROM drive but most didn’t own a high-resolution film scanner – and PhotoCD would have been perfect for them – and the perfect launch customer. To this day I remember being lectured by a senior Kodak executive, “Steve you don’t get it, we’re experts at selling to consumers. We’ll sell them the CDROM drives as well.”  (The Kodak CDROM drives were the size of professional audio equipment and depending on the model, costing $600-$1000 in today’s dollars.)

(And when consumer CDROM drives became available they couldn’t play the PhotoCD disks as they were encoded in a proprietary Kodak standard to lock you into their drives!)  The result was that PhotoCD failed miserably as a consumer product. Subsequent pivots to professional graphics users (a segment another part of Kodak knew well) came too late, as low cost scanners and non-proprietary standards (JPEG) prevailed.

So what’s the lesson for Apple?

  1. Apple is trying to push Vision Pro into their existing consumer customers
  2. All the demos and existing applications are oriented to their consumer customers
  3. Apple did not create demos for how the Vision Pro could be used in new markets where users would jump on buying a Vision Pro. For example,
    1. There is proof of demand (here, here and here) of an adjacent mass market, helping millions of home owners repair things around the home
    2. There is proof of demand in industrial applications outside of the consumer space (here.) Every company that has complex machinery have been experimenting with AR for years. Imagine car repair with a Vision Pro AR tutorial. Or jet engine maintenance. Or the entire gamut of complex machinery.

All of these would have been great Vision Pro demos for training and repair. It’s hard to understand why Apple ignored these easy wins.

Getting it Right
Apple’s entry into new markets by creating new product categories –  iPods, iPads, iPhones – is unprecedented in the history of the modern corporation – $300 billion (75% of their revenue) is from non-computer hardware. In addition, they’ve created an entirely new $85+ billion subscription business model; the App Store, iTunes, Apple Care, Apple Pay, Apple Cash, Apple Arcade, Apple Music, Apple TV.

It’s hard to remember, but the first version of these products launched with serious limitations that follow-on versions remedied. The first version of the iPhone only ran Apple software, it was a closed system without an app store, had no copy and paste, couldn’t record video, etc. The original Apple Watch was positioned as a fashion accessory. It wasn’t until later that Apple realized that the killer apps for the Watch were fitness and health. Fixing the technical flaws while finding the right markets for all these products took time and commitment.

The same will likely be true for the Vision Pro. Apple marketers will realize that adjacent spaces they are less familiar with will provide the first “got to have it” beachhead markets. Newer versions will ride the technology wave of lighter, and cheaper versions.

Apple’s CEO Tim Cook has made a personal bet on the Vision Pro. More than any other company they have sufficient resources (cash on hand and engineering talent) to pivot their way to product/market fit in the real markets that need it.

Here’s hoping they find it.

Profound Beliefs

This post previously appeared in EIX.

In the early stages of a startup your hypotheses about all the parts of your business model are your profound beliefs. Think of profound beliefs as “strong opinions loosely held.”

You can’t be an effective founder or in the C-suite of a startup if you don’t hold any.

Here’s how I learned why they were critical to successful customer development.


I was an aggressive, young and a very tactical VP of marketing at Ardent, a supercomputer company – who really hadn’t a clue about the relationship between profound beliefs, customer discovery and strategy.

One day the CEO called me into his office and asked, “Steve I’ve been thinking about this as our strategy going forward. What do you think?” And he proceeded to lay out a fairly complex and innovative sales and marketing strategy for our next 18 months.  “Yeah, that sounds great,” I said. He nodded and then offered up, “Well what do you think of this other strategy?” I listened intently as he spun an equally complex alternative strategy. “Can you pull both of these off?” he asked looking right at me.  By the angelic look on his face I should have known that I was being set up. I replied naively, “Sure, I’ll get right on it.”

Ambushed
Decades later I still remember what happened next. All of a sudden the air temperature in the room dropped by about 40 degrees. Out of nowhere the CEO started screaming at me, “You stupid x?!x. These strategies are mutually exclusive. Executing both of them would put us out of business. You don’t have a clue about what the purpose of marketing is because all you are doing is giving engineering a list of feature requests and executing a series of tasks like they’re like a big To Do list. Without understanding why you’re doing them, you’re dangerous as the VP of Marketing, in fact you’re just a glorified head of marketing communications.  You have no profound beliefs.”

I left in a daze, angry and confused. There was no doubt my boss was a jerk, but I didn’t understand the point. I was a great marketer. I was getting feedback from customers, and I’d pass on every list of what customers wanted to engineering and tell them that’s the features our customers needed. I could implement any marketing plan handed to me regardless of how complex. In fact I was implementing three different ones. Oh…hmm… perhaps I was missing something.

I was executing a lot of marketing “things” but why was I doing them? The CEO was right. I had approached my activities as simply a task-list to get through. With my tail between my legs I was left to ponder: What was the function of marketing in a startup? And more importantly, what was a profound belief and why was it important?

Hypotheses about Your Business Model = Your Profound Beliefs Loosely Held
Your hypotheses about all the parts of your business model are your profound beliefs. Think of them as strong opinions loosely held. You can’t be an effective founder or in the C-suite if you don’t have any.

The whole role of customer discovery and validation outside your building is to inform your profound beliefs. By inform I mean use the evidence you gather outside the building to either validate your beliefs/hypotheses, invalidate or modify them.  Specifically, what beliefs and hypotheses?  Start with those around product/market fit – who are your customers and what features do they want? Who are the payers? Then march through the rest of the business model. What price will they pay? What role do regulators pay? Etc. The best validation you can get is an order. (BTW, if you’re creating a new market, it’s even OK to ignore customer feedback but you have to be able to articulate why.)

The reality of a startup is that that on day one most of your beliefs/hypotheses are likely wrong. However, you will be informed by those experiments outside the building, and data from potential customers, partners, regulators, et al will modify your vision over time.

It’s helpful to diagram the consequences between hypotheses/ beliefs and customer discovery. (See the diagram)

If you have no beliefs and haven’t gotten out of the building to gather evidence, then your role inside a new venture is neutral. You act as a tactical implementer as you add no insight/or value to product development.

If you’ve gotten out of the building to gather evidence but have no profound beliefs to guide your inquiries, then your role inside a new venture is negative. You’ll collect a laundry-list of customer feature requests and deliver them to product development, without any insight. This is essentially a denial of service attack on engineering’s time. (I was mostly operating in this box when I got chewed out by our CEO.)

The biggest drag on a startup is those who have strong beliefs but haven’t gotten out of the building to gather evidence. Meetings become opinion contests and those with the loudest voices (or worse “I’m the CEO and my opinion matters more than your facts”) dominate planning and strategy.  (They may be right, but Twitter/X is an example where Elon is in the box on the bottom right of the diagram. )

The winning combination is strong beliefs that are validated or modified by evidence gathered outside the building. These are “strong opinions loosely held.”

Strategy is Not a To Do List, It Drives a To Do List
It took me awhile, but I began to realize that the strategic part of my job was to recognize that (in today’s jargon) we were still searching for a scalable and repeatable business model. Therefore my job was to:

  • Articulate the founding team’s strong beliefs and hypotheses about our business model
  • Do an internal check-in to see if a) the founders were aligned and b) if I agreed with them
  • Get out of the building and test our strong beliefs and hypotheses about who were potential customers, what problems they had and what their needs were
  • Test product development’s/engineering’s beliefs about customer needs with customer feedback
  • When we found product/market fit, marketing’s job was to put together a strategy/plan for marketing and sales. That should be easy. If we did enough discovery customers would have told us what features were important to them, how we compare to competitors, how we should set prices, and how to best sell to them

Once I understood the strategy, the tactical marketing To Do list (website, branding, pr, tradeshows, white papers, data sheets) became clear. It allowed me to prioritize what I did, when I did it and instantly understand what would be mutually exclusive.

Lessons Learned

  • Profound beliefs are your hypotheses about all the parts of your business model
    • No profound beliefs but lots of customer discovery ends up as a feature list collection which is detrimental to product development
    • Profound beliefs but no customer discovery ends up as opinion contests and those with the loudest voices dominate
  • The winning combination is strong beliefs that are validated or modified by evidence gathered outside the buildingThese are “strong opinions loosely held.”

Lean Meets Wicked Problems

This post previously appeared in Poets & Quants.

I just spent a month and a half at Imperial College London co-teaching a “Wicked” Entrepreneurship class. In this case Wicked doesn’t mean morally evil, but refers to really complex problems, ones with multiple moving parts, where the solution isn’t obvious. (Understanding and solving homelessness, disinformation, climate change mitigation or an insurgency are examples of wicked problems. Companies also face Wicked problems. In contrast, designing AI-driven enterprise software or building dating apps are comparatively simple problems.)


I’ve known Professor Cristobal Garcia since 2010 when he hosted my first visit to Catholic University in Santiago of Chile and to southern Patagonia. Now at Imperial College Business School and Co-Founder of the Wicked Acceleration Labs, Cristobal and I wondered if we could combine the tenets of Lean (get out of the building, build MVPs, run experiments, move with speed and urgency) with the expanded toolset developed by researchers who work on Wicked problems and Systems’ Thinking.

Our goal was to see if we could get students to stop admiring problems and work rapidly on solving them. As Wicked and Lean seem to be mutually exclusive, this was a pretty audacious undertaking.

This five-week class was going to be our MVP.

Here’s what happened.

Finding The Problems
Professor Garcia scoured the world to find eight Wicked/complex problems for students to work on. He presented to organizations in the Netherlands, Chile, Spain, the UK (Ministry of Defense and the BBC), and aerospace companies. The end result was a truly ambitious, unique, and international set of curated Wicked problems.

  • Increasing security and prosperity amid the Mapuche conflict in Araucania region of Chile
  • Enabling and accelerating a Green Hydrogen economy
  • Turning the Basque Country in Spain into an AI hub
  • Solving Disinformation/Information Pollution for the BBC
  • Creating Blue Carbon projects for the UK Ministry of Defense
  • Improving patient outcomes for Ukrainian battlefield injuries
  • Imagining the future of a low-earth-orbit space economy
  • Creating a modular architecture for future UK defense ships

Recruiting the Students
With the problems in hand, we set about recruiting students from both Imperial College’s business school and the Royal College of Art’s design and engineering programs.

We held an info session explaining the problems and the unique parts of the class. We were going to share with them a “Swiss Army Knife” of traditional tools to understand Wicked/Complex problems, but they were not going to research these problems in the library. Instead, using the elements of Lean methodology, they were going to get out of the building and observe the problems first-hand. And instead of passively observing them, they were going to build and test MVPs.  All in six weeks.

50 students signed up to work on the eight problems with different degrees of “wickedness”.

Imperial Wicked Problems and Systems Thinking – 2023 Class

The Class
The pedagogy of the class (our teaching methods and the learning activities) were similar to all the Lean/I-Corps and Hacking for Defense classes we’ve previously taught. This meant the class was team-based, Lean-driven (hypothesis testing/business model/customer development/agile engineering) and experiential – where the students, rather than being presented with all of the essential information, must discover that information rapidly for themselves.

The teams were going to get out of the building and talk to 10 stakeholder a week. Then weekly each team will present 1) here’s what we thought, 2) here’s what we did, 3) here’s what we learned, 4) here’s what we’re going to do during this week.

More Tools
The key difference between this class and previous Lean/I-Corps and Hacking for Defense classes was that Wicked problems required more than just a business model or mission model to grasp the problem and map the solution. Here, to get a handle on the complexity of their problem the students needed a suite of tools –  Stakeholder Maps, Systems Maps, Assumptions Mapping, Experimentation Menus, Unintended Consequences Map, and finally Dr. Garcia’s derivative of the Alexander Osterwalder’s Business Model Canvas – the Wicked Canvas – which added the concept of unintended consequences and the “sub-problems” according to the different stakeholders’ perspectives to the traditional canvas.

During the class the teaching team offered explanations of each tool, but the teams got a firmer grasp on Wicked tools from a guest lecture by Professor Terry Irwin, Director of the Transition Design Institute at Carnegie Mellon (see her presentation here.) Throughout the class teams had the flexibility to select the tools they felt appropriate to rapidly gain an holistic understanding and yet to develop a minimum viable product to address and experiment with each of the wicked problems.

Class Flow
Week 1 

  • What is a simple idea? What are big ideas and Impact Hypotheses? 
    • Characteristics of each. Rewards, CEO, team, complexity, end point, etc. 
  • What is unique about Wicked Problems?
    • Beyond TAM and SAM (“back of the napkin”) for Wicked Problems
  • You need Big Ideas to tackle Wicked Problems: but who does it?
    •  Startups vs. Large Companies vs. Governments
    • Innovation at Speed for Horizon 1, 2 and 3 (Managing the Portfolio across Horizons)
  • What is Systems Thinking?
  • How to map stakeholders and systems’ dynamics?
  • Customer & Stakeholder Discovery: getting outside the building, city and country: why and how? 

Mapping the Problem(s), Stakeholders and Systems –  Wicked Tools

Week 2

  • Teams present for 6 min and receive 4 mins feedback
  • The Wicked Swiss Army Knife for the week: Mapping Assumptions Matrix, unintended consequences and how to run and design experiments
  • Prof Erkko Autio (ICBS and Wicked Labs) on AI Ecosystems and Prof Peter Palensky (TU Delft) on Smart Grids, Decarbornization and Green Hydrogen
  • Lecture on Minimal Viable Products (MVPs) and Experiments
  • Homework: getting outside the building & the country to run experiments

Assumption Mapping and Experimentation Type –  Wicked Tools

Week 3

  • Teams present in 6 min and receive 4 mins feedback
  • The Wicked Swiss Army Knife for the week: from problem to solution via “How Might We…” Builder and further initial solution experimentation
  • On Canvases: What, Why and How 
  • The Wicked Canvas 
  • Next Steps and Homework: continue running experiments with MVPs and start validating your business/mission/wicked canvas

The Wicked Canvas –  Wicked Tools

Experimentation Design and How We Might… –  Wicked Tools

Week 4

  • Teams present in 6 min and receive 5 mins feedback
  • Wicked Business Models – validating all building blocks
  • The Geography of Innovation – the milieu, creative cities & prosperous regions 
  • How World War II and the UK Started Silicon Valley
  • The Wicked Swiss Tool-  maps for acupuncture in the territory
  • Storytelling & Pitching 
  • Homework: Validated MVP & Lessons learned

Acupuncture Map for Regional System Intervention  – Wicked Tools


Week 5

  • Teams presented their Final Lessons Learned journey – Validated MVP, Insights & Hindsight (see the presentations at the end of the post.)
    • What did we understand about the problem on day 1?
    • What do we now understand?
    • How did we get here?
    • What solutions would we propose now?
    • What did we learn?
    • Reflections on the Wicked Tools

Results
To be honest, I wasn’t sure what to expect. We pushed the students way past what they have done in other classes. In spite of what we said in the info session and syllabus, many students were in shock when they realized that they couldn’t take the class by just showing up, and heard in no uncertain terms that no stakeholder/customer interviews in week 1 was unacceptable.

Yet, everyone got the message pretty quickly. The team working on the Mapuche conflict in the Araucania region of Chile, flew to Chile from London, interviewed multiple stakeholders and were back in time for next week’s class. The team working to turn the Basque Country in Spain into an AI hub did the same – they flew to Bilbao and interviewed several stakeholders. The team working on the Green Hydrogen got connected to the Rotterdam ecosystem and key stakeholders in the Port, energy incumbents, VCs and Tech Universities. The team working on Ukraine did not fly there for obvious reasons. The rest of the teams spread out across the UK – all of them furiously mapping stakeholders, assumptions, systems, etc., while proposing minimal viable solutions. By the end of the class it was a whirlwind of activity as students not only presented their progress but saw that of their peers. No one wanted to be left behind. They all moved with speed and alacrity.

Lessons Learned

  • Our conclusion? While this class is not a substitute for a years-long deep analysis of Wicked/complex problems it gave students:
    • a practical hands-on introduction to tools to map, sense, understand and potentially solve Wicked Problems
    • the confidence and tools to stop admiring problems and work on solving them
  • I think we’ll teach it again.

Team final presentations

The team’s final lessons learned presentations were pretty extraordinary, only matched by their post-class comments. Take a look below.

Team Wicked Araucania

Click here if you can’t see the Araucania presentation.

Team Accelerate Basque

Click here if you can’t see the Accelerate Basque presentation.

Team Green Hydrogen

Click here if you can’t see the Green Hydrogen presentation.

Team Into The Blue

Click here if you can’t see the Team Blue presentation.

Team Information Pollution

Click here if you can’t see the Team Information Pollution presentation.

Team Ukraine

Click here if you can’t see the Team Ukraine presentation.

Team Wicked Space

Click here if you can’t see the Team Wicked Space presentation.

Team Future Proof the Navy

Click here if you can’t see the Future Proof the Navy presentation.