Lean Launchpad at Stanford – 2025

The PowerPoints embedded in this post are best viewed on steveblank.com

We just finished the 15th<>annual Lean LaunchPad class at Stanford. The class had gotten so popular that in 2021 we started teaching it in both the winter and spring sessions.

During the 2025 spring quarter the eight teams spoke to 935 potential customers, beneficiaries and regulators. Most students spent 15-20 hours a week on the class, about double that of a normal class.

This Class Launched a Revolution in Teaching Entreprenurship
This class was designed to break out of the “how to write a business plan” as the capstone of entrepreneurial education. A business plan assumed that all startups needed to was to write a plan, raise money and then execute the plan. We overturned that orthodoxy when we pointed out that while existing organizations execute business models, startups are searching for them. And that a startup was a temporary organization designed to search for a repeatable and scaleable business model. This class was designed to teach startups how to search for a business model.
Several government-funded programs have adopted this class at scale. The first was in 2011 when we turned this syllabus into the curriculum for the National Science Foundation I-Corps. Errol Arkilic, the then head of commercialization at the National Science Foundation, adopted the class saying, “You’ve developed the scientific method for startups, using the Business Model Canvas as the laboratory notebook.”

Below are the Lessons Learned presentations from the spring 2025 Lean LaunchPad.

Team Cowmeter – early detection of cow infections through biological monitoring of milk.

If you can see the Team Cowmeter presentation click here

I-Corps at the National Institute of Health
In 2013 I partnered with UCSF and the National Institute of Health to offer the Lean LaunchPad class for Life Science and Healthcare (therapeutics, diagnostics, devices and digital health.) In 2014, in conjunction with the National Institute of Health, I took the UCSF curriculum and developed and launched the I-Corps @ NIH program.

Team NowPilot – AI copilot for enhancing focus and executive function.

If you can’t see the Team NowPilot presentation click here

I-Corps at Scale
I-Corps is now offered in 100 universities and has trained over 9,500 scientists and engineers; 7,800 participants in 2,546 teams at I-Corps at NSF (National Science Foundation), 950 participants in 317 teams at I-Corps at NIH, and 580 participants in 188 teams at Energy I-Corps (at the DOE).  15 universities in Japan now teach the class.

Team Godela – AI physics engine – with a first disruptive market in packaging.

If you can’t see the Team Godela presentation click here

$4 billion in Venture Capital For I-Corps Teams
1,380 of the NSF I-Corps teams launched startups raising $3.166 billion. Over 300 I-Corps at NIH teams have collectively raised $634 million. Energy I-Corps teams raised $151 million in additional funding.

Team ProspectAI – An AI sales development agent for lean sales teams.

If you can’t see the Team ProspectAI presentation click here

Mission Driven Entreprenurship
In 2016, I co-created both the Hacking for Defense course with Pete Newell and Joe Felter as well as the Hacking for Diplomacy course with Jeremy Weinstein at Stanford. In 2022, Steve Weinstein created Hacking for Climate and Sustainability. In 2024  Jennifer Carolan launched Hacking for Education at Stanford.

Team VLAB – accelerating clinical trials with AI orchestration of data.

If you can’t see the team VLAB presentation click here

Design of This Class
While the Lean LaunchPad students are experiencing what appears to them to be a fully hands-on, experiential class, it’s a carefully designed illusion. In fact, it’s highly structured. The syllabus has been designed so that we are offering continual implicit guidance, structure, and repetition. This is a critical distinction between our class and an open-ended experiential class. Guidance, Direction and Structure –
For example, students start the class with their own initial guidance – they believe they have an idea for a product or service (Lean LaunchPad/I-Corps) or have been given a clear real-world problem (Hacking for Defense). Coming into the class, students believe their goal is to validate their commercialization or deployment hypotheses. (The teaching team knows that over the course of the class, students will discover that most of their initial hypotheses are incorrect.)

Team Blix – IRB clinical trial compliance / A control layer for AI governance for financial services.

If you can’t see the team Blix presentation click here

The Business Model Canvas
The business model / mission model canvas offers students guidance, explicit direction, and structure. First, the canvas offers a complete, visual roadmap of all the hypotheses they will need to test over the entire class. Second, the canvas helps the students goal-seek by visualizing what an optimal endpoint would look like – finding product/market fit. Finally, the canvas provides students with a map of what they learn week-to-week through their customer discovery work. I can’t overemphasize the important role of the canvas. Unlike an incubator or accelerator with no frame, the canvas acts as the connective tissue – the frame – that students can fall back on if they get lost or confused. It allows us to teach the theory of how to turn an idea, need, or problem into commercial practice, week by week a piece at a time.

Team Plotline – A smart marketing calendar for author’s book launch.

If you can’t see the team Plotline presentation click here

Lean LaunchPad Tools
The tools for customer discovery (videos, sample experiments, etc.) offer guidance and structure for students to work outside the classroom. The explicit goal of 10-15 customer interviews a week along with the requirement for building a continual series of minimal viable products provides metrics that track the team’s progress. The mandatory office hours with the instructors and support from mentors provide additional guidance and structure.

Team Eluna/Driftnet  – Data Center data aggregation and energy optimization software.

If you can’t see the team Eluna/Driftnet presentation click here

AI Embedded in the Class
This was the first year where all teams used AI to help create their business model canvas, build working MVPs in hours, generate customer questions, analyze and summarizing interviews.

It Takes A Village
While I authored this blog post, this class is a team project. The secret sauce of the success of the Lean LaunchPad at Stanford is the extraordinary group of dedicated volunteers supporting our students in so many critical ways.

The teaching team consisted of myself and:

  • Steve Weinstein, partner at America’s Frontier Fund, 30-year veteran of Silicon Valley technology companies and Hollywood media companies. Steve was CEO of MovieLabs, the joint R&D lab of all the major motion picture studios.
  • Lee Redden – CTO and co-founder of Blue River Technology (acquired by John Deere) who was a student in the first Lean LaunchPad class 14 years ago!
  • Jennifer Carolan, Co-Founder, Partner at Reach Capital the leading education VC and author of the Hacking for Education class.

Our teaching assistants this year were Arthur C. Campello, Anil Yildiz, Abu B. Rogers and Tireni Ajilore.

Mentors helped the teams understand if their solutions could be a commercially successful business. Thanks to Jillian Manus, Dave Epstein, Robert Feldman, Bobby Mukherjee, Kevin Ray, Deirdre Clute, Robert Locke, Doug Biehn, and John Danner. Martin Saywell from the Distinguished Careers Institute joined the Blix team. The mentor team was led by Todd Basche.

Summary
While the Lean LaunchPad/I-Corps curriculum was a revolutionary break with the past, it’s not the end. In the last decade enumerable variants have emerged. The class we teach at Stanford has continued to evolve. Better versions from others will appear. AI is already having a major impact on customer discovery and validation and we had each team list the AI tools they used. And one day another revolutionary break will take us to the next level.

Hacking for Defense @ Stanford 2025 – Lessons Learned Presentations

The videos and PowerPoints embedded in this post are best viewed on steveblank.com

We just finished our 10th annual Hacking for Defense class at Stanford.

What a year.

Hacking for Defense, now in 70 universities, has teams of students working to understand and help solve national security problems. At Stanford this quarter the 8 teams of 41 students collectively interviewed 1106 beneficiaries, stakeholders, requirements writers, program managers, industry partners, etc. – while simultaneously building a series of minimal viable products and developing a path to deployment.

This year’s problems came from the U.S. Army, U.S. Navy, CENTCOM, Space Force/Defense Innovation Unit, the FBI, IQT, and the National Geospatial-Intelligence Agency.

We opened this year’s final presentations session with inspiring remarks by Joe Lonsdale on the state of defense technology innovation and a call to action for our students. During the quarter guest speakers in the class included former National Security advisor H.R. McMaster, Jim Mattis ex Secretary of Defense, John Cogbill Deputy Commander 18th Airborne Corps, Michael Sulmeyer former Assistant Secretary of Defense for Cyber Policy, and John Gallagher Managing Director of Cerberus Capital.

“Lessons Learned” Presentations
At the end of the quarter, each of the eight teams gave a final “Lessons Learned” presentation along with a 2-minute video to provide context about their problem. Unlike traditional demo days or Shark Tanks which are, “Here’s how smart I am, and isn’t this a great product, please give me money,” the Lessons Learned presentations tell the story of each team’s 10-week journey and hard-won learning and discovery. For all of them it’s a roller coaster narrative describing what happens when you discover that everything you thought you knew on day one was wrong and how they eventually got it right.
While all the teams used the Mission Model Canvas, Customer Development and Agile Engineering to build Minimal Viable Products, each of their journeys was unique.

This year we had the teams add two new slides at the end of their presentation: 1) tell us which AI tools they used, and 2) their estimate of progress on the Technology Readiness Level and Investment Readiness Level.

Here’s how they did it and what they delivered.

Team Omnyra – improving visibility into AI-generated bioengineering threats.

If you can’t see the team Omnyra summary video click here

If you can’t see the Omnyra presentation click here

These are “Wicked” Problems
Wicked problems refer to really complex problems, ones with multiple moving parts, where the solution isn’t obvious and lacks a definitive formula. The types of problems our Hacking For Defense students work on fall into this category. They are often ambiguous. They start with a problem from a sponsor, and not only is the solution unclear but figuring out how to acquire and deploy it is also complex. Most often students find that in hindsight the problem was a symptom of a more interesting and complex problem – and that Acquisition of solutions in the Dept of Defense is unlike anything in the commercial world. And the stakeholders and institutions often have different relationships with each other – some are collaborative, some have pieces of the problem or solution, and others might have conflicting values and interests.
The figure shows the types of problems Hacking for Defense students encounter, with the most common ones shaded.

Team HydraStrike – bringing swarm technology to the maritime domain.

If you can’t see the HydraStrike summary video click here.


If you can’t see the HydraStrike presentation click here

Mission-Driven Entrepreneurship
This class is part of a bigger idea – Mission-Driven Entrepreneurship. Instead of students or faculty coming in with their own ideas, we ask them to work on societal problems, whether they’re problems for the State Department or the Department of Defense or non-profits/NGOs  or the Oceans and Climate or for anything the students are passionate about. The trick is we use the same Lean LaunchPad / I-Corps curriculum — and the same class structure – experiential, hands-on– driven this time by a mission-model not a business model. (The National Science Foundation and the Common Mission Project have helped promote the expansion of the methodology worldwide.)
Mission-driven entrepreneurship is the answer to students who say, “I want to give back. I want to make my community, country or world a better place, while being challenged to solve some of the toughest problems.”

Team HyperWatch – tracking hypersonic threats.

If you can’t see the HyperWatch video click here

If you can’t see the HyperWatch presentation click here

It Started With An Idea
Hacking for Defense has its origins in the Lean LaunchPad class I first taught at Stanford in 2011. I observed that teaching case studies and/or how to write a business plan as a capstone entrepreneurship class didn’t match the hands-on chaos of a startup. Furthermore, there was no entrepreneurship class that combined experiential learning with the Lean methodology. Our goal was to teach both theory and practice. The same year we started the class, it was adopted by the National Science Foundation to train Principal Investigators who wanted to get a federal grant for commercializing their science (an SBIR grant.) The NSF observed, “The class is the scientific method for entrepreneurship. Scientists understand hypothesis testing” and relabeled the class as the NSF I-Corps (Innovation Corps). I-Corps became the standard for science commercialization for the National Science Foundation, National Institutes of Health and the Department of Energy, to date training 3,051 teams and launching 1,300+ startups.

Team ChipForce – Securing U.S. dominance in critical minerals.

If you can’t see the ChipForce video click here

If you can’t see the ChipForce presentation click here
Note: After briefing the Department of Commerce, the Chipforce was offered jobs with the department.

Origins Of Hacking For Defense
In 2016, brainstorming with Pete Newell of BMNT and Joe Felter at Stanford, we observed that students in our research universities had little connection to the problems their government was trying to solve or the larger issues civil society was grappling with. As we thought about how we could get students engaged, we realized the same Lean LaunchPad/I-Corps class would provide a framework to do so. That year we launched both Hacking for Defense and Hacking for Diplomacy (with Professor Jeremy Weinstein and the State Department) at Stanford. The Department of Defense adopted and scaled Hacking for Defense across 60 universities while Hacking for Diplomacy has been taught at Georgetown, James Madison University, Rochester Institute for Technology, University of Connecticut and now Indiana University, sponsored by the Department of State Bureau of Diplomatic Security (see here).

Team ArgusNet – instant geospatial data for search and rescue.

If you can’t see the ArgusNet video click here

If you can’t see the ArgusNet presentation click here

Goals for Hacking for Defense
Our primary goal for the class was to teach students Lean Innovation methods while they engaged in national public service.
In the class we saw that students could learn about the nation’s threats and security challenges while working with innovators inside the DoD and Intelligence Community. At the same time the experience would introduce to the sponsors, who are innovators inside the Department of Defense (DOD) and Intelligence Community (IC), a methodology that could help them understand and better respond to rapidly evolving threats. We wanted to show that if we could get teams to rapidly discover the real problems in the field using Lean methods, and only then articulate the requirements to solve them, defense acquisition programs could operate at speed and urgency and deliver timely and needed solutions.
Finally, we wanted to familiarize students with the military as a profession and help them better understand its expertise, and its proper role in society. We hoped it would also show our sponsors in the Department of Defense and Intelligence community that civilian students can make a meaningful contribution to problem understanding and rapid prototyping of solutions to real-world problems.

Team NeoLens – AI-powered troubleshooting for military mechanics.

If you can’t see the NeoLens video click here

If you can’t see the NeoLens presentation click here

Go-to-Market/Deployment Strategies
The initial goal of the teams is to ensure they understand the problem. The next step is to see if they can find mission/solution fit (the DoD equivalent of commercial product/market fit.) But most importantly, the class teaches the teams about the difficult and complex path of getting a solution in the hands of a warfighter/beneficiary. Who writes the requirement? What’s an OTA? What’s color of money? What’s a Program Manager? Who owns the current contract? …

Team Omnicomm – improving the quality, security and resiliency of communications for special operations units.

If you can’t see the Omnicomm video click here


If you can’t see the Omnicomm presentation click here

Mission-Driven in 70 Universities and Continuing to Expand in Scope and Reach
What started as a class is now a movement.
From its beginning with our Stanford class, Hacking for Defense is now offered in over 70 universities in the U.S., as well as in the UK as Hacking for the MOD and in Australia. In the U.S., the course is a program of record and supported by Congress, H4D is sponsored by the Common Mission Project, Defense Innovation Unit (DIU), and the Office of Naval Research (ONR). Corporate partners include Boeing, Northrop Grumman and Lockheed Martin.
Steve Weinstein started Hacking for Impact (Non-Profits) and Hacking for Local (Oakland) at U.C. Berkeley, and Hacking for Oceans at bot Scripps and UC Santa Cruz, as well as Hacking for Climate and Sustainability at Stanford. Jennifer Carolan started Hacking for Education at Stanford.

Team Strom – simplified mineral value chain.

If you can’t see the Strom video click here

If you can’t see the Strom presentation click here

What’s Next For These Teams?
.When they graduate, the Stanford students on these teams have the pick of jobs in startups, companies, and consulting firms .This year, seven of our teams applied to the Defense Innovation Unit accelerator – the DIU Defense Innovation Summer Fellows Program – Commercialization Pathway. Seven were accepted. This further reinforced our thinking that Hacking for Defense has turned into a pre-accelerator – preparing students to transition their learning from the classroom to deployment

See the teams present in person here

It Takes A Village
While I authored this blog post, this class is a team project. The secret sauce of the success of Hacking for Defense at Stanford is the extraordinary group of dedicated volunteers supporting our students in so many critical ways.

The teaching team consisted of myself and:

  • Pete Newell, retired Army Colonel and ex Director of the Army’s Rapid Equipping Force, now CEO of BMNT.
  • Joe Felter, retired Army Special Forces Colonel; and former deputy assistant secretary of defense for South Asia, Southeast Asia, and Oceania; and currently the Director of the Gordian Knot Center for National Security Innovation at Stanford which we co-founded in 2021.
  • Steve Weinstein, partner at America’s Frontier Fund, 30-year veteran of Silicon Valley technology companies and Hollywood media companies. Steve was CEO of MovieLabs, the joint R&D lab of all the major motion picture studios.
  • Chris Moran, Executive Director and General Manager of Lockheed Martin Ventures; the venture capital investment arm of Lockheed Martin.
  • Jeff Decker, a Stanford researcher focusing on dual-use research. Jeff served in the U.S. Army as a special operations light infantry squad leader in Iraq and Afghanistan.

Our teaching assistants this year were Joel Johnson, Rachel Wu, Evan Twarog, Faith Zehfuss, and Ethan Hellman.

31 Sponsors, Business and National Security Mentors
The teams were assisted by the originators of their problems – the sponsors.

Sponsors gave us their toughest national security problems: Josh Pavluk, Kari Montoya, Nelson Layfield, Mark Breier, Jason Horton, Stephen J. Plunkett, Chris O’Connor, David Grande, Daniel Owins, Nathaniel Huston, Joy Shanaberger, and David Ryan.
National Security Mentors helped students who came into the class with no knowledge of the Department of Defense, and the FBI understand the complexity, intricacies and nuances of those organizations: Katie Tobin, Doug Seich, Salvadore Badillo-Rios, Marco Romani, Matt Croce, Donnie Hasseltine, Mark McVay, David Vernal, Brad Boyd, Marquay Edmonson.
Business Mentors helped the teams understand if their solutions could be a commercially successful business: Diane Schrader, Marc Clapper, Laura Clapper, Eric Byler, Adam Walters, Jeremey Schoos, Craig Seidel, Rich “Astro” Lawson.

Thanks to all!

Teaching National Security Policy with AI

The videos embedded in this post are best viewed on steveblank.com

International Policy students will be spending their careers in an AI-enabled world. We wanted our students to be prepared for it. This is why we’ve adopted and integrated AI in our Stanford national security policy class – Technology, Innovation and Great Power Competition.

Here’s what we did, how the students used it, and what they (and we) learned.


Technology, Innovation and Great Power Competition is an international policy class at Stanford (taught by me, Eric Volmar and Joe Felter.) The course provides future policy and engineering leaders with an appreciation of the geopolitics of the U.S. strategic competition with great power rivals and the role critical technologies are playing in determining the outcome.

This course includes all that you would expect from a Stanford graduate-level class in the Masters in International Policy – comprehensive readings, guest lectures from current and former senior policy officials/experts, and deliverables in the form of written policy papers. What makes the class unique is that this is an experiential policy class. Students form small teams and embark on a quarter-long project that got them out of the classroom to:

  • select a priority national security challenge, and then …
  • validate the problem and propose a detailed solution tested against actual stakeholders in the technology and national security ecosystem

The class combines multiple teaching tools.

  • Real world – Students worked in teams on real problems from government sponsors
  • Experiential – They get out of the building to interview 50+ stakeholders
  • Perspectives – They get policy context and insights from lectures by experts
  • And this year… Using AI to Accelerate Learning

Rationale for AI
Using this quarter to introduce AI we had three things going for us: 1) By fall 2024 AI tools were good and getting exponentially better, 2) Stanford had set up an AI Playground enabling students to use a variety of AI Tools (ChatGPT, Claude, Perplexity, NotebookLM, Otter.ai, Mermaid, Beautiful.ai, etc.) and 3) many students were using AI in classes but it was usually ambiguous about what they were allowed to do.

Policy students have to read reams of documents weekly. Our hypotheses was that our student teams could use AI to ingest and summarize content, identify key themes and concepts across the content, provide an in-depth analysis of critical content sections, and then synthesize and structure their key insights and apply their key insights to solve their specific policy problem.  They did all that, and much, much, more.

While Joe Felter and I had arm-waved “we need to add AI to the class” Eric Volmar was the real AI hero on the teaching team. As an AI power user Eric was most often ahead of our students on AI skills. He threw down a challenge to the students to continually use AI creatively and told them that they would be graded on it. He pushed them hard on AI use in office hours throughout the quarter. The results below speak for themselves.

If you’re not familiar with these AI tools in practice it’s worth watching these one minute videos.

Team OSC
Team OSC was trying to understand what is the appropriate level of financial risk for the U.S. Department of Defense to provide loans or loan guarantees in technology industries?

The team started using AI to do what we had expected, summarizing the stack of weekly policy documentsusing Claude 3.5. And like all teams, the unexpected use of AI was to create new leads for their stakeholder interviews. They found that they could ask AI to give them a list of leaders that were involved in similar programs, or that were involved in their program’s initial stages of development.

See how Team OSC summarized policy papers here:

If you can’t see the video click here

Claude was also able to create a list of leaders with the Department of Energy Title17 credit programs, Exim DFC, and other federal credit programs that the team should interview. In addition, it created a list of leaders within Congressional Budget Office and the Office of Management and Budget that would be able to provide insights. See the demo here:

If you can’t see the video click here
The team also used AI to transcribe podcasts. They noticed that key leaders of the organizations their problem came from had produced podcasts and YouTube videos. They used Otter.ai to transcribe these. That provided additional context for when they did interview them and allowed the team to ask insightful new questions.

If you can’t see the video click here

Note the power of fusing AI with interviews. The interviews ground the knowledge in the teams lived experience.

The team came up with a use case the teaching team hadn’t thought of – using AI to critique the team’s own hypotheses. The AI not only gave them criticism but supported it with links from published scholars. See the demo here:

If you can’t see the video click here

Another use the teaching team hadn’t thought was using Mermaid AI to create graphics for their weekly presentations. See the demo here:

If you can’t see the video click here

The surprises from this team kept coming. Their last was that the team used Beautiful.ai in order to generate PowerPoint presentations. See the demo here:

If you can’t see the video click here

For all teams, using AI tools was a learning/discovery process all its own. By and large, students were largely unfamiliar with most tools on day 1.

Team OSC suggested that students should start using AI tools early in the quarter and experiment with tools like ChatGPT, Otter.ai. Tools that that have steep learning curves, like Mermaid should be started at the very start of the project to train their models.

Team OSC AI tools summary: AI tools are not perfect, so make sure to cross check summaries, insights and transcriptions for accuracy and relevancy. Be really critical of their outputs. The biggest takeaway is that AI works best when prepared with human efforts.

Team FAAST
The FAAST team was trying to understand how can the U.S. improve and scale the DoE FASST program in the urgent context of great power competition?

Team FAAST started using AI to do what we had expected, summarizing the stack of weekly policy documents they were assigned to read and synthesizing interviews with stakeholders.

One of the features of ChatGPT this team appreciated, and important for a national security class, was the temporary chat feature –  data they entered would not be used to train the open AI models. See the demo below.

If you can’t see the video click here

The team used AI do a few new things we didn’t expect –  to generate emails to stakeholders and to create interview questions. During the quarter the team used ChatGPT, Claude, Perplexity, and NotebookLM. By the end of the 10-week class they were using AI to do a few more things we hadn’t expected. Their use of AI expanded to include simulating interviews. They gave ChatGPT specific instructions on who they wanted it to act like, and it provided personalized and custom answers. See the example here.

If you can’t see the video click here

Learning-by-doing was a key part of this experiential course. The big idea is that students learn both the method and the subject matter together. By learning it together, you learn both better.

Finally, they used AI to map stakeholders, get advice on their next policy move, and asked ChatGPT to review their weekly slides (by screenshotting the slides and putting them into ChatGPT and asking for feedback and advice.)

The FAAST team AI tool summary: ChatGPT was specifically good when using images or screenshots, so in these multi-level tasks, and when you wanted to use kind of more custom instructions, as we used for the stakeholder interviews.  Claude was better at more conversational and human in writing, so used it when sending emails. Perplexity was better for researchers because it provides citations, so you’re able to access the web and actually get directed to the source that it’s citing. NotebookLM was something we tried out, but it was not as successful. It was a cool tool that allowed us to summarize specific policy documents into a podcast, but the summaries were often pretty vague.

Team NSC Energy
Team NSC Energy was working on a National Security Council problem, “How can the United States generate sufficient energy to support compute/AI in the next 5 years?”

At the start of the class, the team began by using ChatGPT to summarize their policy papers and generate tailored interview questions, while Claude was used to synthesize research  for background understanding. As ChatGPT occasionally hallucinated information, by the end of the class they were cross validating the summaries via Perplexity pro.

The team also used ChatGPT and Mermaid to organize their thoughts and determine who they wanted to talk to. ChatGPT was used to generate code to put into the Mermaid flowchart organizer. Mermaid has its own language, so ChatGPT was helpful, so we didn’t have to learn all the syntax for this language.
See the video of how Team NSC Energy used ChaptGPT and Mermaid here:

If you can’t see the video click here

Team Alpha Strategy
The Alpha Strategy team was trying to discover whether the U.S. could use AI to create a whole-of-government decision-making factory.

At the start of class, Team Alpha Strategy used ChatGPT.40 for policy document analysis and summary, as well for stakeholder mapping. However, they discovered going one by one through the countless numbers of articles was time consuming. So the team pivoted to using Notebook LM, for document search and cross analysis. See the video of how Team Alpha Strategy used Notebook LM here:

If you can’t see the video click here

The other tools the team used were custom Gpts to build stakeholder maps and diagrams and organize interview notes. There’s going to be a wide variety of specialized Gpts. One that was really helpful, they said, was a scholar GPT.
See the video of how Team Alpha Strategy used custom GPTs:

If you can’t see the video click here

Like other teams, Alpha Strategy used ChatGPT to summarize their interview notes and to create flow charts to paste into their weekly presentations.

Team Congress
The Congress team was exploring the question, “if the Department of Defense were given economic instruments of power, which tools would be most effective in the current techno-economic competition with the People’s Republic of China?”

As other teams found, Team Congress first used ChatGPT to extract key themes from hundreds of pages of readings each week and from press releases, articles, and legislation. They also used for mapping and diagramming to identify potential relationships between stakeholders, or to creatively suggest alternate visualizations.

When Team Congress weren’t able to reach their sponsor in the initial two weeks of the class, much like Team OSC, they used AI tools to pretend to be their sponsor, a member of the defense modernization caucus. Once they realized its utility, they continued to do mock interviews using AI role play.

The team also used customized models of ChatGPT but in their case found that this was limited in the number of documents they could upload, because they had a lot of content. So they used retrieval augmented generation, which takes in a user’s query, and matches it with relevant sources in their knowledge base, and fed that back out as the output. See the video of how Team Congress used retrieval augmented generation here:

If you can’t see the video click here

Team NavalX
The NavalX team was learning how the U.S. Navy could expand its capabilities in Intelligence, Surveillance, and Reconnaissance (ISR) operations on general maritime traffic.

Like all teams they used ChatGPT to summarize and extract from long documents, organizing their interview notes, and defining technical terms associated with their project. In this video, note their use of prompting to guide ChatGPT to format their notes.

See the video of how Team NavalX used tailored prompts for formatting interview notes here:

If you can’t see the video click here

They also asked ChatGPT to role play a critic of our argument and solution so that we could find the weaknesses. They also began uploading many interviews at once, and asked Claude to find themes or ideas in common that they might have missed on their own.

Here’s how the NavalX team used Perplexity for research.

If you can’t see the video click here
Like other teams, the NavalX team discovered you can customize ChatGPT by telling it how you want it to act.

If you can’t see the video click here

Another surprising insight from the team is that you can use ChatGPT to tell you how to write better prompts for itself.

If you can’t see the video click here
In summary, Team NavalX used Claude to translate texts from Mandarin, and found that ChatGPT was the best for writing tasks, Perplexity the best for research tasks, Claude the best for reading tasks, and notebook LM was the best for summarization.

Lessons Learned

  • Integrating AI into this class took a dedicated instructor with a mission to create a new way to teach using AI tools
  • The result was AI vastly enhanced and accelerated learning of all teams
    • It acted as a helpful collaborator
    • Fusing AI with stakeholders interviews was especially powerful
  • At the start of the class students were familiar with a few of these AI tools
    • By the end of the class they were fluent in many more of them
    • Most teams invented creative use cases
  • All Stanford classes we now teach – Hacking for Defense, Lean Launchpad, Entrepreneurship Inside Government – have AI integrated as part of the course
  • Next year’s AI tools will be substantively better