Hacking for Defense (H4D) @ Stanford – Week 2

We just held our second week of the Hacking for Defense class. This week the 8 teams spoke to 106 beneficiaries (users, program mangers, etc.), we held a DOD/IC 101 workshop, our advanced lecture was on the Value Proposition Canvas, and we watched as the students ran into common customer discovery obstacles and found new ones.

(This post is a continuation of the series. See all the H4D posts here. Because of the embedded presentations this post is best viewed on the website.)


DOD/IC 101 – Workshop
We started the week by holding a Monday night workshop – DOD/IC 101. Our goal was to give the students with no military background a tutorial on the challenges facing DoD/IC in the current asymmetric threat environment, how the DOD/IC defines its missions and specifies the products it needs, how it accomplishes these missions and how they get to their ultimate user. This knowledge will help the students understand the overall environment that their Mission Model Canvas is operating in.

We posted the slides here and more important, an annotated narrative for each of the slides here. It’s truly a landmark presentation. Even if you think you know how the DOD works, read the narrative alongside the slides. I learned a lot.

If you can’t see the presentation click here

you can’t see the narrative click here.

Hacking for Defense: Week 2
The second week started with the 8 teams presenting what they learned in their first full week of class.

Capella Space
Team Capella is launching a constellation of synthetic aperture radar satellites into space to provide real-time radar imaging.

This week the team tested whether other beneficiaries – the Coast Guard and the Oil and Gas industry might be interested in their solution. Great learning.

If you can’t see the presentation click here

 

Live Tactical Threat Toolkit (LTTT)
Team LTTT (Live Tactical Threat Toolkit) is trying to enhance the capacity of  foreign military explosive ordnance disposal (EOD) teams to accomplish their mission. The team is developing tech informed options for these teams to consult with their American counterparts in real time to disarm IED’s, and to document key information about what they have found.

The team did a good job in starting to diagram the customer workflow and intends to gain an appreciation for the ground user challenges in accomplishing these types of missions in this weeks customer discovery efforts.

If you can’t see the presentation click here

Narrative Mind
Team Narrative Mind is trying to understand, disrupt, and counter adversaries’ use of social media. Current tools do not provide users with a way to understand the meaning within adversary social media content and there is no automated process to disrupt, counter and shape the narrative.

The team did a good job in starting to diagram the customer workflow and their understanding of how to prioritize MVP features.

If you can’t see the presentation click here

Skynet
Team Skynet is also using drones to to provide ground troops situational awareness. (Almost the inverse of Team Guardian.)

Their Mission Model Canvas had a ton of learning, and their MVP engendered a lot of conversation from those who’ve been in combat and were familiar with the challenges of maintaining situational awareness under fire.

If you can’t see the presentation click here

Aqualink
Team aquaLink is working to give Navy divers a way to monitor their own physiological conditions while underwater (core temperature, maximum dive pressure, blood pressure and pulse.) Knowing all of this would give divers early warning of hypothermia or the bends.

In the first week of the class this team was suiting up in full navy diving gear and doing customer discovery by spending an hour in the life of the beneficiary. They did their homework.
Aqualink suiting up

If you can’t see the presentation click here

Guardian
Team Guardian is working to protect soldiers from cheap, off-the-shelf commercial drones. What happens when adversaries learn how to weaponize drones with bullets, explosives, or chemical weapons? This team is actively working to identify viable responses to these  battlefield inevitabilities.

If you can’t see the presentation click here

Sentinel
Team Sentinel is trying to use low cost sensors to monitor surface ships in a A2/AD environment. The team appreciates that the problem include the sensors as well as the analytics of the sensor data.

Really good summary of hypotheses, experiments, results and action.

If you can’t see the presentation click here

Customer Discovery and the Flipped Classroom – Learnings
After talking to teams in office hours (the teaching team meets every team for 20 minutes every week,) and watching teams present, and then seeing a team send a sponsor an email that read like a bad business school sales pitch, we realized some students had skipped their homework/and or still hadn’t grasped the basics of Customer Discovery.

As a reminder, we run the class as a “flipped classroom” – the lectures – the basics of Customer Discovery and the Mission Model Canvas – are homework watched on Udacity and on Vimeo. It was painfully clear that many of the students hadn’t done their homework. We plan to remedy that in our next week class, warning the students that we will be cold calling on them to show us what they learned.

Some teams did their homework and understood that customer discovery meant “becoming the customer.” For example, the team solving a problem for Navy divers managed to get the Navy to suit them up in full diving regalia. On the other hand, some teams thought that customer discovery simply meant interviewing people and building a minimal viable products. For example, we suggested to the team working on solutions for defusing Improvised Explosive Devices (IEDs) that to truly understand their customer needs might require them to get close to the dirt with some explosive ordnance disposal teams. (Looking ahead we have no doubt that this team will respond aggressively to instructor feedback and suit up in Explosive Ordinance Disposal (EOD) equipment as part of their customer discovery efforts for week 3. Stay tuned.)

Part of the student confusion about customer discovery was the fault of the teaching team. We normally hold a “How to Do Customer Discovery” evening workshop, but we got caught by a tight spring break schedule and we punted this workshop to hold the DOD/IC workshop. In hindsight it was a bad idea – we should have found a way to hold both. We will remedy that by giving an abbreviated workshop first thing next week in the classroom.

All of these were problems we’ve seen before and we’re course correcting quickly to solve them.  But, given the new form of the class we had a few problems we hadn’t encountered.

First, some teams were stymied by the classified nature of the specific data sets they thought they needed to understand the customer problem and build MVPs. In every case, what they lacked was a deep understanding of the customer problem. Which simply required going back to the basics of customer discovery.

Second, a few teams were truly blocked by a few sponsors who were also having a difficult time understanding the role they played in Customer Discovery and required follow up clarification by the teaching team and H4D military liaison officers.

Sponsor Education – Learnings
A few DOD sponsors believed they were not only the gatekeepers to the problem but were the sole source of information for our teams. Given they were supposed to maximize the number of beneficiaries the teams were supposed to talk to, the teaching team jumped on this and rapidly addressed it.

In another case the sponsor so narrowly defined the problem that it was viewed by the team as providing incremental changes to a solution they already have. After discussion the sponsor agreed that the team should focus on the realm of possible and how they would address the problem if there was not a current solution in place and in the process define new plans for how the solutions could be used.

In other cases a few of our sponsors had difficulty generating the leads and contacts within their own ecosystems that were necessary to sustain our teams’ customer discovery beyond the sponsor’s primary contacts. Ultimately teams are required to interview 80-120 beneficiaries, advocates and stakeholders (customers). This is a heavy lift if the sponsor has not thought through who those people are and where they will be found.

Finally, one of our problem sponsors departed their organization and was replaced by an alternate. This created some lag time in reestablishing contact and effectively interacting with the team. Next time we’ll designate a primary and secondary sponsor – the pace of this course requires this.

For us, this was a good learning opportunity to understand the type of sponsor education we need to do in the next class.

Advanced Lecture: Value Proposition Canvas
The advanced lecture for week 2 was on the Value Proposition Canvas – finding product/market fit between Beneficiaries (customers, stakeholders, users) and the Value Proposition (the product/service) in a DOD setting.

Pete Newell started the lecture with a video from his time in the Army’s Rapid Equipping Force.

Pete used the video to take the students through a value proposition canvas and asked the class:

  1. Who are the primary beneficiaries? Who are the other beneficiaries?
  2. What’s the value proposition:
    • To the sergeant?
    • To the mechanics?
    • To the base commander?
    • To contract engineers?
    • To the military vehicle Program Manager?

Pete’ s experienced based vignettes and discussion helped the students appreciate the sometimes competing nature of the interests of a diverse array of beneficiaries.

If you can’t see the presentation click here

Lessons Learned from Week 2

  • Teams are running at full speed
  • Running a flipped classroom requires constant management
    • Problems need to be vetted to insure they can support customer discovery expectations
  • A Customer Discovery Workshop needs to be held
    • Teams need to understand how to work around security issues
  • Sponsors need education and management

Hacking for Defense (H4D) @ Stanford – Week 1

We just had our first Hacking for Defense class and the 8 teams have hit the ground running.

They talked to 86 customers/stakeholders before the class started.

(Because of the embedded presentations this post is best viewed on the website.)


Hacking for Defense is a new class in Stanford’s School Engineering, where students learn about the nation’s security challenges by working with innovators inside the Department of Defense (DoD) and Intelligence Community. The class teaches students the Lean Startup approach to entrepreneurship while they engage in what amounts to national public service.

Hacking for Defense uses the same Lean LaunchPad Methodology adopted by the National Science Foundation and the National Institutes of Health and proven successful in Lean LaunchPad and I-Corps classes with 1,000’s of teams worldwide. Over 70 students applied to this new Stanford class and we selected 32 of them in 8 teams.

One of the surprises was the incredible diversity of the student teams – genders, nationalities, expertise. The class attracted students from all departments and from undergrads to post docs.

Before the class started, the instructors worked with the Department of Defense and the Intelligence Community to identify 20 problems that the class could tackle. Teams then were free to select one of these problems as their focus for the class.

Most discussion about innovation of defense systems acquisition starts with writing a requirements document. Instead, in this class the student teams and their DOD/IC sponsors will work together to discover the real problems in the field and only then articulate the requirements to solve them and deploy the solutions.

Hacking for Defense: Class 1
We started the first class with the obligatory class overview slides. (Most of the students had already seen them during our pre-class information sessions but the class also had team mentors seeing them for the first time.)

If you can’t see the slides click here

Then it was time for each of the 8 teams to tell us what they did before class started.  Their pre-class homework was to talk to 10 beneficiaries before class started. At the first class each team was asked to present a 5-slide summary of what they learned before class started:

  • Slide 1           Title slide
  • Slide 2           Who’s on the team
  • Slide 3           Minimal Viable Product
  • Slide 4:          Customer Discovery
  • Slide 5:          Mission Model Canvas

As the teams presented the teaching team offered a running commentary of suggestions, insights and direction.

Unlike the other Lean Launchpad / I-Corps classes we’ve taught, we noticed that before we even gave the teams feedback on their findings, we were impressed by the initial level of sophistication most teams brought to deconstructing the sponsors problem.

Here are the first week presentations:

Team aquaLink is working on a problem for divers in the Navy who work 60 to 200 feet underwater for 2-4 hours, but currently have no way to monitor their core temperature, maximum dive pressure, blood pressure and pulse. Knowing all of this would give them early warning of hypothermia or the bends. The goal is to provide a wearable sensor system and apps that will allow divers to monitor their own physiological conditions while underwater.

If you can’t see the presentation click here.

Team Guardian is asking how to protect soldiers from cheap, off-the-shelf commercial drones conducting Intelligence, Surveillance and Reconnaissance. What happens when adversaries learn how to weaponize drones with bullets, explosives, or chemical weapons?

Slides 6 and 7 use the Value Proposition canvas to provide a deeper understanding of product/market fit.

If you can’t see the presentation click here.

Team Skynet is also using drones to to provide ground troops situational awareness. (Almost he inverse of Team Guardian.)

Slides 6 – 8 use the Value Proposition canvas to provide a deeper understanding of product/market fit.

If you can’t see the presentation click here.

Team LTTT (Live Tactical Threat Toolkit) is providing assistance to other countries explosive ordnance disposal (EOD) teams – the soldiers trying to disarm roadside bombs (Improvised Explosive Devices – IEDs). They’re trying to develop tools that would allow foreign explosive experts to consult with their American counterparts in real time to disarm IED’s, and to document key information about what they have found.

If you can’t see the presentation click here.

Team Narrative Mind is trying to determine how to use data mining, machine learning, and data science to understand, disrupt, and counter adversaries’ use of social media (think ISIS). Current tools do not provide users with a way to understand the meaning within adversary social media content and there is no automated process to disrupt, counter and shape the narrative.

If you can’t see the presentation click here.

Team Capella is launching a constellation of satellites with synthetic aperture radar into space to provide the Navy’s 7th fleet with real-time radar imaging.

If you can’t see the presentation click here.

Pre-Computing the Problem and Solution
As expected, a few teams with great technical assets jumped into building the MVP and were off coding/building hardware. It’s a natural mistake. We’re trying to get students to understand the difference between an MVP and a prototype and the importance of customer discovery (hard when you think you’re so smart you can pre-compute customer problems and derive the solution sitting in your dorm room.)

Mentors/Liaisons/DIUx Support
Besides working with their government sponsors, each team has a dedicated industry mentor. One of the surprises was the outpouring of support from individuals and companies who emailed us from across the country (even a few from outside the U.S.) volunteering to mentor the teams.

Each team is also supported by an active duty military liaison officer drawn from Stanford’s Senior Service College Fellows.

Another source of unexpected support for the teams was from the Secretary of Defense’s DIUx Silicon Valley Innovation Outpost. DIUx has adopted the class and along with the military liaisons translate “military-speak” from the sponsors into English and vice versa.

Advanced Lectures
The Stanford teaching team uses a “flipped classroom” (the lectures are homework watched on Udacity.)  However, for this class some of the parts of the business model canvas, which make sense in a commercial setting, don’t work in the Department of Defense and Intelligence Community. So we are supplementing the video lectures with in-class “advanced” lectures that explain the new Mission Model Canvas. (We’re turning these lectures into animated videos which can serve as homework for the next time we teach this class.)

The first advanced lecture was on Beneficiaries (customers, stakeholders, users, etc.) in the Department of Defense. Slides 4-7 clearly show that solutions in the DOD are always a multi-sided market. Almost every military program has at least four customer segments: Concept Developers, Capability Managers, Program Managers, Users.

If you can’t see the presentation click here.

Each team is keeping a running blog of their customer interactions so we can virtually look over their shoulder as they talk to customers. From the look of the blogs week 2 is going to be equally exciting. Check in next week for an update.

Steve, Pete, Joe & Tom

Lessons Learned from Class 1

  • Talented and diverse students seem eager to solve national defense problems
  • Teams jumped on understanding their sponsors problems – even before the class
  • We’ve put 800+ teams through the NSF I-Corps and another 200 or so through my classes, but this class feels really different. There’s a mission focus and passion to these teams I’ve not seen before

Learning Through Reflection

“Sometimes, you have to look back in order to understand the things that lie ahead.”

We just finished the 6th annual Lean LaunchPad class. This year we made a small but substantive addition to way we teach the class, adding a week for reflection. The results have made the class massively better.


For the last 6 years I’ve taught the Lean LaunchPad class at Stanford and Berkeley. To be honest I built the class out of frustration watching schools teach aspiring entrepreneurs that all they need to know is how to write a business plan or how to sit in an incubator building a product.

If you’ve read any of my previous posts, you know I believe that:

  1. a product is just a part of a startup, but understanding customers, channel, pricing, – the business model – is what turns a product into a business
  2. business plans are fine for large companies where there is an existing market, existing product and existing customers, but they are useless in a startup where most often none of these are known
  3. entrepreneurship is experiential and requires theory and a ton of practice.

Therefore, we developed the 8-week Lean LaunchPad class to teach students how to think about all the parts of building a business, not just the product.  We organized the class as:

  • Team-based
    • Students apply and learn as teams of 4. Eight teams per class
  • A “flipped” classroom
    • Students watch the lectures as homework via our MOOC
  • Every week we teach a new part of the theory of how to commercialize an insight or invention
    • using the business model canvas as the framework
  • Every week we teach the practice of Lean
    • by having the students get out of the classroom and talk to 10-15 customers a week and build a new Minimum Viable Product weekly
    • in order to validate/invalidate their business model hypotheses
    • The teaching team critiques their progress and suggests what they might do next
  • Every week the teams present their results
    • “Here’s what we thought, here’s what we did, here’s what we found, here’s what we are going to do next”

Class flowThe combination of the Business Model Canvas, Customer Development and Agile Engineering is an extremely efficient template for the students to follow. It drives a hyper-accelerated learning process which leads the students to a “information dense, evidence-based” set of conclusions. (Translation: they learn a lot more, in a shorter period of time, than in any other entrepreneurship course we’ve taught or seen.)

Demo Days Versus Lesson Learned Presentations
One thing we always kept in mind – we were teaching students a methodology and a set of skills for the rest of their lives – not running an incubator or accelerator. As a consequence, we couldn’t care less about a “Demo Day” at the end of the class. We don’t want our students focused on fund-raising, we want them to maximize their learning. Secondly, even for fund-raising, you couldn’t invent a less useful format to evaluate a startup’s potential then the Demo Days held by accelerators. Demo Days historically have been exactly what they sound like, “Show me how smart your team is at this instant in time.”  Everything depends on a demo, presentation and speaking style.

We designed our class to do something different. We wanted the teams to tell the story of their journey, sharing with us their “Lessons Learned from our Customers”. They needed to show what they learned and how they learned it after speaking to 100+customers, using the language of class: interview, iterations, pivots, restarts, experiments, minimal viable products, evidence. The focus of their presentations is on how they gathered evidence and how it impacted the understanding of their business models – while they were building their MVP.

Reflection Week
In the past, our teams would call on customers until the last week of the class and then present their Lessons Learned. The good news is that their presentations were dramatically better than those given at demo days – they showed us what they learned over 8 weeks which gave us a clear picture of the velocity and trajectory of the teams. The bad news is since their heads were down working on customer discovery until the very end, they had no time to reflect on the experience.

We realized that we had been so focused in packing content and work into the class, we failed to give the students time to step back and think about what they actually learned.

So this year we made a change. We turned the next to last week of the class into a reflection week.  Our goal—to have the students extract the insights and meaning from the work they had done in the previous seven weeks.

We asked each team to prepare a draft Lessons Learned presentation telling us about their journey and showing us their:

  • Initial hypotheses and Petal diagram
  • Quotes from customers that illustrated learnings and insights
  • Diagrams of key parts of the Canvas –customer flow, channel, get/keep/grow (before and after)
  • Pivot stories
  • Screen shots of the evolution of Minimum Viable Product (MVP)
  • Demo of final MVP

The teaching team reviewed the drafts and provided feedback to the teams and to the class as a whole. We discussed what general patterns and principles they extracted from all the customer interaction they had. On the last day of class, each team shared their Lessons Learned presentations, giving everyone in the class the benefit of what every team has learned.

We used this week to help teams reflect that they accomplished more than they first realized. For the teams who found that their ideas weren’t a scalable business, we let them conclude that while it was great to celebrate the wins, they could also embrace and celebrate their failures as low cost learning.

By the time the final week of the final Lessons Learned presentations rolled around, the students were noticeably more relaxed and happier than teams in past classes. It was clear they had a solid understanding of the magnitude of their journey and the size of their accomplishments – eight teams had spoken to nearly 900 customers, built 50 minimum viable products, and tested tons of hypotheses.

Here are four examples from our 2016 Stanford class

Pair Eyeware
Be sure to look at how they tested their hypotheses on slides 11 and 12, and the before and after value proposition canvases on slide 13 -17. A great competitive Petal diagram is on slide 22

Share and Tell
Great story and setup in slides 3-7. Understanding their market in week 6, slide 31.

Allocate
Notice how they learned about their customer archetypes on slides 12-14. After 80 interviews, a big pivot on slide 16.

Nova Credit
Look at the key hypotheses on slide 2 and their journey in the class on slide 5.

Lessons Learned

  • Dedicating a week for reflections expands what everyone learns
  • Students extract the insights and meaning from the work they did
  • See all the presentations here

The Mission Model Canvas – An Adapted Business Model Canvas for Mission-Driven Organizations

As we prepared for the new Hacking for Defense class at Stanford, we had to stop and ask ourselves: How do we use the Business Model Canvas if the primary goal is not to earn money, but to fulfill a mission? In other words, how can we adapt the Business Model Canvas when the metrics of success for an organization is not revenue?

H4D screen top

Alexander Osterwalder and I think we have the answer – the new Mission Model Canvas.

Here are our collective thoughts.

—-

The Lean Startup is the way most innovators build startups and innovate inside of existing companies. As a formal method, the Lean Startup consists of three parts:

The Business Model Canvas has been a great invention for everyone from startups to large companies. Unlike an org chart, which describes how a company executes to deliver known products to known customers, the Business Model Canvas illustrates the search for the unknowns that most new ventures face. The 9 boxes of the canvas let you visualize all the components needed to turn customer needs/problems into a profitable company.

From Revenue Streams to Mission Achievement
The Business Model Canvas has served all of us well in thinking about building businesses – and therein lies the problem. In a business the aim is to earn more money than you spend. What if you’re a government or a military organization or part of the intelligence community? In these cases you don’t earn money, but you mobilize resources and a budget to solve a particular problem and create value for a set of beneficiaries (customers, support organizations, warfighters, Congress, the country, etc.)

For these organizations, the canvas box labeled Revenue Streams doesn’t make sense.Business Model Canvas no revenue In a mission-driven organization such as the defense and intelligence community, there is no revenue to measure. So the first step in building a canvas for mission-driven organizations is to change the Revenue Stream box in the canvas and come up with a counterpart that would provide a measure of success.

We’re calling this alternative Mission Achievement. Later in this post I’ll explain how we’ll measure and describe Mission Achievement, but first our Mission Model Canvas needs four more tweaks.

  • Customer Segments is changed to Beneficiaries
  • Cost Structure is changed to Mission Cost/Budget
  • Channel is changed to Deployment
  • Customer Relationships is changed to Buy-in/Support

Mission_Model_CanvasThe rest of this blog post explains the how and why of these changes to the canvas.

Customer Segments Change to Beneficiaries
At first glance, when developing a new technology for use in the defense and intelligence community, the customer appears obvious – it’s the ultimate war fighter. They will articulate pains in terms of size, weight, form fit, complexity and durability. But there are other key players involved.  Requirement writers and acquisition folks look at systems integration across the battlefield system, while contracting officers, yet another segment, will count beans, measure the degree of competition and assess the quality of market research involved. The support organizations need to worry about maintainability of code or hardware. Does legal need to sign off for cyber operations?  So yes, war fighters are one customer segment, but others need to be involved before the war fighter can ever see the product.

So the first insight is that in the defense and intelligence community mission models are always multi-sided markets with the goal of not just building a great demo but getting the product adopted and deployed.

Second, in the defense and intelligence communities almost all of the mission models look like that of an OEM supplier – meaning there are multiple layers of customers in the value chain. Your product/service is just part of someone else’s larger system.

So to differentiate “customers” from the standard business model canvas we’ll call all the different customer segments and the layers in the defense and intelligence value chain beneficiaries.

The Value Proposition Canvas
Of all the nine boxes of the canvas, two important parts of the model are the relationship between the Value Proposition (what you’re building) and the beneficiaries. These two components of the business model are so important we give them their own name, Product/Market Fit.osterwalder books

Because of the complexity of multiple beneficiaries and to get more detail about their gains and pains, Osterwalder added an additional canvas called the Value Proposition Canvas.  This functions like a plug-in to the Mission Model Canvas, zooming in to the value proposition to describe the interactions among these beneficiaries, war fighters, etc. and the product/service in more detail. Using the Value Proposition Canvas with the Mission Model Canvas lets you see both the big picture at the mission model level and the detailed picture of each beneficiary at the “product/market fit” level.

Value prop zoom bus modelIn the defense and intelligence community mission models, there will always be multiple beneficiaries.  It’s important that each beneficiary gets its own separate Value Proposition Canvas.

value_proposition_canvas

Distribution Channel changes to Deployment
In the commercial world we ask, “What type of distribution channel (direct sales, app store, system integrator, etc.) do we use to get the product/service from our company to the customer segments?”  For the Department of Defense or Intelligence organizations, we ask instead:

  • “What will it take to deploy the product/service from our current Minimum Viable Product to widespread use among people who need it?” (What architecture components can they innovate on and what can’t they?)
  • “What constitutes a successful deployment? (number of users, units in the field, time to get it into the field, success in the field, etc.)”
  • “How do we turn a Horizon 3 innovation into something that gets adopted by a Horizon 1 organization?”

Customer Relationships changes to Buy-In/Support
In an existing business, Customer Relationships is defined as establishing and maintaining a relationship to support existing customers. In a startup we redefined Customer Relationships to answer the question:  How does a company get, keep and grow customers?

For the defense and intelligence communities, we have modified Customer Relationships to mean, “For each beneficiary (customer segment), how does the team get “Buy-In” from all the beneficiaries?”

Customer discovery helps you understand whose buy-in is needed in order to deploy the product/service (legal, policy, procurement, etc.) and how to get those beneficiaries to buy-in? (Funding? Mandates? User requested? etc.) In addition, the long-term support and maintenance of new projects need to be articulated, understood and bought-into by the support organizations.

At the Pentagon a favorite way to kill something is to coordinate it to death by requiring buy-in from too many people too early. How to determine who are the small group of critical people to get buy-in from – and how to determine who are the next set required to sustain the iterative development of future MVP’s – is one of the arts of entrepreneurship in the defense and intelligence community.

Revenue Streams changes to Mission Achievement
Mission Achievement is the value you are creating for the sum of all of the beneficiaries / the greater good.

It’s important to distinguish between the value for individual beneficiaries (on the Value Proposition Canvas) and overall Mission Achievement. For example, Mission Achievement could be measured in a variety of ways: the number of refugees housed and fed, the number of soldiers saved from roadside bombs, the number of cyberattacks prevented, the increased target surveillance of sensor fusion, etc.  None of these are measured in dollars and cents. Keep in mind, there is only mission achievement if it delivers value to the end beneficiary.

Lessons Learned

  • In the defense and intelligence community the metrics of success are not revenue but mission achievement
    • We’ve modified the Business Model Canvas into a Mission Model Canvas
    • Changed Revenue Streams to Mission Achievement
    • Changed Customer Segments to Beneficiaries
    • Changed Cost Structure to Mission Cost/Budget
    • Changed Channel to Deployment
    • Changed Customer Relationships to Buy-in/Support
  • Organizations without specific revenue goals can now use a version of the Business Model Canvas

Hacking for Defense @ Stanford – Making the World a Safer Place

Introducing Hacking for Defense – Connecting Silicon Valley Innovation Culture and Mindset to the Department of Defense and the Intelligence Community
Hacking for Defense is a new course at Stanford’s Engineering School in the Spring of 2016. It is being taught by Tom Byers, Steve Blank, Joe Felter and Pete Newell and is advised by former Secretary of Defense Bill PerryJoin a select cross-disciplinary class that will put you hands-on with the masters of lean innovation to help bring rapid-fire innovative solutions to address threats to our national security. Why? Hacking for Defense poster

Army, Navy, Air Force, Marines, CIA, NSA
What do all these groups in the Department of Defense and Intelligence Community (DOD/IC) have in common? Up until the dawn of the 21st century, they defined military technology superiority. Our defense and intelligence community owned and/or could buy and deploy the most advanced technology in the world. Their R&D groups and contractors had the smartest domain experts who could design and manufacture the best systems. Not only were they insulated from technological disruption, they were often also the disrupters. (During the Cold War we used asymmetric technologies in silicon and software to disrupt the Soviet Union’s lead in conventional weapons.) Yet in the last decade the U.S. Department of Defense and Intelligence Community are now facing their own disruption from ISIS. al-Qaeda. North Korea. Crimea. Ukraine. DF-21 and Islands in the South China Sea.

Today these potential adversaries are able to harness the power of social networks, encryption, GPS, low-cost drones, 3D printers, simpler design and manufacturing processes, agile and lean methodologies, ubiquitous Internet and smartphones. Our once closely held expertise in people, processes and systems that we once had has evolved to become commercial off-the-shelf technologies. U.S. agencies that historically owned technology superiority and fielded cutting-edge technologies now find that off-the-shelf solutions may be more advanced than the solutions they are working on, or that adversaries can rapidly create asymmetric responses using these readily available technologies.

Its Not Just the Technology
Perhaps more important than the technologies, these new adversaries can acquire and deploy disruptive technology at a speed that to us looks like a blur. They can do so because most have little legacy organizational baggage, no government overhead, some of the best software talent in the world, cheap manpower costs, no career risk when attempting new unproven feats and ultimately no fear of failure.

organizational capabilitiesTerrorists today live on the ‘net and they are all early adopters. They don’t need an office in Silicon Valley to figure out what’s out there. They are experts in leveraging Web 2.0 and 3.0. They are able to collaborate using Telegram, Instagram, Facebook, Skype, FaceTime, YouTube, wiki’s, IM/chat. Targeting, assessments, technology, recipes, and tactics all flow at the speed of a Lean Startup.  They can crowd-source designs, find components through eBay, fund through PayPal, train using virtual worlds and refine tactics, techniques and procedures using massive on-line gaming. All while we’re still writing a Request for a Proposal from within the US Government procurement and acquisition channels.

technology capabilities

We’re Our Own Worst Enemy
In contrast to the agility of many of our adversaries, the Department of Defense and the Intelligence Community have huge investments in existing systems (aircraft carriers, manned fighters and bombers, large satellites, etc.), an incentive system (promotions) that supports the status quo, an existing contractor base with major political influence over procurement and acquisition, and the talent to deliver complex systems that are the answer to past problems.

Efficiently Being Inefficient
Our drive for ultimate efficiency in buying military systems (procurement) has made us our own worst enemy. These acquisition and procurement “silos” of excellence are virtually impenetrable by new ideas and requirements. Even in the rare moments of crisis and need, when they do show some flexibility, their reaction is often so slow and cumbersome that by the time the solutions reach the field, the problem they intended to solve has changed so dramatically the solutions are useless.

The incentives for acquiring and deploying innovation in the DOD/IC with speed and urgency are not currently aligned with the government acquisition, budgeting, and requirements processes, all of which have remained unchanged for decades or even centuries.

The Offset Dilemma – Technology is the not Silver Bullet
Today, many in the Department of Defense and Intelligence Community are searching for a magic technology bullet – the next Offset Strategyconvinced that if they could only get close to Silicon Valley, they will find the right technology advantage.

It turns out that’s a massive mistake. What Silicon Valley delivers is not just new technology but – perhaps even more importantly – an innovation culture and mindset. We will not lose because we had the wrong technology.  We will lose because we couldn’t adopt, adapt and deploy technology at speed and in sufficient quantities to overcome our enemies.

Ultimately the solution isn’t reforming the acquisition process (incumbents will delay/kill it) or buying a new technology and embedding it in a decade-long procurement process (determined adversaries will find asymmetric responses).

The solution requires new ways to think about, organize, and build and deploy national security people, organizations and solutions.

Stanford’s new Hacking for Defense class is a part of the solution.

Hacking for Defense (H4D) @ Stanford
In Hacking for Defense a new class at Stanford’s School Engineering this spring, students will learn about the nation’s emerging threats and security challenges while working with innovators inside the Department of Defense (DoD) and Intelligence Community. The class teaches students entrepreneurship while they engage in what amounts to national public service.

Hacking for Defense uses the same Lean LaunchPad Methodology adopted by the National Science Foundation and the National Institutes of Health and proven successful in Lean LaunchPad and I-Corps classes with 1,000’s of teams worldwide. Students apply as a 4-person team and select from an existing set of problems provided by the DoD/IC community or introduce their own ideas for DoD/IC problems that need to be solved.

Student teams will take actual national security problems and learn how to apply Lean Startup principles to discover and validate customer needs and to continually build iterative prototypes to test whether they understood the problem and solution.

Most discussion about innovation of defense systems acquisition using an agile process starts with writing a requirements document. Instead, in this class the student teams and their DOD/IC sponsors will work together to discover the real problems in the field and only then articulate the requirements to solve them and deploy the solutions.

Each week, teams will use the Mission Model Canvas (a DOD/IC variant of the Business Model Canvas) to develop a set of initial hypotheses about a solution to the problem and will get out of the building and talk to all Requirement Writers, Buyers (Acquisition project managers) and Users (the tactical folks). As they learn, they’ll iterate and pivot on these hypotheses through customer discovery and build minimal viable prototypes (MVPs). Each team will be guided by two mentors, one from the agency that proposed the problem and a second from the local community. In addition to these mentors, each H4D student team will be supported by a an active duty military liaison officer drawn from Stanford’s Senior Service College Fellows to facilitate effective communication and interaction with the problem sponsors.

Today if college students want to give back to their country they think of Teach for America, the Peace Corps, or Americorps. Few consider opportunities to make the world safer with the Department of Defense, Intelligence Community and other government agencies. The Hacking for Defense class will promote engagement between students and the military and provide a hands-on opportunity to solve real national security problems.

Our goal is to open-source this class to other universities and create the 21st Century version of Tech ROTC. By creating a national network of colleges and universities, the Hacking for Defense program can scale to provide hundreds of solutions to critical national security problems every year.

We’re going to create a network of entrepreneurial students who understand the security threats facing the country and getting them engaged in partnership with islands of innovation in the DOD/IC. This is a first step to a more agile, responsive and resilient, approach to national security in the 21st century.

Lessons Learned

 Hacking for Defense is a new class that teaches students how to:

  • Use the Lean LaunchPad methodology to deeply understand the problems/needs of government customers
  • Rapidly iterate technology to produce solutions while searching for product-market fit
  • Deliver minimum viable products that match DOD/IC customer needs in an extremely short time

The class will also teach the islands of innovation in the Department of Defense and Intelligence Community:

  • how the innovation culture and mindset operate at speed
  • advanced technologies that exist outside their agencies and contractors (and are in university labs, and commercial off-the-shelf solutions)
  • how to use an entrepreneurial mindset and Lean Methodologies to solve national security problems


Sign up here.

Doubling Down On a Good Thing: The National Science Foundation’s I-Corps Lite

I’ve known Edmund Pendleton from the University of Maryland as the Director of the D.C. National Science Foundation (NSF) I-Corps Node (a collaboration among the University of Maryland, Virginia Tech, George Washington, and Johns Hopkins). edmund pendeltonBut it wasn’t until seeing him lead the first I-Corps class at the National Institutes of Health that I realized Edmund could teach my class better than I can.

After seeing the results of 500+ teams through the I-Corps, the NSF now offers all teams who’ve received government funding to start a company an introduction to building a Lean Startup.

Here’s Edmund’s description of the I-Corps Lite program.

SBIR/STTR Program and Startup Seed Funding
The Small Business Innovative Research (SBIR) and Small Business Technology Transfer (STTR) programs are startup seed funds created by Congress to encourage U.S. small businesses to turn Government-funded research into commercial businesses. Eleven U.S. agencies participate in the SBIR/STTR program, with DOD, HHS (NIH), NSF, DOE, and NASA offering the majority of funding opportunities.SBIR and STTR program

The SBIR/STTR program made ~6,200 seed stage investments in 2014, dwarfing the seed investments made by venture capital. seed stage investmentThe SBIR/STTR program represents a critical source of seed funding for U.S. startups that don’t fit whatever’s hot in venture capital. In fact, half of all seed stages in tech companies in the U.S. were funded by the SBIR program.

The SBIR/STTR program
The SBIR/STTR program funds companies in three phases. Phase I funding is for teams to prove feasibility, both technical and commercial.

Since most of the founders come from strong technical roots, companies in Phase I tend to focus on the technology – and spend very little time understanding what it takes to turn the company’s technology into a scalable and repeatable commercial business.

SBIR PhasesIn 2011 the National Science Foundation recognized that many of the innovators they were funding were failing – not from an inability to make their technologies work – but because they did not understand how to translate the technology into a successful business. To address this problem, the NSF collaborated with Steve Blank to adapt his Lean LaunchPad class at Stanford for NSF-funded founders. By focusing on hypothesis testing, the Lean LaunchPad had actually developed something akin to the scientific method for entrepreneurship. (see here, here and the results here.) This was an approach that would immediately make sense to the scientists and technologists NSF was funding. Steve and the NSF collaborated on adapting his curriculum and the result was the 9-week NSF I-Corps program.

NSF’s original I-Corps program was specifically designed for academic innovators still in the lab; fundamentally, to help them determine the best path to commercialization before they moved to the start-up stage. (I-Corps participants are at the “pre-company” stage.) But NSF realized the Lean LaunchPad approach would be equally beneficial for the many startups they fund through the SBIR/STTR program.Icorps plus SBIR

The “Beat the Odds” Bootcamp – an I-Corps “Lite”
The good news is that the NSF found that the I-Corps program works spectacularly well. But the class requires a substantial time commitment for the founding team to get out of the building and talk to 10-15 customers a week, and then present what they learned – the class is essentially a full time commitment.

Was there a way to expose every one of ~240 companies/year who receive a NSF grant to the I-Corps? The NSF decided to pilot a “Beat the Odds Boot Camp” (essentially an I-Corps Lite) at the biannual gathering of new SBIR/STTR Phase I grantees in Washington.

Steve provided an overview of the Lean LaunchPad methodology in an introductory webinar. Then the companies were sent off to do customer discovery before coming to an optional “bootcamp workshop” 12 weeks later. Four certified I-Corps instructors provided feedback to these companies at the workshop. The results of the pilot were excellent. The participating companies learned a significant amount about their business models, even in this very light-touch approach. The NSF SBIR/STTR program had found a way to improve the odds of building a successful company.Icorps lite plus sbir

During the past two years, I’ve taken the lead to expand and head up this program, building on what Steve started. We now require the participating companies to attend kick-off and mid-point webinars, and to conduct 30 customer interviews over the twelve-week program. The companies present to I-Corps instructors at a “Beat the Odds Bootcamp” – the day before the biannual NSF Phase I Grantee Workshop.

In March we conducted our fourth iteration of this workshop with a record number of companies participating (about 110 of 120, or 90%) and 14 certified I-Corps instructors giving feedback to teams. This time, we added afternoon one-on-one sessions with the teams in addition to group presentations in the morning. Companies are very happy with the program, and many have requested even more face time with I-Corps instructors throughout the process.

The smart companies in Phase I realize that this Bootcamp program provides a solid foundation for success in Phase II, when more dollars are available.

What’s Next
Currently, once these teams leave I-Corps Lite, they do not have any “formal” touch points with their instructors. Over time, we hope to offer more services to the teams and develop a version of I-Corps (I-Corps-Next?) for Phase II grantees.

We envision even greater startup successes if SBIR/STTR funded teams can take advantage of I-Corps classes through their entire life cycle:

  • “Pre-company” academic researchers – current I-Corps
  • Phase I SBIR/STTR teams – current I-Corps Lite
  • Phase II SBIR/STTR teams – develop a new I-Corps Next class

Icorps next plus SBIR ii and iii

The emphasis and format would change for each, but all would be solidly rooted in the Lean LaunchPad methodology. And of course, we don’t want to stop with only NSF teams/companies…as we all know. The opportunity is huge, and we can have a significant impact on the country’s innovation ecosystem.

Summary
NSF led the development of the SBIR program in the late 1970s. It has since been adopted by the entire federal research community. We believe NSF’s leadership with I-Corps will deliver something of equal significance… a program that teaches scientists and engineers what it takes to turn those research projects into products and services for the benefit of society.  I-Corps Lite is one more piece of that program.

Lessons Learned

  • The SBIR/STTR program is a critical source of seed funding for technology startups that don’t fit the “whatever’s hot” category for venture capital
  • The program is a national treasure and envied around the world, but we can (and should) improve it.
  • SBIR/STTR Phase I applicants needed more help with “commercial feasibility”…a perfect fit for business model design, customer discovery and agile engineering – so we rolled out the NSF I-Corps
  • The I-Corps was so successful we wanted more NSF funded entrepreneneurs, not just a select few, to be exposed to the Lean methodology – so we built I-Corps Lite

Why Build, Measure, Learn – isn’t just throwing things against the wall to see if they work – the Minimal Viable Product

I am always surprised when critics complain that the Lean Startup’s Build, Measure, Learn approach is nothing more than “throwing incomplete products out of the building to see if they work.”

Unfortunately the Build, Measure, Learn diagram is the cause of that confusion. At first glance it seems like a fire-ready-aim process.

It’s time to update Build, Measure, Learn to what we now know is the best way to build Lean startups.

Here’s how.


Build, Measure, Learn sounds pretty simple. Build a product, get it into the real world, measure customers’ reactions and behaviors, learn from this, and use what you’ve learned to build something better. Repeat, learning whether to iterate, pivot or restart until you have something that customers love.build measure learn

Waterfall Development
While it sounds simple, the Build Measure Learn approach to product development is a radical improvement over the traditional Waterfall model used throughout the 20th century to build and ship products. Back then, an entrepreneur used a serial product development process that proceeded step-by-step with little if any customer feedback. Founders assumed they understood customer problems/needs, wrote engineering requirements documents, designed the product, implemented/built the hardware/software, verified that it worked by testing it, and then introduced the product to customers in a formal coming out called first customer ship.

Waterfall Development was all about execution of the requirements document. While early versions of the product were shared with customers in Alpha and Beta Testing, the goal of early customer access to the product was to uncover bugs not to provide feedback on features or usability. Only after shipping and attempting to sell the product would a startup hear any substantive feedback from customers. And too often, after months or even years of development, entrepreneurs learned the hard way that customers were not buying their product because they did not need or want most of its features.

It often took companies three tries to get products right. Version 1 was built without customer feedback, and before version 1 was complete work had already started on version 2 so it took till version 3 before the customer was really heard (e.g. Microsoft Windows 3.0)

Best practices in software development started to move to agile development in the early 2000’s. This methodology improved on waterfall by building software iteratively and involving the customer. But it lacked a framework for testing all commercialization hypotheses outside of the building. With Agile you could end up satisfying every feature a customer asked for and still go out of business.

Then came the Build-Measure-learn focus of the Lean Startup.

Build-Measure-Learn
The goal of Build-Measure-Learn is not to build a final product to ship or even to build a prototype of a product, but to maximize learning through incremental and iterative engineering. (Learning could be about product features, customer needs, the right pricing and distribution channel, etc.) The “build” step refers to building a minimal viable product (an MVP.) It’s critical to understand that an MVP is not the product with fewer features. Rather it is the simplest thing that you can show to customers to get the most learning at that point in time. build measure learnEarly on in a startup, an MVP could simply be a PowerPoint slide, wireframe, clay model, sample data set, etc. Each time you build an MVP you also define what you are trying to test/measure. Later, as more is learned, the MVP’s go from low-fidelity to higher fidelity, but the goal continues to be to maximize learning not to build a beta/fully featured prototype of the product.

A major improvement over Waterfall development, Build Measure Learn lets startups be fast, agile and efficient.

The three-circle diagram of Build Measure Learn is good approximation of the process. Unfortunately, using the word “build” first often confuses people. The diagram does seem to imply build stuff and throw it out of the building. A more detailed version of the Build Measure Learn diagram helps to clarify the meaning by adding three more elements: Ideas-Build-Code-Measure-Data-Learn.

ideas build code measureThe five-part version of the Build Measure Learn diagram helps us see that the real intent of building is to test “ideas” – not just to build blindly without an objective. The circle labeled “code” could easily be labeled “build hardware” or “build artificial genome.” The circle labeled “data” indicates that after we measure our experiments we’ll use the data to further refine our learning. And the new learning will influence our next ideas. So we can see that the goal of Build-Measure-Learn isn’t just to build things, the goal is to build things to validate or invalidate the initial idea.

The focus on testing specific ideas counters the concern that build-measure-learn is just throwing things against the wall and see if they work.

But it’s still not good enough. We can now do better.

Start With Hypotheses
What Build-Measure-Learn misses is that new ventures (both startups and new ideas in existing companies) don’t start with “ideas”, they start with hypotheses (a fancy word for guesses.) It’s important to understand that the words “idea ” and “hypotheses” mean two very different things. For most innovators the word “idea” conjures up an insight that immediately requires a plan to bring it to fruition. In contrast, a hypothesis means we have an educated guess that requires experimentation and data to validate or invalidate.

These hypotheses span the gamut from who’s the customer(s), to what’s the value proposition (product/service features), pricing, distribution channel, and demand creation (customer acquisition, activation, retention, etc.)

That the Lean Startup begins with acknowledging that your idea is simply a series of untested hypotheses is a big idea. It’s a really big idea because what you build needs to match the hypothesis you want to test.

The minimum viable product you’ll need to build to find the right customers is different from the minimum viable product you need for testing pricing, which is different from an MVP you would build to test specific product features. And all of these hypotheses (and minimal viable products) change over time as you learn more. So instead of Build-Measure-Learn, the diagram for building minimal viable products in a Lean Startup looks like Hypotheses – Experiments – Tests – Insights.hypotheses experiment

Generating Hypotheses
Using this new Hypotheses – Experiments – Tests – Insights diagram the question then becomes, “What hypotheses should I test?” Luckily Alexander Osterwalder’s business model canvas presents a visual overview of the nine components of a business on one page. They are:

  • value proposition, product/service the company offers (along with its benefits to customers)
  • customer segments, such as users and payers or moms or teens
  • distribution channels to reach customers and offer them the value proposition
  • customer relationships to create demand
  • revenue streams generated by the value proposition(s)
  • activities necessary to implement the business model
  • resources needed to make the activities possible
  • partners 3rd parties needed to make the activities possible
  • cost structure resulting from the business model

Business Model Canvas

And it brings us to the definition of a startup: A startup is a temporary organization designed to search for a repeatable and scalable business model.

Testing Hypotheses
And once these hypotheses fill the Business Model Canvas, how does an entrepreneur go about testing them? If you’re a scientist the answer is easy: you run experiments. The same is true in a Lean Startup. (The National Science Foundation described the Lean LaunchPad class as the scientific method for entrepreneurship.)

The Customer Development process is a simple methodology for taking new venture hypotheses and getting out of the building to test them. Customer discovery captures the founders’ vision and turns it into a series of business model hypotheses. Then it develops a series of experiments to test customer reactions to those hypotheses and turn them into facts. The experiments can be a series of questions you ask customers but most often a minimal viable product to help potential customers understand your solution accompanies the questions.

So another big idea here is startups are not building minimal viable products to build a prototype. They are building minimal viable products to learn the most they can.

HBR Reprint

Finally, the goal of designing these experiments and minimal viable products is not to get data. The data is not the endpoint. Anyone can collect data. Focus groups collect data. This is not a focus group. The goal is to get insight. The entire point of getting out of the building is to inform the founder’s vision. The insight may come from analyzing customer responses, but it also may come from ignoring the data or realizing that what you are describing is a new, disruptive market that doesn’t exist, and that you need to change your experiments from measuring specifics to inventing the future.

Lessons Learned

  • Build, Measure, Learn is a great improvement over Waterfall product development and provided the framework to truly join the customer to agile development
  • However, emphasizing “Build” or “Ideas” as the first step misses the key insight about a Lean Startup – you are starting with hypotheses to be tested and are searching for repeatable and scalable business model
  • Hypotheses, Experiments, Test, Insights better represents the Lean startup process:
    • Use the Business Model Canvas to frame hypotheses, Customer Development to get out of the building to test hypotheses, and Agile Engineering to build the product iteratively and incrementally
Follow

Get every new post delivered to your Inbox.

Join 207,918 other followers