In the early stages of a startup your hypotheses about all the parts of your business model are your profound beliefs. Think of profound beliefs as “strong opinions loosely held.”
You can’t be an effective founder or in the C-suite of a startup if you don’t hold any.
Here’s how I learned why they were critical to successful customer development.
I was an aggressive, young and a very tactical VP of marketing at Ardent, a supercomputer company – who really hadn’t a clue about the relationship between profound beliefs, customer discovery and strategy.
One day the CEO called me into his office and asked, “Steve I’ve been thinking about this as our strategy going forward. What do you think?” And he proceeded to lay out a fairly complex and innovative sales and marketing strategy for our next 18 months. “Yeah, that sounds great,” I said. He nodded and then offered up, “Well what do you think of this other strategy?” I listened intently as he spun an equally complex alternative strategy. “Can you pull both of these off?” he asked looking right at me. By the angelic look on his face I should have known that I was being set up. I replied naively, “Sure, I’ll get right on it.”
Ambushed Decades later I still remember what happened next. All of a sudden the air temperature in the room dropped by about 40 degrees. Out of nowhere the CEO started screaming at me, “You stupid x?!x. These strategies are mutually exclusive. Executing both of them would put us out of business. You don’t have a clue about what the purpose of marketing is because all you are doing is giving engineering a list of feature requests and executing a series of tasks like they’re like a big To Do list. Without understanding why you’re doing them, you’re dangerous as the VP of Marketing, in fact you’re just a glorified head of marketing communications. You have no profound beliefs.”
I left in a daze, angry and confused. There was no doubt my boss was a jerk, but I didn’t understand the point. I was a great marketer. I was getting feedback from customers, and I’d pass on every list of what customers wanted to engineering and tell them that’s the features our customers needed. I could implement any marketing plan handed to me regardless of how complex. In fact I was implementing three different ones. Oh…hmm… perhaps I was missing something.
I was executing a lot of marketing “things” but why was I doing them? The CEO was right. I had approached my activities as simply a task-list to get through. With my tail between my legs I was left to ponder: What was the function of marketing in a startup? And more importantly, what was a profound belief and why was it important?
Hypotheses about Your Business Model = Your Profound Beliefs Loosely Held Your hypotheses about all the parts of your business model are your profound beliefs. Think of them as strong opinions loosely held. You can’t be an effective founder or in the C-suite if you don’t have any.
The whole role of customer discovery and validation outside your building is to inform your profound beliefs. By inform I mean use the evidence you gather outside the building to either validate your beliefs/hypotheses, invalidate or modify them. Specifically, what beliefs and hypotheses? Start with those around product/market fit – who are your customers and what features do they want? Who are the payers? Then march through the rest of the business model. What price will they pay? What role do regulators pay? Etc. The best validation you can get is an order. (BTW, if you’re creating a new market, it’s even OK to ignore customer feedback but you have to be able to articulate why.)
The reality of a startup is that that on day one most of your beliefs/hypotheses are likely wrong. However, you will be informed by those experiments outside the building, and data from potential customers, partners, regulators, et al will modify your vision over time.
It’s helpful to diagram the consequences between hypotheses/ beliefs and customer discovery. (See the diagram)
If you haveno beliefs and haven’t gotten out of the building to gather evidence, then your role inside a new venture is neutral. You act as a tactical implementer as you add no insight/or value to product development.
If you’ve gotten out of the building to gather evidence but have no profound beliefs to guide your inquiries, then your role inside a new venture is negative. You’ll collect a laundry-list of customer feature requests and deliver them to product development, without any insight. This is essentially a denial of service attack on engineering’s time. (I was mostly operating in this box when I got chewed out by our CEO.)
The biggest drag on a startup is those who have strong beliefsbuthaven’tgotten out of the building to gather evidence. Meetings become opinion contests and those with the loudest voices (or worse “I’m the CEO and my opinion matters more than your facts”) dominate planning and strategy. (They may be right, but Twitter/X is an example where Elon is in the box on the bottom right of the diagram. )
The winning combination is strong beliefsthat are validated or modified by evidence gathered outside the building. These are “strong opinions loosely held.”
Strategy is Not a To Do List, It Drives a To Do List It took me awhile, but I began to realize that the strategic part of my job was to recognize that (in today’s jargon) we were still searching for a scalable and repeatable business model. Therefore my job was to:
Articulate the founding team’s strong beliefs and hypotheses about our business model
Do an internal check-in to see if a) the founders were aligned and b) if I agreed with them
Get out of the building and test our strong beliefs and hypotheses about who were potential customers, what problems they had and what their needs were
Test product development’s/engineering’s beliefs about customer needs with customer feedback
When we found product/market fit, marketing’s job was to put together a strategy/plan for marketing and sales. That should be easy. If we did enough discovery customers would have told us what features were important to them, how we compare to competitors, how we should set prices, and how to best sell to them
Once I understood the strategy, the tactical marketing To Do list (website, branding, pr, tradeshows, white papers, data sheets) became clear. It allowed me to prioritize what I did, when I did it and instantly understand what would be mutually exclusive.
Lessons Learned
Profound beliefs are your hypotheses about all the parts of your business model
No profound beliefs but lots of customer discovery ends up as a feature list collection which is detrimental to product development
Profound beliefs but no customer discovery ends up as opinion contests and those with the loudest voices dominate
The winning combination is strong beliefsthat are validated or modified by evidence gathered outside the buildingThese are “strong opinions loosely held.”
I just saw the movie Oppenheimer. A wonderful movie on multiple levels.
But the Atomic Bomb story that starts at Los Alamos with Oppenheimer and General Grove misses the fact that from mid-1940 to mid-1942 it was Vannevar Bush (and his number 2, James Conant, the president of Harvard) who ran the U.S. atomic bomb program and laid the groundwork that made the Manhattan Project possible.
Here’s the story.
During World War II, the combatants (Germany, Britain, U.S. Japan, Italy, and the Soviet Union) made strategic decisions about what types of weapons to build (tanks, airplanes, ships, submarines, artillery, rockets), what was the right mix (aircraft carriers, fighter planes, bombers, light/ medium/ heavy tanks, etc.) and how many to build.
But only one country – the U.S. — succeeded in building nuclear reactors and nuclear weapons during the war, moving from atomic theory and lab experiments to actually deploying nuclear weapons in a remarkable 3 years.
Three reasons unique to the U.S. made this possible:
Émigré and U.S. physicists who feared that the Nazis would have an atomic bomb led to passionate advocacy before the government became involved.
A Presidential Science Advisor who created a civilian organization for building advanced weapons systems, funded and coordinated atomic research, then convinced the president to authorize an atomic bomb program and order the Army build it.
The commitment of U.S. industrial capacity and manpower to the atomic bomb program as the No. 1 national priority.
The Atom Splits In December 1938 scientists in Nazi Germany reported a new discovery – that the Uranium atom split (fissioned) when it hit with neutrons. Other scientists calculated that splitting the uranium atom released an enormous amount of energy.
Fear and Einstein Once it became clear that in theory a single bomb with enormous destructive potential was possible, it’s hard to understate the existential dread, fear, and outright panic of U.S. and British emigre physicists – many of them Jewish refugees who had fled Germany and occupied Europe. In the 1920s and ‘30s, Germany was the world center of advanced physics and the home of many first-class scientists. After seeing firsthand the terror of Nazi conquest, the U.S. and British understood all too well what an atomic bomb in the hands of the Nazis would mean. They assumed that German scientists had the know-how and capacity to build an atomic bomb. This was so concerning that physicists convinced Albert Einstein in August 1939 to write to President Roosevelt pointing out the potential of an atomic weapon and the risk of the bomb in German hands.
Motivated by fear of a Nazi atomic bomb, for the next two years scientists in the U.S. lobbied, pushed and worked at a frantic speed to get the government engaged, believing they were in a race with Nazi Germany to build a bomb.
After Einstein’s letter, Roosevelt appointed an Advisory Committee on Uranium. In early 1940 the Committee recommended that the government fund limited research on Uranium isotope separation. It spent $6,000.
Vannevar Bush Takes Over – National Defense Research Committee(NRDC) European émigré physicists (Einstein, Fermi, Szilard, and Teller) and Ernest Lawrence at Berkeley were frustrated at the pace the Advisory Committee on Uranium was moving. As theorists, they thought it was clear an atomic bomb could be built. They wanted the U.S. government to aggressively fund atomic research, so that the U.S. could build an atomic bomb before the Germans had one.
They weren’t alone in feeling frustrated about the U.S. approach to advanced weapons, not just atomic bombs.
In June 1940 Vannevar Bush, ex-MIT dean of engineering; and a group of the country’s top scienceand research administrators (Harvard President James Conant, Bell Labs President and head of the National Academy of Sciences Frank Jewett, and Richard Tolman Caltech Dean) all felt that there was a huge disconnect. The U.S. military had little idea of what science could provide in the event of war, and scientists were wholly in the dark as to what the military needed. As a result, they believed the U.S. was woefully unprepared and ill-equipped for a war driven by technology.
This group engineered a massive end run around the existing Army and Navy Research and Development labs. Bush and others believed that advanced weapons could be created better and faster if they could be designed by civilian scientists and engineers in universities and companies.
The scientists drafted a one-page plan for a National Defense Research Committee (NDRC). The NDRC would look for new technologies that the military labs weren’t working on (radar, proximity fuses, and anti-submarine warfare. (At first, atomic weapons weren’t even on their list.)
in June 1940 Bush got Roosevelt’s approval for the NDRC. In a masterful bureaucratic sleight of hand the NDRC sat in the newly created Executive Office of the President (EOP), where it got its funding and reported directly to the president. This meant that the NDRC didn’t need legislation or a presidential executive order. More importantly it could operate without congressional or military oversight.
Roosevelt’s decision gave the United States an 18-month head start for employing science in the war effort.
The NRDC was divided into five divisions and one committee, each run by a civilian director and each having a number of sections. (see diagram below.)
Bush became chairman of the NDRC and the first U.S. Presidential Science Advisorsystematically applying science to develop advanced weapons. The U.S., alone among all the Axis powers and Allied nations, now had a science advisor who reported directly to the president and had the charter and budget to fundadvanced weapon systems research – outside the confines of the Army or Navy.
NRDC was run by science administrators, who had managed university researchers as well as complex research and applied engineering projects science before. They took input from theorists, experimental physicists, and industrial contractors, and were able to weigh the advice they were receiving. They understood the risks, scale and resources needed to turn blackboard theory to deployed weapons. Equally important, they weren’t afraid to make multiple bets on a promising technology nor were they afraid to kill projects that seemed like dead ends for the war effort.
200+ contracts Prior to mid 1940 research in U.S. universities was funded by private foundations or companies. There was no government funding. The NRDC changed that. With a budget of $10,000,000 to fund research proposed by the five section chairmen, the NDRC funded 200+ contracts for research in radar, physics, optics, chemical engineering, and atomic fission.
For the first time ever, U.S. university researchers were receiving funding from the U.S. government. (It would never stop.)
The Uranium Committee In addition to the five NRDC divisions working on conventional weapons, the NRDC took over the moribund standalone Uranium Committee and made it a scientific advisory board reporting directly to Bush. The goal was to understand whether the theory of an atomic weapon could be turned into a practical weapon. Now the NRDC could directly fund research scientists to investigate ways to separate for U-235 to make a bomb.
What Didn’t Work at the NRDC? After a year, it was clear to Bush that while the NDRC was funding advanced research, the military wasn’t integrating those inventions into weapons. The NRDC had no authority to build and acquire weapons. Bush decided what he needed was a way to bypass traditional Army and Navy procurement processes and get those advanced weapons built.
Read the sidebars for background.
The Office of Scientific Research and DevelopmentStands Up In May 1941 Bush went back to President Roosevelt, this time with a more audacious request: Turn NRDC into an organization that not only funded research but built prototypes of new advanced weaponsand had the budget and authority to write contracts to industry to build these weapons at scale. In June 1941 Roosevelt agreed and signed the Executive Order creating the Office of Scientific Research and Development (OSRD). (It’s worth reading the Executive Order here to see the extraordinary authority he gave OSRD.)
OSRD expanded the National Defense Research Committee’s (NDRC) original five divisions into 19 weapons divisions, five research committees and a medical portfolio. Each division managed a broad portfolio of projects from research to production, and deployment. Its organization chart is shown below.
These divisions spearheaded the development of an impressive array of advanced weapons including radar, rockets, sonar, the proximity fuse, Napalm, the Bazooka and new drugs such as penicillin and cures for malaria.
The OSRD was a radical experiment. Instead of the military controlling weapons development Bush was now running an organization where civilian scientists designed and built advanced weapons systems. Nearly 10,000 scientists and engineers received draft deferments to work in these labs.
As a harbinger of much bigger things, the NRDC uranium committee was enlarged and renamed the S-1 Section on Uranium.
Throughout the next year the pace of atomic research picked up. And Bush’s involvement in launching the U.S. nuclear weapons program would grow larger.
By the middle of 1941 Bush was beginning to believe that building an atomic bomb was possible. But he felt he did not have enough evidence to suggest to the president that the country commit to the massive engineering effort to build the bomb.
Then the MAUD report from the British arrived.
The British Nuclear Weapons Program codenamed “Tube Alloys” and the MAUD Report
Meanwhile in the UK, British nuclear physicists had not only concluded that building an atomic bomb was feasible, but they had calculated the size of the industrial effort needed.In March 1940 scientists had told UK Prime Minister Winston Churchill that nuclear weapons could be built.
In June 1940 the UK formed the MAUD Committee to study the possibility of developing a nuclear weapon. A year later they had their answer: the July 1941 the MAUD Committee report, “Use of Uranium for a Bomb,” said that it was possible to build a bomb from uranium using gaseous diffusion on a massive scale to produce uranium-235. It kick-started the UK’s own nuclear weapons program called Tube Alloys. (Read the MAUD report here.)
They delivered their report to Vannevar Bush in July 1941. And it changed everything.
Bush is Convinced by the MAUD Report The MAUD Report finally pushed Bush over the edge. The British report showed how it was possible to build an atomic bomb. The fact that the British were independently saying what passionate advocates like Lawrence, Fermi, et al were saying convinced Bush that an atomic bomb program was worth investing in at the scale needed.
For a short period of time in 1941 the UK was ahead of the U.S. in thinking about how to weaponize uranium, but British officials dithered on approaching the U.S. for a full nuclear partnership with the U.S. By mid 1942, when the British realized their industrial capacity was stretched too thin and they couldn’t build the uranium separation plants and Bomb alone during the War, the Manhattan Project was scaling up and the U.S. had no need for the UK.
The UK would play a minor role in the Manhattan project.
Bush Tells Roosevelt – We Can Build an Atomic Bomb In October 1941, Bush told the President about the British MAUD report conclusions: the bomb’s uranium core might weigh twenty-five pounds, its explosive power might equal eighteen hundred tons of TNT, but to separate the U-235 they would need to build a massive industrial facility. The President asked Bush to work with the Army Corps of Engineers to figure out what type of plant to build, how to build it and how much would it cost.
A month later, in November 1941 the U.S. National Academy of Sciences confirmed to Bush that the British MAUD report conclusions were correct.
Bush now had all the pieces lined up to support an all-out effort to develop an atomic bomb.
December 1941 – Let’s Build an Atomic Bomb In December 1941, the day before the Japanese attack on Pearl Harbor, the atomic bomb program was placed under Vannevar Bush. He renamed the Uranium program as the S-1 Committee of OSRD.
In addition to overseeing the 19 Divisions of OSRD, Bush’s new responsibility was to coordinate all the moving parts of the atomic bomb program – the research, the lab experiments, and now the beginning of construction contracts.
With the Presidents support, Bush reorganized the program to take it from research to a weapons program. The goal now was to find the best ways to produce uranium-235 and Plutonium in large quantities. He appointed Harold Urey at Columbia to lead the gaseous diffusion and centrifuge methods and heavy-water studies. Ernest Lawrence at Berkeley took electromagnetic and plutonium responsibilities, and Arthur Compton at Chicago ran chain reaction and weapons theory programs. This team proposed to begin building pilot plants for all five methods of separating U-235 before they were proven. Bush and Conant agreed and sent the plan to the President, Vice President, and Secretary of War, suggesting the Army Corps of Engineers build these plants.
With U.S. now at war with Germany and Japan, the race to build the bomb was on.
In January 1942, Compton made Oppenheimer responsible for fast neutron research at Berkeley. This very small part of the atomic bomb program is the first time Oppenheimer was formally engaged in atomic bomb work.
Enter the Army The Army began attending OSRD S-1 (the Atomic Bomb group) meetings in March 1942. Bush told the President that by the summer of 1942 the Army should be authorized to build full-scale plants.
Build the U-235 Separation and Plutonium Plants By May 1942 it was still unclear which U-235 separation method would work and what was the right way to build a nuclear reactor to make Plutonium, so the S-1 committee recommended – build all of them. Build centrifuge, electromagnetic separation, and gaseous diffusion plants as fast as possible; build a heavy water plant for the nuclear reactors as an alternative to graphite; build reactors to produce plutonium; and start planning for large-scale production and select the site(s). The S-1 Committee also recommended the Army be in charge of building the plants.
Meanwhile that same month, Oppenheimer was made the “Coordinator of Rapid Rupture.” He headed up a group of theorists working with experimentalists to calculate how many pounds of U-235 and Plutonium were needed for a bomb.
The Manhattan Engineering District – The Atomic Program Moves to the Army In June 1942, the president approved Bush’s plan to hand building the bomb over to the Army. The Manhattan Engineering District became the new name for the U.S. atomic bomb program. General Groves was appointed its head in September 1942.
To everyone’s surprise Groves selected Oppenheimer to administer the program. It was a surprise because up until then Oppenheimer was a theoretical physicist, not an experimentalist nor had he ever run or managed any programs.
Grove and Oppenheimer decided that in addition to the massive production facilities – U-235 in Oak Ridge, TN, and Plutonium in Hanford, WA – they would need a central laboratory to design the bomb itself. This would become Los Alamos. And Oppenheimer would head that lab bringing together a diverse set of theorists, experimental physicists, explosive experts, chemistry, and metallurgists.
Bush, Conant and Grove at Plutonium production site at Hanford -July 1945
At its peak in mid-1944 130,000 people were working on the Manhattan Project; 5,000 of them worked at Los Alamos.
Vannevar Bush would be present at the test of the Plutonium weapon at the Trinity test site in July 1945.
The OSRD would be the organization that made the U.S. the leader in 20th century research. At the end of World War II, Bush laid out his vision for future U.S. support of research in an article called “Science the Endless Frontier.” OSRD was disbanded in 1947, but after a long debate it was resurrected in pieces. Out of it came the National Science Foundation, the National Institute of Health, the Atomic Energy Commission and ultimately NASA and DARPA – all would all spring from its roots.
50 years before it happened Bush would describe what would become the internet in a 1945 article called As We May Think.
Summary
By the time Oppenheimer and Grove took over the Atomic Bomb program, Vannevar Bush had been running it for two years
The U.S. atomic bomb program was the sum of multiple small decisions guided by OSRD and a Presidential science advisor – Vannevar Bush
Bush’s organizations kick-started the program. The NDRC invested (in 2023 dollars) $10M in nuclear research, OSRD put in another $250M for nuclear experiments
The Manhattan project would ultimately cost ~$40 billion to build the two bombs.
As the country was in a crisis – decisions were made in days/weeks by small groups with the authority to move with speed and urgency.
Large-scale federal funding for science research in U.S. universities started with the Office of Scientific Research and Development (OSRD) – more to come in subsequent posts
I just spent a month and a half at Imperial College London co-teaching a “Wicked” Entrepreneurship class. In this case Wicked doesn’t mean morally evil, but refers to really complex problems, ones with multiple moving parts, where the solution isn’t obvious. (Understanding and solving homelessness, disinformation, climate change mitigation or an insurgency are examples of wicked problems. Companies also face Wicked problems. In contrast, designing AI-driven enterprise software or building dating apps are comparatively simple problems.)
I’ve known Professor Cristobal Garcia since 2010 when he hosted my first visit to Catholic University in Santiago of Chile and to southern Patagonia. Now at Imperial College Business School and Co-Founder of the Wicked Acceleration Labs, Cristobal and I wondered if we could combine the tenets of Lean (get out of the building, build MVPs, run experiments, move with speed and urgency) with the expanded toolset developed by researchers who work on Wicked problems and Systems’ Thinking.
Our goal was to see if we could get students to stop admiring problems and work rapidly on solving them. As Wicked and Lean seem to be mutually exclusive, this was a pretty audacious undertaking.
This five-week class was going to be our MVP.
Here’s what happened.
Finding The Problems Professor Garcia scoured the world to find eight Wicked/complex problems for students to work on. He presented to organizations in the Netherlands, Chile, Spain, the UK (Ministry of Defense and the BBC), and aerospace companies. The end result was a truly ambitious, unique, and international set of curated Wicked problems.
Increasing security and prosperity amid the Mapuche conflict in Araucania region of Chile
We held an info session explaining the problems and the unique parts of the class. We were going to share with them a “Swiss Army Knife” of traditional tools to understand Wicked/Complex problems, but they were not going to research these problems in the library. Instead, using the elements of Lean methodology, they were going to get out of the building and observe the problems first-hand. And instead of passively observing them, they were going to build and test MVPs. All in six weeks.
50 students signed up to work on the eight problems with different degrees of “wickedness”.
Imperial Wicked Problems and Systems Thinking – 2023 Class
The Class The pedagogy of the class (our teaching methods and the learning activities) were similar to all the Lean/I-Corps and Hacking for Defense classes we’ve previously taught. This meant the class was team-based, Lean-driven (hypothesis testing/business model/customer development/agile engineering) and experiential – where the students, rather than being presented with all of the essential information, must discover that information rapidly for themselves.
The teams were going to get out of the building and talk to 10 stakeholder a week. Then weekly each team will present 1) here’s what we thought, 2) here’s what we did, 3) here’s what we learned, 4) here’s what we’re going to do during this week.
More Tools The key difference between this class and previous Lean/I-Corps and Hacking for Defense classes was that Wicked problems required more than just a business model or mission model to grasp the problem and map the solution. Here, to get a handle on the complexity of their problem the students needed a suite of tools – Stakeholder Maps, Systems Maps, Assumptions Mapping, Experimentation Menus, Unintended Consequences Map, and finally Dr. Garcia’s derivative of the Alexander Osterwalder’s Business Model Canvas – the Wicked Canvas – which added the concept of unintended consequences and the “sub-problems” according to the different stakeholders’ perspectives to the traditional canvas.
During the class the teaching team offered explanations of each tool, but the teams got a firmer grasp on Wicked tools from a guest lecture by Professor Terry Irwin, Director of the Transition Design Institute at Carnegie Mellon (see her presentation here.) Throughout the class teams had the flexibility to select the tools they felt appropriate to rapidly gain an holistic understanding and yet to develop a minimum viable product to address and experiment with each of the wicked problems.
Class Flow Week 1
What is a simple idea? What are big ideas and Impact Hypotheses?
Characteristics of each. Rewards, CEO, team, complexity, end point, etc.
What is unique about Wicked Problems?
Beyond TAM and SAM (“back of the napkin”) for Wicked Problems
You need Big Ideas to tackle Wicked Problems: but who does it?
Startups vs. Large Companies vs. Governments
Innovation at Speed for Horizon 1, 2 and 3 (Managing the Portfolio across Horizons)
What is Systems Thinking?
How to map stakeholders and systems’ dynamics?
Customer & Stakeholder Discovery: getting outside the building, city and country: why and how?
Mapping the Problem(s), Stakeholders and Systems – Wicked Tools
Week 2
Teams present for 6 min and receive 4 mins feedback
The Wicked Swiss Army Knife for the week: Mapping Assumptions Matrix, unintended consequences and how to run and design experiments
Prof Erkko Autio (ICBS and Wicked Labs) on AI Ecosystems and Prof Peter Palensky (TU Delft) on Smart Grids, Decarbornization and Green Hydrogen
Lecture on Minimal Viable Products (MVPs) and Experiments
Homework: getting outside the building & the country to run experiments
Assumption Mapping and Experimentation Type – Wicked Tools
Week 3
Teams present in 6 min and receive 4 mins feedback
The Wicked Swiss Army Knife for the week: from problem to solution via “How Might We…” Builder and further initial solution experimentation
On Canvases: What, Why and How
The Wicked Canvas
Next Steps and Homework: continue running experiments with MVPs and start validating your business/mission/wicked canvas
The Wicked Canvas – Wicked Tools
Experimentation Design and How We Might… – Wicked Tools
Week 4
Teams present in 6 min and receive 5 mins feedback
Wicked Business Models – validating all building blocks
The Geography of Innovation – the milieu, creative cities & prosperous regions
How World War II and the UK Started Silicon Valley
The Wicked Swiss Tool- maps for acupuncture in the territory
Storytelling & Pitching
Homework: Validated MVP & Lessons learned
Acupuncture Map for Regional System Intervention – Wicked Tools
Week 5
Teams presented their Final Lessons Learned journey – Validated MVP, Insights & Hindsight (see the presentations at the end of the post.)
What did we understand about the problem on day 1?
What do we now understand?
How did we get here?
What solutions would we propose now?
What did we learn?
Reflections on the Wicked Tools
Results To be honest, I wasn’t sure what to expect. We pushed the students way past what they have done in other classes. In spite of what we said in the info session and syllabus, many students were in shock when they realized that they couldn’t take the class by just showing up, and heard in no uncertain terms that no stakeholder/customer interviews in week 1 was unacceptable.
Yet, everyone got the message pretty quickly. The team working on theMapuche conflict in the Araucania region of Chile, flew to Chile from London, interviewed multiple stakeholders and were back in time for next week’s class. The team working to turn the Basque Country in Spain into an AI hub did the same – they flew to Bilbao and interviewed several stakeholders. The team working on the Green Hydrogen got connected to the Rotterdam ecosystem and key stakeholders in the Port, energy incumbents, VCs and Tech Universities. The team working on Ukraine did not fly there for obvious reasons. The rest of the teams spread out across the UK – all of them furiously mapping stakeholders, assumptions, systems, etc., while proposing minimal viable solutions. By the end of the class it was a whirlwind of activity as students not only presented their progress but saw that of their peers. No one wanted to be left behind. They all moved with speed and alacrity.
Lessons Learned
Our conclusion? While this class is not a substitute for a years-long deep analysis of Wicked/complex problems it gave students:
a practical hands-on introduction to tools to map, sense, understand and potentially solve Wicked Problems
the confidence and tools to stop admiring problems and work on solving them
I think we’ll teach it again.
Team final presentations
The team’s final lessons learned presentations were pretty extraordinary, only matched by their post-class comments. Take a look below.
Today, the U.S. is supporting a proxy war with Russia while simultaneously attempting to deter a China cross-strait invasion of Taiwan. Both are wakeup calls that victory and deterrence in modern war will be determined by a state’s ability to both use traditional weapons systems and simultaneously rapidly acquire, deploy, and integrate commercial technologies (drones, satellites, targeting software, et al) into operations at every level.
Ukraine’s military is not burdened with the DoD’s 65-year-old acquisition process and 20th-century operational concepts. It is learning and adapting on the fly. China has made the leap to a “whole of nation” approach. This has allowed the Peoples Liberation Army (PLA) to integrate private capital and commercial technology and use them as a force multiplier to dominate the South China Sea and prepare for a cross-strait invasion of Taiwan.
The DoD has not done either of these. It is currently organized and oriented to execute traditional weapons systems and operational concepts with its traditional vendors and research centers but is woefully unprepared to integrate commercial technologies and private capital at scale.
Copying SecDef Ash Carter’s 2015 strategy, China has been engaged in Civil/Military Fusion employing a whole of government coordinated effort to harness these disruptive commercial technologies for its national security needs. To fuel the development of technologies critical for defense, China has tapped into $900 billion of private capital in Civil/Military Guidance (Investment) Funds and has taken public state owned enterprises to fund their new shipyards, aircraft, and avionics. Worse, China will learn from and apply the lessons from Russia’s failures in the Ukraine at an ever increasing pace.
But unlike America’s arch strategic rival, the US to date has been unwilling and unable to adapt and adopt new models of systems and operational concepts at the speed of our adversaries. These include attritable systems, autonomous systems, swarms, and other emerging new defense platforms threaten legacy systems, incumbent vendors, organizations, and cultures. (Until today, the U.S. effort was still-born with its half-hearted support of its own Defense Innovation Unit and history of lost capabilities like those that were inherent the US Army’s Rapid Equipping Force.)
Viewing the DoD budget as a zero-sum game has turned the major defense primes and K-street lobbyists into saboteurs for DoD organizational innovation that threaten their business models. Using private capital could be a force multiplier by adding 100’s of billions of dollars outside the DoD budget. Today, private capital is disincented to participate in national security and incentives are aligned to ensure the U.S. military is organized and configured to fight and win the wars of the last century. The U.S. is on a collision course to experience catastrophic failure in a future conflict because of it. Only Congress can alter this equation.
For the U.S. to deter and prevail against China the DoD must create both a strategy and a redesigned organization to embrace those untapped external resources – private capital and commercial innovation. Currently the DoD lacks a coherent plan and an organization with the budget and authority to do so.
A reorganized and refocused DoD could acquire traditional weapons systems while simultaneously rapidly acquiring, deploying, and integrating commercial technologies. It would create a national industrial policy that incentivizes the development of 21st-century shipyards, drone and satellite factories and a new industrial base along the lines of the CHIPS and Innovation and Competition acts.
Congress must act to identify and implement changes within the DoD needed to optimize its organization and structure. These include:
Create a new defense ecosystem that uses the external commercial innovation ecosystem and private capital as a force multiplier. Leverage the expertise of prime contractors as integrators of advanced technology and complex systems, refocus Federally Funded Research and Development Centers (FFRDCs) on areas not covered by commercial tech (kinetics, energetics, nuclear and hypersonics).
Reorganize DoD Research and Engineering. Allocate its budget and resources equally between traditional sources of innovation and new commercial sources of innovation and capital. Split the OSD R&E organization in half. Keep the current organization focused on the status quo. Create a peer organization – the Under Secretary of Defense for Commercial Innovation and Private Capital.
Scale up the new Office of Strategic Capital (OSC) and the Defense Innovation Unit (DIU) to be the lead agencies in this new organization. Give them the budget and authority to do so and provide the services the means to do the same.
Reorganize DoD Acquisition and Sustainment. Allocate its budget and resources equally between traditional sources of production and the creation of new from 21st-century arsenals – new shipyards, drone manufacturers, etc. – that can make 1,000s of low-cost, attritable systems.
Coordinate with Allies. Expand the National Security Innovation Base (NSIB) to an Allied Security Innovation Base. Source commercial technology from allies.
Why Is It Up To Congress?
National power is ephemeral. Nations decline when they lose allies, economic power, interest in global affairs, experience internal/civil conflicts, or miss disruptive technology transitions and new operational concepts.
The case can be made that all of these have or are happening to the U.S.
There is historical precedent for Congressional action to ensure the DoD is organized to fight and win our wars. The 1986 Goldwater/Nichols Act laid the foundation for conducting coordinated and effective joint operations by reorganizing the roles of the military services, and the Joint Chiefs, and creating the Joint Staff and the combatant commands. US Congress must take Ukraine and China’s dominance in the South China Sea as call for action and immediately establish a commission to determine what reforms and changes are needed to ensure the U.S. can fight and win our future wars.
While parts of the DoD understand we’re in a crisis to deter, or if that fails, win a war in the South China Sea, the DoD as a whole shows little urgency and misses a crucial point: China will not defer solving the Taiwan issue on our schedule. Russia will not defer its future plans for aggression to meet our dates. We need to act now.
We fail to do so at our peril and the peril of all those who depend on U.S. security to survive.
The world is very different now. For man holds in his mortal hands the power to abolish all forms of human poverty and all forms of human life.
John F. Kennedy
Humans have mastered lots of things that have transformed our lives, created our civilizations, and might ultimately kill us all. This year we’ve invented one more.
Artificial Intelligence has been the technology right around the corner for at least 50 years. Last year a set of specific AI apps caught everyone’s attention as AI finally crossed from the era of niche applications to the delivery of transformative and useful tools – Dall-E for creating images from text prompts, Github Copilot as a pair programming assistant, AlphaFold to calculate the shape of proteins, and ChatGPT 3.5 as an intelligent chatbot. These applications were seen as the beginning of what most assumed would be domain-specific tools. Most people (including me) believed that the next versions of these and other AI applications and tools would be incremental improvements.
We were very, very wrong.
This year with the introduction of ChatGPT-4 we may have seen the invention of something with the equivalent impact on society of explosives, mass communication, computers, recombinant DNA/CRISPR and nuclear weapons – all rolled into one application. If you haven’t played with ChatGPT-4, stop and spend a few minutes to do so here. Seriously.
At first blush ChatGPT is an extremely smart conversationalist (and homework writer and test taker). However, this the first time ever that a software program has become human-competitive at multiple general tasks. (Look at the links and realize there’s no going back.) This level of performance was completely unexpected. Even by its creators.
In addition to its outstanding performance on what it was designed to do, what has surprised researchers about ChatGPT is its emergent behaviors. That’s a fancy term that means “we didn’t build it to do that and have no idea how it knows how to do that.” These are behaviors that weren’t present in the small AI models that came before but are now appearing in large models like GPT-4. (Researchers believe this tipping point is result of the complex interactions between the neural network architecture and the massive amounts of training data it has been exposed to – essentially everything that was on the Internet as of September 2021.)
(Another troubling potential of ChatGPT is its ability to manipulate people into beliefs that aren’t true. While ChatGPT “sounds really smart,” at times it simply makes up things and it can convince you of something even when the facts aren’t correct. We’ve seen this effect in social media when it was people who were manipulating beliefs. We can’t predict where an AI with emergent behaviors may decide to take these conservations.)
But that’s not all.
Opening Pandora’s Box Until now ChatGPT was confined to a chat box that a user interacted with. But OpenAI (the company that developed ChatGPT) is letting ChatGPT reach out and interact with other applications through an API (an Application Programming Interface.) On the business side that turns the product from an incredibly powerful application into an even more incredibly powerful platform that other software developers can plug into and build upon.
By exposing ChatGPT to a wider range of input and feedback through an API, developers and users are almost guaranteed to uncover new capabilities or applications for the model that were not initially anticipated. (The notion of an app being able to request more data and write code itself to do that is a bit sobering. This will almost certainly lead to even more new unexpected and emergent behaviors.) Some of these applications will create new industries and new jobs. Some will obsolete existing industries and jobs. And much like the invention of fire, explosives, mass communication, computing, recombinant DNA/CRISPR and nuclear weapons, the actual consequences are unknown.
Should you care? Should you worry? First, you should definitely care.
Over the last 50 years I’ve been lucky enough to have been present at the creation of the first microprocessors, the first personal computers, and the first enterprise web applications. I’ve lived through the revolutions in telecom, life sciences, social media, etc., and watched as new industries, markets and customers created literally overnight. With ChatGPT I might be seeing one more.
One of the problems about disruptive technology is that disruption doesn’t come with a memo. History is replete with journalists writing about it and not recognizing it (e.g. the NY Times putting the invention of the transistor on page 46) or others not understanding what they were seeing (e.g. Xerox executives ignoring the invention of the modern personal computer with a graphical user interface and networking in their own Palo Alto Research Center). Most people have stared into the face of massive disruption and failed to recognize it because to them, it looked like a toy.
Others look at the same technology and recognize at that instant the world will no longer be the same (e.g. Steve Jobs at Xerox). It might be a toy today, but they grasp what inevitably will happen when that technology scales, gets further refined and has tens of thousands of creative people building applications on top of it – they realize right then that the world has changed.
It’s likely we are seeing this here. Some will get ChatGPT’s importance instantly. Others will not.
Perhaps We Should Take A Deep Breath And Think About This? A few people are concerned about the consequences of ChatGPT and other AGI-like applications and believe we are about to cross the Rubicon – a point of no return. They’ve suggested a 6-month moratorium on training AI systems more powerful than ChatGPT-4. Others find that idea laughable.
In 1974, molecular biologists were alarmed when they realized that newly discovered genetic editing tools (recombinant DNA technology) could put tumor-causing genes inside of E. Coli bacteria. There was concern that without any recognition of biohazards and without agreed-upon best practices for biosafety, there was a real danger of accidentally creating and unleashing something with dire consequences. They asked for a voluntary moratorium on recombinant DNA experiments until they could agree on best practices in labs. In 1975, the U.S. National Academy of Science sponsored what is known as the Asilomar Conference. Here biologists came up with guidelines for lab safety containment levels depending on the type of experiments, as well as a list of prohibited experiments (cloning things that could be harmful to humans, plants and animals).
Until recently these rules have kept most biological lab accidents under control.
Nuclear weapons and genetic engineering had advocates for unlimited experimentation and unfettered controls. “Let the science go where it will.” Yet even these minimal controls have kept the world safe for 75 years from potential catastrophes.
Goldman Sachs economists predict that 300 million jobs could be affected by the latest wave of AI. Other economists are just realizing the ripple effect that this technology will have. Simultaneously, new startups are forming, and venture capital is already pouring money into the field at an outstanding rate that will only accelerate the impact of this generation of AI. Intellectual property lawyers are already arguing who owns the data these AI models are built on. Governments and military organizations are coming to grips with the impact that this technology will have across Diplomatic, Information, Military and Economic spheres.
Now that the genie is out of the bottle, it’s not unreasonable to ask that AI researchers take 6 months and follow the model that other thoughtful and concerned scientists did in the past. (Stanford took down its version of ChatGPT over safety concerns.) Guidelines for use of this tech should be drawn up, perhaps paralleling the ones for genetic editing experiments – with Risk Assessments for the type of experiments and Biosafety Containment Levels that match the risk.
Unlike moratoriums of atomic weapons and genetic engineering that were driven by the concern of research scientists without a profit motive, the continued expansion and funding of generative AI is driven by for-profit companies and venture capital.
Data shows that pre-seed and seed startups with employees showing up in a physical office have 3½ times higher revenue growth than those that are solely remote.
Let the discussion begin.
During the pandemic, companies engaged in one of the largest unintended experiments in how to organize office work – remotely, in offices, or a hybrid of the two.
Post-pandemic, startups are still struggling to manage the best way to manage return-to-office issues – i.e. employee’s expectations of continuing to work remotely versus the best path to build and grow a profitable company.
Before we can ask which is the best configuration, the first question is what, exactly do we mean by “remote work” versus “office work”? Today work configurations span the spectrum from no office (fully remote, default digital) to some office (flexible hybrid, synchronized hybrid, office first,) to office only.
James Kim at Reach Capital, an early-stage tech ed investor, surveyed their portfolio of 37 companies using the following taxonomy of how virtual and physical work could be configured.
Using this model James found that pre-seed and seed-stage startups that had employees returning to some type of office had 3½ times the revenue growth of startups that were fully remote. Those are staggeringly large differences, and while other factors may play some role (see “What Does This Mean, below), the impact of the all-hands-on-deck approach can’t be ignored.
What might account for these differences? Not surprisingly, almost 90% of the responses from pre-seed/seed startups said team culture was influenced by work configuration. However, unexpectedly, self-reported team culture, eNPS (employee Net Promoter Score) and regrettable attrition – departures that hurt the company — are similar across work configurations.
So while the employees said regardless of the office configuration the team culture didn’t appear to change, the performance of very early stage startups (as measured by revenue growth) told a different story.
What Does This Mean? The data is suggestive but not conclusive. See a full summary of the survey results here.
Let’s start with the data set. The survey sample size was 37 companies from the Reach Capital portfolio. That’s large enough to see patterns, but not large enough to generalize across all startups. Next, Reach Capital’s portfolio of companies are in education and the future of work. The revenue results by workplace configuration may be different in other markets. Reach Capital’s investments are made in many regions including Brazil, so the geography is not limited to Silicon Valley.
Finally office configuration is only one factor that might influence a startup’s growth rate. Still the results are suggestive enough that other VC’s might want to run the same surveys across their portfolio of companies and see if the results match.
(BTW, Nick Bloom at Stanford and others have done extensive research with thousands of people on remote and hybrid work here, and here. Their research is mostly focused on employees working on independent day-to-day tasks such as travel agents. However, we’re interested in the very specific subset of creative knowledge workers in the early stage of startups. Specifically at the stage when startups are searching for product/market fit and a business model not when they are executing day-to-day tasks.
If the results appear elsewhere, then one can speculate why. Working from home may offer more distractions by chores, family, network issues. Do those little things add up to meaningful productivity differences?
Is it that in early-stage startups the random conversations between employees at unscheduled and unplanned times lead to better insights and ideas? And if so, is the productive brainstorming occurring inside of departments –e.g. engineer to engineer — or is it the cross-fertilization between departments – e.g. engineering to marketing?
Research since the 20th century has proven that informal face-to-face interaction is important for the coordination of group activities, maintaining company culture, and team building. This informal information gives employees access to new, non-redundant information through connections to different parts of an organization’s formal org chart and through connections to different parts of an organization’s informal communication network. In addition, research has found that creativity is greatly enhanced in a “small world network – a network structure that is both highly locally clustered and often a hotbed of unscheduled fluid interactions that support innovation. In other words, inside an early-stage startup.
For decades Silicon Valley company founders and investors have known this small world network effect as tacit knowledge. It has been a hallmark of the physical design of Silicon Valley office space – from Xerox PARC to Pixar’s headquarters, to Google and Apple.
So perhaps the converse is true. Does remote work with ad hoc or fixed meetings via Zoom actually stunt the growth of creativity and new insights, just at the time a startup most needs them? Are there new tools such as Discord and others that can duplicate the water cooler effect of physical proximity?
Either way, it’s the beginning of an interesting discussion.
What has been your experience?
Lessons Learned
Data from one VC shows pre-seed and seed-stage startups with employees that show up to the office have 3½ times the revenue growth of those that work remotely
Is the data valid? Is it the same in all markets/industries?
If it’s valid, why?
Is there a difference in remote vs. in-office productivity for creative tasks versus execution tasks?
Three types of organizations – Incubators, Accelerators and Venture Studios – have emerged to reduce the risk of early-stage startup failure by helping teams find product/market fit and raise initial capital. Venture Studios are an “idea factory” with their own employees searching for product/market fit and a repeatable and scalable business model. They do the most to de-risk the early stages of a startup.
Outside a small university in the Midwest, I was having coffee with Carlos, a rising star inside a mid-sized manufacturing company. He had a track record of taking small teams and growing them into successful product lines. However, after a decade working for others, Carlos was interested in building and growing a company of his own. I asked how much he knew about how to get started. He said that from what he read, the path to building and funding a company seemed to be: 1) come up with an idea, 2) form a team, 3) start testing minimal viable products, 4) raise seed funding, 5) then obtain venture capital.
As he described his work in additive manufacturing and 3D printing, Carlos said he knew that there were seed investors in his town, but venture capital was still largely on the coasts, and it was hard to get their attention. He also wasn’t sure his idea was great. But he still had the itch to grow something small into a substantive company.
As we grabbed dessert, Carlos asked, “Other than raising money, are there other ways to start a company?”
I pointed out that there were.
Reducing Startup Risk In the last two decades, three types of organizations — incubators, accelerators and venture studios — have emerged to reduce the risk of early-stage startup failure by helping teams find product/market fit and raise initial capital. Most are founded and run by experienced entrepreneurs that have previously built companies and who understand the difference between theory and practice.
I pointed out to Carlos that accelerators like Y-Combinator, Techstars, and 500 Startups offer a cohort of startups a six to 12-week bootcamp. But these look for founders who have a technical or business model insight and a team. Accelerators provide these teams with technical and business expertise and connect them to a network of other founders and advisors. The culmination of this bootcamp is a “demo day” where all startups in the cohort have a few minutes to pitch their companies to venture capitalists and angel investors. (In some cases the accelerator provides initial funding themselves.) In exchange for attending an accelerator, startups give up 5% to 10% of their company’s equity.
There are thousands of accelerators across the globe. The business model for most of these accelerators is to select startups that can generate venture-class returns – i.e. grow into companies that can potentially be worth billions of dollars. For most accelerators, admission is by application and interview. Some, like Y-Combinator, Techstars, and 500 Startups are open to all types of startups in any market, while others like SOSV, IndieBio, HAX, Orbit, dLab are more specialized.
Incubators are similar to accelerators in that they provide space and shared resources to startups, but usually no or very small amounts of capital. Their financial models are based on membership fees that grant access to a shared coworking space, resources, and access to other founders and operational expertise.
Carlos stirred his coffee. “Accelerators don’t sound like a fit for where I am at in my career,” he offered. “I don’t have a killer idea, or a technical team, but I do know how to build, grow, and manage teams.”
The Alternative: Venture Studios I pointed out there were organizations that might be a better fit for his skills and passion to go out on his own — venture studios. Unlike an accelerator, a venture studio does not fund existing startups.
Venture studios create startups by incubating their own ideas or ideas from their partners. The studio’s internal team builds the minimal viable product, then validates an idea by finding product/market fit and early customers. If the idea passes a series of “Go/No Go” decisions based on milestones for customer discovery and validation, the studio recruits entrepreneurial founders to run and scale those startups. Examples of companies that have emerged from venture studios, include Overture, Twilio, bitly, aircalla, and the most famous alum, Moderna,
I suggested Carlos think of a venture studio as an “idea factory” with their own full-time employees engaged in searching for product/market fit and a repeatable and scalable business model.
How Venture Studios Work Unlike an accelerator or incubator, a venture studio doesn’t fund existing startups. It’s a company that creates multiple startups in-house, then finds entrepreneurs who take them over to grow them.
Most venture studios create and launch several startups each year. These have a greater success rate than those that come out of accelerators or traditional venture-funded companies. That’s because unlike accelerators, which operate on a six- to 12-week cadence, studios don’t have a set timeframe. Instead, they search and pivot until product-market fit is found. Unlike an accelerator or a VC firm, a venture studio kills most of their ideas that can’t find traction and won’t launch a startup if they can’t find evidence that it can be a scalable and profitable company.
Comparing Startup Funding Options Venture studios are a good fit for entrepreneurs who don’t have an idea or team but would like to run and grow a startup. The venture studio’s employees have already identified a product, market fit and early customers — meaning someone else has eliminated many of the early risks of a new venture. In return for the lower risk, a venture studio typically takes a larger percentage of equity.
There are four main types of venture studios:
Tech transfer studios, such as America’s Frontier Fund, work with companies and/or government labs to source ideas and intellectual property. They then transfer the IP and build the startup inside the venture studio.
Corporate studios, such as Applied Materials, source ideas and intellectual property inside their own company. They then build the startup inside a separate corporate venture studio inside the company.
A niche studio is a standalone venture studio that generates its own ideas and IP in a specific industry and domain – for example Flagship Pioneering , which is focused on health care and incubated LS18 — the company that became Moderna.
An industry agnostic studio, such as Rocket Internet, is a standalone venture studio that generates its own ideas and IP and is industry and market agnostic.
Today there are around 720+ venture studios across the world – half are in Europe. In both North America and Europe, many venture studios in non-major cities are funded by government agencies to stimulate local growth, at times with matching donations from companies. These studios have different metrics than startup studios whose limited partners are private family offices or venture capitalists.
Why Would an Entrepreneur Join a Venture Studio? While we were on our second cup of coffee, I told Carlos about the downside to joining a company created by a venture studio — how much equity/ownership they take.
In contrast with an accelerator that takes 5%-10% of a startup’s equity, venture studios take anywhere from 30%-80% of a startup’s equity. This is because companies exiting a venture studios have been handed a startup that has de-risked of much of the early-stage startup process. (There’s a direct correlation between the amount of equity a venture studio takes and their belief in how much they want their founding CEO to be an entrepreneur versus executor.)
Why would an entrepreneur join a venture studio and give up the majority of their company rather than go to accelerator? Most accelerators tend to look for a “founder type” — a stereotypical techie, fresh out of college, who already has an idea and cofounders.
Most people don’t fit that pattern. Yet many are more than capable of taking an idea that’s been stress-tested and validated and building it.
What To Look for in a Venture Studio? As we got up to leave Carlos asked, “How would I know whether the venture studio a good one?”
It was a great question. While there are no hard-and-fast rules, I advise entrepreneurs to ask these four questions:
Is the studio run by a former founder and does it have former founders as full-time employees? The most successful venture studios are founded by entrepreneurs that have previously built companies with $10+M in revenue and had 100+ employees.
What percentage of equity are they asking for? The answer will be directly proportional to what they think your value is. Firms asking for greater than 60% are actually hiring an employee rather than a founder.
Do you want a studio with specific expertise? Studios that focus on specific niches and industries can build a deep bench of domain experts – e.g. founder, advisors, and mentors – who are experts in this one field
Do they have enough funding? Watch out for Zombie studios. If you’ve given away a majority of your company to a studio, it would be helpful to have them around for support after you’ve started. If they don’t have enough funding to keep the lights on for several years, you’re on your own. Make sure your studio has raised more than $10m in funding.
A few weeks later I got a note from Carlos letting me know that he found that there was a venture studio in his city, another run by the state, and a third in his region focused on manufacturing. He had applied to all of them.
A CEO running a B-to-B startup in needs to live in the city where their business is – or else they’ll never scale.
I was having breakfast with Erin, an ex-student, just off a red-eye flight from New York. She’s built a 65-person startup selling enterprise software to the financial services industry. Erin had previously worked in New York for one of those companies and had a stellar reputation in the industry. As one would expect, with banks and hedge funds as customers, the majority were based in the New York metropolitan area.
Where Are Your Biggest Business Deals?
Looking a bit bleary-eyed, Erin explained, “Customers love our product, and I think we’ve found product/market fit. I personally sold the first big deals and hired the VP of sales who’s building the sales team in our New York office. They’re growing the number of accounts and the deal size, but it feels like we’re incrementally growing a small business, not heading for exponential growth. I know the opportunity is much bigger, but I can’t put my finger on what’s wrong.”
Erin continued, “My investors are starting to get impatient. They’re comparing us to another startup in our space that’s growing much faster. My VP of Sales and I are running as fast as we can, but I’ve been around long enough to know I might be the ex-CEO if we can’t scale.”
While Erin’s main sales office is in New York, next to her major prospects and customers, Erin’s company was headquartered in Silicon Valley, down the street from where we were having breakfast. During the Covid pandemic, most of her engineering team worked remotely. Her inside sales team (Sales Development and Business Development reps) used email, phone, social media and Zoom for prospecting and generating leads. At the same time, her account executives were able to use Zoom for sales calls and close and grow business virtually.
There’s a Pattern Here Over breakfast, I listened to Erin describe what at first seemed like a series of disconnected events.
First, a new competitor started up. Initially, she wasn’t concerned as the competitor’s product had only a subset of the features that Erin’s company did. However, the competitor’s headquarters was based in New York, and their VP of Sales and CEO were now meeting face-to-face with customers, most of whom had returned to their offices. While Erin’s New York-based account execs were selling to the middle tier management of organizations, the CEO of her competitor had developed relationships with the exec staff of potential customers. She lamented, “We’ve lost a couple of deals because we were selling at the wrong level.”
Second, Erin’s VP of sales had just bought a condo in Miami to be next to her aging parents, so she was commuting to NY four days a week and managing the sales force from Miami when she wasn’t in New York. Erin sighed, “She’s as exhausted as I am flying up and down the East Coast.”
Third, Erin’s account execs were running into the typical organizational speedbumps and roadblocks that closing big deals often encounter. However, solving them via email, Zoom and once-a-month fly-in meetings wasn’t the same as the NY account execs being able to say, “Hey, our VP of Sales and CEO are just down the street. Can we all grab a quick coffee and talk this over?” Issues that could have been solved casually and quickly ballooned into ones that took more work and sometimes a plane trip for her VP of Sales or Erin to solve.
By the time we had finished breakfast it was clear to me that Erin was the one putting obstacles in front of her path to scale. Here’s what I observed and suggested.
Keep Your Eye on The Prize While Erin had sold the first deals herself, she needed to consider whether each deal happened because as CEO, she could call on the company’s engineers to pivot the product. Were the account execs in New York trying to execute a sales model that wasn’t yet repeatable and scalable without the founder’s intervention? Had a repeatable and scalable sales process truly been validated? Or did each sale require a heroic effort?
Next, setting up their New York office without Erin or her VP of Sales physically living in New York might have worked during Covid but was now holding her company back. At this phase of her company the goal of the office shouldn’t be to add new accounts incrementally – but should be how to scale – repeatably. Hiring account execs in an office in New York let Erin believe that she had a tested, validated, and repeatable sales playbook that could rapidly scale the business. The reality was that without her and the VP of Sales living and breathing the business in New York, they were trying to scale a startup remotely.
Her early customers told Erin that her company had built a series of truly disruptive financial service products. But now, the company was in a different phase – it needed to build and grow the business exponentially. And in this phase, her focus as a CEO needed to change – from searching for product/market fit to driving exponential growth.
Exponential Growth Requires Relentless Execution Because most of her company’s customers were concentrated in a single city, Erin and her VP of Sales needed to be there – not visiting in a hotel room. I suggested that:
Erin had to quickly decide if she wanted to be the one to scale the business. If not, her investors were going to find someone who could.
If so, she needed to realize that she had missed an important transition in her company. In a high-dollar B-to-B business, building and scaling sales can’t be done remotely. And she was losing ground every day. Her New York office needed a footprint larger than she was. It needed business development and marketing people rapidly creating demand.
Her VP of Sales might be wonderful, but with the all the travel the company is only getting her half-time. Erin needs a full-time head of sales in New York. Time to have a difficult conversation.
Because she was behind, Erin needed to rent an apartment in New York for a year, and spend the next six months there and at least two weeks a month after that. Her goal was to:
1) Validate that there was a repeatable sales process. It not, build one
2) Build a New York office that could create a sales and marketing footprint without her presence. Only then could she cut back her time in the City.
Finally, she needed to consider that if her customers were primarily in New York and the engineers were working remotely, why weren’t the company headquarters in New York?
I Hate New York As we dug into these issues, I was pretty surprised to hear her say, “I spent a big part of my career in New York. I thought coming out to Stanford and the West Coast meant I could leave the bureaucracy of large companies and that culture behind. Covid let me do that for a few years. I guess now I’m just avoiding jumping back into an environment I thought I had left.”
We lingered over coffee as I suggested it was time for her to take stock of what’s next. She had something rare – a services company that provided real value with products that early customers loved. Her staff didn’t think they were joining a small business, neither did her investors. If she wasn’t prepared to build something to its potential, what was her next move?
Lessons Learned
For a startup, the next step after finding product/market fit is finding a repeatable and scalable sales process
This requires a transition to the relentless execution of creating demand and exponentially growing sales
If your customers are concentrated in a city or region, you need to be where your customers are
The CEO needs to lead this growth focus
And then hand it off to a team equally capable and committed
Joe Felter, Raj Shah and I designed the class to 1) give our students an appreciation of the challenges and opportunities for the United States in its enduring strategic competition with the People’s Republic of China, Russia and other rivals, and 2) offer insights on how commercial technology (AI, machine learning, autonomy, cyber, quantum, semiconductors, access to space, biotech, hypersonics, and others) are radically changing how we will compete across all the elements of national power e.g. diplomatic, informational, military, economic, financial, intelligence and law enforcement (our influence and footprint on the world stage).
Why This Class?
The return of strategic competition between great powers became a centerpiece of the 2017 National Security Strategy and 2018 National Defense Strategy. The 2021 Interim National Security Guidance and the administration’s recently released 2022 National Security Strategy make clear that China has rapidly become more assertive and is the only competitor potentially capable of combining its economic, diplomatic, military, and technological power to mount a sustained challenge to a stable and open international system. And as we’ve seen in the Ukraine, Russia remains determined to wage a brutal war to play a disruptive role on the world stage.
Prevailing in this competition will require more than merely acquiring the fruits of this technological revolution; it will require a paradigm shift in the thinking of how this technology can be rapidly integrated into new capabilities and platforms to drive new operational and organizational concepts and strategies that change and optimize the way we compete.
Class Organization The readings, lectures, and guest speakers explored how emerging commercial technologies pose challenges and create opportunities for the United States in strategic competition with great power rivals with an emphasis on the People’s Republic of China. We focused on the challenges created when U.S. government agencies, our federal research labs, and government contractors no longer have exclusive access to these advanced technologies.
This course included all that you would expect from a Stanford graduate-level class in the Masters in International Policy – comprehensive readings, guest lectures from current and former senior officials/experts, and written papers. What makes the class unique however, is that this is an experiential policy class. Students formed small teams and embarked on a quarter-long project that got them out of the classroom to 1) identify a priority national security challenge, and then to 2) validate the problem and propose a detailed solution tested against actual stakeholders in the technology and national security ecosystem.
The class was split into three parts. Part 1, weeks 1 through 4 covered international relations theories, strategies and policies around Great Power Competition specifically focused on the People’s Republic of China (PRC) and the Communist Peoples Party (CCP). Part 2, weeks 5 through 8, dove into the commercial technologies: semiconductors, space, cyber, AI and Machine Learning, High Performance Computing, and Biotech. In between parts 1 and 2 of the class, the students had a midterm individual project. It required them to write a 2,000-word policy memo describing how a U.S. competitor is using a specific technology to counter U.S. interests and a proposal for how the U.S. should respond. (These policy memos were reviewed by Tarun Chhabra, the Senior Director for Technology and National Security at the National Security Council.)
Each week the students had to read 5-10 articles (see class readings here.) And each week we had guest speakers on great power competition, and technology and its impact on national power and lectures/class discussion.
Guest Speakers In addition to the teaching team, the course drew on the experience and expertise of guest lecturers from industry and from across U.S. Government agencies to provide context and perspective on commercial technologies and national security.
Our class opened with three guest speakers; former U.S. Secretary of Defense James Mattis and the CIA’s CTO and COO Nand Mulchandani and Andy Makridis. The last class closed with a talk by Google ex-Chairman Eric Schmidt.
In the weeks in-between we had teaching team lectures followed by speakers that led discussions on the critical commercial technologies. For semiconductors, the White House Coordinator for the CHIPS Act – Ronnie Chatterji, and the CTO of Applied Materials – Om Nalamasu. For commercial tech integration and space, former Defense Innovation Unit (DIU) Director Mike Brown and B. General Bucky Butow – Director of the Space Portfolio. For Artificial Intelligence, Lt. Gen. (Ret) Jack Shanahan, former director of the Joint Artificial Intelligence Center. And for synthetic biology Stanford Professor Drew Endy – President, BioBricks Foundation.
Team-based Experiential Project
The third part of the class was unique – a quarter-long, team-based project. Students formed teams and developed hypotheses of how commercial technologies can be used in new and creative ways to help the U.S. wield its instruments of national power. And consistent with all our Gordian Knot Center classes, they got out of the classroom and interviewed 20+ beneficiaries, policy makers, and other key stakeholders testing their hypotheses and proposed solutions. At the end of the quarter, each of the teams gave a final “Lessons Learned” presentation and followed up with a 3,000 to 5,000-word team-written paper.
By the end of the class all the teams realized that the problem they had selected had morphed into something bigger, deeper, and much more interesting.
Team 1: Climate Change
Original Problem Statement: What combinations of technologies and international financial relationships should the US prioritize to mitigate climate change?
Final Problem Statement: How should the US manage China’s dominance in solar panels?
We knew that these students could write a great research paper. As we pointed out to them, while you can be the smartest person in the building, it’s unlikely that 1) all the facts are in the building, 2) you’re smarter than the collective intelligence sitting outside the building.
Jonah Cader: “Technology, Innovation and Great Power Competition (TIGPC) is that rare combination of the theoretical, tactical, and practical. Over 10 weeks, Blank, Felter, and Shah outline the complexities of modern geopolitical tensions and bring students up the learning curves of critical areas of technological competition, from semiconductors to artificial intelligence. Each week of the seminar is a crash course in a new domain, brought to life by rich discussion and an incredible slate of practitioners who live and breathe the content of TIGPC daily. Beyond the classroom, the course plunges students into getting “out of the building” to iterate quickly while translating learnings to the real world. Along the way the course acts as a strong call to public service.”
Team 2: Networks
Original Problem Statement: How might we implement a ubiquitous secure global access to the internet in order to help circumvent censorship in authoritarian regimes?
Final Problem Statement: How can we create an open, free Internet and maintain effective lines of communication in Taiwan in preparation for a potential invasion?
By week 2 of the class students formed teams around a specific technology challenge facing a US government agency and worked throughout the course to develop their own proposals to help the U.S. compete more effectively through new operational concepts, organizations, and/or strategies.
Jason Kim: “This course doesn’t just discuss U.S. national security issues. It teaches students how to apply an influential and proven methodology to rapidly develop solutions to our most challenging problems.”
Team 3: Acquisition
Original Problem Statement: How can the U.S. Department of Defense match or beat the speed of great power competitors in acquiring and integrating critical technologies?
Final Problem Statement: How can the U.S. Department of Defense deploy alternative funding mechanisms in parallel to traditional procurement vehicles to enable and incentivize the delivery of critical next-generation technology in under 5 years?
We wanted to give our students hands-on experience on how to deeply understand a problem at the intersection of our country’s diplomacy, information, its military capabilities, economic strength, finance, intelligence, and law enforcement and dual-use technology. First by having them develop hypotheses about the problem; next by getting out of the classroom and talking to relevant stakeholders across government, industry, and academia to validate their assumptions; and finally by taking what they learned to propose and prototype solutions to these problems.
Matt Kaplan: “The TIGPC class was a highlight of my academic experience at Stanford. Over the ten week quarter, I learned a tremendous amount about the importance of technology in global politics from the three professors and from the experts in government, business, and academia who came to speak. The class epitomizes some of the best parts of my time here: the opportunity to learn from incredible, caring faculty and to work with inspiring classmates. Joe, Steve, Raj instilled in my classmates and me a fresh sense of excitement to work in public service.”
Team 4: Wargames
Original Problem Statement: The U.S. needs a way, given a representative simulation, to rapidly explore a strategy for possible novel uses of existing platforms and weapons.
Final Problem Statement: Strategic wargames stand to benefit from a stronger integration of AI+ML but are struggling to find adoption and usage. How can this be addressed?
We want our students to build the reflexes and skills to deeply understand a problem by gathering first-hand information and validating that the problem they are solving is the real problem, not a symptom of something else. Then, students began rapidly building minimal viable solutions (policy, software, hardware …) as a way to test and validate their understanding of both the problem and what it would take to solve it.
Etienne Reche-Ley: “Technology, Innovation and Great Power Competition gave me an opportunity to dive into a real world national security threat to the United States and understand the implications of it within the great power competition. Unlike any other class I have taken at Stanford, this class allowed me to take action on our problem about networks, censorship and the lack of free flow of information in authoritarian regimes, and gave me the chance to meet and learn from a multitude of experts on the topic. I finished this class with a deep understanding of our problem, a proposed actionable solution and a newfound interest in the intersection of technology and innovation as it applies to national defense. I am very grateful to have been part of this course, and it has inspired me to go a step further and pursue a career related to national security.”
Team 6: Disinformation
Original Problem Statement: Disinformation is a national security threat.
Final Problem Statement: The U.S.’s ability to close the disinformation response kill chain is hampered by a lack of coordination between U.S. government agencies, no clear ownership of the disinformation problem, and a lack of clear guidelines on public-private partnerships.
One other goal of the class was to continue to validate and refine our pedagogy of combining a traditional lecture class with an experiential project. We did this by tasking the students to 1) use what they learned from the lectures and 2) then test their assumptions outside the classroom, the external input they received would be a force multiplier. It would make the lecture material real, tangible and actionable. And we and they would end up with something quite valuable.
Shreyas Lakhtakia: “TIGPC is an interdisciplinary class like no other. It is a fabulous introduction to some of the most significant tech and geopolitical challenges and questions of the 21st century. The class, like the topics it covers, is incredible and ambitious – it’s a great way to level up your understanding of not just international policy, political theory and technology policy but also deep tech and the role of startups in projecting national power. If you’re curious about the future of the world and the role of the US in it, you won’t find a more unique course, a more dedicated teaching team or better speakers to hear from than this!”
Team 7: Quantum Technology
Original Problem Statement: China’s planned government investment in quantum dwarfs that of the U.S. by a factor of 10.
Final Problem Statement: The US quantum ecosystem does not generate enough awareness of opportunities to pursue careers in quantum that could catalyze industry growth.
We knew we were asking a lot from our students. We were integrating a lecture class with a heavy reading list with the best practices of hypothesis testing from Lean Launchpad/Hacking for Defense/I-Corps. But I’ve yet to bet wrong in pushing students past what they think is reasonable. Most rise way above the occasion.
Team 9: Lithium-Ion Batteries
Original Problem Statement: Supply and production of lithium-ion batteries is centered in China. How can the U.S. become competitive?
Final Problem Statement: China controls the processing of critical materials used for lithium-ion batteries. To regain control the DOE needs to incentivize short and long-term strategies to increase processing of critical materials and decrease dependence on lithium-ion batteries.
All of our students put in extraordinary amount of work. Our students came from a diverse set of background and interests – from undergraduate sophomores to 5th year PhD’s – in a mix including international policy, economics, computer science, business, law and engineering. Some will go on to senior roles in State, Defense, policy or other agencies. Others will join or found the companies building new disruptive technologies. They’ll be the ones to determine what the world-order will look like for the rest of the century and beyond. Will it be a rules-based order where states cooperate to pursue a shared vision for a free and open region and where the sovereignty of all countries large and small is protected under international law? Or will it be an autocratic and dystopian future coerced and imposed by a neo-totalitarian regime?
This class changed the trajectory of many of our students. A number expressed newfound interest in exploring career options in the field of national security. Several will be taking advantage of opportunities provided by the Gordian Knot Center for National Security Innovation to further pursue their contribution to national security.
This course and our work at Stanford’s Gordian Knot Center would not be possible without the unrelenting support and guidance from Ambassador Mike McFaul and Professor Riitta Katila, GKC founding faculty and Principal Investigators, and the tenacity of David Hoyt, Gordian Knot Center Assistant Director.
Lessons Learned
We combined lecture and experiential learning so our students can act on problems not just admire them
The external input the students received was a force multiplier
It made the lecture material real, tangible and actionable
Pushing students past what they think is reasonable results in extraordinary output. Most rise way above the occasion
The class creates opportunities for our best and brightest to engage and address challenges at the nexus of technology, innovation and national security
The final presentations and papers from the class are proof that will happen
In the past, headlines about the Pentagon failing its financial audit again would never have caught my attention. But having been in the middle of this conversation when I served on one of the Defense Department’s advisory boards, I understand why the Pentagon can’t count. The experience taught me a valuable lesson about innovation and imagination in large organizations, and the difference visionary leadership – or the lack of it – can make.
With audit costs approaching a billion dollars a year the Pentagon had an opportunity to lead in modernizing auditing. Instead it opted for more of the same.
Auditing the Department of Defense By law, the Department of Defense has to provide Congress and the public with an assessment of where it spends its money and to provide transparency of its operations. A financial audit counts what the Department of Defense has, where it has it, and if they know where its money is being spent.
Auditing the Department of Defense is a massive undertaking. For one thing, it is the country’s largest employer, with 2.9 million people (1.3 million on active duty, 800,000 in the reserve components, and 770,000 civilians.) The audit has to count the location and condition of every piece of military equipment, property, inventory, and supplies. And there are a lot of them. The department has 643,900 assets, from buildings, to pipelines, roads, and fences located on over 4,860 sites, as well as 19,700 aircraft and over 290 battle force ships. To complicate the audit, the department has 326 different and separate financial management systems, 4,700 data warehouses and over 10,000 different and disconnected data management systems.
(BTW, just like in the private sector, financial audits and audits of contracts are separate. While the DoD Office of Inspector General is responsible for these financial audits of trillions of dollars of assets and liabilities, the Defense Contract Audit Agency is responsible for auditing the hundreds of billions of dollars of acquisition contracts. They have the same issues.)
This is the fifth year the Department has undergone a financial statement audit – and failed it. The audit was not a trivial effort, it required 1,600 auditors – 1,450 from public accounting firms and 150 from the Office of Inspector General. In 2019, the audit cost $428 million in auditing costs ($186 million to the auditors along with $242 million to audit support) and another $472 million to fix the issues the audit discovered.
Let’s Invent the Future of Audit The Defense of Department’s 40-plus advisory boards are staffed by outsiders who can provide independent perspectives and advice. I sat on one of these boards, and our charter was to leverage private sector lessons to improve audit quality.
With defense spending on auditing approaching a billion dollars a year, it was clear it would take a decade or more to catch up to the audit standards of private companies. But no single company or even entire industry was spending this much money on auditing. And remarkably, the Defense Department seemed intent on doing the same thing year after year, just with more people and with a few more tools and processes to get incrementally better. It dawned on me that if we tried to look over the horizon, the department could audit faster, cheaper, and more effectively by inventing the future tools and techniques rather than repeating the past.
Nothing in our charter asked the advisory board to invent the future. But I found myself asking, “What if we could?” What if we could provide the defense department with new technology, new approaches to auditing, analytics practices, audit research, and standards, all while creating audit and data management research and a new generation of finance applications and vendors?
The Pentagon Once Led Business Innovation I reminded my fellow advisory board members that in 1959, at the dawn of the computer age, the Defense Department was the largest user of computers for business applications.
However, there was no common business programing language. So rather than wait for one, the Defense Department led the effort to create one – the COBOL programming language. And 20 years later, it did the same for the ADA programming language.
With that history in mind, I proposed we lead again. And that we start an initiative for the 5th generation of audit practices (the Audit 5.0 Initiative) with machine learning, predictive analytics, Intelligent sampling and predictions. This initiative would also include automating ETL, predictive analytics, fraud detection, and a new generation of audit standards.
I pointed out that this program wouldn’t need more funds since the Department of Defense could allocate 10% of the $428M we were spending on auditors and fund SBIR (Small Business Innovation Research) programs in auditing/data management/finance to generate 5-10 new startups in this space each year. Simultaneously we could fund academic research, to incentivize research on Machine Learning as applied to Audit 5.0 challenges in finance, auditing and data management.
By investing 10 percent of the existing auditing budget over the next few years, these activities would create a defense audit center of excellence that would fund academic centers for advanced audit research, standup “future of audit” programs that would create new 5-10 startups each year, be the focal point for government an industry finance and audit standards, and create public-private partnerships rather than mandates.
Spinning up these activities up would dramatically reduce the department’s audit costs, standardize its financial management environment, and provide confidence in their budget, auditability, and transparency. And as a bonus, it would create a new generation of finance, audit and data management startups, funded by private capital.
The Road Not Taken I was in awe of my fellow advisory board members. They had spent decades in senior roles in finance and accounting in both the public and private sectors. Yet, when I pitched this idea, they politely listened to what I had to say and then moved on to their agenda – providing the DoD with Incremental improvements.
At the time I was disappointed, but not surprised. An advisory board is only as good as what it’s being chartered and staffed to do. If they are being asked to provide a 10 percent incremental advice, they’ll do so. But if they’re asked for revolutionary i.e. 10x advice, they can change the world. But that requires a different charter, leadership, people, innovation, and imagination.
In the end, the Department of Defense, the largest purchaser of accounting services in the world, whiffed a chance to be the leader in creating the next generation of audit tools and services, not only for financial audits, but for the hundreds of billions of dollars of acquisition contracts the Defense Contract Audit Agency audits. By now the department could have audit tools driven by machine learning algorithms, ferreting out fraud by vendors or contractors and anticipating programs that are at risk.
Lessons Learned
If you only get what you ask for you haven’t hired people with imagination
America’s defense leaders ought to ask and act for transformational, contrarian and disruptive advice
And ensure they have the will and organizations to act on it
Move requests for advice for incremental improvements to the consulting firms that currently serve the Defense Department
Defense leaders need to consider whether spending a billion dollars a year for an audit is causing the department to become appreciably more efficient or better managed