February 5, 2021
Philip Bouchard, Executive Director of the TrustedPeer Entrepreneurship Advisory, interviewed me recently about the origins and growth of Hacking for Defense, the university program we created to make the world a safer place.
We talked about why Steve Blank, Joe Felter and I created the program; how it’s helping to build a national security innovation pipeline as we partner with the National Security Innovation Network to expand to 30 universities (and counting); and what’s involved for universities that want to adopt the class. Philip captured the conversation in detail; see below.
Interview highlights:
Philip Bouchard: As the co-founder and board director of Hacking for Defense, what do you see as your primary roles?
Pete Newell: I have two primary roles.
PB: For this 2019-2020 academic year, H4D is being offered at 30 universities. Is there an ideal university profile that is the best fit for the H4D course?
PN: When we started Hacking for Defense, I looked around the United States for places where there was a university with a very strong research department that was sitting in the middle of an entrepreneurial community that had some connection to an industrial base, whether it was manufacturing or other, that had ready access to military installations or government research facilities. As I went around the country I counted 13 places where those criteria existed. As you watch the growth of Hacking for Defense, we have almost all 13 of those locations represented.
However, I learned over the years that there were a number of small universities out there that didn't meet any of those requirements yet had an incredible drive to build a team to run a course that was highly value added. As a result, I’ve softened my stance and now believe it's really about the community that universities are able to build around the course. That's the most important discriminator. Can you bring the alumni, local community and other people into the classroom to work with your students in real time?
PB: What should a university consider when deciding whether to participate in Hacking for Defense? What are the costs? What are the necessary resources that a university needs to put in place to support a H4D course?
PN: First and foremost neither the government sponsor, the National Security Innovation Network, nor the nonprofit H4D pay for instructor salaries or TAs for the course. Everybody who participates in a Hacking for Defense course, whether it is the university, the government corporates or students, is putting their own equity into the classroom. The students bring their time and their energy and the fact that they could be taking another class. The government is working hard in its own time to curate problems and to get the right problem people into the classroom.
Corporate folks are donating expertise to work with students and the teams.
We feel it is the university's responsibility to pull its share, which means they need to pony up and provide highly qualified instructors and pull the resources from the university into the classroom to help with the class. The university needs to own the class and do it the way the university typically does. That can be hard but I've yet to have a university not be able to do it when they really wanted to.
H4D is strongly built on Steve Blank’s principles of discovery, which means students need to get out of the classroom and do discovery. This requires students to travel to visit with the problem sponsors. We've done a good job of finding funds from other places to help the university teams get out and do their discovery for a couple of weeks.
For example, at Stanford the Office of Naval Research provides funding for student travel and student prototyping. This year Lockheed Martin donated funds for one university and eventually they will go to multiple universities. We have corporate sponsors coming in through the H4D nonprofit to provide some of that funding. In other cases, universities rely on their own networks for course funding.
In terms of a course construct, it's a flipped classroom. The best teaching team is made up of three people: a dyed-in-the-wool academic, engineering or entrepreneurship professor who lends academic credibility to the classroom; somebody with a deep technical background who understands science and engineering from a broad perspective; and someone with a government background who understands the gobbledygook of the defense and intelligence agencies.
The other thing you want is a couple of good teacher's assistants or TAs. At Stanford, our TAs are all graduates of the H4D program. They've been through the class and they run the class. That lets the teaching team focus on the content of the teams and working with the teams. The setup, the organization, the day-to-day managing of who is briefing when and doing what is a TA’s responsibility. Our TAs have written a TA handbook for the universities to use to them get set up.
Beyond that, every problem we bring into a university is assigned a problem sponsor from the government who owns the problem; as well as an industry mentor who is familiar with the technology area that the problem comes from. A mentor must be able to put three to four hours a week in working with a team to help guide them.
We treat each team as if it were a start-up. The teaching team acts like a board of directors; the problem owner and industry mentor are advisors attached to the team. At the start of the course, each team comes in with six to seven initial advisors, people that the team can reach out to and ask questions about the problem. Beyond that, it is the responsibility of the team to determine who they need to talk to.
One of the challenges a lot of universities face initially is figuring out how to motivate their alumni to get involved in a H4D class and how to motivate students to want to take the class. Our TAs do the best recruiting for the class at Stanford. We pay for a series of social engagements like beer-and-pizza information sessions.
In addition, we provide a sign up-with-your-interest list, but some come to us with a team and already know which problem they want to work on. Others will say, “I don't have a team but I'm interested in doing this; can you help me find people?” In that case, our TAs do some matchmaking to help teams form around problems. If you can put all that together in a classroom that you've got a solid class turn to go from.
PB: You've got so many constituents involved in every Hacking for Defense course; student teams, teaching assistants, the teaching team, end-user beneficiaries at the department of defense and other government agencies, corporate mentors, military liaisons, problem sponsors and others. How do you keep all of the course constituents coordinated?
PN: Managing chaos sometimes means not trying to manage it. If you're trying to teach entrepreneurship to students by letting them experience it, then you have to let them experience some of the chaos that comes with it. I'm hesitant to stick my nose in and fix things that a student team should figure out for itself.
Nevertheless, the National Security Innovation Network and BMNT, my company, who are responsible for curating the problems are very meticulous about the government problem sponsor’s performance in the classroom. Part of the problem curation process is making sure the government sponsor shows up, is ready to be part of the team and does their job during the course. The last thing we want is a student team to have questions of the government and not get an answer for four or five days. We manage the government side very tightly.
As a teaching team, we are constantly in contact with the team mentors. We have office hours to meet with the teams every week and we speak with the team mentors every week. The purpose is to ensure that we're getting feedback on what's going on with the team to determine whether we need to tweak something, fix something or help them bring people in.
We do assessments at the end of the course looking at the advisors and mentors we brought in to see if they were truly valuable to the teams. We do this to make sure that we get high quality people coming into the classroom. Between the TAs and the instructors, we look at each team and the ecosystem that's built around them and make sure they have everything they need. We have a deep Rolodex where we are ready, on a moment's notice, to fill a gap if we have to.
I'm not a professional instructor, but I will tell you that when I teach at Stanford, I don't take on any extra projects because the class consumes my time.
PB: What are your goals and desired outcomes for the Hacking for Defense course at any particular University?
PN: First and foremost it's about the students. I want students who would never have an opportunity to work with the government or for the government to experience what the government really does, and touch some of the really wicked sexy problems they have to solve all the while learning about what our military and intelligence agencies really do versus what they see in the press. In my mind, Hacking for Defense is a civics class. It's an opportunity for students who would never get a chance to do something with the government to “open the green door,” see what is behind there and do something that has an impact.
Second, I am enamored with the concept of experiential education. There is a place for academic learning and rote learning. However, the workplace today demands people who are agile and have the skills to do different things. We want to help industry and universities produce students who have had an opportunity to use every tool and trick they learned at the university, to use every network they've built in order to work on a real problem with real people and get real experience that lends itself to the workplace that they're going to join.
Third, I want to help the government better understand its own problems faster so that they can focus the resources they have on solving things that are important.
PB: The semester-long Hacking for Defense course was created and first launched at Stanford University in 2016. Thirty universities are participating this year. Is there a limit to the number of universities that you can support? What is your plan to scale operations?
PN: Congress has funded H4D to get to 50 universities in the United States. The limiting factor is our ability to deliver quality problems into the classroom. One would think that the government agencies have hundreds of thousands of problems. And they do. However, a problem has to be tied to an energetic, directed problem sponsor who's going to give time and energy to trying to solve the problem. Problems can't be classified; each one has to be of significant scope that the team can get in to work on it but not so broad that that team wallows in trying to find a direction.
Let me share some statistics. We sourced 400 problems out of the Air Force. By the time we got finished curating them, only 50 were useful for the classroom. We experience a 20 percent pass rate as we go through the government looking for problems. Only 20 percent of what is sent to us works well for the classroom. That's the limiting factor. Can we get the right problems in the classroom that are appropriate for the university so that they're attractive to students and motivate them to take the class, and can we help the university be successful with those problems?
I know we can do 50 universities; beyond that there is probably not a theoretical limit as you start looking outside the fringes of direct hard challenges inside the government. This class is phenomenal whether you're looking at a tech problem, a policy problem or a business process problem. They all fit. We've worked with everything from the Veterans Administration’s problems to the FBI to the State Department to monitoring sanctions in North Korea. The more crafty you are with the types of problems you look at, the richer the classroom environment is.
Right now the answer to how many universities we can support is 50 because that's what Congress is funding the DOD to do.
We have also launched in the UK, a Hacking for Defense at Imperial College and Kings College and we have another four colleges in the UK that are standing up to do it. We spent two years talking to the Ministry of Defense in the UK. The first time we talked to them at the headquarters level, their fear was, “We can't produce enough problems to run a Hacking for Defense class.” As we got down to the units and the tactical folks, they said, “Oh my God. I've got hundreds of problems.”
It's a combination of can you get enough rich problems and can the university build the right classroom environment for solving them. In some places it's easier than others.
PB: Your H4D co-creator, Steve Blank (see TrustedPeer Entrepreneurship Advisory interview with Steve Blank), has said that, “Innovation needs to be designed as a process from start to deployment that is supported by a software platform that is engineered to support the Innovation process.” What technologies and platforms are you currently leveraging to scale your Hacking for Defense program?
PN: Innovation needs to be founded on a framework.
This innovation framework shows:
We call that the Innovation Pipeline. There's also a framework that allows you to create a data-driven, disciplined process of making decisions about what moves from one place to the other within the pipeline. For example, the National Security Innovation Network has a software program that was built to keep track of the people who were involved in the classroom. That includes students, professors, advisors, mentors, military people in this ecosystem that you can continue to energize long after the class is over.
At BMNT, we used every management software we could possibly find and could not find any innovation portfolio management software that was the answer to helping people manage innovation at the enterprise level. There's just nothing that's adequate for it. So, we started coding about a year ago with an internal team to demonstrate what one would look like. We are now at the point where a number of government innovation cells are doing beta testing with it to give us feedback on how they're using it to manage their portfolios so that they're not missing decision points and are reflecting what they're really doing.
PB: What is the process for the Hacking for Defense course? What are the milestones, tasks and deliverables that a student team must complete?
PN: Every university is slightly different in how they start this. I'll walk you through how Stanford builds the course because I have taught it for four years.
We start as a teaching team in October-November by selecting the TAs who are going to work with the course. Stanford is on a quarter system. We don't teach until April. We pick the TAs early because students sign up for classes a couple months before the start of the semester and we start our recruiting before students start signing up for their classes, well before we even know what the problems are.
NSIN and BMNT work together to source and curate the problems Our cut-off for having the first pass of curating the problems is Oct. 1 for universities that are teaching in January. We continue curating them until Oct. 20. By Nov. 1, NSIN delivers the problems to the universities for a winter or spring quarter, for people who are teaching in January. For example, if Texas A&M teaches a Hacking for Defense class that starts in January, they'll get their problems in November.
We provide universities with more problems than they plan to have teams for. We want the problem sponsor to have to compete to get students to take their problem. Universities need to socialize the problems a couple of months prior to the start of the semester. We invite the problem sponsors in to meet socially with the students, talk about their problems, and try to do some of their own recruiting to get teams to take them on.
Once we know which problems have been accepted, we identify and recruit mentors for the class based on the number of teams we expect to have. Then, it is a matter of recruiting the advisors. At this point, we start socializing the problems across the university system saying, for example, “I have eight problems; three from an intelligence agency, and one from the Department of Energy.” Next, we'll start talking to people in the university asking, “Who in your networks would would be willing to talk of these students to help them get started in understanding your problems?”
We start building out the list for each problem so that when we start the class the ecosystem is ready to go. A month prior to the course start date, our TAs provide the application package to interested students using a Google sheet to keep track of all the students who have asked to participate. TAs look at the teams that have formed and have expressed interest in a specific problem and work to match teams.
TAs also work with individual students to help flesh out the team if that’s needed. We want the team to understand the problem when they come in to the course. Their applications are due two to three weeks before the course starts. Once we have all the applications in, the teaching team interviews every team.
The team is expected to come in and brief their understanding of the problem, the strengths and skills of their team, and their understanding of the competitive space around that problem. Typically 12 to 15 teams apply. We look for the right diversity of thought among the team members against the type of problem they're trying to bring into the classroom. I don't want a team full of MBAs; I want one that's got, for example, an MBA, an engineering student, a nursing student, etc. We interview them as if we are investing in a start-up, is this a team that we think can hang together through a really hard course, who will do due diligence and do a good job on the problem? Once we've done all those steps, we select the final eight teams for the class.
All that happens before day one of the course. There's no teaching but we hold two seminars as the course starts.
The first seminar is called DoD and Intelligence Agency 101 to help students who have never touched the military understand the framework of where things come from. We talk about everything from national security strategy all the way down to how the combatant commands are put together so they understand where their problem sits within the national military organization chart.
The second is a two-hour seminar on doing discovery. We found that the thing that hinders the teams most early on is their inability to organize their interviews and build MVPs (Minimum Viable Products). That discovery process that Steve Blank has worked so hard on for so many years is still really hard for people. Therefore, in the first week, we teach how to build an MVP as a conversation driver to allow them to interview somebody in order to elicit feedback. We take them through that process and let them practice it a couple times before turning them loose to start the interviews.
We have the teams log every interview they do. That's how we keep up with what's going on as a teaching team. Each team does ten interviews per week and there are eight teams, that's 80 blog entries that we're supposed to be reading every week.
Once the is up and running, there are no further lectures in the classroom. All the pre-reading content is available online. The real learning, however, comes from the feedback session each week. Every week, each team gets 15 minutes to present their deliverables. We're looking for their “Aha” moments of learning and they get direct feedback from the “board,” the course instructors, in terms of challenges to their thought process or their data analysis. We set up office hours and require every team to meet with one of the instructors every week for 30 minutes.
Because you have three very different personalities teaching a class there are days when Steve Blank will tell a team something that I disagree with or vice versa and that’s real life. Sometimes board members disagree with each other about something. Or there may be some days when I will have an issue with something a team has done and they won't understand what I was trying to tell them. In that instance, they have an opportunity to go to another instructor and say, “Pete was trying to get us to do something and we still don't understand what he was saying.” We force students to take that opportunity to talk to people to get help to move them forward.
PB: How does the H4D course grading system work?
PN: There's a lot that goes into grading the process.
Each team presents weekly and we require a different team member to do the presentation. The TAs select on a rotating basis just prior to the class who that team member is going to be so they all have to show up prepared. We grade the presentation of each team on a scale of 1 to 10, looking at, what was the quality of the work this week that you presented to us? The students in the class also peer evaluate the presentation. The teaching team grades, your peers are grading and TAs who have looked at the body of your background work (blogs and your MVPs) are also grading. At the end of the course all of this grading is aggregated. There are plenty of opportunities for evaluation.
We have seen a team come in and “blow” a briefing. They will get more help from TAs, instructors and mentors. We tell them, “You're not allowed to blow this two weeks in a row.” Sometimes you have conflict within the teams because they’re students and sometimes one student has some things going on and they're not carrying a load that week versus another week. The teams peer evaluate each other in terms of their performance and carrying the load. We do a peer rating internal to the team; we do a peer rating external to the team looking at the team quality; we do the background evaluation that the TAs do on the blogs and instructors are evaluating teams.
PB: The foundation for the H4D course is the Lean LaunchPad which was developed by Steve Blank. Steve also created the NSF I-Corps program. What is the difference between NSF I-Corps and H4D? Why should a university support both?
PN: The basic difference is that H4D is an academic course and I-Corps is an extracurricular activity. I-Corps was designed to transition basic research out of the university and commercialize it so that the government is able to get something back on their funding.
H4D’s focus is on the people, giving young men and women who are brilliant in their own right an opportunity to see the really hard wicked sexy problems that the government has to solve and to allow them an opportunity to participate in things that are important to the country. While H4D sometimes produces companies out of the class, that is not our intent. Our intent is to get the body of brilliant young men and women resident in our universities an opportunity to work on things that are important to the country. H4D is an academic course because we strongly feel that without a grade it would be hard to get students to stay focused and stick with something that is this difficult.
The value proposition for the government is to bring problems to the table and get back world-class market research wrapped around their problems. Students will validate that the government got the problem right. No problem that has ever come into a Hacking for Defense course has ever survived the first two weeks in its original form. At the end of the course, even if they don't hand in a solution to the government, their research into that problem area and a potential solution pathway is the best world-class market research around that area that you can get your hands on. You can't go to Deloitte or Booz Allen and get that quality of work in that short a time.
It was important to make sure that there was some incentive grade and professional reputation for the students to keep them to the table and keep them working on a project when it got really hard. We have had H4D teams go through the course and then receive an invite at the end to join an I-Corps cohort. We have looked at I-Corps cohorts or Ph.Ds and said, “You need to come to the H4D class.”
H4D and I-Corps are not competitive. They work together mutually if you do it right.
PB: What happens to the teams and the proposed solutions to the problems after the course is completed?
PN: There are four possible outcomes. In the Stanford course, at the end of the final presentations, we require the students to do a final deliverable for the government problem owner. It is essentially a four- to five-page paper that captures their discovery, the learning and their recommendations for the government. That's the deliverable benefit to the government.
In most cases, the government says, “Thank you very much. Not sure what we're going to do this but really appreciate the hard work.” The team scatters and goes to do its own thing.
In some cases, the government will look at the team and say, “We love the work you have done; there's more to do here. We have fellowships and internships available for you; we'd like you to take them and keep working on this problem with us because we're not done yet.”
In other cases the team discovers a product that needs to be built or a company to be launched and they form a company coming out of the course. At Stanford this year we had 2 of the 8 teams launch companies. One of them immediately augured itself into the ground and realized they weren't ready to be a company. The other team has already gained close to a million dollars in government contracts only two months after the end of the course.
There are some cases where the students learned something while they were in this process, they caught the entrepreneurial bug and either moved on