I get tired of all the talk about rewarding "failure" because it shows people are trying, and won't be penalized for it
So much failure in innovation is unnecessary. It comes from hubris and a failure to acknowledge and learn from previous programs and best practices developed in other iterations.
I write about this a lot, from discussions about "positive deviance" and "pre-mortems."
I bring this up because of a conversation with some city planners at a conference last week. It's really tiring.
It's why I am fond of the quote attributed to Bismarck:
"Fools learn from experience. I prefer to learn from the experience of others."From a 2007 blog entry:
Pre-mortems. From "Analyzing Failure Beforehand," in the New York Times:
Post-mortems, trying to figure out why a new idea failed, are a common business process. But wouldn’t “pre-mortems” make more sense?Positive deviance. I have also written a lot about "positive deviance." See the past blog entries "Positive Deviance and the DC Public Schools" and "Positive deviance in NYC school system remains unrecognized."
They would, argues Gary Klein, chief scientist at Klein Associates, a division of Applied Research Associates, which works with companies to show them how to conduct pre-mortems and “identify risks at the outset.”
“A pre-mortem in a business setting comes at the beginning of a project rather than the end, so the project can be improved rather than autopsied,” Mr. Klein explains in The Harvard Business Review.
In the pre-mortem, company officials assume they have just learned that a product or a service they are about to introduce has “failed spectacularly.” They then write down every plausible reason they can think of to explain the failure. The list is then used to eliminate potential flaws before the new idea is actually introduced into the marketplace.
While companies frequently engage in risk analysis beforehand, employees are often afraid to speak up, fearing they will be seen as naysayers or will suffer the political consequences of objecting to an idea that is popular internally. < An exercise that assumes the new idea fails frees people to be more candid, and can, Mr. Klein writes, serve as a check on the “damn-the-torpedoes attitude often assumed by people who are overinvested in a project.”
The idea of positive deviance is that even low performing organizations have pockets of excellence.
Unlike opposition expressed to importing "best practices," members of the organization can't ascribe the difference in outcomes to different organizational conditions. Instead of working to import best practices, positive deviance is sort of like the Chrysler commercials -- "imported from Detroit -- stressing the excellence already present in the organization and "exporting" it to other sites across the organization.
-- "Your Company's Secret Change Agents," Harvard Business Review, May 2005, is the article that introduced me to the positive deviance concept.
Big data and pattern recognition. In the past few weeks, I have realized that my reticence about jumping on the big data train is that it makes people like me obsolete.
I am good at pattern recognition, structural analysis and process design. Most people aren't. But using analytical tools and big data sets, people who aren't good at pattern recognition have computer applications generate interesting results that they can then address.
I wrote about this a few years ago, somewhat derisively, about an Arlington County initiative focused on battling student truancy. See "Creating the right program vs. the hype of big data." I wrote:
I regularly laud Arlington County's government here, and get some sniping in the comments. But I have to admit that I laughed out loud when I saw an article in the Sunday Washington Post, "Arlington schools tap 'big data' to reduce dropout rate," stating that ArCo is offering a $10,000 prize for using "big data" to figure out the school truancy program.
Big data--mining massive data sets with specialized programming tools and algorithms--is like "Apps for Democracy", not the real issue (see "All the talk of e-government, digital government, and open source government is really about employing the design method")
The issue is process design, creating the right programs, devoting the necessary resources, and staying committed for the many years it will take to right the problem.
I don't know if there is tons of waste in government, but there is a great deal of focusing resources on the "wrong" problem and mis- and dis-coordination of programs and resources.
My approach to planning is to start from the endpoint, "what are your preferred outcomes?," figure out if you are achieving preferred outcomes as a matter of course--routinely--and if not, looking backwards at the processes that produce the outcomes.
Based on a wide variety of analyzes (best practice review, program evaluation, etc.), figure out why preferred outcomes aren't being generated routinely, and change the processes accordingly, taking into account potential "unintended consequences" in advance, rather than not considering them at all.
It comes up again with a recent article ("Cities Are Having a Data and Analytics-Driven Moment, and It's Likely to Stay") in Government Technology about an initiative in New Orleans, where the firefighters decided to be proactive in distributing smoke alarms in neighborhoods with a higher rate/risk of fires. From the article:
In New Orleans, the city has been saving lives by using data to predict which of the city’s buildings need to be equipped with fire alarms. Using data collected by the Census and New Orleans Fire Department, the city identified building age, building inhabitant income, and building inhabitant occupation length as strong predictors for determining if a structure may not have a smoke alarm installed. It then mapped this information along with fire risk calculated from resident age data and fire data over the previous five years. The program’s results now inform NOFD’s door-to-door program to install free smoke alarms.To me the issue isn't big data, but first, the decision (1) to be proactive in distributing smoke alarms, (2) not willy-nilly, but in those neighborhoods with a higher risk for fires.
I find it hard to believe that the Fire Department doesn't analyze runs and fires already to know what types of properties and situations are high risk.