Rebuilding Place in the Urban Space

"A community’s physical form, rather than its land uses, is its most intrinsic and enduring characteristic." [Katz, EPA] This blog focuses on place and placemaking and all that makes it work--historic preservation, urban design, transportation, asset-based community development, arts & cultural development, commercial district revitalization, tourism & destination development, and quality of life advocacy--along with doses of civic engagement and good governance watchdogging.

Tuesday, February 08, 2011

Artificial intelligence

There was an op-ed in the Wall Street Journal last week on artificial intelligence. AI is the process of making machines (software and hardware) intelligent so that they can make decisions independent of specific human direction.

I laughed at the headline though, thinking that "artificial intelligence" is what we do mostly as elected officials and advocates, when we make recommendations for how to improve things, when we don't really know what's going on, what the fundamentals are of the issue, what the real problem is, and how to respond, not to mention how to implement.

One current example is the Walmart issue in DC, which I have been writing about.

Another is a recent hearing on improving bicycle and pedestrian safety in DC. At least as rendered in a piece in TBD, it sounds like it was mostly incoherence ("Blogging the D.C. pedestrian and bicycle safety hearing"). I wasn't able to go--clearly I should have testified and I will submit a position paper on the topic--but it would have driven me bat-sH** if I stayed there and listened to all the drivel.

But I could probably come up with a dozen more examples easily.

For example, I get sick and tired of reading articles about new books on the topic of making mistakes such as Not So Sorry: Kathryn Schulz, 'Being Wrong,' at Politics & Prose" from the Express or about how failure is essential to making good entrepreneurs and for future success.

Yes, it's ok to make mistakes. Yes, you should admit when you're wrong.

No, it's not ok to not learn from mistakes.

But it's downright stupid to not learn the right lesson from mistakes, because it means that you have learned nothing.

Speaking of quotes, I am fond of Bismarck's apt point:

"Fools learn from experience. I prefer to learn from the experience of others."

His point is that by you yourself paying attention to what others have done, it's not necessary to make the same mistakes, thereby being more successful.

So I will reprint this blog entry, which is relevant to all people, regardless of where they work, what they do, or how they are involved:

Helping Government Learn, March 23, 2009

Michael Skapinker of the Financial Times, in the column "Managers must listen before disaster strikes," calls our attention to the report from the UK National Audit Office, "Helping Government Learn." Also see the Executive summary (PDF - 238KB), Press notice (HTML), and Literature Review (PDF - 606KB).

From the column:

In 1999, newspapers were full of pictures of would-be British holidaymakers queuing in the rain because a new processing system at the UK Passport Agency had failed to produce their passports.

By contrast, the 2006 roll-out of “e-passports” – passports with electronic chips – proceeded without a bump. Why the difference? The leadership of the agency adopted new strategies. They talked about possible risks at the beginning of every meeting rather than at the end. Managers had a commitment to “knowledge transfer” – learning from others and teaching them – written into their contracts. The e-passports team also made regular contact with passport agencies involved in similar programmes in five other countries.

People at the top began with an assumption that they did not know everything – that front-line staff and customers probably knew more. Several of the report’s examples of listening managers were in departments that had, like the Passports Agency, previously suffered catastrophic setbacks.

The real challenge will be finding leaders who listen before disaster strikes. That will be harder. It is self-confidence that propels people to the top, not openness to criticism.

His piece is about organizations generally, and he starts by discussing the financial crash. He asks "Will employees who dissent always be ignored or harassed?"

I think yes. Too often, people see analysis as criticism and as personally directed. I find two past articles from the Harvard Business Review to be particularly relevant to this issue, "Performing a Project Premortem" which is summarized as:

In a premortem, team members assume that the project they are planning has just failed--as so many do--and then generate plausible reasons for its demise. Those with reservations may speak freely at the outset, so that the project can be improved rather than autopsied.

and this article, "The Experience Trap," which is pretty damning. Rather than analyze issues and problems, most managers just expect that problems will occur, they take this for granted, dooming themselves to repetitive failure rather than the development of robust systems designed to capture learning and apply it to increase the likelihood of future success. From the article:

Why Learning Breaks Down

When anyone makes a decision, he or she draws on a preexisting stock of knowledge called a mental model. It consists largely of assumptions about cause-and-effect relationships in the environment. As people observe what happens as a result of their decisions, they learn new facts and make new discoveries about environmental relationships. Discoveries that people feel can be generalized to other situations are fed back, or “appropriated,” into their mental models. On the face of it, the process seems quite scientific—people form a hypothesis about a relationship between a cause and an effect, act accordingly, and then interpret the results from their actions to confirm or revise the hypothesis. The problem is that the approach seems to be effective only in relatively simple environments, where cause-and-effect relationships are straightforward and easily discovered. In more complex environments, such as software projects, the learning cycle frequently breaks down. In the experiments we carried out with our study participants, we identified three types of real-world complications that were associated with the cycle’s breakdown.

What I try to do through the writings in this blog is to capture and disseminate knowledge, to suss out the reasons for success or failure, and figure out how to capture that learning into the development of robust change programs:

1. Indicate -- identity the particulars of processes and structures of success and failure
2. Duplicate -- figure out how to duplicate (repeat) success.
3. Replicate -- develop the systems, structures, frameworks to apply programs to different situations and communicate them throughout innovation networks.
4. Accelerate -- figure out how to speed up successful innovation and programs.

Transformation and innovation and creative problem solving is the goal.

But most people aren't too interested in dealing with challenges to their world-view. Especially higher-up officials, and often people in "think tanks."

Labels: , , ,


Post a Comment

<< Home