Archive for February, 2010

Antimetabole is more fun than it sounds

Saturday, February 27th, 2010

“Be precise in the use of words and expect precision from others” —Pierre Abelard

I love words, especially obscure but delightfully precise ones. I enjoy it when authors use language that sends me to the dictionary.

The other day a friend of mine quoted to me “Good fish ain’t cheap and cheap fish ain’t good.” He described this style of phrasing as “chiasmus,” a word I did not know. Upon looking it up I discovered that while the phrase was indeed chiasmus, it was also a good example of a subset of chiasmus called “antimetabole.” According to Wikipedia, many people confuse the two.

Chiasmus refers only to the arrangement of grammatical elements in the sentence, while antimetabole depends on the repetition of the words. The structure of chiasmus is this:

Subject, adverb, verb, conjunction, subject, verb, adverb

Thus, the phrase “He brightly spoke and I replied clearly” is chiasmus. The statement “I meant what I said, and I said what I meant” is antimetabole.

Chiasmus is sufficiently obscure to be the provenance of English lit students and Shakespeare deconstructionists, but antimetabole is one of those far more accessible and amusing little sideshows of the English language, despite its scholarly moniker. Its rhythm and pattern make it particularly memorable to the human mind, and thus many great quotations and aphorisms follow the antimetabole pattern, such as:

“Those who know aren’t talking, and those who are talking don’t know”

“If everybody is thinking alike, then somebody isn’t thinking.” —George S. Patton, Jr.

“When the going gets tough, the tough get going.” —Joseph Kennedy

Here’s a transcript of a brief Public Radio story on the use of antimetabole during the ‘08 presidential campaign. It starts with what is arguably the most famous example of oratorical antimetabole, when President John Kennedy said in his 1961 Inaugural Address, “Ask not what your country can do for you; ask what you can do for your country.”

Of course, there are many unnamed cousins of antimetabole. There is the homophonic variant, where instead of reversing the order, we change the meaning by changing the spelling, but not the pronunciation, of one of the words. For example, my colleague Jonathan Korman, says “No whiteboards, no design; know whiteboards, know design.” Korman’s phrase is a delightfully secular twist on the fundamentalist Christian version, “No Jesus, no peace; know Jesus, know peace.”

Most crafts encapsulate their wisdom in antimetabole, and airplane pilots have some good ones. They say, “Plan your flight and fly your plan.” My personal favorite pilot saying is “It’s better to be on the ground wishing you were in the air, than to be in the air wishing you were on the ground.” To drive home the point that caution is a healthy attribute in a pilot they say “There are old pilots and bold pilots, but no old, bold pilots”.

And then there is the homonymic variant, where spelling remains constant, but the meaning changes. The grammatical order isn’t strictly chiasmus but they are nevertheless fascinating:

“You will be fired with enthusiasm or you will be fired with enthusiasm.” —Vince Lombardi

“We will hang together or we will hang separately” —Benjamin Franklin

Some other favorite examples of antimetabole include:

“In theory, there is no difference between theory and practice; in practice, there is a big difference.”

“Not everything that can be counted counts and not everything that counts can be counted.” —Albert Einstein

“I never entertain wicked thoughts; wicked thoughts entertain me.”

“You don’t get what you don’t pay for.”

“Just because you can do something doesn’t mean you should do it, and just because you should do something doesn’t mean that you can do it.”

“I have always found that plans are useless, but planning is indispensable.” —Gen. Dwight D. Eisenhower

“Conservatives believe it when they see it; liberals see it when they believe it.” —Rep. Dick Armey

“Good judgment comes from experience and experience comes from bad judgment.”

“As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality.” —Albert Einstein

“I have taken more out of alcohol than alcohol has taken out of me.” —Winston Churchill

If you know of any other good examples of antimetabole, please share them with me.

There Because It’s There

Friday, February 26th, 2010

I was poking around the internet when I stumbled upon a reference to a year-old blog posting by Jakob Nielsen. The referring person was a UX professional who asked,

“Have you seen what Jakob Nielsen suggests about masking passwords: I think he’s gone cuckoo.”

Upon reading that provocative accusation, I had to follow the link to see just how crazy old Jakob had become.

But Jakob’s brief, clear post was relevant, correct and well-reasoned. He instructed us to stop masking password entry fields. It was the UX professional who had “gone cuckoo.”

Showing bullets instead of the actual characters of your password to obscure it from onlookers is one of those interface idioms that have been around forever. Unfortunately, its age is the only possible reason for its continued existence. Its effect is the opposite of its intent to enhance security.

One of my design axioms is, “Design for the probable; provide for the possible.” It is possible that some nefarious person with both the means and motive to steal your identity is just awaiting the opportunity to peer over your shoulder and memorize your 8-character, mixed-case, partially numeric, non-mnemonic password. However, it is far, far more probable that you are alone, or where nobody can clearly see your mobile’s screen, or in a pub surrounded by friends with whom you have shared far more than just access to your Amazon account.

What’s more, because the characters are obscured, it is far, far more probable that you will hesitate halfway through typing your password and lose your confidence that you have typed correctly. This forces you into taking the extra step of erasing and retyping. In other words, the extra thought and work is frequently necessitated but rarely useful. Instead, a simple option to turn on masking, pushed the extra work onto the rare—but possible—case when one is surfing the internet in a hostile environment.

Password masking undoubtedly originated when some clever programmer put it in a program to show off. I can hear him now, bragging to his colleagues, “Somebody might need to enter his password in a hostile environment”. Whenever you hear that telltale phrase, “Somebody might…” you are about to be covered in interface slime.

“Panic” buttons on automobile remote entry keyfobs is an identical problem. I suppose it is theoretically possible to imagine a case when someone would want to intentionally set off their car alarm, but I have never heard even a whisper of a real situation. But what is far more probable is what happened to me just the other night. I was watching TV and accidentally dropped the remote. Upon bending to pick it up, something in my pants pocket pressed against my keyfob, and my car’s alarm went off. Everybody in the neighborhood heard the blasting horn while I fumbled to shut it off.

I guarantee that some automotive engineer a decade ago, working on the new remote keyless entry system, had a brainstorm about a rare possibility. “Somebody might want to set the alarm off intentionally” he said to himself, and created the Panic button. The marketing department loved the idea because it seemed they could offer a new feature at no additional cost.

Sadly, there is additional cost, one not measured in money, but in the lowered quality of experience. I would gladly pay to have that evil Panic button removed from my keyfob, yet every new car still comes with one, simply because it has always been there, and that’s a terrible rationale.

Jakob Nielsen pleads with us to “clean up the cobwebs and remove stuff that’s there only because it’s always been there.” I think it’s cuckoo not to.

The oxymoron of consumer-facing open-source

Sunday, February 21st, 2010

My colleague, Dave Malouf (@daveixd), asked me an intriguing question in response to a tweet I twittered. I asserted that “There is no greater oxymoron than consumer-focused open-source software!” Dave replied, wondering if I thought “that Firefox has shown that open source consumer software CAN work?”

Firefox is an excellent program, reasonably easy to use, with some very user-friendly attributes. Firefox certainly proves something about open source, but I don’t think it proves that it can be consumer facing, exactly.

I believe that Firefox proves only that, on rare occasion, and by accident, some non-consumer-facing software faces consumers successfully. That may sound like I’m splitting hairs, but let me explain.

The defining characteristic of open source is that it is designed and built by and for those who will use it. Since no one is paid to write it, open source is software written by interested parties whose only compensation is the joy of their personal accomplishment. Of necessity, those parties must possess the skills to create it. Non-programmers might like some program, but they are not capable of creating it, so it will remain unwritten. Thus open source is always designed and created by and for computer programmers, for geeks.

This means that open source products always fit into one of two categories: system software or geek verticals. Both of these categories are things that software geeks love.

System software is the largest category. Operating system geeks build open source operating systems like Linux to run their computers. Programmers trying to wrangle large collections of code build open source version control systems.

Geek verticals is the broadest category. Geek astronomers who love gazing at the night sky build open source telescope management software. Geek motocross drivers build open source driver-rating software to manage their sport-driving meets. Geek model railroaders (nobody geekier) write open source software to manage their miniature train empires.

Some people falsely extrapolate from the success of open source software that one could compel geeks to write open source software for their commercial needs, but this will never, ever happen. The only case in which this would appear to be untrue is in the extremely unlikely case where the needs of consumers happen to coincide exactly with some system software or geek vertical category.

Of course, this is exactly the case with Firefox. Any web browser is both systems software and a geek vertical. The original concept of the web browser was as a system tool, on the order of ftp or awk, and was never imagined to include normal users. It allowed geeks to reach across network boundaries. But it turned out that consumers could also benefit from accessing information across a broad internetwork.

In addition, the original concept of the web browser was as a geek vertical, on the order of grep. It was a simplified tool for technologists to quickly scan geeky academic data stored on the geeky Unix networks of the world’s universities and corporate research labs. It was quite an accident that the Web browser turned out also to be something that normal humans wanted.

And the scope of the browser, while broad, actually didn’t demand much explicit control for the normal range of consumer usage. The complexity could be managed by the particular website. This should sound familiar. The original personal computer operating system of the 1970s, CP/M and MSDOS, was an identical rare accident of desire and capability with the ability to encapsulate the complexity of applications. The spectacular success and world-changing nature of both the personal computer and the World Wide Web are testimony to the power of this incredibly rare congruence.

Nice video

Sunday, February 21st, 2010

I love this video!

I love the style: very fast paced, narration matched to words enhanced by great diagrams.

The visual interplay of the diagrams is very effective and entertaining.

The Walter-Winchell-esque narration will appeal to some and not others, but I like it. Doing the voice-over this way makes it sound not so serious as it otherwise might be.

This video was a school project made to illustrate one of William Lidwell’s Universal Principles of Design. It’s one of my personal favorite principles largely because it isn’t based on aesthetics, but on organization.

The Fucking Shovel Joke

Thursday, February 18th, 2010

The Mother Superior went to the construction site next to the convent seeking out the Foreman. She asked him to “Please stop your men from using crude and foul language as the Novices and Sisters should not be exposed to such offensive speech.”

The Foreman stood his ground, saying that while he sympathized, rough working men and laborers sometimes “called a spade a spade”.

The Mother Superior drew herself up and replied “Well, most of the time they call it a fucking shovel!”

This is a very old joke; so old that it doesn’t get told in its entirety very often, and I thought that there would be some benefit to repeating it in full.  (I apologize if the strong language offends you. If it does, please stop reading my blog.)

Sometimes, when talking to others, I use hyperbole to make my point. Sometimes, I go just a little bit too far, and I use this joke to get myself off the hook. My intent is to drive home a point, not to drive people away.

Many programmers and other hyper-intelligent people I have known over the years have this same tendency to call things “fucking shovels” when they merely want to call a spade a spade. They want to be scrupulously clear about their meaning. For everyone’s sanity, it’s good to understand that taking a strongly polarized position in discourse is a rhetorical technique, and not necessarily representative of a rigid, polarized opinion.

We who enjoy polemics play this game often. It combats complacency and it motivates the reluctant to act positively. It can be a very good thing to paint a picture in black and white.

On the other hand, many people have difficulty distinguishing between rhetorical hyperbole and rock-solid-belief. Such a listener will discount everything you say if they find fault with anything you say.

On the third hand, the nature of people in groups always trends toward maintenance of the status quo. Because I always want to see forward movement , I will always be guilty of  calling a spade a fucking shovel.

But I understand that there is  twelve-step program for this…

Book Review: On the Origin of Stories by Brian Boyd

Wednesday, February 17th, 2010

Darwin’s theory of evolution is generally regarded as one of—if not the—most important scientific revelations in history. Even amateurs like me give it props, but respect and understanding are two different things. It’s particularly difficult to wrap your head around the theory’s profound implications when you are one of those evolved forms. To put it another way, while humans have evolved many powerful mental abilities to help us understand the world in which we live, we haven’t evolved any particular mental ability to let us clearly see ourselves in an evolutionary perspective. We struggle to grasp what it really means to be an animal, driven by instinct, composed of an unordered set of adaptations, and not the rational, clear-headed, self-directed person we imagine ourselves to be.

It’s true that humans are animals with a highly evolved set of cognitive powers, but that doesn’t mean that we are without instinct or without invisible motivations rooted in our survival adaptations. Not only do humans have a history of ignoring the effect of their adaptations, but our recent history shows us misunderstanding and abusing them. In the middle of the twentieth century, the pseudo-science of eugenics was used as a justification for genocide. Academia, in recoiling from eugenics, banished any enquiry into how Darwinism might affect Homo sapiens. For at least a half-century, serious enquiry into the evolutionary basis of human behavior was suppressed. Unfortunately, in the vacuum, touchy-feely psycho-babble like Freudianism dominated the landscape of study.

The passage of time as well as such technical tools as computer-aided-tomography has finally allowed serious scientists to turn their attention to human evolution, provoking only ragged outbursts of hysteria. The last couple of decades have seen a tremendous explosion of fascinating and important work in the many new evolution-based fields of study.

Enough researchers have probed the subject with sufficient rigor and repeatability to elevate cutting-edge, evolutionary psychology to the level of “hard” science. We are not just working with theories and metaphors any more. What’s more, there are many skilled writers bringing scientific findings to the amateur reader. Brian Boyd’s new book, “On the Origin of Stories”, subtitled “Evolution, Cognition, and Fiction”, is a fine example.

The book is a fascinating enquiry into how humans think, behave, and conceive of the world and each other through the startling lens of our evolved survival mechanisms. Boyd says that “Minds exist to predict what will happen next.” What sets this book apart is its focus on the role of art, and particularly the art of telling not-strictly-true-stories, in shaping human behavior and civilization.

Some readers will object to Boyd’s pedantic, step-by-relentless-step defense of fiction—and all art—as an evolved behavior. But political correctness, in its unyielding fight against eugenics, still actively combats contemporary researchers when they look human-ward. To make his case, Boyd clearly feels that he must not only examine art from an evolutionary perspective, but he must also examine evolutionary biology as well, which he does doggedly, but effectively. Boyd explains, “Since evolution challenges deeply held intuitions about our special place in the world, since the controversies have been sharp and the confusions and misrepresentations profuse, we have to tread carefully.” Even if you are not a hyper-sensitive, political-correctness Nazi, or a bible-thumping Creationist, seeing human nature through Darwinian lenses can be challenging to one’s self-image, and while Boyd’s methodical arguments are demanding, they are not unwelcome.

I’m neither an academic nor a scientist, but a designer and builder of software, which means that I endow high-technology products with human-facing behavior. Understanding how humans conceive of the world around them, and how they are motivated, is an essential skill for anyone in my line of work.

Historically, computer programmers have ignored all of this human stuff and instead immersed themselves in the abundant technical minutia of programming. Of course, the software they wrote was heartbreakingly difficult to use. It took many years and many tears before an awareness of the linkage between bad software and misconstrued human nature emerged. And sort of like those vestigial defenders against eugenics, there remain many who resist the idea that evolutionary psychology plays a role in the design of software. Yet, because evolutionary psychology directly addresses human motivation, it is arguably the single most useful tool for understanding and designing the form and behavior of software.

Not only are Boyd’s opening chapters on evolutionary psychology an excellent précis of the territory, but his focus on the evolutionary origins of storytelling are even more useful to the software designer. Narrative, or storytelling, is a vitally important tool both for the design of behavior and for the communication of that design.

Boyd gives us a vocabulary for understanding storytelling. He shows how humans conceive of the world through stories. Our relations with others are framed by our physical memory into narratives with characters and events to enable future recall. Our values and our perceptions are based on these storytelling mechanisms. We imagine the world through narrative eyes: plot and character, event and intent, attention and pattern, anticipation and surprise.

Storytelling is what allows the software designer to imagine real people in front of our software creations. Storytelling allows us to see their instinctive human motivations and perceptions at work as they manipulate the interfaces that we design. Storytelling allows us to share our abstract designs with others who must implement them.

The evolutionary basis of art is probably the least examined, and least understood, area of contemporary evolutionary study. Even Stephen Pinker’s 1997 book, How the Mind Works, a sweeping overview of the field, fumbled the art ball. Pinker posed two theories. The first being that art is merely a by-product—or vestigial remain—of our big brains used for other, more important things. Pinker’s second argument is that art is used to attract mates. Being otherwise without purpose, owning an expensive painting, for example, communicates one’s wealth, and by extension, ability to nurture offspring.

Boyd kindly but thoroughly dismantles both of Pinker’s arguments. Art as by-product falls to the argument that evolution quickly evolves away from costly but useless abilities, citing the way cave-dwelling salamanders soon become sightless. The art-as-sexual-attractant is a more resilient argument. Darwin described such mechanisms, like a peacock’s tail, calling the process “sexual selection.” Boyd argues,

“If art were sexually selected, this would predict that it is overwhelmingly male and directed to females, developing rapidly at puberty, peaking just before mate selection, and diminishing drastically afterward. But mothers of all cultures sing to infants; infants prefer their mother’s singing to their fathers; infants of both sexes engage in cooing and singing, clapping, and dancing as soon as they can.”

It is easier to recognize human adaptations when we look at the simple, fundamental manifestations of art in human behavior than it is to discern them by gazing at a Vermeer or a Klee.

Boyd says “Evolution by natural selection is a simple principle with staggeringly complex and unpredictable results.” No where is this more true than that uniquely human affect, art.

“Despite its many forms, art, too, is a specifically human adaptation, biologically part of our species. It offers tangible advantages for human survival and reproduction, and it derives from play, itself an adaptation widespread among animals with flexible behaviors.”

Some people might find it hard to believe that something instinctive and important for human survival could also be so entertaining, enjoyable, and often inconsequential. But humans persist in having sex even when there is no chance for reproduction.

Boyd also argues that just because something serves a purpose, it doesn’t mean that it can’t serve others as well.

“Eyes evolved for vision, but we also use them for communications: hence our contrastive white sclera, which highlight the direction of another’s gaze, and our highly refined capacities for registering and inferring attention and intention from others’ eye direction. That we can now intimidate others with our stare does not refute the fact that eyes evolved for vision.”

The core of Boyd’s case is that art is a form of cognitive play. “We can define art as cognitive play with pattern. Just as play refined behavioral options over time by being self-rewarding, so art increases cognitive skills, repertoires, and sensitivities.” The role of play is to safely practice what we will eventually need to survive. We play at fighting because eventually we will fight for our lives. We play at storytelling because eventually the real story that plays out around us will determine our fitness to succeed.

Storytelling, and all art, is the tool and training ground for humans to learn and practice social interaction. Humans develop a sense of “event comprehension” while still in the cradle. Events, of course, are a key element of plot and narrative.

The brain works by strengthening paths that are used repeatedly, so we need to practice those skills necessary for survival. In the ultra-social world of humans, perceiving and understanding the honesty and intent of others is paramount.

Boyd describes it by saying,

“A work of art acts like a playground for the mind, a swing or a slide or a merry-go-round of visual or aural or social pattern. Art’s appeal to our preferences for pattern ensures that we expose ourselves to high concentrations of humanly appropriate information eagerly enough that over time we strengthen the neural pathways that process key patterns in open-ended ways.”

The portion of our brains that are uniquely human, the neocortex, is constructed differently from the older parts of the brain, and it functions differently, too. It works more like an executive, integrating signals from widely disparate facilities. This necessitates a mechanism for aiming the executive. Boyd asserts that this mechanism is attention.

Attention is what allows us to focus on the tiny behavioral cues of others, to determine their intent and to assess their validity. Conversely, it allows us to alter our behavior, so that the cues we send to others suit our own intentions. Stories become cognitive exercises that focus our attention on “perceived patterns of behavior in order to infer intent.”

Attention is remarkably important to humans. To a significant extent, attention is the true currency of human civilization. The maxim “survival of the fittest” seems to say that all of us evolved beasts are constantly in competition not only with our surroundings, but with each other. That is certainly true, but it is also true that we cooperate, and we often do so across large populations. By definition, any social species must cooperate effectively, and humans are exceptional in this regard.

“All social species prosper more together than alone, or they would not remain social, but humans take this to another level, ultrasociality, the most intense cooperativeness of all individualized animal societies. Not endowed by nature with formidable strength or speed, we have been able for hundreds of thousands of years to coordinate our activity sufficiently to kill large prey—and, for thousands of years to construct pyramids or cathedrals and settlements of thousands or even millions.”

Cooperation itself is a multi-faceted thing. At the lowest level is mutualism, wherein simply being near others of the same species is helpful; watching for predators, for example.

The next step, active cooperation, explains why parents look out for their children. But in this case, the direct tie to genes is obvious. How can cooperation be explained when those involved don’t share genes? Deduced from game theory, the answer is termed reciprocal altruism: “I help you in the expectation that you may help me later.” Of course, it is still very easy to cheat, so humans have evolved many cognitive tools for the detection, prevention, and punishment of cheaters. These uniquely human tools include

“Sympathy, so that I am inclined to help another; trust, so that I can offer help now and expect it will be somehow repaid later; gratitude, to include me, when I have been helped, to return the favor; shame, to prompt me to repay when I still owe a debt; a sense of fairness, so that I can intuitively gauge an adequate share or repayment; indignation, to spur me to break off cooperation with or even inflict punishment on a cheat; and guilt, a displeasure at myself and fear of exposure and reprisal to deter me from seeking the short-term advantages of cheating.”

Boyd says, “Rather than merely taking these emotions as givens, we can account for them as natural selection’s way of motivating widespread cooperation in highly social species.”

One of the more fascinating aspects of this book is its description of the empirical tools employed by evolutionary psychologists to explore their theories. Boyd describes the work of evolutionary psychologist Leda Cosmides attempting to discern the workings of the human anti-cheating mechanism. She found that although people struggled to solve problems in logical reasoning when presented in abstractions, people solved them easily when they were presented in terms involving cheating in social exchange.

Behaving in non-selfish ways is logical only if you see the bigger picture of cooperative groups, but people don’t think of groups, they simply behave according to their emotions. And those emotions are mechanisms that function regardless of logic. Our justice-detection mechanism, key to mutual altruism, often makes us behave in ways that defy the assumptions of logicians.

“In one experiment, the dictator game, two strangers play for (usually real) money, say $100. I, as dictator, must offer you a share. If you accept the division, both of us keep our agreed portions. If you reject the offer, neither of us receives anything. In terms of strict economic rationality, an offer of a dollar, even a cent, would leave the second participant better off, and should therefore be accepted. But … often if the sum offered is only a little under $40, the respondent rejects it. A sense of fairness in social exchange overrides the rational calculation of gain. We have evolved not to be ‘rational individuals’, profit maximizers, but social animals, holding others to fair dealings even at our own cost.”

Stories originated in the need for informing our justice systems; for monitoring people’s compliance with fairness. Gossip is the simplest and most widespread form of this, and ultimately, all fiction derives from it. Out of such simple mechanisms grow mighty civilizations.

Slowly, methodically, Boyd steers us away from conventional thinking about the role of art and narrative. The importance of art can be seen in its ubiquity. “Art” he says

“(1) is universal in human societies; (2) it has persisted over several thousand generations; (3) despite the vast number of actual and possible combinations of behavior in all known human societies, art has the same major forms (music and dance; the manual creation of visual design; story and verse) in all; (4) it often involves high costs in time, energy, and resources; (5) it stirs strong emotions, which are evolved indicators that something matters to an organism; (6) it develops reliably in all normal humans without special training, unlike purely cultural products such as reading, writing, or science. The fact that it emerges early in individual development—that young infants respond with special pleasure to lullabies and spontaneously play with colors, shapes, rhythms, sounds, words and stories—particularly supports evolutionary against nonevolutionary explanations.”

As I read this book, I assumed that Boyd was a scientist; an evolutionary biologist. At some point I glanced at the dust jacket and was surprised to find that he is a professor of English in New Zealand, and that he is “the world’s foremost authority on the works of Vladimir Nabokov.” The book is divided in two equal portions. The first half is pure evolutionary psychology, and I found it quite fascinating and informative. In the second half of the book, Boyd-the-English-Professor emerges. He presents two timeless works of literature from the evolutionary point of view. You can get a glimpse of his dual nature just by knowing that his selections are “The Odyssey” by Homer, and “Horton Hears a Who” by Dr. Seuss.

While I devoured the first half of the book, the literary second half seemed prolix and redundant to me. You may have a different experience. In any case, I recommend the book to anyone interested in evolutionary science, and in particular to any practitioner in the world of human-facing software.

Agile and Responsibility

Friday, February 12th, 2010

Agile gives developers something of great value: it gives them responsibility.

Agile developers take responsibility not just for the implementation quality, but for the correctness and appropriateness of the product they build. While agile is composed of several programming techniques, and several more team organization methods, the essential element is that individual programmers are expected to behave as though they have an ownership stake in what is being produced. They must act as responsible craftsmen, not as misunderstood, misdirected, over-qualified assembly-line workers.

Pre-agile development methods forced the judgment-intensive, knowledge-rich, internally-motivating needs of programming into a rusty, old-fashioned, command and control management hierarchy that was based on following orders, need-to-know, and external motivations. The fit was terrible, and programmers grew isolated in self-defense. Hunkering down protected them from disaster but it rarely resulted in work they could be proud of.

Nobody was happy with this situation, and both sides began to take corrective action. Business people began to send their development work offshore. They figured, “If I can’t get good quality, I might as well pay less for it.” At the same time, developers realized that the risk of changing their role was smaller than the risk of losing their job. Developers began to use minimalist running code as a tool for communicating with their business colleagues, and agile was born.

Rather than doing what the boss tells them to do, the agile developer engages the boss in an active, collaborative dialog, with running code as the focal point, to jointly determine the right software to build. The vital element of this new paradigm is a shift in power away from command and control and towards a more enlightened organizational form. Agile developers enjoy the increased responsibility and self-determination, and business people enjoy the benefits of better quality software reaching the market sooner. Everyone benefits from the cultural change away from the polarized, armed-camp, towards a more fraternal team sharing common goals.

I love my cat, but…

Saturday, February 6th, 2010

I love my cat but it’s hard to type like this.

I love my cat but sometimes he makes it hard to type.

Is the Microsoft Way counter-productive?

Saturday, February 6th, 2010

Dick Brass just published a fascinating op/ed piece discussing Microsoft’s inability to innovate. As a Vice President at Microsoft from 1997 to 2004, he had an excellent vantage point. At the risk of over-simplifying his argument, he lays the blame on the company’s over-competitive culture, and my experience supports this.

He says “when the competition becomes uncontrolled and destructive [it] creates a dysfunctional corporate culture in which the big established groups are allowed to prey upon emerging teams, belittle their efforts, compete unfairly against them for resources, and over time hector them out of existence.”

Not too many years ago Microsoft was the poster child for the post-industrial company; smarter people in a flatter organization with open lines of communications; an antidote to industrial age command and control. Copied by most software companies, it allowed them to build enterprise infrastructure and desktop productivity applications.

But the Microsoft Way hasn’t really satisfied the users of software or those who labor over its creation. And the Way has been largely abandoned by the new, nimble, social-networking and Web 2.0 applications entrepreneurs of the last few years. It has become clear that Microsoft’s Way was just a first approximation of the company of the future.

The model now emerging is based on cooperative, collaborative, self-organizing teams. All of the many creative disciplines in the high tech world resonate with its supportive warmth. Surprisingly, its leading proponents are programmers, and even the community of interaction designers is playing catch-up. Under the loose rubric of “agile”, they advocate teams that are “cross-skilled,” egalitarian, and self-organizing; they espouse the value of taking responsibility for one’s own actions. This new organizational meme is still morphing and trying to find its optimal configuration but it is clearly superior and will come to dominate the industry.

In the late 1980s I sold some software to Microsoft and got a glimpse into their culture. They not only wanted to buy my program, but they wanted to hire me too. Had I joined in 1988 I would be a very rich man today, but I turned them down, much to their consternation. At the time, I couldn’t have articulated why I didn’t want to join them, but this summation by Brass does a good job.

Back then if you were creative, inventive, aggressive, and ambitious, Microsoft was by far and away the best place to be. I’ve always thought of myself as possessing those attributes but, when the time for a decision came, I voted with my feet. I chose to indulge my devotion to creativity and invention by not subjecting them to the intense competition of aggressively ambitious co-workers at Microsoft.

Over the years, Microsoft’s approach has worked pretty well from a money point of view, and while a company that big may lose its leadership role, it will take a long, long time for it to diminish in fiscal size. But, as Brass points out, Microsoft’s approach doesn’t sustain it as a leader. For a company to innovate over the long haul, it can’t buy all of its intellectual property from outside its walls. It has to create a culture where innovators feel comfortable. And innovation can only thrive where competition isn’t dominant.

As Scott Berkun points out in his excellent book “The Myths of Innovation”, real breakthroughs tend to look silly at first blush. For example, if ten years ago I said that I imagined a portable MP3 player that only had one button, it would sound silly. Portable MP3 players existed back then, and they, like all techno-geek products, had lots of buttons on them. And they all failed. The overnight success of the iPod made the one-button concept orthodox, and its silliness forgotten. But what matters is how the culture reacts before its ideas are validated, or not, by the marketplace. Highly aggressive, competitive, and ambitious people are often uncomfortable with things that look silly, and Microsoft’s culture tends to reject ideas that don’t feel orthodox.

As an inventor, I haven’t really helped my own case, always focusing on the creative idea and expending little effort on the business case. Sometimes my vision of how the innovation will actually be deployed is simply wrong. Visual Basic is an excellent case in point. I originally sold it to Bill Gates to be a front-end shell for Windows. Whether or not it could have succeeded in that role is moot, but to Gates credit, he found a way to deploy my invention so that it would make many millions of dollars. But the central innovation didn’t originate at Microsoft, and I’ve no doubt that I couldn’t have invented it had I been working inside Microsoft.

There is another sub-culture that exhibits the same characteristics as Microsoft’s over-competitive, too-serious-for-silly-innovation mindset: high tech venture capital. While the aggressive, competitive, Menlo Park venture capital model has made lots of money over the years, it has little role in innovation today. As a group, VCs have enormous amounts of available capital, and a dearth of places to put it. Yet, when they do invest, they always demand performance metrics that are virtually guaranteed to create an overly-competitive, too-aggressive, deadly-serious culture that stifles silliness on the perceived path to “innovation.” Rational business minds tend to believe that success will somehow look successful when they first see it. They reject the notion that innovation might look wrong.

Over the years, I’ve brought several innovative ideas to many venture capitalists, and all of them have been rejected (for which I’m as grateful as I am about not joining Microsoft back in the day). At the time, each rejection mystified me, as the VCs themselves usually had a high regard for my past performance. They would squirm uncomfortably when I showed them prototypes embodying innovative and powerful new concepts. (For years I unsuccessfully pitched what, when finally invented by others, came to be known as tags and XML). Ultimately, they couldn’t put their fund’s money into something that was purely creative and innovative for the simple reason that it demanded them to believe that my silliness was a marker for eventual profit.

Contrary to lore, the VC world has never invested much in innovation. They restrict their participation to companies that only deliver proven products at a lower price point, or to later stage investments, after the aura of silliness has dissipated in the glow of market acceptance. VCs will continue to make money, but they won’t play a big role in innovation; the same fate as Microsoft.

Although innovation is the fuel for successful business, it doesn’t thrive in the culture of competitive business. The two are antithetical. While innovation can make lots of money for a company, until it has fully gestated, the one thing you won’t see in truly innovative thinking is the shadow of money and competition.

Craftsmanship Rant #729B

Thursday, February 4th, 2010

A new cyber-friend of mine, Bob MacNeal (@BobMacNeal) Twittered me a good question a few days ago. He wanted my opinion of the relationship between craftsmanship and beauty. I felt that a proper answer deserved more than 140 characters, so this is my reply to him.

I try to use the term “interaction design” to differentiate what I think is important from the many aspects of design that are amusing and nice-to-have, but ultimately, don’t improve the user’s experience.

To my great chagrin, some of the most tragic confusion about this differentiation exists in the minds of those who claim to be able to help agile developers by applying their personal variant of “design”. Bah!


It is about taking responsibility to make certain that one is solving the correct problem, and that the problem is solved correctly. In the world of software, aesthetic appeal might be on the list of desired goals, but it ALWAYS comes a far distant second to making the user feel successful and rewarded.

These wacky art-based designers try to make things look good, but it’s the behavior of the software that makes the difference. Britney Spears *LOOKS* good, but she acts like an imbecile. Her behavior is what makes the difference.

The most successful websites in history (Google, Amazon, Wikipedia) are tragically ugly. Some of the most beautiful sites in history are, well, history.

I use a simple test to differentiate art and design (or art-based-design versus craft-based-design), and it is this:

The artist can change the problem to suit his solution; the designer cannot. An artist takes tree and finds the beautiful artifact hidden within it. He might find a table or a railing or a chest. The problem morphs to exalt the solution (as an amateur woodworker, I am experienced with, and deeply sympathetic to this point of view).

On the other hand, if a craftsman is tasked with making a chair, he will make a chair, even if the wood is better suited to a table. If he is a *responsible* craftsman, he will attempt to collaborate, communicate and compromise with his client, so that the wood is, in fact, placed in its most exalted role, but he will produce a chair (he’ll also verify that a chair was indeed what the client wanted). That chair will be sturdy, economical, robust, and will delight its owner every time he places his butt-in-seat.

Now, the confusion comes from this simple fact: Good art depends on good craftsmanship. All good artists are also good craftsmen, and many good artists take their ART for granted and emphasize their CRAFT.

However, good craft does not necessarily demand good art, and many good craftsmen are weak at the art-based stuff. I probably could not *design* a Sam Maloof rocker, but I could *make* an excellent one (to be fair, many craftsmen take their craft for granted and emphasize the artistic nature of their work, but there is no real downside to this, unless the client values his time and money over his other goals).