How does tool design affect the result?

A while back, I visited Chocolaterie Tessa, an Austin confectionery with global accolades. The proprietor, Tessa Halstead opened her establishment for a free chocolate tasting and educational presentation. Most attendees were affiliated with the Creative Mornings organization.

After sharing the origin story of her profession, Tessa drew our attention to a squat device sitting on a nearby table. Imagine that a Dalek became romantically involved with water boiler and you’d be in the right visual territory. She described this machine with obvious pride — it was responsible for the international award that one of her caramels had earned in a recent competition.

It was a water-jacketed tempering vat for achieving the optimal temperature in caramel. The delicate interplay between butter and sugar is won and lost with proper heating.

You and I would make caramel on the stove in a pot slapped directly on the burner. If you wanted to be fancy, you’d post up a double boiler and a candy thermometer to keep things in check.

Tessa’s machine is like a laser-guided heat-seeking ballistic missile compared to the primitive fire power of the cottage confectioner.

This device, this tool is designed to allow artisans such as Tessa to achieve remarkable outcomes.

In my experience, when people talk about design, they talk about outcomes. Very rarely do we discuss tools as outcomes themselves.

Tool design is often a response to a specific need, not an original pursuit. A dull knife will suffice in a pinch. A mangled screwdriver is serviceable, though not preferred.

We make do with the tools at hand, because what matters is the outcome.

Step back and think about tool design for the outcomes you are responsible for. As a professional, you know how to get things done. The tools you use are a means to an end, not an end in themselves.

The trouble with this thinking is that it leads people to overlook opportunities to redesign their tools. When you change the process, not just with sharper knives and crisp screwdrivers, but with radical new approaches to the tooling itself — you open up wild possibilities.

Consider your tools. The caramel reflects the temperature of the vat. The floorboard holds an inverse image of the saw’s teeth. The word document follows the constraints of the document processing software.

Tessa uses a water-jacketed tempering vat to create an award-winning caramel. Her tool facilitates an exceptional outcome. Tools and process are integral; they might be euphemisms.

The transformative possibilities come when you attempt to reach the same outcome with unfamiliar means. Perhaps designing new tools can lead to completely unprecedented results, not just the same old things made better, faster, cheaper, stronger.

Why do children love breakfast cereal?

* As part of unpacking questions on this blog, I’m going to be applying Ken Wilbur’s four quadrant or integral model. In short, I’ll be looking at each question from four separate angles: the objective, the subjective, the isolated and the collective. I abbreviate this as OSIC, which is shocking close to the Inuit word – oosik – for the penis bone of a walrus – incidentally used as a lethal weapon. Draw your own conclusions; mnemonic hooks are helpful.

Back to breakfast cereal.

The humble origins of today’s breakfast cereal come from porridge. Soaking and cooking grains makes them much easier to digest, and requires less mechanical transformation than bread. Mr. Kellogg reportedly took a corn mush and ran it through some rollers and let it dry, the result was cornflakes. These could be reconstituted with a bit of milk.

Growing up, I was stunned to discover that “cereal” was the term applied to the grasses from which Cheerios and such were derived and the not the other way round.The word cereal comes from the Roman goddess of agriculture, Ceres.

In my mind, cereal came from a box, not a plant, much less grass. The idiomatic use of “cereal” is still strange to me.

I adored cereal as a kid. I remember topping my Honey Nut Cheerios with extra honey and anxiously peering into a friend’s kitchen cupboard after a sleep-over, hoping for Cinnamon Toast Crunch (CTC) or something equally magical and sugary.

Therein lies the rub, porridge is a categorical letdown, whereas most breakfast cereal might as well be dessert!

My family ate lots of home-made granola, which was scrumptious in its own right. It did not provide the fascination that comes from eating a bowl of CTC. Frankly, breakfast cereal seems to have been conceived for children to consume: it enters the bowl in lovely bite-sized pieces that have been transformed into tiny pieces of toast, or cookies, or perfect puffed spheres. I remember wondering what Corn Puffs and Rice Krispies were made from…

They appeared nothing like the raw materials indicated by the name.

Just the other morning, I was pouring Honey Nut Cheerios into my daughter’s bowl and she remarked how easily they came out of the box. Most four-year-olds are ham-fisted when it comes to eating and so a meal that doesn’t need to be cut or wrestled onto a d$#@ spoon is a delight to the young.

Step aside Willy Wonka, you can gaggle on about everlasting gobstoppers, what I want to see is how they managed to color and flavor individual Fruit Loops!

There is also the matter of marketing. On Saturday morning you have a Captain telling that he’s put his Crunch in a box, and what’s more this time it tastes like peanut butter! I must partake in this maritime miracle.

Bill Waterson’s Chocolate Frosted Sugar Bombs (Calvin & Hobbes) strike me as a delightful excoriation of the industry behind breakfast cereal. The phrase “part of this complete breakfast,” is a subtle concession that this food is primarily garnish, not substance.

Nevertheless, breakfast cereal become a touchstone of youth, either because you ate too much and loved it, or you pined away for something more interesting than oatmeal and fried eggs.

Few things garner such consistently fond memories. Breakfast cereal becomes a shared identity.

What Makes a Good Question?

Given that I’m writing a “question blog,” I’ve dedicated quite a bit of thought to this question.

My nearly-4-year-old daughter has also spurred my thinking on this topic as well. For example,

”Daddy, are you making macaroni and cheese for supper?”

Helpful context: I am NOT making macaroni and cheese. It’s quite apparent that I’m preparing rice and beans. She’s watched and helped make both types of dishes – she knows the signs. However, I’m doing everything I can to avoid shutting down her curiosity with an over-corrective response.

”Sophia, think about your question.”

She examines the evidence.

”No. You’re making rice and beans.”

My suspicion is that she asked the original question because she was hoping, in her clever toddler brain, that Daddy could be swayed in his choice of dinner with a little suggestive line of questioning. It’s an angle that she is more than capable of using. There are times when she asked questions of this nature, where the answer is either contained within the question, or plainly accessible by turning up the steam in her miniature brain-pan dynamo.

As an adult, I’d be tempted to say that I rarely ask such inane questions. The truth is less favorable.

Thus I think the first criteria of asking a good question is that it shouldn’t be contained in the question itself. This requires reflecting before you ask the question.

The second criteria for a solid investigative question is that you not have the answer already in mind. You must learn to let go of the answer you expect. This requires you to be brave enough to ask what appear to be obvious questions, which can look similar to an inane questions. The key difference is whether the answer is contained in the question itself, if “yes” then it’s a poor question, if “no” then it’s probably worth considering.

There could be more criteria for asking good questions. If you have ideas, please add them in the comments.

Why are numbers actually big fat liars?

Sometimes marketing analytics is like this…


As a person who has worked in marketing I have spent a great deal of time involved in an activity that I shall call “plunging the metaphorical analytics toilet.”

Allow me to expound.

In marketing, tracking metrics is hugely important. It’s the reason that advertising and marketing professionals love to quote “Half the money I spend on advertising is wasted; the trouble is, I don't know which half,” and attribute it to a famous retailing magnate (I couldn’t verify the authenticity of the attribution, so I won’t perpetuate it).

The principle is simple, but nuanced: even in the age of “drill-down” targeting for media like pay-per-click ads, only a portion of the audience is interested in what you’re advertising. Even a perfectly positioned offer isn’t going to convert 100% of viewers. Thus, any money spent showing ads to people who aren’t interested is, effectively, wasted. The amount of waste varies greatly, but 50% has a very authoritative ring to it.

Back to the analytics toilet.

There are tons and tons of metrics you can track. You set up Google Analytics, maybe a few custom spreadsheets and boom, the plumbing works and the water (data) is flowing. Until it doesn’t. At which point you grab a plunger and a plumber’s snake and attack the blockage. You look for unmonitored traffic sources, or robot impressions or poor email delivery – anything that might be clogging your report and making it stink.

Which brings me to another thoughtless truism that people regurgitate in business “Numbers don’t lie.”

You might as well say that a sledge hammer doesn’t lie. Strictly speaking, that’s true, but I can use that hammer to drive tent stakes or pulverize a human head – intent is everything.

I contest that numbers are, in fact, the finest instrument ever conceived to broadcast and harmonize deceit. Enron is a choice example.

I have mis-reported numbers more times than I care to admit, not because I was trying to cover a problem, simply because I didn’t know they were wrong in the first place. Only later did I discover my error.

Numbers as measurement (conceding that there are other applications) are approximate, a tool of separation and aggregation, as sharp and sticky as a scalpel and superglue, or as dull and weak as a rolling pin and chewed gum.

Why do cashiers hand you the coins on top of your paper bills?

This post is a petty gripe, but I suspect that it's shared secretly by anyone who uses cash to buy things.

The transaction is pretty simple. Unless you start the day carrying coins, you will pay for something using a larger denomination than required – “change” must be made.

The cashier reviews the till readout and pulls the appropriate amount.

What follows is logical from one perspective: counting out the change to back-calculate how much you gave.

Example: You buy $16.73 in groceries and pay with a $20 bill. The cashier pulls three $1 bills, one quarter and two pennies.

“Seventeen, eighteen, nineteen and twenty-seven cents is $20”

The bills are placed in your hand and then the coins. It’s a simple, codified way to ensure the arithmetic is accurate.

Next you either wad the entire handful into your pocket, purse or wallet, or you have to get the coins into your free hand without them dumping on the floor (they slide off of the paper, like water off a ducks bill, (ha!)) and pull out your wallet to insert the bills and then put the coins in your pocket.

If the cashier would place the coins in your hand first, you could easily drop them into your pocket and the receive the paper bills.

I started this by saying it was a gripe, however, it’s one of those things that just happens and it’s inconvenient because it’s always been that way. The change counting scenario I gave above could be reversed and you would start with the coins. This has happened to me with a rare few cashiers and I always walk away smiling on the inside.

I’m not saying this is a huge inconvenience we need to fix right away. I am saying it’s a thoughtless act that is also inconvenient.

I might be overlooking a more compelling reason for the current system, but perhaps it’s just one of those “always been done that way” behaviors. Strange how such a specific behavior could be so widespread.

What happens after advertising?

A skeletal sign at a defunct self-carwash

A skeletal sign at a defunct self-carwash

To say that advertising is ubiquitous, is equivalent to saying that human beings wear clothing – my mental construct of a human being includes an unconscious acknowledgement of clothing. We cover our bodies, yet the object exists separate from the wrapper.

Advertising is to business as clothes are to human beings.

Business exists without advertising – but it’s awkward to behold.

Yes. I suspect you’re disagreeing with me. Business exists without advertising.

Even if we expand the definition of advertising to encompass word of mouth and any place the business name appears (receipts, legal documents, etc.), the exchange of value exists apart from it.

It is this characteristic of advertising that permits the question I’m asking.

We can’t very well conceive of the next evolution in human skin – because skin is an integral, critical organ.

Advertising is not. Theoretically speaking. Functionally speaking, values like growth, brand awareness, market saturation (as a spectrum, not a state of being), and competitive advantage are compromised in a world without advertising.

Human survival is compromised without clothing to insulate from the elements (hot and cold, light and abrasion). We might find other means to mitigate the elements, but clothing has developed as a practical means and a form of cultural expression.

So also has advertising developed as a practical means and cultural expression.

And where clothing has become highly specialized and effective. Advertising has become highly specialized and marginally more effective than nothing at all.

If you change the terms of what makes a business successful, you would also change the importance of advertising.

An economy could exist without advertising on every unclaimed surface. Advertising is just the best solution we’ve developed for gathering leads and establishing a presence.

If business values change, so will advertising. Human beings may never cease to wear clothing. It is difficult to prevent yourself from believing the same is true of advertising. It isn’t.

Why do human beings want to abdicate their intelligence?

In direct response marketing, the importance of credibility is difficult to overstate. An authoritative quote or endorsement can make or break a piece of sales copy. Find a doctor to quote. Find a scientific study to cite. Find 15 people who agree with your point of view and it’s possible to make it sound like an army.

This important because it assures the reader that the claims being made are substantive.

The first argument to be made is simple and sound: There is more information in the world than any one person can comprehend, let alone use to make an informed decision. People who have spent years in rigorous professional training are more likely to wield a high command of that information, at least in their area of expertise.

The concern comes when you equate a piece of paper (a diploma or license) with an authoritative command of the information.

This isn’t a new dynamic. Examine the history of world religions and you find people desperate to abdicate their decision-making responsibility (i.e. intelligence) to a priest, imam or cult leader.

The second argument to be made is more nuanced: Human beings have a limited amount of decision-making energy. Piggy-backing on the decision “work” of others conserves that energy for other decisions.

We don’t really have metrics for the reserve or expenditure of decision energy. If you don’t pay attention to the types of decisions that deplete your reserves, or the volume of decisions you’re making in a given time frame – you’ll run out.

We use trust in relationships to mitigate the decision “load.” If someone I know makes a decision that turns out well – I see the evidence – my trust in them grows. We begin sharing the decision work.

Again, once I conflate relationship trust and authority conferred by a piece of paper – I’ve abdicated a piece of my own intelligence. Conferred authority begets subsequent layers of authority – in short, bureaucracy (“if you say so”, “it’s the law”, “you’re the boss”, etc.).

Relational trust is more reliable than bureaucracy, but it doesn’t scale as well.

If our cultural focus is on economic growth as the greatest good, than bureaucracy will always win.

What if doors were designed to be opened with your feet?

The first time I saw a door designed this way was in a men’s restroom at a restaurant. A cast iron plate stuck out from the bottom of the door – slip your shoe underneath and pull the door open. Someone decided this would be an improvement on the sanitary technique of grasping the door handle with a paper towel.

I agree.

I don’t think the principle has been taken far enough. Opening a door with an armload of boxes is a trick. Lever-style handles manipulate without too much trouble. But far too often I find myself wedging my load against the doorjamb, slipping a hand out and swinging the door open before gravity yanks its inertial chain.

Imagine a door with a “handle” at foot-level. I suppose that makes it a “footle” or a foot-knob. You could step on the latch to push the door open, or pull it up to operate it in the other direction.

Of course, a door handle of this nature could cause temporary imbalance. However, if you’ve got a balance problem, just use the conventional handle.

It would appear (from page 1 of my Google search) that most design solutions aim at the sanitary benefits. This is such a narrow application, especially because they involve free swinging doors only.

Residential front doors are an obvious next step.

The next time you try to use a door with both hands full, consider how simple it would be if you could just use your foot instead.