post

Why Google+ works (UXtraordinary blog preview)

Excerpt from UXtraordinary:

I am thrilled to see Andy Herzfeld’s social circles concept implemented so beautifully in Google+. My semi-educated guess is that empowering users to define access to themselves according to their own purposes and needs—in context—will build engagement and strong loyalty. Why? Because it comes naturally.

Too many social media-based sites, including many social networks, provide only lip service to how users think about groups, let alone user privacy empowerment. A user is seen as one of their members, and the business creates a mental model in which the user is the center of a series of widening circles. “Empowerment” of user content privacy is typically limited to enabling permissions control within those circles.

Users don’t see themselves that way. People think of sharing information in terms of a constantly changing algorithm of need, purpose, and ability. We trust some friends more closely than others; we have acquaintances who know a great deal about us whom we barely know. It’s my belief that these clashing mental models are one of the primary reasons social media and social networks fail to engage.

Mental models of user grouping
Read more
.

post

How not to be overwhelmed by data

When dealing with vast amounts of data, how to prioritize it? Lois Beckett reports this approach, from John O’Neil, curator of the New York Times’ topic pages:

The most pressing criterion for what gets updated first, O’Neil said, is whether “we would feel stupid not having it there.”

Guess that’s as good a standard as any ;–)


Beckett, Lois (23 Feb. 2011). The context-based news cycle: editor John O’Neil on the future of The New York Times’ Topics Pages, Neiman Journalism Lab, Harvard.

post

Am I there yet? How progress bar dynamics drive users

Gavin Davies wrote a nice piece, Are we nearly there yet? The role of the progress bar on the web, discussing the four requirements of a useful progress bar in tracking software task completion. Per Davies, a good progress bar should be:

  • Accurate – watching a bar fill up gradually only to chug to a halt at around 90% can infuriate all but the most Zen. Worse still on the hair ripping scale are bars that fill up, only to empty and begin anew!
  • Responsive and smooth – the bar should be updated regularly to show that things are still working. This means that, on the web, we should update progress bars via Ajax rather than hefty page reloads. Research shows that a linear, consistent progress increase is better than the bar jerking around like a malfunctioning robot dancer.
  • Precise – the bar should show an estimate of time remaining, and perhaps other data such as percent or file size remaining so the user knows if he or she should start any long books in the interim.
  • Appropriate – before using a progress bar, consider carefully whether it is appropriate, both in terms of User Experience and technical feasibility.

I’d expand on this, and add that the progress bar is useful beyond the completion of a software task, but for personal accomplishments; “Am I there yet?” moments, if you will. For example, the LinkedIn progress bar prompts profile completeness.

The progress bar is actually a game element, which triggers the twin desires to both complete and compete. It’s invaluable in educational, social media, and other contexts. Like Seth Priebatsch of SCVNGR said, “Humans love progress bars. If you see a progress bar, you want to complete it.”

The same requirements apply that Davies suggests, but the nature of the progression dynamic changes. So, instead of time remaining, the user may have tasks remaining, or user-generated content, or a certain amount of time using the application or exploring the site. These progress bar milestones can themselves incorporate game elements, becoming a quest for users for which progress bar completion is only one of many rewards.

Apart from games themselves, game mechanics (or gamification) have been primarily used in the educational field, although it’s been spreading through interaction design since the nineties. Those interested in exploring the field, a good start is Clark Aldrich, a true guru at designing “serious games” and simulations.

Cross-posted from UXtraordinary, my professional blog. Related UXtraordinary post: Fun is Fundamental.

post

Simplicity is not a goal but a tool

Simplicity in design is not a goal but a tool. The goal is the need of the moment: to sell a product, to express an opinion, to teach a concept, to entertain. While elegance and optimal function in design frequently overlaps with simplicity, there are times that simplicity is not only not possible but hurts usability. Yet many designers do not understand this, and over the years, I’ve seen the desire to “keep it simple, stupid,” lead to poor UX.

I was therefore glad to see Francisco Inchauste’s well-thought, longer version of Einstein’s “as simple as possible, but no simpler” remark.

From the column:

As an interactive designer, my first instinct is to simplify things. There is beauty in a clean and functional interface. But through experience I’ve found that sometimes I can’t remove every piece of complexity in an application. The complexity may be unavoidably inherent to the workflow and tasks that need to be performed, or in the density of the information that needs to present. By balancing complexity and what the user needs, I have been able to continue to create successful user experiences.

Plus, as I’ve commented before, messy is fun!

[Cross-posted from UXtraordinary.]


post

The biggest barrier to UX implementation

My response to a LinkedIn UX Professionals question, Why they don’t like to spend or invest in the User Experience tasks?

My personal experience has been that ignorance is the largest barrier to UX implementation. While there are many exceptions, too often do developers, marketers, executive management, or others with a large level of control over UX strategy and tactical development feel that user experience is simply “common sense.” They believe that they are users, and therefore they have insight into the process. This is natural.

It’s the responsibility of UX professionals to educate them and evangelize the value of user experience. (Though it’s nice if you can get executive support, it’s frequently not there.) At my current company, I approached this from several angles:

  • I held one-on-one meetings with stakeholders and others, seeking to understand their needs and start a conversation about possible UX solutions.
  • I wrote and presented brown bags, open to all, on subjects like Why Taxonomy Matters: Taxonomy and the User Experience, in order to promote understanding of UX and its considerations.
  • I introduced concepts designed to make people think more from the user perspective. For example, like many sites we’re interested in user-generated content. I expanded this to user-generated experience (a concept I’d already developed from previous social media work and user analysis), and measured/discussed user-generated activity. The point, of course, was that thinking about user activity required thinking about user flow and perspective. Eventually key stakeholders were talking about UGA as a matter of course, and we even discovered ways to convert some UGA into UGC.

This was successful enough that UX became a standard consideration in not just design, but product strategy. It is of course beyond your control what others do with your information – but you have to provide it!

People understand success. Show your co-workers and management how UX solves their problems. Provide numbers, using performance indicators that matter to your audience. Present before/after case studies. Remember to focus on solutions, not problems (never show a problem for which you don’t have a suggested solution). In short, provide the best possible user experience for your internal customers.


Update

Ahmed Kamal, the person who posed the question, responded positively to my comment:

Alex O’Neal, I raise my hat! I appreciate it really! your comments are really touching, reflecting a real long experience, comprehensive and concluding the problem and how to solve it!!

Aw, shucks :–)

[Cross-posted on UXtraordinary.]

link

The BBC’s style guide. An exceptional, best-in-class example of what a style guide should do.

Also of interest: a BBC’s blog post on their goal of a new global visual language .

post

Nature and nurture, not nature or nurture

Recent research reported in Scientific American shows that for some people, mother’s milk may promote a higher IQ. That’s all very well and good, but it was this paragraph that had me skipping with joy:

As for the study’s implications on the nature / nurture debate, Linda Gottfredson, a professor of education at the University of Delaware, says that a person’s DNA is not really a blueprint, as it is commonly portrayed. “[Genes] are more like playbooks,” she says. “It’s not nature or nurture, but your genes operate frequently by making you more susceptible or less susceptible to certain environmental conditions.”

I find this a beautiful example of stepping away from the tyranny of dichotomy (I’ve been saying “nature and nurture” since high school). One of my pet peeves is either-or conceptualizing being applied to more complex discussions. Such black-and-white, right-or-wrong pigeon holing erodes critical thought.

post

Good blog from founder on IxDA Sturm und Drang

Off-site comment capture of the moment:

David Malouf, one of the founders of IxDA (the Interaction Design Association), has heard a lot of grief about a recent site redesign. He shared his thoughts on the subject in an open letter . Definitely worth reading.

My comment:

Personally, I see great improvement in the new site. I wasn’t able to contribute to it myself, but I’m impressed at what was done through donated, as-possible time by many people.

I think a lot of the hostility is probably misdirected energy from the economy, a lack of empowerment at work, a need to set themselves apart in the job market, etc. Sadly, it’s always easier to put down another’s work and show “expertise” with criticism, rather than to do something well yourself. (This is not to say that criticism isn’t a valid skill, just that when it exists in a vacuum it’s not very helpful.)

What about raising funds via IxDA-backed training? Leverage local IxDA members to provide reliable training in specific areas. IxDA sets the standards and accepts the payments; the trainer gets the street cred of IxDA backing, and half the money (via PayPal or whatever).

I’ll volunteer to author/present on how taxonomy affects interaction design :–)

link

Much more than just a pretty image, this is a full-featured interactive web application that showcases the periodic table from a variety of different perspectives. My favorite? The orbitals view. Mouse down the columns in this view to get a visual of the similarities between the elements.

post

Saving face for the other guy works

Off site comment capture:

Schott’s Vocab , The New York Times, requested face-saving excuses readers regularly turned to. Mine actually got marked as an Editor’s Highlight (rare for me), and received a whopping (again, for me) 15 “recommended” clicks ;–)

In a reverse approach, but still aimed at smoothing out the occasional awkwardness:

As someone with a lot of diabetes in the family history (though thankfully I’ve escaped it so far!), I avoid sugar in colas, etc. Sometimes I go through a drive-through and I’m uncertain the person gave me a diet soda (they didn’t mark the top of the cup, or didn’t repeat it back to me). But people typically don’t like their ability to perform simple tasks questioned, so I put the onus on me: “Did I remember to ask for diet soda?” They happily tell me yes, and I am reassured.

This “it’s my fault” approach works in a variety of situations.

post

Evolutional UX

[Cross-posted from UXtraordinary.]

I subscribe to the school of evolutional design. In evolution, species change not to reach for some progressively-closer-to-perfection goal, but in response to each other and their ever-changing environment. My user experience must do likewise.

Rather than reach for pixel-perfect, which is relatively unattainable outside of print, (and is probably only “perfect” to myself and possibly my client), I reach for what’s best for my users, which is in the interests of my client. I expect that “best” to change as my users change, and as my client’s services/products change. This approach makes it much easier to design for UX.

Part of evolutional design is stepping away from the graceful degradation concept. The goal is not degraded experience, however graceful, but differently adapted experience. In other words, it’s not necessary that one version of a design be best. Two or three versions can be equally good, so long as the experience is valuable. Think of the differences simply resizing a window can have on well-planned liquid design, without hurting usability. Are the different sizes bad? Of course not.

This approach strongly supports behavioral design, in which design focuses on the behavior and environment of the user. You might be designing for mobile, or a laptop, or video, or an e-newsletter; you might be designing for people being enticed to cross a pay wall, or people who have already paid and are enjoying your service. You might be appealing to different demographics in different contexts. Evolutional UX thinks in terms of adaptation within the digital (and occasionally analog) ecology.

Evolutional UX also reminds the designer that she herself is part of an evolving class of worker, with many species appearing and adapting and mutating and occasionally dying out. We must adapt, or fall out of the game—and the best way to do that is to design for your ever-changing audience and their ever-changing tools.

And now, some words of wisdom from that foremost evolutional ecologist, Dr. Seuss. Just replace the “nitch” spelling with “niche” and you’ve got sound ecological theory, as every hermit crab knows.

And NUH is the letter I use to spell Nutches,
Who live in small caves, known as Nitches, for hutches.
These Nutches have troubles, the biggest of which is
The fact there are many more Nutches than Nitches.
Each Nutch in a Nitch knows that some other Nutch
Would like to move into his Nitch very much.
So each Nutch in a Nitch has to watch that small Nitch
Or Nutches who haven’t got Nitches will snitch.


post

Technological evolution in the Cambrian age

Another comment left in the wild, this time in response to the deeply wonderful Irving Wladawsky-Berger’s post about The Data Center in the Cambrian Age. I strongly recommend it. In the meantime, here’s what I had to say:

Great post! Reminds me of Stephen Jay Gould’s Wonderful Life, which discusses the explosion of fauna of the Burgess Shale. That book transformed my understanding not just of biology, but of creativity and human development. Since then I’ve observed this effect in other areas.

For example, movies of the ’20s and ’30s used techniques set aside in later decades, as the industry determined what they thought most appealed to the market. Ironically, some of these were then called innovative when re-used by more modern directors. Typography has gone through a similar pattern of evolution as well.

The interesting thing about an explosion of human technological innovation is that unlike competing animal species, whose success is just as largely due to chance as well as adaptation, humans can at least partly evaluate the value of a new idea. But in the marketplace, established companies using older approaches can crush new ideas and better approaches. Humans have to leverage the internet, governments, and their purchasing power to make sure we know *all* our options, and can choose the best one for our needs, not an abstract corporate entity’s profit line.

post

Page title versus Call to action: who wins?

Off-site comment of the moment:

An IxDA member posted a question about whether or not the header of a page should take precedence over the primary call to action. My response is below. Essentially, I believe the very act of forcing a choice between the two is a mistake.

Re: [IxDA Discuss] Visual Importance of Page Titles
Tue, 24 Feb 2009 13:52:37

I think you can, and should, have both. The page title can be obvious and clear to the user, while there can also be a clear call to action (your marketing splash). Your title is not only the description of the page’s content, it’s part of the navigational scent of the site, and as such is can’t be left out. This in no way stops you from having a clear “center” to the page’s content. In fact, having a primary focus to your content is better than scattering the focus.

If marketing wants to leave the title off completely, that’s a problem not only in usability but in SEO. There’s a reason the H1 tags are important to search engines – they’re important to users! And as search engines overcome the Heisenbergian issue of depth of analysis vs breadth of pages, things like semantically sound page construction will become even more important.

Alex

P.S. As an aside, I would recommend trying to eliminate either-or questions in design. Frequently there are many choices, not just two, and in the few cases where their truly are only two options, the attempt to find more will be the exception that proves the rule. :–) The tyranny of dichotomy limits much more than it resolves.

post

Thomas Huxley’s letter, on the death of his son

The below is from Leonard Huxley’s The Life and Letters of Thomas Henry Huxley, courtesy of Project Gutenberg. Quoted by Stephen Jay Gould.

A letter written in response to well-meant advice from Cambridge professor and priest Charles Kingsley.


14, Waverley Place, September 23, 1860.

My dear Kingsley,

I cannot sufficiently thank you, both on my wife’s account and my own, for your long and frank letter, and for all the hearty sympathy which it exhibits–and Mrs. Kingsley will, I hope, believe that we are no less sensible of her kind thought of us. To myself your letter was especially valuable, as it touched upon what I thought even more than upon what I said in my letter to you. My convictions, positive and negative, on all the matters of which you speak, are of long and slow growth and are firmly rooted. But the great blow which fell upon me seemed to stir them to their foundation, and had I lived a couple of centuries earlier I could have fancied a devil scoffing at me and them–and asking me what profit it was to have stripped myself of the hopes and consolations of the mass of mankind? To which my only reply was and is–Oh devil! truth is better than much profit. I have searched over the grounds of my belief, and if wife and child and name and fame were all to be lost to me one after the other as the penalty, still I will not lie.

And now I feel that it is due to you to speak as frankly as you have done to me. An old and worthy friend of mine tried some three or four years ago to bring us together–because, as he said, you were the only man who would do me any good. Your letter leads me to think he was right, though not perhaps in the sense he attached to his own words.

To begin with the great doctrine you discuss. I neither deny nor affirm the immortality of man. I see no reason for believing in it, but, on the other hand, I have no means of disproving it.

Pray understand that I have no a priori objections to the doctrine. No man who has to deal daily and hourly with nature can trouble himself about a priori difficulties. Give me such evidence as would justify me in believing anything else, and I will believe that. Why should I not? It is not half so wonderful as the conservation of force, or the indestructibility of matter. Whoso clearly appreciates all that is implied in the falling of a stone can have no difficulty about any doctrine simply on account of its marvellousness. But the longer I live, the more obvious it is to me that the most sacred act of a man’s life is to say and to feel, “I believe such and such to be true.” All the greatest rewards and all the heaviest penalties of existence cling about that act. The universe is one and the same throughout; and if the condition of my success in unravelling some little difficulty of anatomy or physiology is that I shall rigorously refuse to put faith in that which does not rest on sufficient evidence, I cannot believe that the great mysteries of existence will be laid open to me on other terms. It is no use to talk to me of analogies and probabilities. I know what I mean when I say I believe in the law of the inverse squares, and I will not rest my life and my hopes upon weaker convictions. I dare not if I would.

Measured by this standard, what becomes of the doctrine of immortality?

You rest in your strong conviction of your personal existence, and in the instinct of the persistence of that existence which is so strong in you as in most men.

To me this is as nothing. That my personality is the surest thing I know–may be true. But the attempt to conceive what it is leads me into mere verbal subtleties. I have champed up all that chaff about the ego and the non-ego, about noumena and phenomena, and all the rest of it, too often not to know that in attempting even to think of these questions, the human intellect flounders at once out of its depth.

It must be twenty years since, a boy, I read Hamilton’s essay on the unconditioned, and from that time to this, ontological speculation has been a folly to me. When Mansel took up Hamilton’s argument on the side of orthodoxy (!) I said he reminded me of nothing so much as the man who is sawing off the sign on which he is sitting, in Hogarth’s picture. But this by the way.

I cannot conceive of my personality as a thing apart from the phenomena of my life. When I try to form such a conception I discover that, as Coleridge would have said, I only hypostatise a word, and it alters nothing if, with Fichte, I suppose the universe to be nothing but a manifestation of my personality. I am neither more nor less eternal than I was before.

Nor does the infinite difference between myself and the animals alter the case. I do not know whether the animals persist after they disappear or not. I do not even know whether the infinite difference between us and them may not be compensated by THEIR persistence and MY cessation after apparent death, just as the humble bulb of an annual lives, while the glorious flowers it has put forth die away.

Surely it must be plain that an ingenious man could speculate without end on both sides, and find analogies for all his dreams. Nor does it help me to tell me that the aspirations of mankind–that my own highest aspirations even–lead me towards the doctrine of immortality. I doubt the fact, to begin with, but if it be so even, what is this but in grand words asking me to believe a thing because I like it.

Science has taught to me the opposite lesson. She warns me to be careful how I adopt a view which jumps with my preconceptions, and to require stronger evidence for such belief than for one to which I was previously hostile.

My business is to teach my aspirations to conform themselves to fact, not to try and make facts harmonise with my aspirations.

Science seems to me to teach in the highest and strongest manner the great truth which is embodied in the Christian conception of entire surrender to the will of God. Sit down before fact as a little child, be prepared to give up every preconceived notion, follow humbly wherever and to whatever abysses nature leads, or you shall learn nothing. I have only begun to learn content and peace of mind since I have resolved at all risks to do this.

There are, however, other arguments commonly brought forward in favour of the immortality of man, which are to my mind not only delusive but mischievous. The one is the notion that the moral government of the world is imperfect without a system of future rewards and punishments. The other is: that such a system is indispensable to practical morality. I believe that both these dogmas are very mischievous lies.

With respect to the first, I am no optimist, but I have the firmest belief that the Divine Government (if we may use such a phrase to express the sum of the “customs of matter”) is wholly just. The more I know intimately of the lives of other men (to say nothing of my own), the more obvious it is to me that the wicked does NOT flourish nor is the righteous punished. But for this to be clear we must bear in mind what almost all forget, that the rewards of life are contingent upon obedience to the WHOLE law–physical as well as moral–and that moral obedience will not atone for physical sin, or vice versa.

The ledger of the Almighty is strictly kept, and every one of us has the balance of his operations paid over to him at the end of every minute of his existence.

Life cannot exist without a certain conformity to the surrounding universe–that conformity involves a certain amount of happiness in excess of pain. In short, as we live we are paid for living.

And it is to be recollected in view of the apparent discrepancy between men’s acts and their rewards that Nature is juster than we. She takes into account what a man brings with him into the world, which human justice cannot do. If I, born a bloodthirsty and savage brute, inheriting these qualities from others, kill you, my fellow-men will very justly hang me, but I shall not be visited with the horrible remorse which would be my real punishment if, my nature being higher, I had done the same thing.

The absolute justice of the system of things is as clear to me as any scientific fact. The gravitation of sin to sorrow is as certain as that of the earth to the sun, and more so–for experimental proof of the fact is within reach of us all–nay, is before us all in our own lives, if we had but the eyes to see it.

Not only, then, do I disbelieve in the need for compensation, but I believe that the seeking for rewards and punishments out of this life leads men to a ruinous ignorance of the fact that their inevitable rewards and punishments are here.

If the expectation of hell hereafter can keep me from evil-doing, surely a fortiori the certainty of hell now will do so? If a man could be firmly impressed with the belief that stealing damaged him as much as swallowing arsenic would do (and it does), would not the dissuasive force of that belief be greater than that of any based on mere future expectations?

And this leads me to my other point.

As I stood behind the coffin of my little son the other day, with my mind bent on anything but disputation, the officiating minister read, as a part of his duty, the words, “If the dead rise not again, let us eat and drink, for to-morrow we die.” I cannot tell you how inexpressibly they shocked me. Paul had neither wife nor child, or he must have known that his alternative involved a blasphemy against all that was best and noblest in human nature. I could have laughed with scorn. What! because I am face to face with irreparable loss, because I have given back to the source from whence it came, the cause of a great happiness, still retaining through all my life the blessings which have sprung and will spring from that cause, I am to renounce my manhood, and, howling, grovel in bestiality? Why, the very apes know better, and if you shoot their young, the poor brutes grieve their grief out and do not immediately seek distraction in a gorge.

Kicked into the world a boy without guide or training, or with worse than none, I confess to my shame that few men have drunk deeper of all kinds of sin than I. Happily, my course was arrested in time–before I had earned absolute destruction–and for long years I have been slowly and painfully climbing, with many a fall, towards better things. And when I look back, what do I find to have been the agents of my redemption? The hope of immortality or of future reward? I can honestly say that for these fourteen years such a consideration has not entered my head. No, I can tell you exactly what has been at work. “Sartor Resartus” led me to know that a deep sense of religion was compatible with the entire absence of theology. Secondly, science and her methods gave me a resting-place independent of authority and tradition. Thirdly, love opened up to me a view of the sanctity of human nature, and impressed me with a deep sense of responsibility.

If at this moment I am not a worn-out, debauched, useless carcass of a man, if it has been or will be my fate to advance the cause of science, if I feel that I have a shadow of a claim on the love of those about me, if in the supreme moment when I looked down into my boy’s grave my sorrow was full of submission and without bitterness, it is because these agencies have worked upon me, and not because I have ever cared whether my poor personality shall remain distinct for ever from the All from whence it came and whither it goes.

And thus, my dear Kingsley, you will understand what my position is. I may be quite wrong, and in that case I know I shall have to pay the penalty for being wrong. But I can only say with Luther, “Gott helfe mir, Ich kann nichts anders.”

I know right well that 99 out of 100 of my fellows would call me atheist, infidel, and all the other usual hard names. As our laws stand, if the lowest thief steals my coat, my evidence (my opinions being known) would not be received against him. [The law with respect to oaths was reformed in 1869.]

But I cannot help it. One thing people shall not call me with justice and that is–a liar. As you say of yourself, I too feel that I lack courage; but if ever the occasion arises when I am bound to speak, I will not shame my boy.

I have spoken more openly and distinctly to you than I ever have to any human being except my wife.

If you can show me that I err in premises or conclusion, I am ready to give up these as I would any other theories. But at any rate you will do me the justice to believe that I have not reached my conclusions without the care befitting the momentous nature of the problems involved.

And I write this the more readily to you, because it is clear to me that if that great and powerful instrument for good or evil, the Church of England, is to be saved from being shivered into fragments by the advancing tide of science–an event I should be very sorry to witness, but which will infallibly occur if men like Samuel of Oxford are to have the guidance of her destinies–it must be by the efforts of men who, like yourself, see your way to the combination of the practice of the Church with the spirit of science. Understand that all the younger men of science whom I know intimately are ESSENTIALLY of my way of thinking. (I know not a scoffer or an irreligious or an immoral man among them, but they all regard orthodoxy as you do Brahmanism.) Understand that this new school of the prophets is the only one that can work miracles, the only one that can constantly appeal to nature for evidence that it is right, and you will comprehend that it is of no use to try to barricade us with shovel hats and aprons, or to talk about our doctrines being “shocking.”

I don’t profess to understand the logic of yourself, Maurice, and the rest of your school, but I have always said I would swear by your truthfulness and sincerity, and that good must come of your efforts. The more plain this was to me, however, the more obvious the necessity to let you see where the men of science are driving, and it has often been in my mind to write to you before.

If I have spoken too plainly anywhere, or too abruptly, pardon me, and do the like to me.

My wife thanks you very much for your volume of sermons.

Ever yours very faithfully,

T.H. Huxley.

post

The tyranny of dichotomy

An informational cascade is a perception—or misperception—spread among people because we tend to let others think for us when we don’t know ourselves. For example, recently John Tierney (tierneylab.blog.nytimes.com) discussed the widely held belief but little-supported belief that too much fat is nutritionally bad. Peter Duesberg contends that the HIV hypothesis for AIDS is such an error.

Sometimes cultural assumptions can lead to such errors. Stephen Gould described countless such mistakes, spread by culture or simple lack of data, in The Mismeasure of Man. Gould points out errors such as reifying abstract concepts into entities that exist apart from our abstraction (as has been done with IQ), and forcing measurements into artificial scales, both assumptions that spread readily within and without the scientific community without any backing.

Mind, informational cascades do not have to be errors—one could argue that the state of being “cool” comes from an informational cascade. Possibly many accurate understandings come via informational cascades as well, but it’s harder to demonstrate those because of the nature of the creatures.

It works like this: people tend to think it binary, all-or-nothing terms. Shades of gray do not occur. In fact, it seems the closest we come to a non-binary understanding of a concept is to have many differing binary decisions about related concepts, which balance each other out.

So, in the face of no or incomplete information, we take our cues from the next human. When Alice makes a decision, she decides yes-or-no; then Bob, who knows nothing of the subject, takes his cue from Alice in a similarly binary fashion, and Carol takes her cue from Bob, and so it spreads, in a cascade effect.

Economists and others rely on this binary herd behavior in their calculations.

But.

The problem is that people don’t always think this way; therefore people don’t have to think this way. Some people seem to have the habit of critical thought at an early age. As well, the very concept of binary thinking seems to fit too neatly into our need to measure. It’s much easier to measure all-or-nothing than shades of gray, so a model that assumes we behave in an all-or-nothing manner can easily be measured, and is therefore more easily accepted within the community of discourse.

Things tend to be more complex than we like to acknowledge. As Stephan Wolfram observed in A New Kind of Science,

One might have thought that with all their successes over the past few centuries the existing sciences would long ago have managed to address the issue of complexity. But in fact they have not. And indeed for the most part they have specifically defined their scope in order to avoid direct contact with it.

Which makes me wonder if binary classification isn’t its own informational cascade. In nearly every situation, there are more than two factors and more than two options.

The tradition of imposing a binary taxonomy our world goes back a long way. Itkonen (2005) speaks about the binary classifications that permeate all mythological reasoning. By presenting different quantities as two aspects of the same concept, they are made more accessible to the listener. By placing them in the concept the storyteller shows their similarities, and uses analogical reasoning to reach the audience.

Philosophy speaks of the law of the excluded middle—something is either this or that, and not an in between—but this is a trick of language. A question that asks for only a yes or no answer does not allow for responses such as both or maybe.

Neurology tells us that neurons either fire or they don’t. But neurons are much more complex than that. From O’Reilly and Munakata’s Computational Explorations in Cognitive Neuroscience (italics from the authors):

In contrast with the discrete boolean logic and binary memory representations of standard computers, the brain is more graded and analog in nature… Neurons integrate information from a large number of different input sources, producing essentially a continuous, real valued number that represents something like the relative strength of these inputs… The neuron then communicates another graded signal (its rate of firing, or activation) to other neurons as a function of this relative strength value. These graded signals can convey something like the extent or degree to which something is true….

Gradedness is critical for all kinds of perceptual and motor phenomena, which deal with continuous underlying values….

Another important aspect of gradedness has to do with the fact that each neuron in the brain receives inputs from many thousands of other neurons. Thus, each individual neuron is not critical to the functioning of any other—instead, neurons contribute as part of a graded overall signal that reflects the number of other neurons contributing (as well as the strength of their individual contribution). This fact gives rise to the phenomenon of graceful degradation, where function degrades “gracefully” with increasing amounts of damage to neural tissue.

So, now we have a clue that binary thinking may be an informational cascade all its own, what do we do about it? That’s a subject for another post.


References

  • Itkonen, E. (2005). Analogy as structure and process: Approaches in linguistics, cognitive psychology and philosophy of science. Amsterdam: John Benjamins Publishing.
  • O’Reilly, R.C., and Y. Munakata. (2000). Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. Cambridge, MA: MIT Press.
post

Messy is fun: stepping away from Occam’s Razor

The scientific method is the most popular form of scientific inquiry, because it provides measurable testing of a given hypothesis. This means that once an experiment is performed, whether the results were negative or positive, the foundation on which you are building your understanding is a little more solid, and your perspective a little broader. The only failed experiment is a poorly designed one.

So, how to design a good experiment? The nuts and bolts of a given test will vary according to the need at hand, but before you even go about determining what variable to study, take a step back and look at the context. The context in which you are placing your experiment will determine what you’re looking for and what variables you choose. The more limited the system you’re operating in, the easier your test choices will be, but the more likely you are to miss something useful. Think big. Think complicated. Then narrow things down.

But, some say, simple is good! What about Occam’s razor and the law of parsimony (entities should not be unnecessarily multiplied)?

Occam’s razor is a much-loved approach that helps make judgment calls when no other options are available. It’s an excellent rule of thumb for interpreting uncertain results. Applying Occam’s razor, you can act “as if” and move on to the next question, and go back if it doesn’t work out.

Still, too many people tend to use it to set up the context of the question, unconsciously limiting the kind of question they can ask and limiting the data they can study. It’s okay to do this consciously, by focusing on a simple portion of a larger whole, but not in a knee-jerk fashion because “simple is better.” Precisely because of this, several scientists and mathematicians have suggested anti-razors. These do not necessarily undermine Occam’s razor. Instead, they phrase things in a manner that helps keep you focused on the big picture.

Some responses to Occam’s concept include these:

Einstein: Everything should be as simple as possible, but no simpler.

Leibniz: The variety of beings should not rashly be diminished.

Menger: Entities must not be reduced to the point of inadequacy.

My point is not that Occam’s razor is not a good choice in making many decisions, but that one must be aware that there are alternative views. Like choosing the correct taxonomy in systematics, choosing different, equally valid analytic approaches to understand any given question can radically change the dialogue. In fact, one can think of anti-razors as alternative taxonomies for thought: ones that let you freely think about the messy things, the variables you can’t measure, the different perspectives that change the very language of your studies. You’ll understand your question better, because you’ll think about it more than one way (thank you, Dr. Minsky). And while you’ll need to pick simple situations to test your ideas, the variety and kind of situations you can look at will be greatly expanded.

Plus, messy is fun.


Note: Some of these thoughts sprang from a letter to the editor I posted on Salon.

Cross-posted on UXtraordinary.com.

post

Zombie ideas

In 1974 Robert Kirk wrote about the “zombie idea,” describing the concept that the universe, the circle of life, humanity, and our moment-to-moment existence could all have developed, identically with “particle-for-particle counterparts,” and yet lack feeling and consciousness. The idea is that evolutionally speaking, it is not essential that creatures evolved consciousness or raw feels in order to evolve rules promoting survival and adaptation. Such a world would be a zombie world, acting and reasoning but just not getting it (whatever it is).

I am not writing about Kirk’s idea. (At least, not yet.)

Rather, I’m describing the term in the way it was used in 1998, by four University of Texas Health Science Center doctors, in a paper titled, “Lies, Damned Lies, and Health Care Zombies: Discredited Ideas That Will not Die” (pdf). Here the relevant aspect of the term “zombie” is refusal to die, despite being killed in a reasonable manner. Zombie ideas are discredited concepts that nonetheless continue to be propagated in the culture.

While they (and just today, Paul Krugman) use the term, they don’t explicate in great detail. I thought it might be fun to explore the extent to which a persistent false concept is similar to a zombie.

  • A zombie idea is dead. For the vast majority of the world, the “world is flat” is a dead idea. For a few, though, the “world is flat” virus has caught hold, and this idea persists even in technologically advanced cultures.
  • A zombie idea is contagious. Some economists are fond of the concept of “binary herd behavior.” The idea is that when most people don’t know about a subject, they tend to accept the view of the person who tells them about it; and they tend to do that in an all-or-nothing manner. Then they pass that ignorant acceptance on to the next person, who accepts it just as strongly. (More about the tyranny of the dichotomy later.)So, when we’re children and our parents belong to Political Party X, we may be for Political Party X all the way, even though we may barely know what a political party actually is.
  • A zombie idea is hard to kill. Some zombie viruses are very persistent. For example, most people still believe that height and weight is a good calculator to determine your appropriate calorie intake. Studies, however, repeatedly show that height and weight being equal, other factors can change the body’s response. Poor gut flora, certain bacteria, and even having been slightly overweight in the past can mean that of two people of the same height and weight, one will eat the daily recommended calories and keep their weight steady, and one will need to consume 15% less in order to maintain the status quo. Yet doctors and nutritionists continue to counsel people to use the national guidelines to determine how much to eat.
  • A zombie idea eats your brain. Zombie ideas, being contagious and false, are probably spreading through binary thinking. A part of the brain takes in the data, marks it as correct, and because it works in that all-or-nothing manner, contradictory or different data has a harder time getting the brain’s attention. It eats up a part of brain’s memory, and by requiring more processing power to correct it, eats up your mental processing time as well. It also steals all the useful information you missed because your brain just routed the data right past your awareness, thinking it knew the answer.
  • Zombies are sometimes controlled by a sorcerer, or voodoo bokor. Being prey to zombie ideas leaves you vulnerable. If you have the wrong information, you are more easily manipulated by the more knowledgeable. Knowledge, says Mr. Bacon, is power.
  • Zombies have no higher purpose than to make other zombies. Closely related to the previous point. Even if you are not being manipulated, your decision-making suffers greatly when you are wrongly informed. You are also passing on your wrong information to everyone you talk to about it. Not being able to fulfill your own purposes, you are simply spreading poor data.

So we see that the tendency to irony is not just useful in and of itself, but useful in helping prevent zombie brain infections. As lunchtime is nearly over, and I can’t think of more similarities, I’m stopping here to get something to eat.

[Exit Alex stage right, slouching, mumbling, “Must…eat…brains.”]


Cross-posted in UXtraordinary.