Banner
    Can Brains Be Binary? Due To Context, Even Or Odd Is No Easy Feat For Our Minds
    By News Staff | December 23rd 2013 04:31 AM | 1 comment | Print | E-mail | Track Comments

    Like computers, our brains work on inductance. A switch is open or closed, a signal is passed. Brains follow rules, like computers. 

    But if the brain is like a computer, why do brains make mistakes that computers don't?

    Psychologist Gary Lupyan at the University of Wisconsin–Madison says that our brains stumble on even the simplest rule-based calculations because humans get caught up in contextual information, even when the rules are as clear-cut as separating even numbers from odd.

    Almost all adults understand that it's the last digit — and only the last digit —that determines whether a number is even. In a new study, that didn't keep them from mistaking a number like 798 for odd.

    A significant minority of people, regardless of their formal education, believe 400 is a better even number than 798, according to Lupyan, and also systematically mistake numbers like 798 for odd. After all, it is mostly odd, right?

    "Most of us would attribute an error like that to carelessness, or not paying attention," says Lupyan. "But some errors may appear more often because our brains are not as well equipped to solve purely rule-based problems."

    Asked in experiments to sort numbers, shapes, and people into simple categories like evens, triangles, and grandmothers, study subjects often broke simple rules in favor of context.

    For example, when asked to consider a contest open only to grandmothers and in which every eligible contestant had an equal chance of victory, people tended to think that a 68-year old woman with 6 grandchildren was more likely to win than a 39-year old woman with a newborn grandkid.

    "Even though people can articulate the rules, they can't help but be influenced by perceptual details," Lupyan says. "Thinking of triangles tends to involve thinking of typical, equilateral sorts of triangles. It is difficult to focus on just the rules that make a shape a triangle, regardless of what it looks like exactly."

    In many cases, eschewing rules is no big deal. In fact, it can be an advantage in assessing the unfamiliar.

    "This serves us quite well," Lupyan says. "If something looks and walks like a duck, chances are it's a duck."

    Unless it's a math test, where rules are absolutely necessary for success. Thankfully, humans have learned to transcend their reliance on similarity.

    "After all, although some people may mistakenly think that 798 is an odd number, not only can people follow such rules — though not always perfectly — we are capable of building computers that can execute such rules perfectly," Lupyan says. "That itself required very precise, mathematical cognition. A big question is where this ability comes from and why some people are better at formal rules than other people."

    That question may be important to educators, who spend a great deal of time teaching rules-based systems of math and science.

    "Students approach learning with biases shaped both by evolution and day-to-day experience," Lupyan says. "Rather than treating errors as reflecting lack of knowledge or as inattention, trying to understand their source may lead to new ways of teaching rule-based systems while making use of the flexibility and creative problem solving at which humans excel."



    Published in the journal Cognition.
    Source: University of Wisconsin-Madison

    Comments

    rholley
    That figures with me.

    Although a part of my work has been as a Math Historian, I have never been happy with the bald logical algebraic “even-odd” proof that the square root of two cannot be a rational number.  It makes my brain hurt.

    On the contrary, this quasi-geometric proof by John Conway (now at Princeton) I could grab immediately.  My notes at the time said: 
    This is John Conway’s 20th-century “geometric” proof. Assume that there is a minimum integer ratio of diagonal to side. One takes two squares with that integral side, and fits them into a square on the integral diagonal. One then makes smaller square A and B. Can you prove the contradiction?
    However, I say could, in the past tense, because since retirement a lot of that subject has been somewhat mothballed.  Though even as I write, it is coming back to me ...



    Robert H. Olley / Quondam Physics Department / University of Reading / England