Banner
    Emotional Developmental Symbol Creation
    By Samuel Kenyon | December 2nd 2012 07:39 PM | Print | E-mail | Track Comments
    About Samuel

    Software engineer, AI researcher, interaction designer (IxD), actor, writer, atheist transhumanist. My blog will attempt to synthesize concepts...

    View Samuel's Profile
    To create an artificial intelligence system that is similar to humans or other animals, it has to have some way to generate meaning. A potential mechanism of meaning is an emotional system which has built in symbol grounding.

    When I talk about mental symbols in this article, I assume that ideas, concepts, knowledge, representations, etc. are all composed of mental symbols.

    Before getting into emotions, lets take a higher level look at development levels of children.

    Development Levels


    The Piaget theoretical framework of how a child's mind transforms as it grows has four stages.

    Stage NumberCognitive Development LevelCapacities / Features
    1Sensorimotor
    0 – 2 years
    Object permanence (knowing an object exists even if it's hidden), connections of actions to perceptions
    2Preoperational
    2 – 7 years
    Egocentrism, symbolic play
    3Concrete Operational
    7 – 11 years
    Logical thought to physical objects,  conservation(redistributing material doesn't affect its mass, number or volume).
    4Formal Operational11 years +Manipulate ideas in head, e.g. abstract reasoning
    (Tabled adapted from [1])

    The actual years in which the stages occur, especially Formal Operational, have been debated for decades, but in this article I am not concerned with exact years. An AI system could be time compressed or expanded, and might actually be a new kind of animal (not purely human) so its stages could be different in many ways.

    Emotional Development


    In the book The First Idea, Stanley I. Greenspan and Stuart G. Shanker proposed nine childhood stages of cognitive development based on emotional functional development. The theories proposed in their book are based on child and infant clinical studies as well as non-human primate studies.

    Each stage overlaps with previous stages, as opposed to Piaget's first three stages which end as the next begins:
    these emotional abilities build on one another. For example, a baby must be engaged in a relationship with a caregiver for loving feelings to become part of an emotional exchange of signals. Using emotional ideas---”I feel sad”---precedes building logical bridges between emotional ideas: “I feel sad because you didn't play with me.” [2]

    Here is a table summarizing the stages. Note that Greenspan and Shanker warn that these are rough categorizations of a continuous development process.

    Stage NumberCognitive Development LevelCapacities / Features
    1Regulation and Interest in the World
    0 months +
    Pleasurable interest in sights, sounds, touch, movement, and other sensory experiences. Looking, listening, calming, awareness of the outer world and simple patterns.
    2Engaging and Relating
    2 – 4 months +
    Pleasurable feelings characterize relationships. Growing feelings of intimacy.
    3Intentionality
    4 – 8 months +
    A range of feelings become used in back-and-forth emotional signaling to convey intentions, beginning of cause-and-effect thinking.
    4Problem Solving, Mood Regulation, and a Sense of Self
    9-18 months +
    A continuous flow of emotional interactions to express wishes and needs and solve problems.
    5Creating Symbols and Using Words and Ideas
    18 months +
    Experiences, including feelings, intentions, wishes, action patterns, etc. are put into words, pretend play, drawings, or other symbolic forms at different levels.
    6Emotional Thinking, Logic, and a Sense of “Reality”
    2.5 years +
    Symbolized experiences are connected together logically to enable thinking, ability for differentiated feelings, ability for creating connections between feeling states.
    7Multiple-Cause and Triangular ThinkingExploring multiple reasons for a feeling, comparing feelings,and understanding triadic interactions among feeling states.
    8Gray-Area Emotionally Differentiated ThinkingShades and gradations among differentiated feeling states.
    9Intermittent Reflective ThinkingReflecting on feelings in relationship to an internalized sense of self, and an internal standard.
    (Table adapted from [3]).

    Greenspan and Shanker also theorized adolescence and adulthood stages as follows, although they seem to be oriented towards what I would guess are Greenspan's clinical psychiatry ideas of a healthy and normal adult. I won't discuss these here.
    Stage NumberCognitive Development Level
    10Expanded Sense of Self
    11Reflecting on a Personal Future
    12Stabilizing a Separate Sense of Self
    13Intimacy and Commitment
    14Creating a Family
    15Middle Age
    16“Wisdom of the Ages”
    (Table adapted from [3]).

    Greenspan and Shanker claim that Piaget and company did not figure out how symbols are formed, and that Piaget viewed emotions as more of a secondary phenomenon. There is some interesting design ideas in Piaget's schemas, and they aren't necessarily at cross-purposes with the emotional approach, but I am not going to get into schemas here.

    Early Emotions


    Even abstract, intellectual concepts, those that underlie theoretical scientific speculations, also reach back to a child's felt experience. [2]

    From the beginning, a baby dual codes every sensation:

    1. The physical affect

    2. The emotional affect


    As Greenspan and Shanker exemplify:
    Thus, a blanket may feel smooth and pleasant or itchy and irritating...a voice loud and inviting or jarring...

    In Stage 1 (Regulation and Interest in the World), a baby's interest in the world is largely due to the relationship with the caregivers. In Stage 2 (Engaging and Relating), a baby is obsessed with the primary caregiver. The emotional interactions, according to Greenspan and Shanker, cause a baby to discriminate the emotional interests of the human world from that of the inanimate world, and to learn pattern recognition.

    Greenspan and Shanker claim that the baby-caregiver interactions are crucial to cognitive development and involves these constructs:

    1. Co-regulation: Partners create new behaviors and meanings.

    2. Reciprocity: back and forth conversation-like patterns.


    During Stage 5, a baby begins using mental symbols (they are also using verbal symbols, aka words, but that is outside the scope of this article).

    Forming Symbols with Emotions


    There are two conditions to form mental symbols:

    1. Separation of perceptions from actions

    2. Relevant emotional experiences


    Perception directly triggers action in nonhuman animals, young human babies, and impulsive older humans (Greenspan and Shanker make reference to the barroom brawler stereotype). I would assume that depending on the animal, “directly” may be slightly different and have more involved in the path from perception to action.

    W. Grey Walter's turtle robots made in the 1940s / 1950s were based on the tight coupling between perception and action. Over the years many other reflexive AI architectures have been made, such as Subsumption. Sometimes the reflexive system is a layer within a larger system that also includes, for instance, high level (but slower) planning. These reflexive robots / layers do not have mental symbols.

    Here is a diagram representing perception-action coupling.



    Somehow, humans have a cognitive development step that allows an image---a multisensory structure---to be disconnected from action. This is a “freestanding image”.



    The aforementioned “somehow” is via emotional interactions with a caregiver that result in an infant taming emotional patterns. One thing that is strange about this, is that the development of symbol formation is essentially dependent on initial external interactions, before it can continue on internally.

    Then the next step that allows a true symbol to form is the emotional linkage:


    Objective Concepts


    Jackie Chapell and Aaron Sloman have also written about altricial cognitive architectures, but have challenged the assumption that all learned symbols are grounded.

    Human-like AI implementations would have to learn objective concepts about the world [4]:
    For example, young human infants reach to grasp an object with their mouths if their hands are not available, and once they have discovered that they can pull a toy resting on a blanket towards them by pulling the blanket, they can transfer the same action to perceptually very different, but functionally similiar materials (such as a sheet of paper).

    But Chapell and Sloman claim that if the objective knowledge is tied closely to particular sensory and action signals within the body, then generalizations cannot be applied to new contexts [4].

    Can emotionally invested symbols overcome this problem? It seems that they can, if the description of Stage 6 in regards to a sense of reality is correct. In Stage 6, a child learns how to contextually connect symbols logically [2].
    Connecting ideas logically is also the basis for reality testing, because the child now connects experiences inside herself with those outside and categorizes which are which…

    Emotional investment in symbols allows general reasoning to emerge. This is the beginning of reasoning about non-emotional aspects of the world [2].
    For example, cause-and-effect thinking with symbols comes from dealing logically with someone else's intentions or feelings: "When I'm mean, my mom gets annoyed with me."

    AI Implications


    One thing to consider is the the actual design of the emotional aspect of the architecture. There are different ways to do that which would be compatible with this theoretical framework.

    Another implication for an altricial AI system that implements this kind of symbol formation is that it requires organism-level caregivers (for emotional interaction) and suitable environments for the various development stages. The design of the development context is just as important as the baby AI's cognitive architecture design.

    Another consideration is that there is not necessarily a “correct” set of symbols.
    Though we generally assume that we all experience sensations---such as sound and touch---in more or less the same way, significant variations are now known to exist in the ways individuals process even very simple sensory information. We have explored the emotional consequences of those sensory differences first described by Jean Ayres, a pioneer in occupational therapy. A given sensation can produce quite different emotional effects in different individuals---pleasure, for example, in one person, but anxiety in another. Each of us, therefore, quite unwittingly creates our own personal, and sometimes idiosyncratic, “catalogue” of sensory and emotional experience.[2]

    The implementation and context differences will result in slightly different internal structures over time. In software and robotics, one can swap modules between systems (indeed reusability is often desirable). But with this kind of internal structure, a new kind of problem emerges if one wants to reuse the knowledge embedded in the artificial mind-world system for another agent, or in order to generate future agents with innate knowledge. I discussed some aspects of this problem in my essay “Transplanting Commonsense Structures.”


    References

    [1] McLeod, S. A. (2009). Jean Piaget | Cognitive Theory. Retrieved from http://www.simplypsychology.org/piaget.html

    [2] Greenspan, S.I.&Shanker, S.G, The First Idea: How Symbols, Language, and Intelligence Evolved from Our Primate Ancestors to Modern Humans, Da Capo Press, 2004.

    [3] Ibid. pp. 88-91.

    [4] Chappell, J.&Sloman A., "Natural and artificial meta-configured altricial information-processing systems." International Journal of Unconventional Computing, 3:211-239 (2007).