Our auditory system is able to detect sounds at an implicit level. The brain can distinguish between even very similar sounds, but we do not always recognize these differences. A new study demonstrated using sound perception during passive listening; when the subject is not trying to explicitly hear the differences.

The result was that the human brain unconsciously distinguishes between even very similar sound signals during passive listening.  

Their experiment was with 20 healthy volunteers. The participants listened to sounds while the researchers used electroencephalography (EEG) to measure their brain responses to the stimuli. The sounds were so similar that the participants could only explicitly distinguish them with 40% accuracy. First, the volunteers listened to sequences of three sounds in which one sound was repeated often, while the two others appeared rarely. The participants were asked to press a key if they heard a difference in the sounds. Then, in passive listening mode, the same sounds appeared in more elaborate sequences: groups of five similar sounds and groups in which the fifth sound was different.

Two types of sound sequences were used in the experiment: those with local irregularities and those with global irregularities. In the first type, groups of similar sounds were often repeated, while a group with a different sound at the end appeared randomly and rarely. In the second type, groups with a different sound at the end appeared often and groups of similar sounds appeared rarely.

Detecting these two types of sound sequences requires attention at different levels. The brain reacts differently to them, and EEG registers different types of potentials. Local irregularity can be detected without explicit attention and elicits mismatch negativity (MMN) and P3a potentials. Global irregularity demands concentration and elicits P3b potential, which reflects a higher level of consciousness. The same potentials were registered in earlier experiments with the same methodology. The difference with the current study is that they used sounds that are barely distinguishable. In earlier studies, stimuli (sounds or images) could be recognized with 100% accuracy.

"We made the sound sequence more complicated, assuming this would facilitate sound recognition. We would see this in an increased amplitude of potential. But the result was unexpected. Instead of P3b potential in global irregularities, we saw emerging N400 potential, which is related to explicit information processing but can also appear in implicit attention. The appearance of this potential is a sign of a hidden, implicit form of learning that is constantly happening in our lives," commented Olga Martynova, Senior Research Fellow at the National Research University Higher School of Economics in Moscow.

The authors say the appearance of N400 potential confirms their belief in predictive coding, where the brain creates a model of the environment based on its experience and uses predictions to optimize its operations. When faced with experiences that contradict these predictions, its world outlook is updated. This process forms the basis of implicit (unconscious) learning and is related to the aim of minimizing prediction errors to enable better adaptation and faster reaction to changes in the environment.