A study published in eLife found that wakefulness, non-REM, and REM sleep have complementary functions for learning. University of Bern researchers found that introducing new, virtual sensory inputs during REM sleep facilitated the extraction of semantic concepts by mimicking a machine learning technique. They found that replaying episodic memories via perturbed dreaming during non-REM sleep can lead to deeper learning. The researchers used simulations of a brain cortex to show how sleep phases impact learning. Wakefulness, non-REM, and REM sleep play a key role in processing information that involves experiencing the stimulus and explaining a semantic representation of that experience.
In this study, learning in the model was organized across three different global brain states: wakefulness, non-rapid eye movement (NREM), and REM sleep, optimizing different objective functions. The researchers created this model based on a machine learning technique called Generative Adversarial Networks (GANs) that provide a computational perspective on sleep states, memory replay, and dreams. A cortical implementation of (GANs) involves two neural networks that compete with each other to generate new data from the same dataset. The dataset used here consisted of simple pictures of animals and objects. In order to evaluate the model’s performance, a classifier assesses pictures read from cortical representations. From this data, researchers generated a new computational perspective on sleep states, memory replay, and dreams using cortical implementation of GANs.
When we sleep, we cycle through two alternating types of sleep phases: NREM sleep and REM sleep. During NREM periods, the brain replays the sensory experience that happened while awake. REM sleep is characterized by spontaneous bursts of intense brain activity. This activity usually produces vivid dreams. The University of Bern study found that dreams play a key role in forming semantic representations and this process is mainly driven by REM sleep. Dreams help us perform better on tasks by integrating old and new memories, and this processing of information happens at multiple levels.
Senior author Jakob Jordan explains that “Non-REM and REM dreams become more realistic as our model learns. While non-REM dreams resemble waking experiences quite closely, REM dreams tend to creatively combine these experiences.” Extracting general concepts from sensory experience appears to be a natural process for humans and animals. Cognitive processes that occur when the brain is “offline” or asleep involve the systematic replaying of previous experiences and integrating this information into current sensory experiences.
Sources: