Memory Encoding, Retrieval and Updating
Memory is key to human behavior and involves multiple processes, including encoding (getting information into our memory system), consolidation (stabilization of memory traces), and retrieval (bringing memories back to mind). How can we encode, maintain, and retrieve our memories? Can we decode our memories? Is it possible to modify consolidated memories? Is it possible to restore impaired memory functions?
Memory studies have emphasized the critical role of the hippocampus in memory formation and retrieval. Moreover, memory models also suggest that various cortical regions are involved in the encoding, storage, and retrieval of specific components of memory. Neuroimaging studies of mental imagery or working memory have supported the theory that the mechanisms used to process original perception or encoding may be reactivated during retrieval.
Newly developed memories become stable through a consolidation process, during which each memory is in a labile or malleable state. Accumulated evidence demonstrates that even after this initial consolidation, stable memories also become labile again immediately following memory retrieval and thus require reconsolidation. Much attention has focused on this labile state and the reconsolidation of memory traces, as researchers hope to provide a way to manipulate memories.
The goal of our research is to understand the neural processes of human memory. Not only will such an understanding provide fundamental insights into the memory mechanisms of the brain, but such research can also ultimately help direct the development of effective interventions during treatments of mental disorders.
In our everyday lives, we all always experience feelings such as happiness, anger, and sadness. What neural mechanisms underlie our emotional processes? How do emotional contexts affect cognitive processes such as memory, attention, and motivation? Can we read our emotions from brain activity patterns? Prior studies indicate that the primary emotional network regions including the amygdala play a critical role in emotional processes. However, the higher-order areas such as the prefrontal cortex are also considered to be associated with emotional processes. Our recent work suggests that higher-order information processed in the prefrontal cortex is more strongly integrated into sensory representations during the experience of emotional stimuli compared to neutral stimuli.
How can we recognize objects, faces, or the spaces around us? How do our behavioral goals, memories, or emotions affect our perception processing? Our everyday visual perception is the product of the interaction between externally driven “bottom-up” sensory information and internally generated “top-down” signals, which guide our interpretations of sensory input. The perception of even a single object can differ due to top-down signals such as intent, attention or prior knowledge. Our research mainly focuses on the influence of the top-down processing of our perceptions and a comparison between perceived information and retrieved information from memories.