姓  名:
陈立翰
职  称:
副教授 博导
研究领域:
跨感觉通道信息整合 认知神经科学 工程心理学
通信地址:
北京大学王克桢楼 100080
电子邮件:
CLH AT pku.edu.cn
个人主页:
https://www.multisensorylab.com/

陈立翰,窝窝视频网副教授(2015-至今),工学学士(1999年,浙江大学),教育学硕士(2005年,浙江大学),哲学博士(2010年,慕尼黑大学)。脑与认知科学系副系主任(2019-现在),中国心理学会会员,中国心理学会工程心理学专业委员会委员(2016-现在),国际多感觉通道研究论坛科学委员会委员(2016-现在),中国认知科学学会认知建模专业委员会副主任委员(2021-2025),中国人类工效学会复杂系统人因与工效学分会委员(2021-2025),中国电子工业标准化技术协会用户体验标准工作委员会专家委员(2022-现在)。《Perception》编委(2021-2024),《Frontiers in Virtual Reality》编委 (2019-现在)。研究方向为多感觉注意和认知神经科学。 主要研究手段为心理物理学、虚拟现实、脑电和脑磁图、眼动记录以及计算建模等。

教学

  1. 本科生专业必修课《心理学研究方法-Matlab》, 2011秋,2012春-现在
  2. 研究生创新能力建设课程《多感觉通道信息加工》,2013年春,2014年春,2017年春
  3. 研究生教育创新计划课程 《工程心理学》,2018年春-现在

社会服务

曾担任硕士研究生班主任(2011-2015)。Frontiers in Virtual Reality 和 Frontiers in Augmented Reality 期刊编委(2019-现在),Perception/i-Perception 期刊编委(2021-2024)。承担并完成中国大百科全书心理卷(第三版) 普通心理学《知觉》条目的撰写。为《心理学报》、《心理科学》,《中国科学》,《心理科学进展》,《应用心理学》等国内核心期刊,以及国际期刊Attention Perception & Psychophysics, Cognition, Cerebral Cortex, Cortex, Frontiers in Psychology, Fundamental Research, Journal of Experimental Psychology: HPP, Journal of Vision, NeuroImage,Neuroscience, Perception, Scientific Reports, Somatosensory & Motor Research, IEEE Transactions on haptics, Multisensory Research, Vision Research, 国际智能机器人和系统会议(Conference on Intelligent Robots and Systems ,IROS 2019) 等期刊和会议论文审稿。作为客座编辑,主编心理学前沿《Frontiers in Psychology》对于“Sub-and supra-second timing: brain, learning and development”的主题期刊。2015年世界触觉大会-触觉科学(Haptic Science, IEEE World Haptics Conference)副主编。作为组织者之一,承办第17届国际多感觉通道研究论坛会议(2016年6月,IMRF2016)。2022年亚洲触觉大会(AsiaHaptics)出版主席。

研究兴趣: 多感觉通道时间知觉;跨感觉通道刺激编码和特征联合;触觉知觉;工程心理学与人因学

主要的研究工作包括:

(1) 多感觉通道感知和信息加工机理。研究亚秒级(1秒以内)的多通道(视觉、听觉和触觉等感觉)的注意加工,特别是时空信息加工的认知神经科学机理。

(2) 逼近真实环境的多感觉通道信息整合和神经人因学评估。研究特定工作情境下人的认知绩效、感知增强和认知训练(康复)等方面的认知神经科学与神经人因工程问题。

在研主要项目

  1. 科技创新2030-脑科学与类脑研究重大项目(2021窜顿0202600),《多感觉整合的神经机制》,2021.12-2026.1,课题负责人。
  2. 国家自然科学基金国际(地区)合作与交流项目- 中德跨学科重大合作研究项目(62061136001),《跨模态学习的自适应、预测和交互》,2020.1-2023.12,子课题负责人。
  3. 公司横向,《用户心理创新体验人因研究项目》,2022.6-2023.9,主持。
  4. 唐仲英基金会资助项目,《校园屏幕组学与时间银行构建》,2022.3-2024.12,主持。

已完成项目

  • 国家自然科学基金国际(地区)合作与交流项目(项目批准号:31861133012),《价值驱动的跨通道注意加工机制》,2019.1-2021.12,子课题负责人。
  • 国家自然科学基金面上项目(项目批准号:11774379),《鸡尾酒会效应中多维度特征的注意选择及其认知建模》,2018.1-2021.12,参与。
  • 公司横向,《跨感觉通道统计学习的认知机制》,2019.1-2020.12,主持
  • 公司横向,《提供自闭症儿童的时间知觉与行为绩效的感觉统合与脑认知功能测试》,2018.5-2019.6,主持
  • 国家自然基金重大科研仪器研制项目(项目批准号:61527804),《面向泛在视频的感知质量测评仪》,2016.1-2020.12,参与。
  • 国家自然科学基金中德重大国际合作项目 “跨模态学习的自适应、预测和交互”(项目批准号:61621136008),2016年1月-2019年12月,子课题负责人。
  • 国家自然科学基金面上项目(项目批准号:81371206) 《中文发展性阅读障碍儿童知觉学习能力缺陷的认知神经机制》,2014.1-2017.12, 参与。
  • 国家自然科学基金面上项目(项目批准号:31371127)《速度和距离对叁维空间中时空整合的不同影响:行为和神经证据》,2014.1-2017.12,参与。
  • 国家自然科学基金面上项目(项目批准号:31470978)《叁维空间中视觉运动与听觉、触觉特征的跨通道一致性》,2015-2018,参与。
  • 中科院声学研究所横向课题《鸡尾酒会效应中的视听特征绑定与注意加工的实验和测试》,2016-2017,主持。
  • 教育部留学回国人员科研启动项目(第45批)《听觉信号的知觉竞争对视觉似动的影响与加工机制》, 2013.1-2013.12,主持。
  • 中国博士后科学基金特别资助项目(第四批) 《空间映射与时空隐喻对跨感觉通道似动的影响》,2011.1-2011.12,主持。
  • 中央高校基本科研项目《多感觉通道信息整合与运动控制》,2010-2012, 参与。
  • 中国博士后科学基金项目(第47批) 《听觉刺激的时间结构域刺激物理属性对触觉似动的影响》, 2010-2011,主持。
  • 国家自然科学青年基金项目(项目批准号: 31000456),《基于贝叶斯理论的感知运动控制的不确定性研究》,2011-2013,参与。
  • 国家自然科学青年基金项目(项目批准号: 31200760)《感觉通道信息交互作用与知觉平均加工》, 2013.1-2015.12主持
  • 863项目(项目批准号: 2012AA011602)《脑机协同视听觉信息处理关键技术及平台研究》, 2012-2015,参与。
  • 中科院声学研究所横向课题《面向声学信号多元处理和声音序列知觉加工》,2013-2014,主持。
  • 杭州师范大学认知与脑疾病研究中心开放课题《触觉时间知觉学习和脑的可塑性》,2013-2015,主持.
  • 中科院声学研究所横向课题《纯音和语音序列时间感知的心理与生理测试》,2014-2015,主持。

英文论着

Hechuan Zhang, Xuewei Liang, Ying Lei, Yanjun Chen, Zhenxuan He, Yu Zhang, Lihan Chen, Hongnan Lin, Teng Han*, and Feng Tian. 2024. Understanding the Effects of Restraining Finger Coactivation in Mid-Air Typing: from a Neuromechanical Perspective. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 51, 1–18. https://doi.org/10.1145/3654777.3676441

Guo, L., Bao, M*., Chen, Z., & Chen, L*. (2024). Contingent magnetic variation and beta-band oscillations in sensorimotor temporal decision-making. Brain research bulletin, 215, 111021. https://doi.org/10.1016/j.brainresbull.2024.111021

Liu, J., Chen, L., Gu, J., Buidze, T., Zhao, K*., Liu, C. H., Zhang, Y., Gl?scher, J., & Fu, X. (2024). Common intentional binding effects across diverse sensory modalities in touch-free voluntary actions. Consciousness and cognition, 123, 103727. https://doi.org/10.1016/j.concog.2024.103727

He, X., Ke, Z., Wu, Z., Chen, L*., & Yue, Z*. (2023). The speed and temporal frequency of visual apparent motion modulate auditory duration perception. Scientific reports, 13(1), 11281. https://doi.org/10.1038/s41598-023-38183-w

Kang, G., Luo, X., Chen, L., Chen, J., Chen, J., Dai, H., & Zhou, X. (2023). Reward delays quitting in visual search. Psychological research, 10.1007/s00426-023-01860-6. Advance online publication. https://doi.org/10.1007/s00426-023-01860-6

Liu, Y., Katzakis, N., Steinicke, F., Chen, L*. (2023). Peripersonal Space Tele-Operation in Virtual Reality: The Role of Tactile - Force Feedback. In: Wang, D., et al. Haptic Interaction. AsiaHaptics 2022. Lecture Notes in Computer Science, vol 14063. Springer, Cham. https://doi.org/10.1007/978-3-031-46839-1_13

Chen, L.*(accepted). Synesthetic correspondence: an overview, in Gu, Y. & Zaidel, A. Advances of Multisensory Integration in the Brain, Springer. https://link.springer.com/book/9789819976102

Feng, S., Wang, Q., Hu, Y., Lu, H., Li, T.,Song, C., Fang, J., Chen, L.*, Yi, L.* (2022). Increasing Audiovisual Speech Integration in Autism Through Enhanced Attention to Mouth. Developmental Science, e13348. https://doi.org/10.1111/desc.13348

Xu, S.,Zhou, X*., Chen, L.*(2022). Intermodulation from Unisensory to Multisensory Perception: A Review. Brain Sciences. 2022, 12, 1617. https://doi.org/10.3390/brainsci12121617

Huang, J., Chen, L.*, Zhou, X.*(2022). Self-reference modulates the perception of visual apparent motion. Attention, Perception & Psychophysics. https://doi.org/10.3758/s13414-022-02620-1

Gu, X., Chen,L., Wang,G., & Li, S.* (2022). An alternative paradigm for assessing attitudes in virtual reality -interpersonal distance paradigm: taking weight stigma as an example. Frontiers in Virtual Reality. doi: 10.3389/frvir.2022.1015791

Kun Liang, Wu Wang, Xiao Lei, Huanke Zeng, Wenxiao Gong, Chunmiao Lou, Lihan Chen*(2022). Odor-induced sound localization bias under unilateral intranasal trigeminal stimulation, Chemical Senses, Volume 47, bjac029, https://doi.org/10.1093/chemse/bjac029

Yue,S., Chen,L* (2022) Graphemic and Semantic Pathways of Number–Color Synesthesia: A Dissociation of Conceptual Synesthesia Mechanisms. Brain Sciences. https://www.mdpi.com/2076-3425/12/10/1400

Wang,W., Lei X., Gong,W., Liang,K.,Chen,L.* (2022). Facilitation and inhibition effects of anodal and cathodal tDCS over areas MT+ on the flash-lag effect. Journal of Neurophysiology, 128:239-248. https://journals.physiology.org/doi/abs/10.1152/jn.00091.2022

Lou, C., Zeng,H., & Chen, L.* (2022). Asymmetric switch cost between subitizing and estimation in tactile modality. Current Psychology, https://doi.org/10.1007/s12144-022-02858-w

Gong, W., Gu,L., Wang,W., & Chen, L.*(2022). Interoception visualization relieves acute pain.Biological Psychology, 169, Artical ID 108276, https://doi.org/10.1016/j.biopsycho.2022.108276

Liu,S.*, Nakajima, Y., Chen, L., Arndt, S., Kakizoe, M., Elliott, M., & Remijn, G. (2022). How Pause Duration Influences Impressions of English Speech: Comparison Between Native and Non-native Speakers. Front. Psychol. 13:778018.doi: 10.3389/fpsyg.2022.778018

Li, B*., Wang, K. & Chen, L. (2021). The rhythm aftereffect induced by adaptation to the decelerating rhythm. Psychonomic Bulletin & Review, https://doi.org/10.3758/s13423-021-02014-8

Feng S, Lu H, Wang Q, Li T, Fang J, Chen L*, Yi L*(2021).Face-viewing patterns predict audiovisual speech integration in autistic children. Autism Research, https://doi.org/10.1002/aur.2598.

Feng S, Lu H, Fang J,Li X, Yi,L*,Chen L* (2021). Audiovisual speech perception and its relation with temporal processing in children with and without autism. Reading and Writing, https://doi.org/10.1007/s11145-021-10200-2

Ding-Zhi Hu, Kai Wen, Li-Han Chen*, Cong Yu*(2021).Perceptual learning evidence for supramodal representation of stimulus orientation at a conceptual level,Vision Research,187:120-128.

Li B, Jia J, Chen L*, Fang F* (2021). Electrophysiological correlates of the somatotopically organized tactile duration aftereffect. Brain Research,1762,Artical ID 147432.

Chen L*, Hsin-I Liao* (2020). Microsaccadic eye movements but not pupillary dilation response characterizes the crossmodal freezing effect, Cerebral Cortex Communications,tgaa072, https://doi.org/10.1093/texcom/tgaa072

Tian Y, Liu X* and Chen L* (2020). Mindfulness Meditation Biases Visual Temporal Order Discrimination but Not Under Conditions of Temporal Ventriloquism. Front. Psychol. 11:1937.doi: 10.3389/fpsyg.2020.01937 (featured by )

Katzakis N*, Chen L and Steinicke F (2020) Visual-Haptic Size Estimation in Peripersonal Space. Front. Neurorobot. 14:18. doi: 10.3389/fnbot.2020.00018

Chen L (2020) Education and visual neuroscience: A mini-review. Psych Journal. doi: 10.1002/pchj.335

Nikolaos Katzakis, Lihan Chen, Oscar Ariza, Robert Teather, Frank Steinicke (in press). Evaluation of 3D Pointing Accuracy in the Fovea and Periphery in Immersive Head-Mounted Display Environments. IEEE Transactions on Visualization and Computer Graphics (TVCG)

Li B., Chen L.* and Fang F.* (2019) Somatotopic representation of tactile duration: evidence from tactile duration aftereffect. Behavioural Brain Research. Volume 371, Article ID 111954

Zeng, H and Chen, L* (2019) Robust Temporal Averaging of Time Intervals Between Action and Sensation. Front. Psychol. 10:511. doi: 10.3389/fpsyg.2019.00511

Lei X, Zhang T,Chen K, Zhang J, Tian Y, Fang F* and Chen L*(2019). Psychophysics of wearable haptic / tactile perception in a multisensory context. Virtual Reality & Intelligent Hardware, 1(2): 185—200,DOI: 10.3724/SP.J.2096-5796.2018.0012

Chen, L.(2019) Discrimination of empty and filled intervals marked by auditory signals with different durations and directions of intensity change. PsyCh Journal. DOI: 10.1002/pchj.267

Shen L#, Han B#, Chen L#, and Chen Q* (2019) Perceptual inference employs intrinsic alpha frequency to resolve perceptual ambiguity. PLoS Biology 17(3): e3000025.(# co-first authors)

Zheng,W*.,and Chen, L*.(2019) Illusory perception of auditory filled duration is task- and context-dependent. British Journal of Psychology, DOI:10.1111/bjop.12379

Lihan Chen*, Xiaolin Zhou, Hermann J. Müller, and Zhuanghua Shi (2018). What You See Depends on What You Hear: Temporal Averaging and Crossmodal Integration. Journal of Experimental Psychology: General.

Zheng,W.* & Chen, L.* (2018). The Roles of Attentional Shifts and Attentional Reengagement in Resolving The Spatial Compatibility Effect in Tactile Simon-like Tasks. Scientific Reports, 8:8760. doi:10.1038/s41598-018-27114-9.

Wan, Y. & Chen, L.* (2018). Temporal Reference, Attentional Modulation,and Crossmodal Assimilation. Front. Comput. Neurosci. 12:39. doi: 10.3389/fncom.2018.00039.

Yiltiz, H. & Chen, L.*(2018) Emotional cues and social anxiety resolve ambiguous perception of biological motion. Exp Brain Res. https://doi.org/10.1007/s00221-018-5233-3 

Tian,Y. & Chen, L.* (2018) Cross-modal attention modulates tactile subitizing but not tactile numerosity estimation.Attention, Perception, & Psychophysics, https://doi.org/10.3758/s13414-018-1507-x 

Chen, L.*, Feng, W.,Yue, Z. (2018). Introduction to the Special Issue on Multisensory Research Forum (IMRF 2016, Suzhou). Multisensory Research,31,345-349.

Liu Y*, Zang X, Chen L, Assump??o L, Li H*.(2018) Vicariously touching products through observing others’ hand actions increases purchasing intention, and the effect of visual perspective in this process: an fMRI study,Human Brain Mapping, 39,332-343. 

Nicholas Katzakis, Jonathan Tong, Oscar Ariza, Lihan Chen, Gudrun Klinker, Brigitte R?der, and Frank Steinicke(2017). Stylo and Handifact: Modulating Haptic Perception through Visualizations for Posture Training in Augmented Reality. In Proceedings of SUI ’17, Brighton, United Kingdom, October 16–17, 2017, 10 pages. https://doi.org/10.1145/3131277.3132181  

Guo,L., Bao, M*., Guan,L., Chen, L.*.(2017).Cognitive Styles Differentiate Crossmodal Correspondences Between Pitch Glide and Visual Apparent Motion, Multisensory Research 30,363-385.

Chen, L.*, Guo, L., Bao,M*.(2017).Sleep-dependent consolidation benefits fast transfer of time interval training. Experimental Brain Research 235(3),661-672. 

Chen, L.*, Guo, L., Wan,Y., Bao, M. (accepted) Temporal perceptual grouping and transfer in a multisensory context. 日本音响学会聴覚研究会资料.

Chen L.*, Bao Y and Wittmann M (2016). Editorial: Sub-and supra-second timing: brain, learning and development. Front. Psychol. 7:747. doi: 10.3389/fpsyg.2016.00747

Yue Z.*, Gao T., Chen L., Wu J.(2016) Odors bias time perception in visual and auditory modalities. Frontiers in Psychology. 10.3389.

Chen L*., Zhang M., Ai F.#, Xie W.#, Meng X.(2016).Cross-modal synesthetic congruency improves visual timing in dyslexic children.Research in Developmental Disabilities. 55,14-26 (# equal contributions).

Zhang Y and Chen L * (2016) Crossmodal statistical binding of temporal information and stimuli properties recalibrates perception of visual apparent motion. Front. Psychol. 7:434.doi:10.3389/fpsyg.2016.00434

Gupta, D.S*., and Chen L.* (2016) Brain oscillations in perception, timing and action, 8:161-166, Current opinion in Behavioral Sciences.

Wang Q., Guo L., Bao M.* and Chen L.*(2015) Perception of visual apparent motion is modulated by a gap within concurrent auditory glides, even when it is illusory. Front. Psychol. 6:564.doi:10.3389/fpsyg.2015.00564

Yiltiz H. & Chen L.* (2015). Tactile input and empathy modulate the perception of ambiguous biological motion. Front. Psychol. 6:161. doi: 10.3389/fpsyg.2015.00161

Chen L. (2014) How many neural oscillators we need on sub- and supra-second intervals processing in the primate brain. Front. Psychol. 5:1263. doi: 10.3389/fpsyg.2014.01263.

Chen L. (2014) Statistical Learning in a Multisensory World. Austin Biom and Biostat.1(1): 3.

Chen, L. & Zhou, X.* (2014). Fast transfer of cross-modal time interval training. Experimental Brain Research, 232, 1855-1864.

Chen L *, Wang Q., & Bao M* (2014). Spatial references and audio-tactile interaction in cross-modal dynamic capture. Multisensory Research, 27, 55-70.

Wang Q., Bao M*. & Chen L.* (2014). The role of spatio-temporal and spectral cues in segregating short sound events: evidence from auditory Ternus display. Experimental Brain Research, 232,273-282.

Chen, L. & Vroomen, J.(2013). Intersensory Binding across Space and Time: A Tutorial Review. Attention,Perception, & Psychophysics ,75,790-811

Chen, L. (2013). Tactile Flash Lag Effect: Taps with Changing Intensities Lead Briefly Flashed Taps. IEEE World Haptics Conference 2013, The 5th Joint Eurohaptics Conference and IEEE Haptics Symposium, pp 253-258.

Jiang, Y.,& Chen, L.* (2013). Mutual influences of intermodal visual/tactile apparent motion and auditory motion with uncrossed and crossed arms. Multisensory Research,26,19-51.

Chen, L.(2013). Synaesthetic correspondence between auditory clips and colors: an empirical study.In J. Yang, F. Fang, and C. Sun (Eds): IScIDE 2012, Lecture Notes in Computer Science (LNCS) 7751, pp. 507-513,Springer-Verlag Berlin Heidelberg.

Zhang, H., *Chen L* & Zhou, X. (2012) Adaptation to visual or auditory time intervals modulates the perception of visual apparent motion. Frontiers in Integrative Neuroscience. 6:100. doi: 10.3389/fnint.2012.00100

Chen, L., & Zhou, X. (2011). Capture of intermodal visual/tactile apparent motion by moving and static sounds. Seeing and Perceiving, 24, 369-389. DOI: 10.1163/187847511X584434

Chen, L., & Zhou, X. (2011). Visual apparent motion can be modulated by task-irrelevant lexical information. Attention, Perception, & Psychophysics, 73, 1010-1015. DOI: 10.3758/s13414-010-0083-5

Chen L, Shi Z, & Müller H. J. (2011) Interaction of Perceptual Grouping and Crossmodal Temporal Capture in Tactile Apparent-Motion. PLoS ONE 6(2): e17130. doi:10.1371/journal.pone.0017130

Shi, Z., Zou, H., Rank, M., Chen, L., Hirche, S., Müller, H. J. (2010). Effects of packet loss and latency on the temporal discrimination of visual-haptic events. IEEE Transactions on Haptics,3 (1),28-36.

Shi, Z., Chen, L., & Müller , H. J. (2010). Auditory temporal modulation of the visual Ternus effect: the influence of time interval. Experimental Brain Research, 203 (4), 723-735.

Chen, L., Shi, Z., & Müller , H. J. (2010). Influences of intra- and crossmodal grouping on visual and tactile Ternus apparent motion. Brain Research,1354,152-162.

中文论着

陈立翰 (2023),《学习力脑科学》,中国人民大学出版社。

宫晓东,张佳乐,陈立翰.老年人心智模型研究及在交互设计领域的应用摆闯闭.包装工程,2021,42(24):84-92.

陈立翰 (2017),《心理学研究方法:基于MATLAB和PSYCHTOOLBOX》, 北京大学出版社。

马家俊,魏坤琳,陈立翰* (2015). 施动感研究的意向捆绑范式述评.心理科学, 38(2),506-510.

陈立翰 (2008). 单侧化准备电位的含义和应用.心理科学进展,16(5): 712-720

陈立翰,张秀波 (2008). 不同求职目的和经历的大学生职业价值观初探.人类工效学,14(4),42-45

滕腾,何金彩, 陈立翰,孙克瀛,林颖颖,黄国胜 (2007). 强制戒毒人员对负性情绪图片的自主神经反应和愉悦评价,中国药物依赖性杂志,16(1),57-62

刘玉丽,张智君,徐青, 陈立翰 (2007). 前置线索对动作准备不同阶段的影响, 应用心理学,13(2),131-137