Publications

Publications

  • 2021
  • 2020
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010
  • 2009
  • 2008
  • 2007
  • 2006
  • 2005
  • 1998

2021

Journal articles

  1. Ogasawara T., Fukamachi H., Aoyagi K., Kumano S., Togo H., & Oka K. (2021). Archery Skill Assessment Using an Acceleration Sensor. IEEE Transactions on Human-Machine Systems, 1-8.
  2. Kawabe T. (2021). Perceptual Properties of the Poisson Effect. Frontiers in Psychology, 11, 612368.
  3. Zhou Y., Nakamura Y., Mugitani R., & Watanabe J. (2021). Influence of prior auditory and visual information on speech perception: Evidence from Japanese singleton and geminate words. Acoustical Science and Technology, 42 (1), 36-45.
  4. Liao H.-I., Kashino M., & Shimojo S. (2021). Attractiveness in the eyes: A possibility of positive loop between transient pupil constriction and facial attraction. Journal of Cognitive Neuroscience, 33 (2), 315-340.
  5. Kuroki S., & Nishida S. (2021). Motion direction discrimination with tactile random-dot kinematgrams. i-Perception, 12, 1-20.
  6. Wong-Villacres M., Garcia A. A., Badillo-Urquiola K., Machuca M. D. B., Felice M. C., Gaytán-Lugo L. A., Lemus O. A., Reynolds-Cuéllar O., & Perusquía-Hernández M. (2021). Lessons from Latin America: embracing horizontality to reconstruct HCI as a pluriverse. Interactions, 28 (2), 56-63.
  7. Perusquia-Hernandez M. (2021). Are people happy when they smile?: Affective assessments based on automatic smile genuineness identification. Emotion Studies, 6 (1), 57-71.
  8. Sakamoto M., Watanabe J., & Yamagata K. (2021). Automatic Estimation of Multidimensional Personality From a Single Sound-Symbolic Word. Frontiers in Psychology, 12, 1339.
  9. Kawabe T., Ujitoko Y., Yokosaka T., & Kuroki S. (2021). Sense of Resistance for a Cursor Moved by User's Keystroke. Frontiers in Psychology, 12, 652781.
  10. Takamuku S., Ohta H., Kanai C., Hamilton A. F., & Gomi H. (2021). Seeing motion of controlled object improves grip timing in adults with autism spectrum condition: evidence for use of inverse dynamics in motor control. Experimental Brain Research, 239 (4), 1047-1059.
  11. Koumura T., Nakatani M., Liao H.-I., & Kondo H. M. (2021). Dark, loud, and compact sounds induce frisson. Quarterly Journal of Experimental Psychology, 74 (6), 1140-1152.
  12. Ooishi Y., Fujino M., Inoue V., Nomura M., & Kitagawa N. (2021). Differential effects of focused attention and open monitoring meditation on autonomic cardiac modulation and cortisol secretion. Frontiers in Physiology.
  13. Murata A., Nomura K., Watanabe J., & Kumano S. (2021). Interpersonal physiological synchrony is associated with first person and third person subjective assessments of excitement during cooperative joint tasks. Scientific Reports, 11 (12543).
  14. Kumano S., Hamilton A. F., & Bahrami B. (2021). The role of anticipated regret in choosing for others. Scientific Reports, 11 (12557).
  15. Ota Y., Ujitoko Y., Sakurai S., Nojima T., & Hirota K. (2021). Inside Touch: Presentation of Tactile Feeling Inside Virtual Object Using Finger-Mounted Pin-Array Display. IEEE Access, 9, 75150-75157.
  16. Kuroki S., Sawayama M., & Nishida S. (2021). The roles of lower- and higher-order surface statistics in tactile texture perception. Journal of Neurophysiology, 126, 95-111.
  17. Otsuka S., & Furukawa S. (2021). Conversion of amplitude modulation to phase modulation in the human cochlea. Hearing Research, 408, 108274.
  18. Uezu Y., Hiroya S., & Mochida T. (2021). Articulatory compensation for low-pass filtered formant-altered auditory feedback. The Journal of the Acoustical Society of America, 150, 64.
  19. Kuroki S. (2021). Anisotropic distortion in the perceived orientation of stimuli on the arm. Scientific Reports, 11, 14602.
  20. Abekawa N., Gomi H., & Diedrichsen J. (2021). Gaze control during reaching is flexibly modulated to optimize task outcome. J Neurophysiol, 126 (3), 816-826.
  21. Yokosaka T., Kuroki S., & Nishida S. (2021). Describing the sensation of the ‘velvet hand illusion' in terms of common materials. IEEE Transactions on Haptics, 14 (3), 680-685.
  22. Yamashita J., Terashima H., Yoneya M., Maruya K., Koya H., Oishi H., Nakamura H., & Kumada T. (2021). Pupillary fluctuation amplitude before target presentation reflects short-term vigilance level in Psychomotor Vigilance Tasks. PLoS One, 16 (9), e0256953.

Books/Chapters

  1. Yasu K., & Ishikawa M. (2021). Magnetact Animals: A Simple Kinetic Toy Kit for a Creative Online Workshop for Children. Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. (p.4). Association for Computing Machinery.

Conferences

  1. Koumura T., Terashima H., & Furukawa S. (2021). Temporal modulation transfer function based on time-averaged responses of units in a neural network model. Association for Research in Otolaryngology (ARO) Midwinter Meeting 2021. Online.
  2. Yamagishi S., & Furukawa S. (2021). To What Extent Do Visual- and Auditory-Targeting Saccades Share Common Mechanisms?. Association for Research in Otolaryngology (ARO) Midwinter Meeting 2021. Online.
  3. Kuroki S. (2021). Motion direction discrimination using tactile random-dot kinematograms with distributed lateral skin stretch. World Haptics Conference 2021.
  4. Yokosaka T., Suzuishi Y., & Kuroki S. (2021). Feel illusory texture through a hole: Rotating stimulus modulates tactile sensation for touched object's surface. World Haptics Conference 2021.
  5. Ban Y., & Ujitoko Y. (2021). Hit-Stop in VR: Combination of Pseudo-haptics and Vibration Enhances Impact Sensation. World Haptics Conference 2021.
  6. Kawabe T., Ujitoko Y., & Yokosaka T. (2021). Pseudo-heaviness during mid-air gestures is tuned to visual speed. World Haptics Conference 2021.
  7. Nagano M., Ijima Y., & Hiroya S. (2021). Impact of Emotional State on Estimation of Willingness to Buy from Advertising Speech. Proc. Interspeech2021.

2020

Journal articles

  1. Murata A., Nishida H., Watanabe K., & Kameda T. (2020). Convergence of physiological responses to pain during face-to-face interaction. Scientific Reports, 10 (1), 450.
  2. Yamagishi S., Yoneya M., & Furukawa S. (2020). Relationship of postsaccadic oscillation with the state of the pupil inside the iris and with cognitive processing. Journal of Neurophysiology, 123 (2), 484-495.
  3. Koumura T., Terashima H., & Furukawa S. (2020). Chimeric sounds with shuffled “texture” and “content” synthesized by a model of the auditory system. Acoustical Science and Technology, 41 (1), 337-340.
  4. Terashima H., & Furukawa S. (2020). Examination of efficient coding model for auditory nerves during infant development. Acoustical Science and Technology, 41 (1), 351-354.
  5. Furukawa S., Terashima H., Koumura T., & Tsukano H. (2020). Data-driven approaches for unveiling the neurophysiological functions of the auditory system. Acoustical Science and Technology, 41 (1), 63-66.
  6. Yokosaka T., Inubushi M., Kuroki S., & Watanabe J. (2020). Frequency of switching touching mode reflects tactile preference judgment. Scientific Reports, 10 (1), 3022.
  7. Honda S., Ishikawa Y., Konno R., Imai E., Nomiyama N., Sakurada K., Koumura T., Kondo H. M., Furukawa S., Fujii S., & Nakatani M. (2020). Proximal Binaural Sound Can Induce Subjective Frisson. Frontiers in Psychology, 11, 316.
  8. Ito S., & Gomi H. (2020). Visually-updated hand state estimates modulate the proprioceptive reflex independently of motor task requirements. eLife.
  9. Marmolejo-Ramos F., Murata A., Sasaki K., Yamada Y., Ikeda A., Hinojosa J. A., Watanabe K., Parzuchowski M., Tirado C., & Ospina R. (2020). Your Face and Moves Seem Happier When I Smile. Experimental Psychology, 67, 14-22.
  10. Takagi A., Maxwell S., Melendez-Calderon A., & Burdet E. (2020). The dominant limb preferentially stabilizes posture in a bimanual task with physical coupling. J. Neurophysiology, 123, 2154-2160.
  11. Kawabe T. (2020). Mid-Air Action Contributes to Pseudo-Haptic Stiffness Effects. IEEE Transactions on Haptics, 13, 18-24.
  12. Kawabe T., & Sawayama M. (2020). A Computational Mechanism for Seeing Dynamic Deformation. eNeuro, 7, 1-14.
  13. Yang, Y.-H., & Wolfe J. M. (2020). Is apparent instability a guiding feature in visual search?. Visual Cognition, 28.
  14. De Havas J., Ito S., & Gomi H. (2020). On stopping voluntary muscle relaxations and contractions: evidence for shared control mechanisms and muscle state specific active breaking. Journal of Neuroscience, 40, 6035-6048.
  15. Kuroki S. (2020). Visual motion information modulates tactile roughness perception. Scientific Reports, 10, 13929.
  16. Chen L., & Liao H.-I. (2020). Microsaccadic eye movements but not pupillary dilation response characterizes the crossmodal freezing effect. Cerebral Cortex Communications, 1.
  17. Terashima H., Kihara K., Kawahara J., & Kondo M. H. (2020). Common principles underlie the fluctuation of auditory and visual sustained attention. Quarterly Journal of Experimental Psychology, (ja), 1747021820972255.
  18. Yamagishi S., & Furukawa S. (2020). Factors influencing saccadic reaction time: Effects of task modality, stimulus saliency, spatial congruency of stimuli, and pupil size. Fronties in Human Neuroscience, 14, 571893.
  19. Uetsuki M., Watanabe J., & Maruya K. (2020). “Textual Prosody” Can Change Impressions of Reading in People With Normal Hearing and Hearing Loss. Frontiers in Psychology, 11, 548619.
  20. Takagi A., De Magistris G., Xiong G., Micaelli A., Kambara H., Koike Y., Savin J., Marsot J., & Burdet E. (2020). Analogous adaptations in speed, impulse and endpoint stiffness when learning a real and virtual insertion task with haptic feedback. Scientific Reports, 10 (1).
  21. Takagi A., Furuta R., Saetia S., Yoshimura N., Koike Y., & Minati L. (2020). Behavioral and physiological correlates of kinetically tracking a chaotic target. PLOS ONE, 15 (9), e0239471.
  22. Takagi A., Li Y., & Burdet E. (2020). Flexible assimilation of human's target for versatile human-robot physical interaction. IEEE Transactions on Haptics, 1-1.
  23. Arslanova I., Wang K., Gomi H., & Haggard P. (2020). Somatosensory evoked potentials that index lateral inhibition are modulated according to the mode of perceptual processing: comparing or combining multi-digit tactile motion. Cognitive Neuroscience, 1-13.
  24. Ooishi Y., Hiraoka D., Mugitani R., & Nomura M. (2020). Relationship between oxytocin and maternal approach behaviors to infants' vocalizations. Comprehensive Psychoneuroendocrinology, 4.

Conferences

  1. Yang Y.-H., Liao H.-I., & Furukawa S. (2020). Sensitivity of eye-metrical responses to sound salience: Contributions of detectability, signal-to-noise ratio, and spectral consistency of acoustic context. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  2. Koumura T., Terashima H., & Furukawa S. (2020). Modulation transfer functions measured with broad- and narrow-band noise carriers in a deep neural network trained for natural sound recognition. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  3. Yamagishi S., & Furukawa S. (2020). Simultaneous measures of auditory brainstem frequency following response, pupillary response, and microsaccade during auditory selective attention task. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  4. Ebina T., Otsuka S., Furukawa S., Okamoto Y., Nakagawa S., Morimoto T., Fujisaka Y., Nonaka T., & Kanzaki S. (2020). Phase characteristics of otoacoustic emissions evoked by amplitude-modulated low-frequency tone. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  5. Tsuzaki M., Matsuura Y., Otsuka S., Furukawa S., & Yamamoto E. (2020). Medial olivocochlear reflexes of musicians with various specialties. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  6. Otsuka S., Nakagawa S., & Furukawa S. (2020). Temporal expectation modulates medial olivocochlear bundle reflex. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  7. Furukawa S., & Maki K. (2020). Stimulus-specific information on sound azimuth conveyed by gerbil collicular neurons. 42nd Association for Research in Otolaryngology (ARO) Midwinter Meeting 2020. San Jose, USA.
  8. Terashima H., Kihara K., Kawahara J. I., & Kondo H. M. (2020). Auditory Sustained Attention Fluctuates Similarly to Visual Sustained Attention. The Abstracts of the Association for Research in Otolaryngology. San Jose, USA.
  9. Kawano H. (2020). Parallel Permutation for Linear Full-resolution Reconfiguration of Heterogeneous Sliding-only Cubic Modular Robots. 2020 IEEE International Conference on Robotics and Automation (ICRA). Paris, France.
  10. Abdallah E. A., Perusqua-Hernandez M., Denman P., Abdelrahman Y., Hassib M., Meschtscherjakov A., Ferreira D., & Henze N. (2020). MEEC: First Workshop on Momentary Emotion Elicitation and Capture. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. Honolulu, HI, USA.
  11. Garcia A. A., Badillo-Urquiola K., Machuca M. D. B., Cibrian F. L., Felice M. C., Gaytan-Lugo L. S., Gomez-Zara D., Griggio C. F., Perusquia-Hernandez M., Silva-Prietch S., Tejada C. E., & Wong-Villacres M. (2020). Fostering HCI Research in, by, and for Latin America. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. Honolulu, HI, USA.
  12. Yasu K. (2020). MagneLayer: Force Field Fabrication by Layered Magnetic Sheets. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Honolulu, HI, USA.
  13. Yang Y.-H., Liao H.-I., & Furukawa S. (2020). The impact of consicous states to the pupillary resoibses as revealed by face inversion effect. The 43nd Annual Meeting of the Japan Neuroscience Society. OnLine. Hyougo, Japan.
  14. Yamagishi S., & Furukawa S. (2020). Relationship between auditory brainstem response and microsaccade during auditory selective attention task. The 43nd Annual Meeting of the Japan Neuroscience Society. OnLine. Hyougo, Japan.
  15. Murata A., Kumano S., & Watanabe J. (2020). Interpersonal physiological linkage is related to excitement during a joint task. The 42nd Annual Virtual Meeting of Cognitive Science Society. Toronto, Canada.
  16. Otsuka S., Nakagawa S., & Furukawa S. (2020). Temporal expectation modulates cochlear efferent feedback. The 43rd Annual Meeting of the Japan Neuroscience Society. OnLine. Kobe, Japan.
  17. Gomi H., Abekawa N., & Ueda H. (2020). Functional roles of visual motion for hand reaching movement - New lines of evidence dissociate posture related and target related responses. The 43rd Annual Meeting of the Japan Neuroscience Society. OnLine. Kobe, Japan.
  18. Nakamura D., & Gomi H. (2020). Spatiotemporal processing of visual motion for generating quick ocular and manual responses examined by convolutional neural network. The 43rd Annual Meeting of the Japan Neuroscience Society. OnLine. Kobe, Japan.
  19. Takemura A., Nakamura D., Abekawa N., & Gomi H. (2020). Effects of cerebral/cerebellum lesions on short-latency manual responses in monkeys. The 43rd Annual Meeting of the Japan Neuroscience Society. OnLine. Kobe, Japan.
  20. Yasu K. (2020). MagneLayer: Force Field Fabrication for Rapid Prototyping of Haptic Interactions. ACM SIGGRAPH 2020 Labs. New York, NY, USA.
  21. Kuroki S. (2020). Arm's blind line: Anisotropic distortion in perceived orientation of stimuli on the arm. EuroHaptics 2020: Haptics: Science, Technology, Applications. Leiden, Netherlands.
  22. Yokosaka T., Suzuishi Y., & Kuroki S. (2020). Feel illusory texture through a hole: Rotating stimulus modulates tactile sensation for touched object's surface. EuroHaptics 2020: Haptics: Science, Technology, Applications. Leiden, Netherlands.
  23. Perusquia-Hernandez M., Cuberos Balda M., Gomez Jauregui D. A., Paez-Granados D., Dollack F., & Salazar J. V. (2020). Robot Mirroring: Promoting Empathy with an Artificial Agent by Reflecting the User's Physiological Affective States. The 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). Online.
  24. Zushi N., Perusquia-Hernandez M., & Ayabe-Kanamura S. (2020). The Effect of Different Affective Arousal Levels on Taste Perception. Companion Publication of the 2020 International Conference on Multimodal Interaction. New York, NY, USA.

2019

Journal articles

  1. Fukiage T., Kawabe T., & Nishida S. (2019). Perceptually Based Adaptive Motion Retargeting to Animate Real Objects by Light Projection. IEEE Transaction on Visualization and Computer Graphics, 25, 2061-2071.
  2. Sakamoto M., & Watanabe J. (2019). Visualizing Individual Perceptual Differences Using Intuitive Word-Based Input. Frontiers in Psychology, 10, 1108.
  3. Koumura T., Terashima H., & Furukawa S. (2019). Cascaded Tuning to Amplitude Modulation for Natural Sound Recognition. Journal of Neuroscience.
  4. Hiraoka D., Ooishi Y., Mugitani R., & Nomura M. (2019). Differential Effects of Infant Vocalizations on Approach-Avoidance Postural Movements in Mothers. Frontiers in Psychology, 10, 1378.
  5. Kawabe T. (2019). Perceptual transparency from cast shadow. i-Perception, 10, 1-14.
  6. Kawabe T. (2019). Temperature as an exteroceptive sense: Challenges remain in thermal modeling of skin-object interactions. i-Perception, 6, 104-105.
  7. Hosokawa K., Maruya K., Nishida S., Takahashi M., & Nakadomari S. (2019). Gamified vision test system for daily self-check. IEEE-GEM 2019.
  8. Kawabe T. (2019). Shadow-based illusion of depth and transparency in printed materials. ACM Transactions on Applied Perception, 16, 10: 1-10: 12.
  9. Kawano H. (2019). Distributed Linear Heterogeneous Reconfiguration of Cubic Modular Robots via Simultaneous Tunneling and Permutation. Regular Paper, IEEE Transactions on Robotics (T-RO), Accepted on 20th August.
  10. Kawada A., Nagasawa M., Murata A., Mogi K., Watanabe K., Kikusui T., & Kameda T. (2019). Vasopressin enhances human preemptive strike in both males and females. Scientific Reports, 9(1).
  11. Dollack F., Perusquia-Hernandez M., Kadone H., & Suzuki K. (2019). Head anticipation during locomotion with auditory instruction in the presence and absence of visual input. Frontiers in Human Neuroscience, 13, 293.
  12. Kanayama N., Hara M., Watanabe J., Kitada R., Sakamoto M., & Yamawaki S. (2019). Controlled emotional tactile stimulation during functional magnetic resonance imaging and electroencephalography. Journal of Neuroscience Methods, 327, 108393.
  13. Zhao S., Chait M., Dick F., Dayan P., Furukawa S., & Liao H.-I. (2019). Pupil-linked phasic arousal evoked by violation but not emergence of regularity within rapid sound sequences. Nature Communications, 10 (1).
  14. Otsuka S., Nakagawa S., & Furukawa S. (2019). Relationship between cochlear mechanics and speech-in-noise reception performance. The Journal of the Acoustical Society of America, 146 (3), EL265-EL271.
  15. Zhao S., Yum N. W., Benjamin L., Benhamou E., Yoneya M., Furukawa S., Dick F., Slaney M., & Chait M. (2019). Rapid ocular responses are modulated by bottom-up driven auditory salience. Journal of Neuroscience, 39 (39), 7703-7714.
  16. Takamuku S., & Gomi H. (2019). Better grip force control by attending to the controlled object: Evidence for direct force estimation from visual motion. Scientific Reports, 9 (1), 13114.
  17. Kawabe T. (2019). Visual assessment of causality in the Poisson effect. Scientific Reports, 9, 1-10.
  18. Perusquía-Hernández M., Ayabe-Kanamura S., & Suzuki K. (2019). Human perception and biosignal-based identification of posed and spontaneous smiles. PLOS ONE, 14 (12), 1-26.

Conferences

  1. Koumura T., Terashima H., & Furukawa S. (2019). Emergence of ITD Selectivity in a Deep Neural Network Trained for Binaural Natural Sound Detection. 42nd Annual MidWinter Meeting, Association for Research in Otolaryngology.
  2. Terashima H., Tsukano H., & Furukawa S. (2019). Mapping Areal Organization of Mouse Auditory Cortex by Data-driven Decomposition of Responses to Naturalistic Sounds. The 42st MidWinter Meeting of Association for Research in Otolaryngology (ARO). Baltimore (USA).
  3. Liao H.-I., Fujihira H., & Furukawa S. (2019). Pupillary Light Response Reveals Covert Attention to Auditory Space and Object. 42nd Annual MidWinter Meeting, Association for Research in Otolaryngology.
  4. Liu X., Sawayama M., Hayashi R., Ozay M., Okatani T., & Nishida S. (2019). Perturbation tolerance of deep neural networks and humans in material recognition. CiNet 5th conference. Osaka, Japan.
  5. Murata A., Abe O. M. & Watanabe K. (2019). Implicit effect of dyadic coordination during joint action. International Convention of Psychological Science (ICPS2019). Paris, France.
  6. Kawabe T. (2019). Visual illusion meets augmented reality techniques. IEEE VR 2019 Tutorials T1:(AR) Hack Our Material Perception in Spatial Augmented Reality. Osaka, Japan.
  7. Sawayama M., Barla P., & Nishida S. (2019). Color information processing in intrinsic image decomposition. Dynamic in Vision and Touch 2019 Workshop. Giessen, Germany.
  8. Nishida S. (2019). Hacking Human Visual Perception. IEEE VR 2019 Keynote Talks. Osaka, Japan.
  9. Fukiage T., Kawabe T., & Nishida S. (2019). Demonstration of Perceptually Based Adaptive Motion Retargeting to Animate Real Objects by Light Projection. IEEE VR 2019 Research Demos. Osaka, Japan.
  10. Maruya K., & Ohtani T. (2019). Shadows can change the shape appearances of real and virtual objects. IEEE VR 2019 Poster. Osaka, Japan.
  11. Perusquia-Hernandez M., Ayabe-Kanamura S., Suzuki K., & Kumano S. (2019). The Invisible Potential of Facial Electromyography: A Comparison of EMG and Computer Vision when Distinguishing Posed from Spontaneous Smiles. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Glasgow, UK.
  12. Yang Y.-H., Liao H.-I., Yamagishi S., & Furukawa S. (2019). Pupillometry and microsaccade response reveal unconscious processing of face information under interocular suppression. 19th Annual Meeting of Vision Science Society (VSS). Florida, US.
  13. Liao H.-I., Fujihira H., Yamagishi S., & Furukawa S. (2019). Microsaccades and pupillary responses represent the focus of auditory attention. 19th Annual Meeting of Vision Science Society (VSS). Florida, US.
  14. Yasu K. (2019). Magnetact: Magnetic-sheet-based Haptic Interfaces for Touch Devices. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. New York, NY, USA.
  15. Sawayama M., Fukiage T., & Nishida S. (2019). Slant-dependent image modulation for perceiving translucent objects. 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  16. Kuroki S., Sawayama M., & Nishida S. (2019). Haptic discrimination of 3D-printed patterns based on natural visual textures. 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  17. Kawabe T. (2019). The judgment of causality for deformations of stretchy materials. 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  18. Kawabe T., Sawayama M., & Hoshika T. (2019). Moiré effects on real object's apearances. Demo night at the 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  19. Maruya K., Fujita Y., & Ohtani T. (2019). Café-Wall illusion caused by shadows on a surface of three dimensional object. 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  20. Hosokawa K., Maruya K., Nishida S., & Nakadomari S. (2019). Test battery for daily self-assessment of visual abilities. 19th Annual Meeting of Vision Sciences Society (VSS). Florida, US.
  21. Kawano H. (2019). Linear Heterogeneous Reconfiguration of Cubic Modular Robots via Simultaneous Tunneling and Permutation. 2019 Ineternational Conference on Robotics and Automations (ICRA2019). Montreal, Canada.
  22. Hosokawa K., Maruya K., Nishida S., Takahashi M., & Nakadomari S. (2019). Gamification of vision test improves usability for internet experiments. Asian-Pacific Conference on Vision 2019. Osaka, Japan.
  23. Kawabe T. (2019). The perception of motion direcrtion for deformation flow. Asian-Pacific Conference on Vision 2019. Osaka, Japan.
  24. Yokosaka T., Kuroki S., & Nishida S. (2019). Describing the Sensation of 'Velvet Hand Illusion' in Terms of Common Materials. IEEE World Haptics Symposium 2019. Tokyo, Japan.
  25. Kawabe T., & Sawayama M. (2019). A computational analysis of Moire-induced illusory deformation. Visual Science of Ars Conference 2019. Leuven, Belgium.
  26. Yang Y.-H., Huang T.-R., & Yeh S.-L. (2019). Can Semantic Information be Temporally Integrated Under Interocular Suppression? An fMRI study. The 15th Asia-Pacific Conference on Vision(APCV). Osaka, Japan.
  27. Chien S.-E., Yang Y.-H., Teramoto S., Ono Y., & Yeh S.-L. (2019). Neural correlates of semantic priming under visual crowding: An MEG study. The 15th Asia-Pacific Conference on Vision(APCV). Osaka, Japan.
  28. Liao H.-I. (2019). Unified audio-visual spatial attention revealed by pupillary light response. The 15th Asia-Pacific Conference on Vision(APCV). Osaka, Japan.
  29. Liao H.-I. & Chen L. (2019). Sound freezes transient visual presentations as revealed by microsaccade inhibition. The 42nd European Conference on Visual Perception (ECVP). Leuven, Belgium.
  30. Ho H.-N., Terashima H., Wakamatsu K., Kwon J., Sakamoto M., Nakauchi S., & Nishida S. (2019). Visual inference for warm/cold perception of surfaces. The 42nd edition of the European Conference on Visual Perception (ECVP2019). Leuven, Belgium.
  31. Koumura T., Terashima H., & Furukawa S. (2019). “Psychophysical” modulation transfer functions in a deep neural network trained for natural sound recognition. The 7th International Symposium on Auditory and Audiological Research (ISAAR2019). Nyborg, Denmark.
  32. Zhou Y., Nakamura K., Murata A., Watanabe K., & Watanabe J. (2019). An investigation of the influence of false heartbeat feedback. IEEE World Haptics Symposium 2019. Tokyo, Japan.
  33. Nomura K., Murata A., Yotsumoto Y., & Kumano S. (2019). Bayesian Item Response Model with Condition-specific Parameters for Evaluating the Differential Effects of Perspective-taking on Emotional Sharing. The 41st Annual Meeting of the Cognitive Science Society (CogSci 2019). Montréal, Canada.
  34. Nomura K., Kumano S., & Yotsumoto Y. (2019). Multitask Item Response Model Revealed Bias in Estimated Emotional Features due to Response Style within the Open Affective Standardized Image Set (OASIS). The 52nd Annual Meeting of the Society for Mathematical Psychology (MathPsych 2019). Montréal, Canada.
  35. Yamagishi S., Yoneya M., & Furukawa S. (2019). The effects of the saliency and the spatial congruency of the stimuli on saccadic eye movement elicited by visual and auditory stimuli. The 42nd Annual Meeting of the Japan Neuroscience Society. Niigata, Japan.
  36. Ooishi Y., Hiraoka D., Nomura M., & Mugitani R. (2019). Relationship between maternal approach-avoidance behavior and emotional contents of infant vocalization. The 42nd Annual Meeting of the Japan Neuroscience Society. Niigata, Japan.
  37. Koumura T., Terashima H., & Furukawa S. (2019). Tuning of Single-Unit Responses to Interaural Time Difference in a Deep Neural Network. The 42st Annual Meeting of the Japan Neuroscience Society. Niigata, Japan.
  38. Koumura T., Terashima H., & Furukawa S. (2019). “Psychophysical” Detectability of Amplitude Modulation in a Deep Neural Network Trained for Natural Sound Recognition. The Proceedings of the 29th Annual Conference of the Japanese Neural Network Society. Tokyo, Japan.
  39. Abekawa N., & Gomi H. (2019). Learning and retrieving motor memories depending on gaze-reach coordination. Neuroscience2019. (Neuroscience2019). Chicago, USA.
  40. Gomi H., & Nakamura D. (2019). Synthetic modeling of human visual motion analysis for generating quick ocular and manual responses. Neuroscience2019. (Neuroscience2019). Chicago, USA.
  41. Ito S., & Gomi H. (2019). Online modulation of proprioceptive reflex gain depending on uncertainty in multisensory state estimation. Neuroscience2019. (Neuroscience2019). Chicago, USA.
  42. Ueda H., Abekawa N., Ito S., & Gomi H. (2019). Distinct temporal frequency-dependent modulations of direct and indirect visual motion effects on reaching adjustments. Neuroscience2019. (Neuroscience2019). Chicago, USA.
  43. Uezu Y., Hiroya S., & Mochida T. (2019). Naturalness of transformed auditory feedback sounds changes the patterns of compensatory articulatory responses and self-agency ratings in speech production. Neuroscience2019. (Neuroscience2019). Chicago, USA.
  44. Kawabe T. (2019). Implementation of visual illusions in real-world situations. The 1st meeting of Applying Neuroscience to Business. Yokohama, Japan.
  45. Kuroki S. (2019). Arm's blind line: Anisotropic distortion in perceived orientation of stimuli on the arm. Neuroscience 2019 (SfN). Chicago, USA.
  46. Terashima H., & Furukawa S. (2019). Efficient codes of the auditory nerves reconsidered with natural reverberations. Advances and Perspectives in Auditory Neuroscience (APAN). Chicago, USA.
  47. Terashima H., Tsukano H., & Furukawa S. (2019). Data-driven auditory field mapping for mice using naturalistic sounds. Neuroscience 2019 (SfN). Chicago, USA.
  48. Palumbo C., Kriening H., Wajda B., & Perusquia-Hernandez M. (2019). Understanding User Customization Needs : Requirements for an Augmented Reality Lamp Customization Tool. Proceedings of the Design and Semantics of Form and Movement Conference, XI Edition. Cambridge, MA, USA.
  49. Perusquía-Hernández M., Ayabe-Kanamura, S., & Suzuki, K. (2019). Posed and spontaneous smile assessment with wearable skin conductance measured from the neck and head movement. 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). Cambridge, U.K.
  50. Nunez E., Hirokawa M., Perusquia-Hernandez, M., & Suzuki, K. (2019). Effect on Social Connectedness and Stress Levels by Using a Huggable Interface in Remote Communication. 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). Cambridge, U.K.

2018

Journal articles

  1. Furukawa, S., Onikura, K., Kidani, S., Kato, M. & Kitagawa, N. (2018). Light-synchronized tapping task as an objective method for estimating auditory detection threshold. Acoustical Science and echnology, 39, 30-36.
  2. Ooishi, Yuuki (2018). Correlation between resting testosterone/cortisol ratio and sound-induced vasoconstriction at fingertip in men. Frontiers in Physiology, 9, 164.
  3. Scinob Kuroki & Shin'ya Nishida (2018). Human tactile detection of within- and inter-finger spatiotemporal phase shifts of low-frequency vibrations. Scientific Reports, 8, 4288.
  4. Sakamoto, Maki & Watanabe, Junji (2018). Bouba/Kiki in Touch: Associations Between Tactile Perceptual Qualities and Japanese Phonemes. Frontiers in Psychology, 8 (295).
  5. Abekawa, Naotoshi, Ferrè, Elisa Raffaella, Gallagher, Maria, Gomi, Hiroaki & Haggard, Patrick (2018). Disentangling the visual, motor and representational effects of vestibular input. Cortex, 104, 46-57.
  6. Takamuku, Shinya, Forbes, Paul A. G., Hamilton, Antonia F. de C. & Gomi, Hiroaki (2018). Typical use of inverse dynamics in perceiving motion in autistic adults: Exploring computational principles of perception and action. Autism Research: Official Journal of the International Society for Autism Research.
  7. Masataka Sawayana & Shin'ya Nishida (2018). Material and shape perception based on two types of intensity gradient information. PLoS Computational Biology, 14 (4), e1006061-40.
  8. Yokosaka, Takumi, Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2018). Estimating Tactile Perception by Observing Explorative Hand Motion of Others. IEEE Transactions on Haptics, 11 (2), 192-203.
  9. De Havas, Jack, Ito, Sho, Haggard, Patrick & Gomi, Hiroaki (2018). Low Gain Servo Control During the Kohnstamm Phenomenon Reveals Dissociation Between Low-Level Control Mechanisms for Involuntary vs. Voluntary Arm Movements. Frontiers in behavioral neuroscience, 12.
  10. Antaket, Layla Chadaporn, Matsuda, Masafumi, Otsuka, Kazuhiro & Kumano, Shiro (2018). Analyzing Generation and Cognition of Emotional Congruence using Empathizing-Systemizing Quotient. International Journal of Affective Engineering, 17 (3), 183-192.
  11. Kawabe Takahiro & Shin'ya Nishida (2018). Deformation-induced transparency resolves color scission. Journal of Vision, 18 (8), 3: 1-12.
  12. Shin'ya Nishida, Kawabe Takahiro, Masataka Sawayama & Taiki Fukiage (2018). Motion Perception: From Detection to Interpretation. Annual Review of Vision Science, 4, in press.
  13. Li Li, Shiro Kumano, Anita Keshmirian, Bahador Bahrami, Jian Li & Nicholas D. Wright (2018). Parsing cultural impacts on regret and risk in Iran, China and the United Kingdom. Scientific Reports, 8, 13862.
  14. Otsuka, Sho, Nakagawa, Seiji & Furukawa, Shigeto (2018). A Preceding Sound Expedites Medial Olivocochlear Reflex. Acta Acustica united with Acustica, 104 (5), 804-808.
  15. Ohga, Shinpei, Tsukano, Hiroaki, Horie, Masao, Terashima, Hiroki, Nishio, Nana, Kubota, Yamato, Takahashi, Kuniyuki, Hishida, Ryuichi, Takebayashi, Hirohide & Shibuki, Katsuei (2018). Direct Relay Pathways from Lemniscal Auditory Thalamus to Secondary Auditory Field in Mice. Cerebral Cortex, bhy234.
  16. Koumura, Takuya & Furukawa, Shigeto (2018). Do Speech Contexts Induce Constancy of Material Perception Based on Impact Sound Under Reverberation?. Acta Acustica united with Acustica, 104 (5), 796-799.
  17. Hsin-I Liao, Yoneya Makoto, Makio Kashino & Shigeto Furukawa (2018). Pupillary dilation response reflects surprising moments in music. Journal of Eye Movement Research, 11 (2), 13.
  18. Kawabe, Takahiro (2018). Linear Motion Coverage as a Determinant of Transparent Liquid Perception. i-Perception, 9 (6), 2041669518813375.
  19. Ueda, Hiroshi, Abekawa, Naotoshi & Gomi, Hiroaki (2018). The faster you decide, the more accurate localization is possible: Position representation of "curveball illusion" in perception and eye movements. PLOS ONE, 13 (8), e0201610.
  20. Mugitani, Ryoko, Kobayashi, Tessei, Hayashi, Akiko & Fais, Laurel (2018). The Use of Pitch Accent in Word--Object Association by Monolingual Japanese Infants. Infancy.

Conferences

  1. Liao, H.-I., Fujihira, H. & Furukawa, S. (2018). The pupillary light response reveals the focus of auditory spatial attention. Association for Research in Otolaryngology (ARO) 41th Annual MidWinter Meeting. San Diego (USA).
  2. Furukawa, S. (2018). Eye metrics as indicators of auditory salience?. Symposium: Understanding auditory salience, 41th ARO Midwinter Meeting. San Diego (USA).
  3. Koumura, Takuya, Terashima, Hiroki & Furukawa, Shigeto (2018). Representation of Amplitude Modulation in a Deep Neural Network Optimized for Sound Classification. 41st Annual MidWinter Meeting, Association for Research in Otolaryngology. San Diego (USA).
  4. Fujihira, Haruna, Yamagishi, Shimpei, Liao, Hsin-I & Furukawa, Shigeto (2018). Sensitivities of Pupillary Dilation Responses and Microsaccade Rates to Alert Sounds. 41st Annual MidWinter Meeting, Association for Research in Otolaryngology. San Diego (USA).
  5. Terashima, Hiroki & Furukawa, Shigeto (2018). Reconsidering the Efficient Coding Model of the Auditory Periphery under Reverberations. The 41st MidWinter Meeting of Association for Research in Otolaryngology (ARO). San Diego (USA).
  6. Yokosaka, Takumi, Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2018). Linkage between Free Exploratory Movements and Subjective Tactile Ratings. IEEE Haptics Symposium 2018. San Francisco, USA.
  7. Ho, H.-N., Sato, K., Kuroki, S., Watanabe, J., Maeno, T. & Nishida, S. (2018). Physical-Perceptual Correspondence for Dynamic Thermal Stimulation. IEEE Haptics Symposium 2018. San Francisco, USA.
  8. Maruya, Kazushi & Ohtani, Tomoko (2018). The optical illusion blocks: Optical illusion patterns in a three dimensional world. VSS 2018(demo night). Florida, USA.
  9. Hosokawa, Kenchi & Maruya, Kazushi (2018). Quick estimation of contrast sensitivity function using a tablet device. VSS 2018(demo night). Florida, USA.
  10. Yokosaka, Takumi, Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2018). Explorative hand motion reflects process of value-based binary decision-making for tactile stimuli. Euro Haptics 2018. Pisa, Italy.
  11. Kuroki, Scinob, Sawayama, Masataka & Nishida, Shin'ya (2018). Haptic texture perception on 3D-printed surfaces transcribed from visual natural textures. Euro Haptics 2018. Pisa, Italy.
  12. Kuroki, Scinob & Nishida, Shin'ya (2018). Direction judgements with randomdot motion in touch. Euro Haptics 2018. Pisa, Italy.
  13. Otsuka, Sho, Nakagawa, Seiji & Furukawa, Shigeto (2018). A preceding sound expedites medial olivocochlear reflex. International Symposium on Hearing 2018. Snekkersten, Denmark.
  14. Koumura, Takuya & Furukawa, Shigeto (2018). Do speech contexts induce constancy of material perception based on impact sound under reverberation? International Symposium on Hearing 2018. Snekkersten, Denmark.
  15. Koumura, Takuya, Terashima, Hiroki & Furukawa, Shigeto (2018). Emergence of auditory-system-like representation of amplitude modulation in a deep neural network trained for sound classification. 27th Annual Computational Neuroscience Meeting (CNS*2018). Seattle (USA).
  16. Terashima, Hiroki & Furukawa, Shigeto (2018). Efficient coding of natural sounds at the auditory periphery with consideration of environmental modulations: a computational study. The 11th FENS Forum of Neuroscience (FENS 2018). Berlin (Germany).
  17. Terashima, Hiroki & Furukawa, Shigeto (2018). Revisiting efficient coding of natural sounds in the environment: unsupervised learning or task-based optimization?. 27th Annual Computational Neuroscience Meeting (CNS*2018). Seattle (USA).
  18. Koumura, Takuya, Terashima, Hiroki & Furukawa, Shigeto (2018). Single unit recording in a deep neural network reveals representation of amplitude modulation similar to the auditory nervous system. The 11th FENS Forum of Neuroscience (FENS 2018). Berlin (Germany).
  19. Hisanaga,Satoko, Mugitani, Ryoko & Sekiyama, Kaoru (2018). Selective attention to the mouth of a talking face in Japanese learning infants and toddlers. The 40th International Conference on Infant Studies (ICIS 2018). Philadelphia (USA).
  20. Miyazaki, Michiko, Mugitani, Ryoko & Asai, Tomohisa (2018). “Touching!!”: An AR system for unveiling face topography in very young children. The 40th International Conference on Infant Studies (ICIS 2018). Philadelphia (USA).
  21. Tsukano, Hiroaki, Ohga, Shinpei, Horie, Masao, Terashima, Hiroki, Nishio, Nana, Kubota, Yamato, Takahashi, Kuniyuki, Hishida, Ryuichi, Takebayashi, Hirohide & Shibuki, Katsuei (2018). Thalamocortical structures that differentiate complexity in functional organizations between primary and secondary auditory cortices in mice. The 41st Annual Meeting of the Japan Neuroscience Society. Kobe, Japan.
  22. Ryohei Shibue & Makoto Yoneya (2018). Scan path modeling using marked point processes. The 41st Annual Meeting of the Japan Neuroscience Society. Kobe, Japan.
  23. Nishida, Shin'ya (2018). Understanding human recognition of material properties for innovation in SHITSUKAN science and technology. 2018 UK-JSPS Symposium "SHITSUKAN approach to digital colour sensing: human colour vision for material quality". Manchester, UK.
  24. Jan Jaap R. van Assen, Shin'ya Nishida & Roland W. Fleming (2018). Estimating perceived viscosity of liquids with neural networks. European Conference on Visual Perception. Irieste, Italy.
  25. Hiroya, S. & Mochida, T. (2018). Neural mechanisms underlying the impact of speech sound naturalness during transformed auditory feedback.10th Annual Meeting of the Society for Neurobiology of Language, Québec City, Canada.
  26. Fukiage, Taiki, Kawabe, Takahiro & Nishida Shin'ya (2018). Hidden Stereo: Hiding Phase-Based Stereo Disparity for Ghost-Free Viewing Without Glasses. IMID2018. Busan, Korea.
  27. Daiki, Amanai, Tomoko, Ohtani & Kazushi, Maruya (2018). A Suggestion of the Optical Illusion Blocks for an Architectural Theory: Toward an Architecture in the Near Future. The 18th International Conference on Geometry & Graphics (ICGG 2018). Milan, Itary.
  28. Tomoko, Ohtani, Daiki, Amanai & Kazushi, Maruya (2018). The Effect of a Two-Dimensional Optical Illusion Pattern on the Three-Dimensional Interpretation of Objects Using Café Wall “Illusion Blocks”. The 18th International Conference on Geometry & Graphics (ICGG 2018). Milan, Itary.
  29. Kawano, Hiroshi (2018). Distributed Tunneling Reconfiguration of Sliding Cubic Modular Robots in Severe Space Requirements. Proceedings of 14th International Symposium on Distributed Autonomous Robotic Systems.
  30. Otsuka, Kazuhiro, Kasuga, Keisuke & Köhler, Martina (2018). Estimating Visual Focus of Attention in Multiparty Meetings Using Deep Convolutional Neural Networks. Proceedings of the 20th ACM International Conference on Multimodal Interaction. New York, NY, USA.
  31. Ishii, Ryo, Otsuka, Kazuhiro, Kumano, Shiro, Higashinaka, Ryuichiro & Tomita, Junji (2018). Analyzing Gaze Behavior and Dialogue Act During Turn-taking for Estimating Empathy Skill Level. Proceedings of the 20th ACM International Conference on Multimodal Interaction. New York, NY, USA.
  32. Koumura, Takuya, Terashima, Hiroki & Furukawa, Shigeto (2018). Chimeric sounds with shuffled “texture” and “content” synthesized by a model of the auditory system. International Symposium on Universal Acoustical Communication 2018.
  33. Koumura, Takuya, Sawayama, Masataka & Nishida, Shin'ya (2018). Explaining Behavioral Data of Visual Material Discrimination with a Neural Network for Natural Image Recognition. The 28th Annual Conference of the Japanese Neural Network Society.
  34. Terashima, Hiroki & Furukawa, Shigeto (2018). An examination of the efficient coding model for auditory nerves during the infant development. International Symposium on Universal Acoustical Communication 2018.
  35. Terashima, Hiroki (2018). Computational understanding of auditory neural codes and natural sounds. Joint workshop of UCL-ICN, NTT, UCL-Gatsby, and AIBS: Analysis and Synthesis for Human/Artificial Cognition and Behaviour.
  36. Ho, Hsin-Ni, Terashima, Hiroki, Wakamatsu, Kohta, Kwon, Jinhwan, Sakamoto, Maki, Nakauchi, Shigeki & Nishida, Shin'ya (2018). Image statistics and the warm/cold perception of surfaces. Joint workshop of UCL-ICN, NTT, UCL-Gatsby, and AIBS: Analysis and Synthesis for Human/Artificial Cognition and Behaviour.
  37. Terashima, Hiroki, Tsukano, Hiroaki & Furukawa, Shigeto (2018). Unsupervised Area Segmentation of Mouse Auditory Cortex based on Responses to Naturalistic Complex Sounds. The Proceedings of the 28th Annual Conference of the Japanese Neural Network Society.
  38. Furukawa, Shigeto, Terashima, Hiroki, Koumura, Takuya & Tsukano, Hiroaki (2018). Data-driven approaches for unveiling the neurophysiological func-tions of the auditory system. Seminar on brain, hearing and speech sciences for universal speech communication.
  39. Furukawa, Shigeto (2018). Probing auditory attention by eye metrics. Joint workshop of UCL-ICN, NTT, UCL-Gatsby, and AIBS: Analysis and Synthesis for Human/Artificial Cognition and Behaviour.
  40. Yamagishi, Shimpei, Yoneya, Makoto & Furukawa, Shigeto (2018). Dynamic overshoot in saccadic movement of pupil inside iris during pro- and anti-saccade tasks. Neuroscience 2018, Annual Meeting of the Society for Neuroscience.
  41. Gatica-Perez, Daniel, Sanchez-Cortes, Dairazalia, Do, Trinh-Minh-Tri, Babu Jayagopi, Dinesh & Otsuka, Kazuhiro (2018). Vlogging Over Time: Longitudinal Impressions and Behavior in YouTube. Proceedings of 17th International Conference on Mobile and Ubiquitous Multimedia (MUM2018).
  42. Ryohei Shibue & Makoto Yoneya (2018). Scan path modeling using marked point processes. Neuroscience 2018, Annual Meeting of the Society for Neuroscience.
  43. Jan Jaap R. van Assen, Shin'ya Nishida & Roland W. Fleming (2018). How neural networks perceive the viscosity of liquids. Skin of things symposium. Amsterdam, Netherlands.
  44. Kawabe, Takahiro (2018). Spatially Augmented Depth and Transparency in Paper Materials. SIGGRAPH Asia 2018 Emerging Technologies. New York, NY, USA.
  45. Kawabe, Takahiro (2018). Spatially Augmented Depth and Transparency in Paper Materials. SIGGRAPH Asia 2018 Posters. New York, NY, USA.
  46. Kawabe, Takahiro (2018). Danswing Papers. SIGGRAPH Asia 2018 Posters. New York, NY, USA.
  47. Fukiage, Taiki, Kawabe, Takahiro & Nishida Shin'ya (2018). Hidden Stereo: Synthesizing Ghost-free Stereoscopic Images for Viewers without 3D Glasses. IDW2018. Nagoya, Japan.
  48. Takumi Shimada, Vibol Yem, Yasushi Ikei, Tomohiro Amemiya, Koichi Hirota & Michiteru Kitazaki (2018). Spatiotemporal Tactile display with tangential force and normal skin vibration generated by shaft end-effectors. Proceedings of AsiaHaptics 2018.
  49. Uezu, Yasufumi, Hiroya, Sadao & Mochida, Takemi (2018). Sound naturalness of feedback speech affects articulatory compensation for transformed auditory feedback. Joint workshop of UCL-ICN, NTT, UCL-Gatsuby, and AIBS.
  50. Takamuku, Shinya & Gomi, HiroakI (2018). Low sensitivities of walking speed adjustment and self-motion velocity perception to dense optic flow. JNNS Satellite Workshop "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". Okinawa, Japan.
  51. Takamuku, Shinya & Gomi, Hiroaki (2018). Increase in density of optic flow deteriorates self-motion velocity perception and decreases implicit adjustments of walking speed. Neuroscience2018.
  52. Shimizu, Koichi, Sueta, Gaku, Yamaoka, Kentaro, Sawamura, Kazuki, Suzuki, Yujin, Yoshida, Keisuke, Yem, Vibol, Ikei, Yasushi, Amemiya, Tomohiro, Sato, Makoto, Hirota, Koichi & Kitazaki, Michiteru (2018). FiveStar VR: Shareable Travel Experience Through Multisensory Stimulation to the Whole Body. Proc. ACM SIGGRAPH Asia 2018 Virtual & Augumented Reality. Tokyo, Japan.
  53. Nakamura, Daiki & Gomi, Hiroaki (2018). Statistical analysis of optic flow induced by body motion characterizing OFR and MFR. JNNS Satellite Workshop "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". Okinawa, Japan.
  54. Kaneko, Hirofumi, Amemiya, Tomohiro, Yem, Vibol, Ikei, Yasushi, Hirota, Koichi & Kitazaki, Michiteru (2018). Leg-Jack: Generation of the sensation of walking by electrical and kinesthetic stimuli to the lower limbs. Proc. ACM SIGGRAPH Asia 2018 Emerging Technologies. Tokyo, Japan.
  55. Ito, Sho & Gomi, Hiroaki (2018). Multimodal contribution to body state representation for generating proprioceptive reflexes. JNNS Satellite Workshop "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". Okinawa, Japan.
  56. Hiroya, Sadao (2018). Speech production and perception share common brain mechanisms. Joint workshop of UCL-ICN, NTT, UCL-Gatsuby, and AIBS.
  57. Hiroya, Sadao, Cai, Q., Sethi, A., Lavan, N., Chen, S.H., Meekings, S. & Scott, S.K. (2018). Representational similarity analysis reveals the involvement of supplementary motor area in perceiving speech rhythm. Society for Neuroscience 2018 Abstracts.
  58. Hiroaki Gomi (2018). Implicit visuomotor control and its effect on self-awareness (invited talk). Workshop on mechanism of brain and mind. Hokkaido, Japan.
  59. Gomi, Hiroaki (2018). Contribution of internal models on sensorimotor control (Invited talk). The 75th Fujihara Seminar "The Cerebellum as a CNS hub ? from its evolution to therapeutic strategies". Tokyo Medical and Dental University.
  60. Gomi, Hiroaki (2018). Output Modality Dependent Visual Motion Analysis in the Brain (invited talk). Brain and AI symposium by Korea Society of Brain and Nerve.
  61. Gomi, Hiroaki & Nakamura, Daiki (2018). Specificities of manual and ocular following responses and natural statistics of optic flow induced by body movements. The 41st Annual Meeting of the Japan Neuroscience Society. Kobe, Japan.
  62. Gomi, Hiroaki & Ito, Sho (2018). Portable 2DoF force display gadget realized by 'anisotropic rigidity bridging'. EuroHaptics 2018. Pisa, Italy.
  63. Gomi, Hiroaki, Ito, Sho & Watanabe, Junji (2018). Let's feel bodily sensation in visual experience with a portable 2-dof force display. EuroHaptics 2018. Pisa, Italy.
  64. De Havas, Jack, Ito, Sho & Gomi, Hiroaki (2018). The inhibition of voluntary muscle relaxations depends on similar mechanisms to the inhibition of muscle contractions. Neuroscience2018. San Diego, USA.
  65. De Havas, Jack, Ito, Sho & Gomi, Hiroaki (2018). Does the inhibition of voluntary muscle relaxations depend on similar mechanisms to the inhibition of muscle contractions?. Joint workshop of UCL-ICN, NTT, UCL-Gatsuby, and AIBS "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". OIST, Okinawa, Japan.
  66. Arslanova, Irena, Gomi, HiroakI & Haggard, Patrick (2018). Intra-hemispheric limitations in integrating spatiotemporal information across fingers. JNNS Satellite Workshop "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". Okinawa.
  67. Abekawa, Naotoshi & Gomi, Hiroaki (2018). Different eye-hand coordination forms distinct motor memories in visuomotor adaptation. Neuroscience2018. San Diego, USA.
  68. Abekawa, Naotoshi & Gomi, Hiroaki (2018). Difference in eye-hand coordination forms distinct motor memories in implicit visuomotor adaptation. JNNS Satellite Workshop "Analysis and Synthesis for Human/Artificial Cognition and Behaviour". OIST, Okinawa, Japan.
  69. Yasu, Kentaro (2018). Magnetact: Magnetic-sheet-based Haptic Interfaces for Touch Devices. SIGGRAPH Asia 2018 Emerging Technologies. New York, NY, USA.

2017

Journal articles

  1. Amemiya, Tomohiro, Beck, Brianna, Walsh, Vincent, Gomi, Hiroaki & Haggard, Patrick (2017). Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study. Scientific reports, 7 (January), 40937.
  2. Fukiage, Taiki, Kawabe, Takahiro, Sawayama, Masataka & Nishida, Shin'ya (2017). Animating Static Objects by Illusion-Based Projection Mapping. Journal of the Society for Information Display, 25 (7), 434-443.
  3. Fukiage, Taiki, Kawabe, Takahiro & Nishida, Shin'ya (2017). Hiding of Phase-Based Stereo Disparity for Ghost-Free Viewing Without Glasses. ACM Transactions on Graphics (Proc. SIGGRAPH 2017), 36 (4).
  4. Yamagishi, S., Otsuka, S., Furukawa, S. & Kashino, M. (2017). Comparison of perceptual properties of auditory streaming between spectral and amplitude modulation domains. Hearing Research.
  5. Altmann, C. F., Ueda, R., Furukawa, S., Kashino, M., Mima, T. & Fukuyama, H. (2017). Auditory Mismatch Negativity in Response to Changes of Counter-Balanced Interaural Time and Level Differences. Frontiers in neuroscience, 11.
  6. Altmann, C. F., Ueda, R., Bucher, B., Furukawa, S., Ono, K., Kashino, M., Mima, T. & Fukuyama, H. (2017). Trading of dynamic interaural time and level difference cues and its effect on the auditory motion-onset response measured with electroencephalography. NeuroImage, 159, 185-194.
  7. De Havas, J., Gomi, H. & Haggard, P. (2017). Experimental investigations of control principles of involuntary movement: a comprehensive review of the Kohnstamm phenomenon. Exp Brain Res, 235 (7), 1953-1997.
  8. Igeta, Takako, Hiroya, Sadao & Arai, Takayuki (2017). Overlapping of /o/ and /u/ in modern Seoul Korean: focusing on speech rate in read speech. 말소리와 음성과학, 1-7.
  9. Ho, H.-N., Sato, K., Kuroki, S., Watanabe, J., Maeno, T. & Nishida, S. (2017). Physical-Perceptual Correspondence for Dynamic Thermal Stimulation. IEEE Transactions on Haptics, 10 (1), 84-93.
  10. Ho, H.-N. (2017). Material recognition based on thermal cues: Mechanisms and applications. Temperature, 1-20.
  11. Zhou, Y., Ho, H.-N. & Watanabe, J. (2017). Perceptual-semantic congruency facilitates semantic discrimination of thermal qualities. Frontiers in Psychology, 8 (2113).
  12. Kawabe, Takahiro (2017). What Property of the Contour of a Deforming Region Biases Percepts toward Liquid?. Frontiers in Psychology, 8.
  13. Kawabe, Takahiro (2017). Perceiving Animacy From Deformation and Translation. i-Perception, 8 (3), 2041669517707767.
  14. Kawabe, Takahiro, Sasaki, Kyoshiro, Ihaya, Keiko & Yamada, Yuki (2017). When categorization-based stranger avoidance explains the uncanny valley: A comment on MacDorman and Chattopadhyay (2016). Cognition, 161, 129-131.
  15. Kawabe, Takahiro & Kogovšek, Rok (2017). Image deformation as a cue to material category judgment. Scientific Reports, 7, 44274.
  16. Shiro Kumano, Kazuhiro Otsuka, Ryo Ishii & Junji Yamato (2017). Collective First-Person Vision for Automatic Gaze Analysis in Multiparty Conversations. IEEE Trans. Multimedia, 19 (1), 107-122.
  17. Kuroki, S, Watanabe, J & Nishida, S (2017). Integration of vibrotactile frequency information beyond the mechanoreceptor channel and somatotopy. Scientific Reports, 7, 2758.
  18. Kuroki, S, Yokosaka, T & Watanabe, J (2017). Sub-Second Temporal Integration of Vibro-Tactile Stimuli: Intervals between Adjacent, Weak, and Within-Channel Stimuli Are Underestimated. Frontiers in Psychology, 8, fpsyg2017.01295.
  19. Watanabe, Ken, Ooishi, Yuuki & Kashino, Makio (2017). Heart rate responses induced by acoustic tempo and its interaction with basal heart rate. Scientific Reports, 7.
  20. Ooishi, Yuuki, Mukai, Hideo, Watanabe, Ken, Kawato, Suguru & Kashino, Makio (2017). Increase in salivary oxytocin and decrease in salivary cortisol after listening to relaxing slow-tempo and exciting fast-tempo music. PloS one, 12 (12), e0189075.
  21. Sawayama, Masataka, Adelson, Edward H. & Nishida, Shinya (2017). Visual wetness perception based on image color statistics. Journal of Vision, 17(5): 7, 1-24.
  22. Sawayama, Masataka, Nishida, Shinya & Shinya, Mikio (2017). Human perception of subresolution fineness of dense textures based on image intensity statistics. Journal of Vision, 17(4): 8, 1-18.
  23. Uetsuki, Miki, Watanabe, Junji, Ando, Hideyuki & Maruya, Kazushi (2017). Reading traits for dynamically presented texts: Comparison of the optimum reading rates of dynamic text presentation and the reading rates of static text presentation. Frontiers in Psychology, 8 (1390), 1-10.
  24. Sakamoto, Maki & Watanabe, Junji (2017). Exploring tactile perceptual dimensions using materials associated with sensory vocabulary. Frontiers in Psychology, 8 (569), 1-10.
  25. Doizaki, Ryuichi, Watanabe, Junji & Sakamoto, Maki (2017). Automatic estimation of multidimensional ratings from a single sound-symbolic word and word-based visualization of tactile perceptual space. IEEE Transactions on Haptics, 10 (2), 173-182.
  26. Yokosaka, Takumi, Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2017). Linkage between free exploratory movements and subjective tactile ratings. IEEE Transactions on Haptics, 10 (2), 217-225.
  27. Hayashi, Ryusuke, Watanabe, Osamu, Yokoyama, Hiroki & Nishida, Shinya (2017). A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications. Journal of Vision, 17(6): 14, 1-20.
  28. Klimova, Michaela, Nishida, Shinya & Roseboom, Warrick (2017). Grouping by feature of cross-modal flankers in temporal ventriloquism. Scientific Reports, 7.
  29. Kawabe Takahiro & Shin'ya Nishida (2018). Deformation-induced transparency resolves color scission. Journal of Vision, 18 (8), 3: 1-12.
  30. Shin'ya Nishida, Kawabe Takahiro, Masataka Sawayama & Taiki Fukiage (2018). Motion Perception: From Detection to Interpretation. Annual Review of Vision Science, 4, in press.

Conferences

  1. Amemiya, Tomohiro, Ikei, Yasushi, Hirota, Koichi & Kitazaki, Michiteru (2017). Vibration on the soles of the feet evoking a sensation of walking expands peripersonal space. IEEE World Haptics Conference 2017 (WHC 2017).
  2. Masato Kurosawa, Ken Ito, Hirofumi Kaneko, Yasushi Ikei, Koichi Hirotaand Tomohiro Amemiya & Michiteru Kitazaki (2017). Cutaneous sensation of airflow for presentation of body motion. IEEE World Haptics Conference 2017 (WHC 2017).
  3. Tashiro, Kento, Fujie, Toi, Ikei,Yasushi, Amemiya, Tomohiro, Hirota, Koichi & Kitazaki, Michiteru (2017). TwinCam: Omni-directional Stereoscopic Live Viewing Camera Reducing Motion Blur during Head Rotation. SIGGRAPH 2017 Emerging Technologies.
  4. Tashiro, Kento, Fujie, Toi, Ikei,Yasushi, Amemiya, Tomohiro, Hirota, Koichi & Kitazaki, Michiteru (2017). TwinCam: Omni-directional Stereoscopic Live Viewing Camera Reducing Motion Blur during Head Rotation. SIGGRAPH 2017 Emerging Technologies.
  5. Ikei, Yasushi, Amemiya, Tomohiro, Hirota, Koichi & Kitazaki, Michiteru (2017). A New Experience Presentation in VR2.0. HCI International 2017.
  6. Otsuka, S., Tsuzaki, M., Tanaka, J. & Furukawa, S. (2017). Comparison of time course of frequency following response and otoacoustic emission following short-duration acoustic exposure. 40th ARO Midwinter Meeting. Baltimore, MD.
  7. Yoneya, M., Zhao, S., Chait, M., Furukawa, S. & Kashino, M. (2017). Eye-metrics: A measure of auditory distraction?. 40th ARO Midwinter Meeting. Baltimore, MD.
  8. Liao, H.I., Yoneya, M., Kashino, M. & Furukawa, S. (2017). What does pupil tell about musical processing?. Conference on Music & Eye-Tracking. Frankfurt, Germany.
  9. Ueda, Hiroshi, Abekawa, Naotoshi & Gomi, Hiroaki (2017). Temporal integration of sensory evidence for position representation of a moving object containing motion signal in perceptual and motor decision making.
  10. Ueda, Hiroshi, Abekawa, Naotoshi & Gomi, Hiroaki (2017). Temporal development of an interaction effect between internal motion and contour signals of drifting target on reaching adjustment.
  11. Hiroya, Sadao, Lavan, Nadine, Chen, Sinead, Meekings, Sophie & Scott, Sophie K. (2017). Impact of articulator velocity-controlled rhythm in perceiving speech. Washington, D.C.
  12. Abekawa, Naotoshi & Gomi, Hiroaki (2017). Modulation difference in visuomotor responses in implicit and explicit motor tasks depending on postural stability.
  13. Kawabe, Takahiro & Nishida, Shin'ya (2017). Transparent surface formation from dynamic image deformation. Symposium at European Conference on Visual Perception 2017. Berlin (Germany).
  14. Kawabe,Takahiro & Nishida, Shin'ya (2017). Spatial configuration modulates perceptual transparency from dymamic image deformation. Vision Sciences Society 17th Annual Meeting (VSS2017). St. Pete Beach (Florida, US).
  15. Kawano, Hiroshi (2017). Tunneling-based Self-Reconfiguration of Heterogeneous Sliding Cub-shaped Modular Robots in Environments with Obstacles. 2017 IEEE International Conference on Robotics and Automation (ICRA 2017). Singapore, Singapore.
  16. Koumura, Takuya & Furukawa, Shigeto (2017). Effect of Reverberation and Its Presentation Context on Material Perception Based on Impact Sounds. 2017 Spring Meeting, Acoustical Society of Japan.
  17. Yokosaka, T, Tajima, S, Kuroki, S, Watanabe, J & Nishida, S (2017). Modeling Pleasantness Ratings for Touched Materials by Explorative Hand Motion. IEEE World Haptics 2017. Munich (Germany).
  18. Kuroki, S & Nishida, S (2017). Envelope wave modulates not only amplitude but also perceived frequency of carrier wave. IEEE World Haptics 2017. Munich (Germany).
  19. Chen, L. & Liao, H.-I. (2017). Cross-modal freezing effect: Evidence from eye tracking on audiovisual integration. the 18th Annual international Multisensory Research Forum (IMRF).
  20. Liao, H.-I., Chen, Y.-C., Kashino, M. & Shimojo, S. (2017). How does pupillary response contribute to interpersonal preference evaluation?. the 13th Asia-Pacific Conference on Vision (APCV).
  21. Liao, H.-I., Kashino, M. & Shimojo, S. (2017). Pupil constriction reflects not only facial attractiveness, but also appraisal evaluation for natural scenes. the 40th European Conference on Visual Perception (ECVP).
  22. Liao, H.-I., Nakatani, M., Miyazaki, H. & Furukawa, S. (2017). Sensing frisson by material sounds. the 21st Annual Meeting of the Association for the Scientific Sutdy of Consciousness (ASSC).
  23. Liao, H.-I., Zhao, S., Chait, M., Kashino, M. & Furukawa, S. (2017). How the eyes detect acoustic transitions: A study of pupillary response to transitions between regular and random frequency patterns. Association for Research in Otolaryngology (ARO) 40th Annual MidWinter Meeting.
  24. Norimichi, Kitagawa, Masahiro, Fujino, Vimala, Inoue, Michio, Nomura & Yuuki, Ooishi (2017). Mindfulness meditation can weaken audiovisual integration in multisensory perception. Mind & Life Europe Summer Research Institute. Chiemsee (Germany).
  25. Sawayama, M., Fukiage, T. & Nishida, S. (2017). Perceiving shape of thin translucent objects from spatial transmittance variation. Vision Sciences Society 17th Annual Meeting (VSS2012). Florida (USA).
  26. Takamuku, Shinya, Nagasawa, Tomoyuki & Gomi, Hiroaki (2017). Automatic adjustment of walking speed by optic flow benefits from binocular vision. Annual Meeting of the Society for Neuroscience.
  27. Takamuku, Shinya, Forbes, Paul, Hamilton, Antonia & Gomi, Hiroaki (2017). Inverse dynamics computation in adults with autism - examination based on perceptual biases -. Winter workshop on mechanism of brain and mind.
  28. Terashima, Hiroki & Okada, Masato (2017). Using a V1 model to understand the disordered topography and ``complex'' pitch cells of A1. The 6th International Conference on Auditory Cortex (ICAC2017). Banff, Canada.
  29. Watanabe, Junji (2017). Media Technologies for Education Workshops. International Display Workshops 2017. Sendai, JP.
  30. Watanabe, Junji (2017). Tactility for Communication and Well-being. International Display Workshops 2017. Sendai, JP.
  31. Tanoue, Tomo, Watanabe, Junji, Maeda, Taro & Ando, Hideyuki (2017). 3D communication system using slit light field. SIGGRAPH ASIA 2017. Bangkok, TH.
  32. Ando, Hideyuki, Watanabe, Junji, Chen, Dominick & Sakakura, Kyosuke (2017). Development of Information Technology Guidelines for Promoting Japanese-style Wellbeing. Science Centre World Summit. Tokyo, JP.
  33. Ueda, Tomoya, Miyagi, Takuya, Kuroda, Tsuyoshi, Watanabe, Junji, Suegami, Takashi, Daimoto, Hiroshi & Miyazaki, Makoto (2017). Cross-modal effects in speed judgments during virtual motorcycle riding. Neuroscience 2017 Washington, DC. Washington DC, US.
  34. Ando, Hideyuki, Watanabe, Junji, Chen, Dominick & Sakakura, Kyosuke (2017). Development of Information Technology Guidelines for Promoting Japanese-style Wellbeing. CHI Conference on Human Factors in Computing Systems 2017. Denver, US.
  35. Hashimoto, Yuki, Watanabe, Junji, Maeda, Taro & Ando, Hideyuki (2017). Tactile illusion of texture using vibration to a finger for active touch. IEEE World Haptics Conference 2011. Istanbul, TR.
  36. Yasu, Kentaro (2017). Magnetic Plotter: A Macrotexture Design Method Using Magnetic Rubber Sheets. ACM SIGGRAPH 2017 Studio. New York, NY, USA.
  37. Yasu, Kentaro (2017). Magnetic Plotter: A Macrotexture Design Method Using Magnetic Rubber Sheets. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York, NY, USA.
  38. Nishida, Shinya (2017). Visual Material Perception (Rank Prize Lecture). European Conference on Visual Perception (ECVP2017). Berlin (Germany).
  39. Tsukano, Hiroaki, Ohga, Shinpei, Horie, Masao, Terashima, Hiroki, Nishio, Nana, Kubota, Yamato, Takahashi, Kuniyuki, Hishida, Ryuichi, Takebayashi, Hirohide & Shibuki, Katsuei (2018). Thalamocortical structures that differentiate complexity in functional organizations between primary and secondary auditory cortices in mice. The 41st Annual Meeting of the Japan Neuroscience Society. Kobe, Japan.
  40. Ryohei Shibue & Makoto Yoneya (2018). Scan path modeling using marked point processes. The 41st Annual Meeting of the Japan Neuroscience Society. Kobe, Japan.
  41. Nishida, Shin'ya (2018). Understanding human recognition of material properties for innovation in SHITSUKAN science and technology. 2018 UK-JSPS Symposium "SHITSUKAN approach to digital colour sensing: human colour vision for material quality". Manchester, UK.
  42. Jan Jaap R. van Assen, Shin'ya Nishida & Roland W. Fleming (2018). Estimating perceived viscosity of liquids with neural networks. European Conference on Visual Perception. Irieste, Italy.
  43. Hiroya, S. & Mochida, T. (2018). Neural mechanisms underlying the impact of speech sound naturalness during transformed auditory feedback.

Misc

  1. Kuroki, S (2017). Directional remapping in tactile motion perception. NTT technical review, 15 (1).

2016

Journal articles

  1. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2016). Prediction of Who Will Be the Next Speaker and When" using Gaze Behavior in Multi-Party Meetings. ACM Transaction on Interactive Intelligent Systems, 6 (1).
  2. Asai, T. (2016). Agency elicits body-ownership: proprioceptive drift toward a synchronously acting external proxy. Experimental Brain Research, 234 (5), 1163-1174.
  3. Izawa, J., Asai, T. & Imamizu, H. (2016). Computational motor control as a window to understanding schizophrenia. Neuroscience Research, 104, 44-51.
  4. Otsuka, S., Tsuzaki, M., Sonoda, J., Tanaka, S. & Furukawa, S. (2016). A role of medial olivocochlear reflex as a protection mechanism from noise-induced hearing loss revealed in short-practicing violinists. PLoS ONE, 11 (1), e0146751.
  5. Liao, H. I., Kidani, S., Yoneya, M., Kashino, M. & Furukawa, S. (2016). Correspondences among pupillary dilation response, subjective salience of sounds, and loudness. Psychonomic Bulletin & Review, 23 (2), 412-425.
  6. Liao, H.I., Yoneya, M., Kidani, S., Kashino, M. & Furukawa, S. (2016). Human pupillary dilation response to deviant auditory stimuli: Effects of stimulus properties and voluntary attention. Frontiers in Neuroscience, 10, 43.
  7. Amemiya, Tomohiro, Hirota, Koichi & Ikei, Yasushi (2016). Tactile Apparent Motion on the Torso Modulates Perceived Forward Self-Motion Velocity. IEEE Transactions on Haptics, 9 (4), 474-482.
  8. Amemiya, Tomohiro & Gomi, Hiroaki (2016). Active Manual Movement Improves Directional Perception of Illusory Force. IEEE Transactions on Haptics, 9 (4), 465-473.
  9. Yamagishi, S., Otsuka, S., Furukawa, S. & Kashino, M. (2016). Subcortical correlates of auditory perceptual organization in humans. Hearing Research, 339, 104-111.
  10. Otsuka, S., Furukawa, S., Yamagishi, S., Hirota, K. & Kashino, M. (2016). Relation between cochlear mechanics and performance of temporal-fine-structure based tasks. Journal of the Association for Research in Otolaryngology, 17, 541-557.
  11. Petsas, T., Harrison, J., Kashino, M., Furukawa, S. & Chait, M. (2016). The effect of distraction on change detection in crowded acoustic scenes. Hearing research, 341, 179-189.
  12. Ochi, Atsushi, Yamasoba, Tatsuya & Furukawa, Shigeto (2016). Contributions of coding efficiency of temporal-structure and level information to lateralization performance in young and early-elderly listeners. Advances in Experimental Medicine and Biology, 894, 19-28.
  13. Sakurada, T., Ito, K. & Gomi, H. (2016). Bimanual motor coordination controlled by cooperative interactions in intrinsic and extrinsic coordinates. Eur J Neurosci, 43 (1), 120-30.
  14. De Havas, J., Ghosh, A., Gomi, H. & Haggard, P. (2016). Voluntary motor commands reveal awareness and control of involuntary movement. Cognition, 155, 155-67.
  15. Kawabe, Takahiro, Fukiage, Taiki, Sawayama, Masataka & Nishida, Shin'ya (2016). Deformation Lamps: A Projection Technique to Make Static Objects Perceptually Dynamic. ACM Trans. Appl. Percept., 13 (2), 10: 1-10: 17.
  16. Ryo Ishii, Kazuhiro Otsuka, Shiro Kumano & Junji Yamato (2016). Using Respiration of Who Will be the Next Speaker and When in Multiparty Meetings. ACM Trans. Interactive Intelligent Systems (TiiS), 6 (2), Article No. 20.
  17. Kuroki, S, Hagura, N, Nishida, S, Haggard, P & Watanabe, J (2016). Sanshool on The Fingertip Interferes with Vibration Detection in a Rapidly-Adapting (RA) Tactile Channel. PLOS ONE, 11 (12), e0165842.
  18. Kawabe, Takahiro, Fukiage, Taiki, Sawayama, Masataka & Nishida, Shinya (2016). Deformation lamps: A projection technique to make static objects perceptually dynamic. ACM Transactions on Applied Perception (TAP), Volume 13 (Issue 2), Article No. 10.
  19. Sato Goto, Takashi, Watanabe, Junji & Takehiro, Moriya (2016). Presenting changes in acoustic features synchronously to respiration alters the affective evaluation of sound. International Journal of Psychophysiology, 110, 179-186.
  20. Kuroki, Scinob, Hagura, Nobuhiro, Nishida, Shin'ya, Patrick Haggard & Watanabe, Junji (2016). Sanshool on the fingertip interferes with vibration detection in a rapidly-adapting (RA) tactile channel. PLOS ONE, 11(12) (e0165842), 1-12.
  21. Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2016). Neural timing signal for precise tactile timing judgments. Journal of Neurophysiology, 115, 1620-1629.
  22. Iida, Naoki, Kuroki, Scinob & Watanabe, Junji (2016). Comparison of tactile temporal numerosity judgments between unimanual and bimanual presentations. Perception, 45, 99-113.
  23. Amano, Kaoru, Qi, Liang, Terada, Yoshikazu & Nishida, Shinya (2016). Neural correlates of the time marker for the perception of event timing. eNeuro, 3 (4), ENEURO-0144.
  24. Hisakata, Rumi, Nishida, Shinya & Johnston, Alan (2016). An adaptable metric shapes perceptual space. Current Biology, 26 (14), 1911-1915.
  25. Rider, Andrew T., Nishida, Shinya & Johnston, Alan (2016). Multiple-stage ambiguity in motion perception reveals global computation of local motion directions. Journal of Vision, 16(15): 7, 1-11.

Books/Chapters

  1. Ikei, Yasushi, Hirota, Koichi, Amemiya, Tomohiro & Kitazaki, Michiteru (2016). Five Senses Theater: A Multisensory Display for the Bodily Ultra-Reality. Emotional Engineering Volume 4. Springer International Publishing.

Conferences

  1. Ishii, R., Kumano, S. & Otsuka, K. (2016). Prediction of Next-Utterance Timing using Head Movement in Multi-Party Meetings. (INTERSPEECH).
  2. Otsuka, K. (2016). MMSpace: Kinetically-augmented telepresence for small group-to-group conversations. (IEEE Virtual Reality 2016 (VR2016)).
  3. Furukawa, S. (2016). Natural combinations of interaural time and level differences in realistic auditory scenes. (39th ARO Midwinter Meeting).
  4. Terashima, H. & Furukawa, S. (2016). A developmental analysis of infant vocalizations under a sparse-code framework. (39th ARO Midwinter Meeting).
  5. Otsuka, S., Tsuzaki, M., Tanaka, S., Sonoda, J. & Furukawa, S. (2016). Temporary effect of short-duration acoustic exposure on human frequency following responses. (39th ARO Midwinter Meeting).
  6. Yamagishi, S., Otsuka, S., Furukawa, S. & Kashino, M. (2016). Comparison of properties of perceptual switching in auditory streaming based on spectral and temporal cues. (39th ARO Midwinter Meeting).
  7. Liao, H. I., Yoneya, M., Furukawa, S. & Kashino, M. (2016). Detecting variations in music by pupil. (39th ARO Midwinter Meeting).
  8. Amemiya, T., Hirota, K. & Ikei, Y. (2016). Topographic surface perception modulated by pitch rotation of motion chair. (18th International Conference on Human-Computer Interaction (HCI International 2016)).
  9. Ikei, Yasushi, Kato, Shunki, Komase, Kohei, Imao, Shogo, Sakurai, Sho, Amemiya, Tomohiro, Kitazaki, Michiteru & Hirota, Koichi (2016). Vestibulohaptic passive stimulation for a walking sensation. IEEE VR.
  10. Amemiya, Tomohiro, Takamuku, Shinya & Gomi, Hiroaki (2016). Changes in direction discriminability of illusory force after adaptation to skin stretch. Eurohaptics 2016.
  11. Saka, Naoyuki, Ikei, Yasushi, Amemiya, Tomohiro, Hirota, Koichi & Kitazaki, Michiteru (2016). Passive Arm Swing Motion for Virtual Walking Sensation. The 26th International Conference on Artificial Reality and Telexistence and the 21st Eurographics Symposium on Virtual Environment (ICAT-EGVE '16). Goslar Germany, Germany.
  12. Fukiage, Taiki, Kawabe, Takahiro & Nishida, Shin'ya (2016). Interactive Editing and Automatic Projection of Motion Impression on Real-World Objects. The 24th International Display Workshops. Fukuoka, Japan.
  13. Fukiage, Taiki, Kawabe, Takahiro & Nishida, Shin'ya (2016). A model of V1 metamer can explain perceived deformation of a static object induced by light projection. Vision Sciences Society 16th Annual Meeting (VSS2016). St. Pete Beach (Florida, US).
  14. Otsuka, S. & Furukawa, S. (2016). Conversion of amplitude modulation to phase modulation on the basilar membrane and its implication in the perceptual consequences of disrupted temporal coding. 5th Joint Meeting of Acoustical Society of America and Acoustical Society of Japan. Honolulu, HI.
  15. Yamagishi, S., Otsuka, S., Furukawa, S. & Kashino, M. (2016). Comparison of brainstem frequency-following responses associated with auditory streaming based on spectral and temporal cues. 5th Joint Meeting of Acoustical Society of America and Acoustical Society of Japan. Honolulu, HI.
  16. Takamuku, Shinya, Teshima, Tetsuhiko, Amemiya, Tomohiro & Gomi, Hiroaki (2016). Does proprioceptive information contribute to illusory force sensation elicited by asymmetric vibration. Annual Meeting of the Society for Neuroscience.
  17. Gomi, Hiroaki, Ito, Sho, Amemiya, T & Takamuku, S (2016). Ungrounded 6-dof force display by asymmetric vibration: `Buru-Navi4 CubicForce 6D'.
  18. Abekawa, Naotoshi, Gomi, Hiroaki & Diedrichsen, Joern (2016). Task demands change online coordination of eye and hand movements.
  19. Abekawa, Naotoshi, Ferre, Elisa, Gallagher, Maria, Gomi, Hiroaki & Haggard, Patrick (2016). Modulation of egocentric spatial frames of references by vestibular stimulation.
  20. Hiroya, Sadao, Jasmin, Kyle, Krishnan, Saloni, Lima, Cesar, Ostarek, Markus, Boebinger, Dana & Scott, Sophie K. (2016). Speech rhythm measure of non-native speech using a statistical phoneme duration model. the 8th Annual Meeting of the Society for the Neurobiology of Language.
  21. Gomi, Hiroaki, Ito, Sho, Amemiya, Tomohiro & Takamuku, Shinya (2016). Ungrounded 6-dof force display by asymmetric vibration: `Buru-Navi4 CubicForce 6D. EuroHaptics.
  22. Kawabe, Takahiro & Nishida, Shin'ya (2016). Seeing Jelly: Judging Elasticity of a Transparent Object. Proceedings of the ACM Symposium on Applied Perception. New York, NY, USA.
  23. Kawano, Hiroshi (2016). Full Resolution Reconfiguration Planning for Heterogeneous Cube-shaped Modular Robots with only Sliding Motion Primitive. 2016 IEEE International Conference on Robotics and Automation (ICRA 2016). Stockholm, Sweden.
  24. Ryo Ishii, Shiro Kumano & Kazuhiro Otsuka (2016). Analyzing mouth-opening transition pattern for predicting next speaker in multi-party meetings. International Conference on Multimodal Interaction (ICMI 2016).
  25. Kuroki, S (2016). Motion system in touch. 17th International Multisensory Research Forum. Suzhou, China.
  26. Liao, H.-I., Shimojo, S. & Kashino, M. (2016). Reading appraisal of faces via pupillary responses. the Annual Meeting of Taiwanese Psychology Association.
  27. Liao, H.-I., Yoneya, M., Kidani, S., Barascud, N., Zhao, S., Chait, M., Kashino, M. & Furukawa, S. (2016). Detecting auditory changes by pupillary response. the ASA-ASJ Joint Meeting.
  28. Yuuki, Ooishi, Maori, Kobayashi, Kanako, Ueno & Makio, Kashino (2016). The effect of the acoustical presence of music on autonomic response. 10th FENS Forum of Neuroscience. Copenhagen (Denmark).
  29. Ken, Watanabe, Yuuki, Ooishi & Makio, Kashino (2016). The entrainment of heart rate to acoustic tempo. 10th FENS Forum of Neuroscience. Copenhagen (Denmark).
  30. Sawayama, M., Shinya, M. & Nishida, S. (2016). Perception of super-fine structures based on image intensity statistics. Vision Sciences Society 16th Annual Meeting (VSS2012). Florida (USA).
  31. Amemiya, Tomohiro, Takamuku, Shinya & Gomi, Hiroaki (2016). Changes in direction discriminability of illusory force after adaptation to skin stretch. Eurohaptics.
  32. Hiroki Terashima (2016). Sparse coding strategy shared by auditory and visual cortices. CNRS-NTT Joint Seminar 2016. Fontainebleau, France.
  33. Watanabe, Junji (2016). Toward "Haptic Design": Psychological foundations and practices. ERC PATCH Closing Workshop on Computational Touch. Paris, FR.
  34. Sato, Takao & Watanabe, Junji (2016). Media Technology and Psychology (Workshop organizer). International Congress of Psychology 2016. Yokohama, JP.
  35. Watanabe, Junji (2016). Tactility for understanding and producing information. International Congress of Psychology 2016. Yokohama, JP.
  36. Watanabe, Junji (2016). Interactions between Physical and Semantic Temperature. Euro Haptics2016. London, GB.
  37. Ho, Hsin-Ni, Sato, Katsunari, Kuroki, Schinob, Watanabe, Junji, Maeno, Takashi & Nishida, Shin'ya (2016). Perceptual representation of dynamic thermal stimulation. Euro Haptics2016. London, GB.
  38. Zhou, Yizhen, Ho, Hsin-Ni & Watanabe, Junji (2016). Interactions between Physical and Semantic Temperature. IMRF2016 17th International Multisensory Research Forum. Suzhou, CH.
  39. Kosohara, Mai, Watanabe, Junji, Hiranuma, Yasuaki, Doizaki, Ryuichi, Matsuda, Takahide & Sakamoto, Maki (2016). A system to visualize tactile perceptual space of young and old people. AAAI 2016 SPRING SYMPOSIA. Palo Alt, US.
  40. Yamagishi, Shimpei, Otsuka, Sho, Furukawa, Shigeto & Kashino, Makio (2016). Comparison of brainstem frequency-following responses associated with auditory streaming based on spectral and temporal cues. 5th Joint Meeting of the Acoustical Society of America and the Acoustical Society of Japan. Honolulu, Hawaii, USA.
  41. Nishida, Shinya (2016). Motion perception: From a dark room to the real world. Aisa Paific Conference on Vision (APCV2016). Fremantle (Australia).
  42. Nishida, Shinya, Kawabe, Takahiro, Fukiage, Taiki & Sawayama, Masataka (2016). Animating Static Objects by Illusion-Based Projection Mapping. IDW/AD'16. Fukuoka (Japan).
  43. Hayashi, Ryusuke, Yokoyama, Hiroki, Watanabe, Osamu & Nishida, Shinya (2016). A new analytical method for characterizing nonlinear visual processes. The 39th European Conference on Visual Perception (ECVP 2016). Barcelona (Spain).
  44. Nishida, Shinya (2016). Perception of material properties. 31st International Congress of Psychology (ICP 2016). Yokohama (Japan).

Misc

  1. Yamada,T., Takahashi, S., Naya, S., Ikebe, T. & Furukawa, S. (2016). Artificial Intelligence Research Activities and Directions in the NTT Group. NTT Technical Review, 14 (5).
  2. Furukawa, S., Yoneya, M., Liao, H.I. & Kashino, M. (2016). The Eyes as an Indicator of the Mind?A Key Element of Heart-Touching-AI. NTT Technical Review, 14 (5).
  3. Ho, H.-N. (2016). Influence of Object Geometry on Skin Temperature Responses during Hand-Object Interactions. 9774, 281-290.

2015

Journal articles

  1. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2015). Prediction of "Who Will Be Next Speaker and When" in Multi-Party Meetings. NTT Technical Review, 13 (7).
  2. Asai, T. (2015). Feedback control of one's own action: self-other sensory attribution in motor control. Consciousness and Cognition, 38, 118-129.
  3. Imaizumi, S. & Asai, T. (2015). Dissociation of agency and body ownership following visuomotor temporal recalibration. Frontiers in Integrative Neuroscience., 9, 35.
  4. Ujiie, Y., Asai, T. & Wakabayashi, A. (2015). The relationship between level of autistic traits and local bias in the context of the McGurk effect. Frontiers in Psychology, 6: 891.
  5. Ujiie, Y., Asai, T., Tanaka, A. & Wakabayashi, A. (2015). The McGurk effect and autistic traits; an analogue perspective. Letters on Evolutionary Behavioral Science, 6 (2), 9-12.
  6. Kihara, K., Takeuchi, T., Yoshimoto, S., Kondo, H.M. & Kawahara, J.I. (2015). Pupillometric evidence for the locus coeruleus-noradrenaline system facilitating attentional processing of action-triggered visual stimuli. Frontiers in Psychology, 6 (827).
  7. Yokosaka, T., Kuroki, S., Nishida, S. & Watanabe, J. (2015). Apparent time interval of visual stimuli is compressed during fast hand movement. PLOS ONE, 10 (4), e0124901.
  8. Oohashi, H., Hiroya, S. & Mochida, T. (2015). Real-time robust formant estimation system using a phase equalization-based autoregressive exogenous model. Acoustical Science and Technology, 36 (6), 478-488.
  9. Takamuku, S. & Gomi, H. (2015). What you feel is what you see: inverse dynamics estimation underlies the resistive sensation of a delayed cursor. Proceedings. Biological sciences / The Royal Society, 282 (1811).
  10. De Havas, J., Ghosh, A., Gomi, H. & Haggard, P. (2015). Sensorimotor organization of a sustained involuntary movement. Frontiers in behavioral neuroscience, 9, 185.
  11. I-Fan, L., Yamada, T., Komine, Y., Kato, N., Kato, M. & Kashino, M. (2015). Vocal identity recognition in autism spectrum disorder. PLOS ONE.
  12. Kondo, H.M., Nomura, M. & Kashino, M. (2015). Different roles of the COMT and HTR2A genotypes in working memory subprocesses. PLOS ONE, 10 (5), e0126511.
  13. Ukezono, M., Nakashima, S.F., Sudo, R., Yamazaki, A. & Takano, Y. (2015). The combination of perception of other individuals and exogenous manipulation of arousal enhances social facilitation: re-examination of Zajonc's drive theory. Frontiers in Psychology.
  14. Fleming, R. W., Gegenfurtner, K. R. & Nishida, S. (2015). Visual perception of materials: The science of stuff. Vision Research, 109, 123-124.
  15. Kuehn, E., De Havas, J., Silkoset, E., Gomi, H. & Haggard, P. (2015). On the bimanual integration of proprioceptive information. Experimental Brain Research, 233, 1273-1288.
  16. Yamada, Y., Kawabe, T. & Miyazaki, M. (2015). Awareness shaping or shaped by prediction and postdiction: Editorial. Frontiers in Psychology, 6 (166).
  17. Sawayama, M. & Kimura, E. (2015). Stain on texture: Perception of a dark spot having a blurred edge on textured backgrounds. Vision Research, 109, 209-220.
  18. Roseboom, W., Linares, D. & Nishida, S. (2015). Sensory adaptation for timing perception. Proceedings of the Royal Society B: Biological Sciences, 282 (1805).
  19. Paulun, V. C., Kawabe, T., Nishida, S. & Fleming, R.W. (2015). Seeing liquids from static snapshot. Vision Research.
  20. Kumano, S., Otsuka, K., Mikami, D., Matsuda, M. & Yamato, J. (2015). Analyzing Interpersonal Empathy via Collective Impressions. IEEE Transactions on Affective Computing(TAFFC), 6 (4), 324-336.
  21. Kanaya, S., Fujisaki, W., Nishida, S., Furukawa, S. & Yokosawa, K. (2015). Effects of frequency separation and diotic/dichotic presentations on the alternation frequency limits in audition derived from a temporal phase discrimination task. Perception, 44 (2), 198-214.
  22. Nakashima, S.F., Ukezono, M., Nishida, H., Sudo, R. & Takano, Y. (2015). Receiving of emotional signal of pain from conspecifics in laboratory rats. Royal Society Open Science.
  23. Abekawa, N. & Gomi, H. (2015). Online gain update for manual following response accompanied by gaze shift during arm reaching. Journal of Neurophysiology, 113 (4), 1206-1216.
  24. Huang, T.-H., Yeh, S.L., Yang, Y.H., Liao, H.I., Tsai, Y.-Y., Chang, P.-J. & Chen, H. H. (2015). Method and experiments of subliminal cueing for real-world images. Multimedia Tools and Applications, 74 (22), 10111-10135.
  25. Ho, H.-N. (2015). Color-temperature correspondence: Its nature and its impact on object temperature perception. NTT Technical Review, 13 (1).
  26. Paulun, Vivian C., Kawabe, Takahiro, Nishida, Shin'ya & Fleming, Roland W. (2015). Seeing liquids from static snapshots. Vision Research, 115, Part B, 163-174.
  27. Kawabe, Takahiro (2015). Delayed Visual Feedback of One's Own Action Promotes Sense of Control for Auditory Events. Frontiers in Integrative Neuroscience, 57.
  28. Kawabe, Takahiro, Maruya, Kazushi & Nishida, Shin'ya (2015). Perceptual transparency from image deformation. Proceedings of the National Academy of Sciences, 112 (33), E4620-E4627.
  29. Kawabe, Takahiro, Maruya, Kazushi, Fleming, Roland W. & Nishida, Shin'ya (2015). Seeing liquids from visual motion. Vision Research, 109, Part B, 125-138.
  30. Kato, Masaharu & Mugitani, Ryoko (2015). Pareidolia in infants. PloS one, 10 (2), e0118539.
  31. Watanabe, Ken, Ooishi, Yuuki & Kashino, Makio (2015). Sympathetic Tone Induced by High Acoustic Tempo Requires Fast Respiration. PloS one, 10 (8), e0135589.
  32. Sakamoto, Maki & Watanabe, Junji (2015). Cross-modal associations between sounds and drink tastes/textures: A study with spontaneous production of sound-symbolic words. Chemical Senses, 41 (3), 197-203.
  33. Ho, Hsin-Ni, Iwai, Daisuke, Yoshikawa, Yuki, Watanabe, Junji & Nishida, Shin'ya (2015). Impact of hand and object colors on object temperature perception. Temperature, 2 (3), 344-345.
  34. Yokosaka, Takumi, Kuroki, Scinob, Nishida, Shin'ya & Watanabe, Junji (2015). Apparent time interval of visual stimuli is compressed during fast hand movement. PLOS ONE, 10(4) (e0124901).
  35. Yang, Jiale, Watanabe, Junji, Kanazawa, So, Nishida, Shin'ya & Yamaguchi, K. Masami (2015). Infants' visual system non-retinotopically integrates color signals along motion trajectory. Journal of Vision, 15(1) (25), 1-10.
  36. Fleming, Roland W, Nishida, Shin'ya & Gegenfurtner, Karl R (2015). Perception of material properties. Vision Research, 115 (PB), 157-162.
  37. Terao, Masahiko, Murakami, Ikuya & Nishida, Shin'ya (2015). Enhancement of motion perception in the direction opposite to smooth pursuit eye movement. Journal of Vision, 15 (13), 2-11.

Conferences

  1. Ishii, R., Ozawa, S., Kojima, A., Otsuka, K., Hayashi, Y. & Nakano, Y. (2015). Design and Evaluation of Mirror Interface MIOSS to Overlay Remote 3D Spaces. (IFIP International Conference on Human-Computer Interaction (INTERACT 2015)).
  2. Ishii, R., Otsuka, K. & Kumano, S. (2015). Multimodal Fusion using Respiration and Gaze for Predicting Next Speaker in Multi-Party Meetings. (International Conference on Multimodal Interaction (ICMI 2015)).
  3. Sonoda, J. & Otsuka, K. (2015). Die Intensität des Ausdrucks lutherischer Gläubigkeit im “Herr, wenn ich nur dich habe” (SWV 280) von H. Schütz: Ein Vergleich mit D. Buxtehude (BuxWV38) anhand des Klangspektrums des Gehörs. (Internationaler Kongress für Kirchenmusik).
  4. Imaizumi, S. & Asai, T. (2015). Time Contraction during Delayed Visual Feedback of Hand Action. (The European Conference on Visual Perception (ECVP 2015)).
  5. Imaizumi, S. & Asai, T. (2015). Visuomotor temporal recalibration leads to sense of agency but not to body-ownership. (16th International Multisensory Research Forum (IMRF 2015)).
  6. Asai, T. (2015). Between self and other in motor control. (16th International Multisensory Research Forum (IMRF 2015)).
  7. Kawano, H. (2015). Complete Reconfiguration Algorithm for Sliding Cube-shaped Modular Robots with only Sliding Motion Primitive. (2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015)).
  8. Yoneya, M., Furukawa, S. & Kashino, M. (2015). Potential use of Microsaccade in Personal Identification. (37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society).
  9. Takeuchi, T., Yoshimoto, S., Shimada, Y., Kochiyama, T. & Kondo, H. M. (2015). Individual differences in visual motion perception and the associated excitatory and inhibitory neurotransmitter concentrations in the brain. (OSA Fall Vision Meeting 2015).
  10. Kondo, H. M. (2015). Sensory-perceptual transformations for auditory scene analysis. (The 9th International Conference on Complex Medical Engineering (CME 2015)).
  11. Liao, H. I., Shimojo, S. & Kashino, M. (2015). Correspondence between pupillary response and facial attractiveness. (Asia-Pacific Conference on Vision).
  12. Liao, H. I., Kidani, S., Yoneya, M., Kashino, M. & Furukawa, S. (2015). Correspondence between pupillary dilation response and subjective rating of sound salience. (Association for Research in Otolaryngology (ARO) 38th Annual MidWinter Meeting.).
  13. Hiroya, S., Jasmin, K., Evans, S., Krishnan, S., Ostarek, M., Boebinger, D. & Scott, S.K. (2015). Effects of speaking rhythm naturalness on the neural basis of speech perception. (Neuroscience 2015).
  14. Amemiya, T. (2015). Perceptual Illusions for Multisensory Displays. (The 22nd International Display Workshops (IDW '15)).
  15. Amemiya, T., Beck, B., Gomi, H. & Haggard, P. (2015). Getting a feeling for where things are going: human perception of tactile motion. (The International Association for the Study of Affective Touch: Inaugural Congress).
  16. Shimabukuro, S., Kato, S., Ikei, K., Hirota, K., Amemiya, T. & Kitazaki, M. (2015). Characteristics of virtual walking sensation created by a 3-dof motion seat. (IEEE Virtual Reality Conference 2015 (VR 2015)).
  17. Okuya, Y., Ikei, K., Amemiya, T. & Hirota, K. (2015). Third person's footsteps enhanced moving sensation of seated person. (IEEE Virtual Reality Conference 2015 (VR 2015)).
  18. Ikei, Y., Shimabukuro, S., Kato, K., Okuya, Y., Hirota, K., Kitazaki, M. & Amemiya, T. (2015). Five senses theatre project: Sharing experiences through bodily ultra-reality. (IEEE Virtual Reality Conference 2015 (VR 2015)).
  19. Ikei, Y., Kato, S., Komase, K., Shimabukuro, S., Hirota, K., Amemiya, T. & Kitazaki, M. (2015). Experience simulator for the digital museum. (17th International Conference on Human-Computer Interaction (HCI International 2015)).
  20. Gomi, H., Amemiya, T., Takamuku, S. & Ito, S. (2015). Mechanisms of illusory continuous force sensation induced by asymmetric vibration - A computational approach to sensory processing.-. (NEUROSCIENCE 2015).
  21. Amemiya, T., Beck, B., Gomi, H. & Haggard, P. (2015). TMS over V5/hMT+ disrupts tactile direction discrimination. (NEUROSCIENCE 2015).
  22. Teshima, T, Takamuku, S, Amemiya, T & Gomi, H (2015). Light touch on pillar array surface greatly improves directional perception induced by asymmetric vibration. (SIGGRAPH ASIA 2015 workshop on haptic media and contents design.).
  23. Ito, S. & Gomi, H. (2015). Mirror-reversed visual feedback reduces quick somatomotor response evoked by mechanical perturbation. (The Society for Neuroscience 45th Annual Meeting).
  24. Kidani, S., Liao, H. I., Yoneya, M., Kashino, M. & Furukawa, S. (2015). Effect of distractor saliency on amplitude-modulation detection task. (The Association for Research in Otolaryngology 38th Annual MidWinter Meeting).
  25. Otsuka, S., Tsuzaki, M., Sonoda, J. & Furukawa, S. (2015). Effects of Short-duration instrument practice on the auditory peripheral functions of violin players. (38th ARO Midwinter Meeting).
  26. Kumano, S., Otsuka, K., Ishii, R. & Yamato, J. (2015). Automatic Gaze Analysis in Multiparty Conversations based on Collective First-Person Vision. (International Workshop on Emotion Representation, Analysis and Synthesis in Continuous Time and Space (EmoSPACE 2015)).
  27. Furukawa, S., Onikura, S., Kidani, S., Kato, M. & Kitagawa, N. (2015). An objective measure of auditory detection threshold based on a light-synchronized tapping task. (Association for Research in Otolaryngology (ARO) 38th Annual MidWinter Meeting).
  28. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2015). Predicting Next Speaker Based on Head Movement in Multi-Party Meetings. (IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2015)).
  29. Oohashi, H., Hiroya, S. & Mochida, T. (2015). Real-time formant tracking system using a phase equalization-based autoregressive exogenous model. (ICASSP2015).
  30. Kitagawa, N., Mochida, T. & Kitamura, M. (2015). The McGurk effect does not depend on perceptual synchrony between audiovisual signals. (IMRF 2015, 16th International Multisensory Research Forum).
  31. Maruya, K., Watanabe, J., Takahashi, H. & Hashida, S. (2015). A leraning system utilizing learners'active tracing behaviors. (The 5th International Conference on Learning Analytics and Knowledge Conference).
  32. lin, I.F., Mochida, T., Asada, K., Ayaya, S., Kumagaya, R. & Kato, M. (2015). Atypical audio-vocal system regulation in autism spectrum disorder. (Association for Research in Otolaryngology Annual MidWinter Meeting).
  33. Terashima, H. & Okada, M. (2015). A Computational Model for the Disorganized Tonotopy and Pitch Cells of the Auditory Cortex: Analogy with the Visual Cortex. (ARO The 38th Annual MidWinter Meeting).
  34. Takamuku, S. & Gomi, H. (2015). Seeing the manipulated object improves temporal estimation of its inertial force. (Winter workshop on mechanism of brain and mind).
  35. Kumano, S., Otsuka, K., Ishii, R. & Yamato, J. (2015). Collaborative First-Person Vision: Calibration-Free Gaze Analysis in Social Interaction. (IEEE Int'l Conf. Automatic Face and Gesture Recognition(FG2015)).
  36. Kawabe, Takahiro & Nishida, Shin'ya (2015). Seeing transparent liquids from refraction-based image deformation and specular reflection. Vision Sciences Society 15th Annual Meeting (VSS2015). St. Pete Beach (Florida, US).
  37. Kawabe, T., Sawayama, M. & Nishida, S. (2015). Deformation lamps: a projection technique to make a static picture dynamic. ACM SIGGRAPH 2015 Emerging Technologies.
  38. Sawayama, M. & Nishida, S. (2015). Visual perception of surface wetness. Vision Sciences Society 15th Annual Meeting (VSS2012). Florida (USA).
  39. Nishida, S., Sawayama, M. & Shimokawa, T. (2015). Material-dependent shape distortion by local intensity order reversal. Vision Sciences Society 15th Annual Meeting (VSS2012). Florida (USA).
  40. Morinaga, Sayo & Watanabe, Junji (2015). Toward tactile contents design using laser engraving machines. SIGGRAPH ASIA 2015. Kobe, JP.
  41. Suzuki, Yasuhiro, Suzuki, Rieko, Watanabe, Junji & Sakurazawa, Shigeru (2015). Haptic vibration for hands and bodies. SIGGRAPH ASIA 2015. Kobe, JP.
  42. Yokosaka, Takumi, Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2015). Predictability of tactile evaluation from exploratory movement in free-touch and task-dependent situations. IEEE World Haptics2015. Chicago, US.
  43. Kuroki, Scinob, Hagura, Nobuhiro, Nishida, Shin'ya, Patrick Haggard & Watanabe, Junji (2015). Asian spice sets fingers trembling. IEEE World Haptics2015. Chicago, US.
  44. Tanaka, Yoshihiro, Watanabe, Junji & Okamoto, Shogo (2015). Toward Comfortable Texture Design. IEEE World Haptics2015. Chicago, US.

Misc

  1. Furukawa, S., Yamagishi, S., Liao, H.-I., Yoneya, M., Otsuka, S. & Kashino, M (2015). Biological Measures that Reflect Auditory Perception. NTT Technical Review, 13.

2014

Journal articles

  1. Abekawa, N. & Gomi, H. (2014). Understanding the coordination mechanisms of gaze and arm movements. NTT Technical Review, 12 (7), 1-8.
  2. Takano, Y. & Ukezono, M. (2014). An experimental task to examine the mirror system in rats. Scientific Reports, 4, 6652.
  3. Sugimori, E. & Asai, T. (2014). Attribution of movement: Potential links between subjective reports of agency and output monitoring. The Quarterly Journal of Experimental Psychology, 68 (5), 900-916.
  4. Imaizumi, S., Asai, T., Kanayama, N., Kawamura, M. & Koyama, S. (2014). Agency over a phantom limb and electromyographic activity on the stump depend on visuomotor synchrony: a case study. Frontiers in Human Neuroscience, 8, 545.
  5. Gomi, H., Sakurada, T. & Fukui, T. (2014). Lack of motor prediction, rather than perceptual conflict, evokes an odd sensation upon stepping onto a stopped escalator. Frontiers in Behavioral Neuroscience, 8 (77).
  6. Asai, T. (2014). Illusory body-ownership entails automatic compensative movement: for the unified representation between body and action. Experimental Brain Research, 233 (3), 777-785.
  7. Nakashima, S.F., Morimoto, Y., Takano, Y., Yoshikawa, S. & Hugenberg, K. (2014). Faces in the dark: interactive effects of darkness and anxiety on the memory for threatening faces. Frontiers in Psychology, 5, 1091.
  8. Bergman, P., Ho, H.-N., Koizumi, A., Tajadura-Jiménez, A. & Kitagawa, N. (2014). The pleasant heat? Evidence for thermal-emotional implicit associations occurring with semantic and physical thermal stimulation. Cognitive Neuroscience, 6 (1), 24-30.
  9. Abekawa, N., Inui, T. & Gomi, H. (2014). Eye-hand coordination in on-line visuomotor adjustments. Neuroreport, 25 (7), 441-445.
  10. Fujisaki, W., Goda, N., Motoyoshi, I., Komatsu, H. & Nishida, S. (2014). Audio-visual integration in the human perception of materials. Journal of Vision, 14, 1-20.
  11. Ho, H.-N., Van Doorn, G.H., Kawabe, T., Watanabe, J. & Spence, C. (2014). Colour-temperature correspondences: When reactions to thermal stimuli are influenced by colour. PLoS One, 9 (3), e91854.
  12. Ho, H.-N., Iwai, D., Yoshikawa, Y., Watanabe, J. & Nishida, S. (2014). Combining colour and temperature: A blue object is more likely to be judged as warm than a red object. Scientific Reports, 4, 5527.
  13. Otsuka, S., Furukawa, S., Yamagishi, S., Hirota, K. & Kashino, M (2014). Inter-individual variation of sensitivity to frequency modulation: Its relation with click-evoked and distortion-product otoacoustic emissions. Journal of Association for Research in Otolaryngology, 15 (2), 175-186.
  14. Altmann, C. F., Terada, S., Kashino, M., Goto, K., Mima, T., Fukuyama, H. & Furukawa, S. (2014). Independent or integrated processing of interaural time and level differences in human auditory cortex?. Hearing Research, 312, 121-127.
  15. Kumano, S., Otsuka, K., Matsuda, M. & Yamato, J. (2014). Analyzing Perceived Empathy based on Reaction Time in Behavioral Mimicry. IEICE Trans., E97-D (8), TBA.
  16. Kondo, H. M., Toshima, I., Pressnizer, D. & Kashino, M. (2014). Probing the time course of head-motion cues integration during auditory scene analysis. Frontiers in Neuroscience, 8 (170), 1-7.
  17. Ochi, A., Yamasoba, T. & Furukawa, S. (2014). Factors that account for inter-individual variability of lateralization performance revealed by correlations of performance among multiple psychoacoustical tasks. Frontiers in Neuroscience, 8, 27.
  18. Otsuka, K., Furuyama, N. & Bono, M. (2014). The Future of Idobata Kaigi ("Congregation at the Well"): Realizing Natural Conversations with Distance Access. NII Today, 48, 006-007.
  19. Masuda, Ayako, Watanabe, Junji, Terao, Masahiko, Yagi, Akihiro & Maruya, Kazushi (2014). A temporal window for estimating surface brightness in the Craik-O'Brien-Cornsweet effect. Frontiers in Human Neuroscience, 8 (855), 1-11.
  20. Ho, Hsin-Ni, Iwai, Daisuke, Yoshikawa, Yuki, Watanabe, Junji & Nishida, Shin'ya (2014). Combining color and temperature: A blue object is more likely to be judged as warm than a red object. Scientific Reports, 4 (5527), 1-5.
  21. Aruga, Reiko, Saito, Hideo, Ando, Hideyuki & Watanabe, Junji (2014). Two-dimensional grouping affects perisaccadic perception of depth and synchrony. Perception, 43 (6), 589-594.
  22. Ho, Hsin-Ni, Van Doorn, H. George, Kawabe, Takahiro, Watanabe, Junji & Spence, Charles (2014). Colour-temparature correspondences: When reactions to thermal stimuli are influenced by colour. PLOS ONE, 9(3) (e91854), e91854.

Conferences

  1. Liao, H.I., Yoneya, M., Kidani, S., Kashino, M. & Furukawa, S. (2014). Human pupil dilation responses to auditory stimulations: Effects of stimulus property, context, probability, and voluntary attention. (37th ARO Midwinter Meeting).
  2. Ishii, R. (2014). Analyzing Multi-party Turn-taking Mechanism and Predicting Next Speaking. (ACM International Conference on Multimodal Interaction (ICMI 2014), Workshop on Multimodal, Multi-Party, Real-World Human-Robot Interaction).
  3. Suzuki, Y., Suzuki, R. & watanabe, J. (2014). Transformation from text to touch - Touching a“Japanese old tale”. (CHI2014).
  4. Gomi, H. (2014). Implicit visual and motor processing in manual following responses. (Neuroscience2014).
  5. Takano, Y., Ukezono, M., Nakashima, S.F. & Fukasawa, S. (2014). Actor's performance of blinking. (1st World Congress on Facial Expression).
  6. Takano, Y., Nakashima, S.F. & Ukezono, M. (2014). Social learning and the cingulate cortex in rats. (9th Frderation of European Neuroscience Societies).
  7. kimura, T., Mochida, T., Ijiri, T. & Kashino, M. (2014). Real-time sonification of motor coordination to support motor skill learning in sports. (icSports 2014).
  8. Ikei, Y., Shimabukuro, S., Kato, S., Okuya, Y., Abe, K., Hirota, K. & Amemiya, T. (2014). Rendering of Virtual Walking Sensation by a Passive Body Motion. (Eurohaptics 2014).
  9. Shirama, A., Kato, N. & Kashino, M. (2014). Weak individualization of spontaneous eye movements in individuals with autism spectrum disorders. (37th European Conference on Visual Perception).
  10. Yamagishi, S., Ashihara, T., otsuka, S., Furukawa, S. & kashino, M. (2014). The frequency-following response reflects spontaneous perceptual switching in auditory streaming. (Frequency Following Response (FFR) Workshop).
  11. Yamagishi, S., Otsuka, S., Furukawa, S. & Kashino, M. (2014). Neural correlates of auditory streaming in human scalp potentials generated from the brainstem and thalamocortical auditory pathway. (37th ARO Midwinter Meeting).
  12. Otsuka, S., Furukawa, S., Yamagishi, S., Hirota, K. & Kashino, M. (2014). Inter-Individual variation of sensitivities to frequency modulation, amplitude modulation, and interaural-phase difference: Relation with click-evoked otoacoustic emissions. (Association for Research in Otolaryngology (ARO) 37th Annual MidWinter Meeting).
  13. Furukawa, S., Ikeda, S., Numata, R., Sugimoto, S. & Hirokawa, J. (2014). Modulation of the auditory-evoked potential by continuous laser irradiation: Effects of wavelength and induced temperature change. (37th ARO Midwinter Meeting).
  14. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2014). Analysis of Timing Structure of Eye Contact in Turn-changing. (Workshop on Eye Gaze in Intelligent Human Machine Interaction: Eye-Gaze and Multimodality (GazeIn 2014)).
  15. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2014). Analysis of Respiration for Prediction of "Who Will Be Next Speaker and When?" in Multi-Party Meetings. (International Conference on Multimodal Interaction (ICMI 2014)).
  16. Roseboom, W., Linares, D. & Nishida, S. (2014). Audio-visual asynchrony exposure changes sensitivity for temporal synchrony: adaptation in relative timing mirrors adaptation in vision. (Annual meeting of the International Multisensory Research Forum).
  17. Ohtani, T. & Maruya, K. (2014). Using optical illusion patterns affixed to toy blocks for learning human errors in three-dimensional projections. (The 16th International Conference on Geometry & Graphics (ICGG 2014)).
  18. Nakashima, S.F., Ukezono, M., Nishida, H., Murata, A., Takano, Y. & Takahashi, N. (2014). Frequencies and types of 50kHz vocalization emission in feeding situation depend on social context in laboratory rats. (Ultrasonic communication in Rodents-2nd international workshop).
  19. Nakashima, S.F., Ukezono, M. & Takano, Y. (2014). Recognition of facial expression in laboratory rats. (1st World Congress on Facial Expression of Emotion).
  20. Ukezono, M. & Takano, Y. (2014). Development of an experimental task for rat's mirror system. (9th Frderation of European Neuroscience Societies).
  21. Maruya, K. & Nishida, S. (2014). Adaptation to a non-uniform motion pattern reveals a mechanism to encode local flow changes. (The 10th Asia-Pacific Conference on Vision (APCV 2014)).
  22. Lin, I.F., Yamada, T., Nakamura, M., Watanabe, H., Takayama, Y., Iwanami, A., Kato, N. & Kashino, M (2014). Selective listening in autism: The influence of informational masking. (International Meeting for Autism Research).
  23. Imaizumi, S., Asai, T., Kanayama, N., Kawamura, M. & Koyama, S. (2014). Visuo-motor timing and sense of agency in a phantom limb: a case study. (ECVP 2014).
  24. Hisakata, R. & Nishida, S. (2014). A directional bias of apparent position shift of a moving element with a hard edge. (The 10th Asia-Pacific Conference on Vision).
  25. Fujisaki, W., Goda, N., Motoyoshi, I., Komatsu, H. & Nishida, S. (2014). Optimal audiovisual integration of object appearance and impact sounds in human perception of materials. (Asia-Pacific Conference on Vision).
  26. Mikami, D., Matsumoto, A., Kimura, T., Ozawa, S. & Kojima, A. (2014). Visual feedback system for intuitive comprehension of self-movement and sensor data for effective motor learning. (2nd International Congress on Sports Science Research and Technology Support).
  27. Asai, T. (2014). Positive bias in agency judgment. (Association for the Scientific Study of Consciousness (ASSC18)).
  28. Amemiya, T. & Gomi, H. (2014). Buru-Navi3: Movement Instruction Using Illusory Pulled Sensation Created by Thumb-sized Vibrator. (ACM SIGGRAPH 2014).
  29. Kawabe, T., Maruya, K. & Nishida, S. (2014). Image deformation as a perceptual cue to a transparent layer. (The 10th Asia-Pacific Conference on Vision).
  30. Yoneya, M., Liao, H.I., Kidani, S., Furukawa, S. & Kashino, M. (2014). Auditory attention could affect the positioning-control of microsaccade. (the 37th Annual Meeting of the Japan Neuroscience Society).
  31. Liao, H.I., Kidani, S., Yoneya, M., Kashino, M. & Furukawa, S. (2014). Pupillary response reflects subjective salience of sound. (the 37th Annual Meeting of the Japan Neuroscience Society).
  32. Furukawa, S., Liao, H. I., Kidani, S., Yoneya, M. & Kashino, M. (2014). Evaluating the salience of auditory events through eyes. (7th Forum Acusticum Krakow 2014).
  33. Hisakata, R., Nishida, S. & Johnston, A. (2014). No motion-induced sensitivity modulation for chromatic gratings. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  34. Maruya, K. & Nishida, S. (2014). Adaptation to a non-uniform motion pattern reveals a mechanism to encode local flow changes. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  35. Sawayama, M. & Nishida, S. (2014). Discrimination of highlights from reflectance changes using isophote maps of surface images. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  36. Sawayama, M. & Nishida, S. (2014). Shape and material from intensity gradient: A hypothesis. (The 10th Asia-Pacific Conference on Vision).
  37. Kawabe, T., Maruya, K. & Nishida, S. (2014). What do human observers see in dynamic image deformation?. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  38. Amemiya, T. & Gomi, H. (2014). Distinct pseudo-attraction force sensation by a thumb-sized vibrator that oscillates asymmetrically. (Eurohaptics 2014).
  39. Ikei, Y., Okuya, Y., Shimabukuro, S., Abe, K., Amemiya, T. & Hirota, K. (2014). To Relive a Valuable Experience of the World at the Digital Museum. (In Proceedings of 16th International Conference on Human-Computer Interaction (HCI International 2014)).
  40. Ishii, R., Otsuka, K., Kumano, S. & Yamato, J. (2014). Analysis and Modeling of Next Speaking Start Timing based on Gaze Behavior in Multi-party Meetings. (IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)).
  41. Kitamura, M., Watanabe, K. & Kitagawa, N. (2014). Effects of positive emotion on audiovisual integration. (International Multisensory Research Forum 2014).
  42. Takamuku, S. & Gomi, H. (2014). Fine-tuned force control under consistent visual feedback of object motion. (Annual meeting of the society for Neuroscience(Neuroscience2014)).
  43. Yamagishi, S., Ashihara, T., Otsuka, S., Furukawa, S. & Kashino, M (2014). Auditory brainstem and thalamo-cortical evoked responses that correlate perceptual switching on auditory streams. (2014 ICME International Conference on Complex Medical Engineering (CME2014)).
  44. Kitagawa, N. & Kitamura, M. (2014). The stream-bounce illusion depends on subjective audiovisual simultaneity. (15th International Multisensory Research Forum).
  45. Tajadura Jimenez, A., Deroy, O., Marquardt, T., Bianchi-Berthouze, N., Asai, T., Kimura, T. & Kitagawa, N. (2014). Auditory-tactile induced changes in represented leg height when dropping a ball. (15th International Multisensory Research Forum).
  46. Liao, H. I., Yoneya, M., Kidani, S., Kashino, M. & Furukawa, S. (2014). Can pupil dilation response be a marker for auditory salience?. (2014 International Conference on Complex Medical Engineering (CME2014)).
  47. Asai, T. (2014). Agency in sensorimotor dynamics. (IMRF 2014).
  48. Kihara, K., Takeuchi, T., Yoshimoto, S., Kondo, H.M. & Kawahara, J.I. (2014). The locus coeruleus-noradrenaline system facilitates attentional processing of action-triggered visual stimuli. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  49. Kidani, S., Liao, H.I., Yoneya, M., Furukawa, S. & Kashino, M. (2014). Deriving the “Salience Level” of a Target Sound using a Tapping Technique. (Association for Research in Otolaryngology (ARO) 37th Annual MidWinter Meeting).
  50. Yoneya, M., Liao, H.I., Kidani, S., Furukawa, S. & Kashino, M. (2014). Sounds in Sequence Modulate Dynamic Characteristics of Microsaccades. (Association for Research in Otolaryngology (ARO) 37th Annual MidWinter Meeting).
  51. Liao, H.I., Shimojo, S. & Kashino, M. (2014). Pupil constriction during visual preference decision. (Vision Sciences Society 14th Annual Meeting(VSS2014)).
  52. Ishii, R., Ozawa, S., Kawamura, H., Kojima, A. & Nakano, Y. (2014). Evaluation of Window Interface in Remote Collaboration Involving Pointing Gestures. (International Conference on Advances in Computer-Human Interactions(ACHI)).
  53. Ito, Sho & Gomi, Hiroaki (2014). Sensory mismatch decreases stretch reflex amplitude in mirrored visual tracking task. The 24th Annual Conference of the Japanese Neural Network Society.
  54. De Havas, Jack, Ito, Sho, Gomi, Hiroaki & Haggard, Patrick (2014). Control mechanisms underlying an involuntary movement: the effect of resistive and assistive perturbation on the Kohnstamm phenomenon. The Society for Neuroscience 44th Annual Meeting.
  55. Yuuki, Ooishi & Takashi, Sato (2014). Sound presentation during different respiration phases alters the sound-induced sympathetic tone. 9th FENS Forum of Neuroscience. Milan (Italy).
  56. Ken, Watanabe, Yuuki, Ooishi & Makio, Kashino (2014). The entrainment of heart rate to acoustic tempo. 9th FENS Forum of Neuroscience. Milan (Italy).
  57. Watanabe, Junji, Seiichiro Hirahara, Maeda, Taro & Ando, Hideyuki (2014). Interactive Technologies with Applied Perception. International Display Workshops 2014. Nigata, JP.
  58. Ando, Hideyuki, Hirabara, Seichiro, Maeda, Taro & Watanabe, Junji (2014). Slit-based Light Field 3D Display. SIGGRAPH 2014. Vancouver City, CA.
  59. Kagitani, Tatsuki, Goto, Mao, Watanabe, Junji & Sakamoto, Maki (2014). Sound Symbolic Relationship between Onomatopoeia and Emotional Evaluations in Taste. CogSci 2014. Quebec City, CA.
  60. Tsuboi, Hiroki, Inoue, Makoto, Kuroki, Scinob, Mochiyama, Hiromi & Watanabe, Junji (2014). Roughness perception of micro-particulate plate: a study on two-size-mixed stimuli. Eurohaptics 2014. Paris, FR.
  61. Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2014). Vibrotactile frequency discrimination performance with cross-channel distractors. Eurohaptics 2014. Paris, FR.
  62. Doizaki, Ryuichi, Watanabe, Junji & Sakamoto, Maki (2014). A System Evaluating Tactile Feelings Expressed by Sound Symbolic Words. Eurohaptics 2014. Paris, FR.
  63. Suzuki, Yasuhiro, Suzuki, Rieko & Watanabe, Junji (2014). Transformation from text to touch - Touching a “Japanese old tale. CHI Conference on Human Factors in Computing Systems 2014. Toronto, CA.
  64. Ando, Hideyuki & Watanabe, Junji (2014). Implementation of the multi-slit stereoscopic display -Device design-. ASIAGRAPH Forum 2014. Bali, ID.

2013

Journal articles

  1. Gomi, Hiroaki, Abekawa, Naotoshi & Shimojo, Shinsuke (2013). The hand sees visual periphery better than the eye: motor-dependent visual motion analyses. The Journal of Neuroscience, 33 (42), 16502-9.
  2. Roseboom, W., Kawabe, T. & Nishida, S. (2013). The cross-modal double flash illusion depends on featural similarity between cross-modal inducers. Scientific Reports, 3, 3437.
  3. Linares, D. & Nishida, S. (2013). A synchronous surround increases the motion strength gain of motion. Journal of Vision, 13, 1-15.
  4. Roseboom, W., Kawabe, T. & Nishida, S. (2013). Audio-visual temporal recalibration can be constrained by content cues regardless of spatial overlap. Frontiers in Psychology, 4, 189.
  5. Yamada, Y., Kawabe, T. & Miyazaki, M. (2013). Pattern randomness aftereffect. Scientific Reports, 3, 2906.
  6. Yamada, Y. & Kawabe, T. (2013). Gaze-cueing of attention distorts visual space. Universitas Psychologica, 12 (5), 1501-1510.
  7. Kawabe, T. (2013). Inferring sense of agency from the quantitative aspect of action outcome. Consciousness and Cognition., 22 (2), 407-412.
  8. Kawabe, T. (2013). Side effect of acting on the world: Acquisition of action-outcome statistic relation alters visual interpretation of action outcome. Frontiers in Human Neuroscience, 7, 610.
  9. Kawabe, T., Roseboom, W. & Nishida, S. (2013). The sense of agency is action-effect causality perception based on cross-modal grouping. Proceedings of the Royal Society B: Biological Sciences(Proc. R. Soc. B), 280 (1763), 20130991.
  10. Amemiya, T. & Gomi, H. (2013). Directional Torque Perception with Brief, Asymmetric Net Rotation of a Flywheel. IEEE Transactions on Haptics, 6 (3), 370-375.
  11. Hiroya, S. (2013). Non-negative temporal decomposition of speech parameters by multiplicative update rules. IEEE Trans. Audio, Speech and Language Processing., 21 (10), 2108-2117.
  12. Hiroya, S. (2013). Speaking rhythm extraction and control by non-negative temporal decomposition. NTT Technical Review 2013, 11 (12).
  13. Koizumi, A., Kitagawa, N., Kondo, H. M., Kitamura, M.S., Sato, T. & Kashino, M. (2013). Serotonin transporter gene-linked polymorphism affects detection of facial expressions. PLOS ONE, 8, e59074.
  14. Mochida, T., Kimura, T., Hiroya, S., Kitagawa, N., Gomi, H. & Kondo, T. (2013). Speech misperception: speaking and seeing interfere differently with hearing. PLOS ONE, 8 (7), e68619.
  15. Furukawa, S., Nishida, T., Kondo, T. & Kakehi, K. (2013). Insensitivity to the coherence of interaural-time-difference modulation across frequency channels. Acoust. Sci. Tech., 34, 397-403.
  16. Ishii, R., Nakano, Y. & Nishida, T. (2013). Gaze Awareness in Conversational Agents: Estimating User's Conversational Engagement using Eye-gaze. The ACM Transactions on Interactive Intelligent System, Special issue on interaction with smart objects, Special section on eye gaze and conversation archive, 3(2) (11).
  17. Liao, H.I., Wu, D.A., Halelamien, N. & Shimojo, S. (2013). Cortical stimulation consolidates and reactivates visual experience: Neural plasticity from magnetic entrainment of visual activity. Scientific Reports, 3, 2228.
  18. Liao, H.I., Shimojo, S. & Yeh, S.L. (2013). Happy faces are preferred regardless of familiarity - Sad faces are preferred only when familiar. Emotion, 13 (3), 391-396.
  19. Yamada, Y. & Kawabe, T. (2013). Localizing non-retinotopically moving objects. PLoS One, 8 (1).
  20. Edwards, M., Cassnello, R.C., Badcock, D.R. & Nishida, S. (2013). Effect of form cues on 1D and 2D motion pooling. Vision Research, 76, 94-104.
  21. Maruya, K., Holcombe, A.O. & Nishida, S. (2013). Rapid encoding of relationships between spatially remote motion signals. Journal of Vision, 13(2) (4), 1-20.
  22. Roseboom, W., Kawabe, T. & Nishida, S. (2013). Direction of visual apparent motion driven by perceptual organisation in cross-modality signals. Journal of Vision, 13(1) (6), 1-13.
  23. Furukawa, S., Washizawa, S., Ochi, A. & Kashino, M. (2013). How independent are the pitch and the interaural-time-difference mechanisms that rely on temporal fine structure information?. Basic Aspects of Hearing: Advances in Experimental Medicine and Biology, 787, 91-99.
  24. Hosokawa, K., Maruya, K. & Sato, T. (2013). Temporal characteristics of depth perception from motion parallax. Journal of Vision, 13(1) (16), 1-8.
  25. Kato, M. & Konishi, U. (2013). Where and how infants look: The development of fixations and scan paths in face perception. Infant Behavior and Development: An International and Interdisciplinary Journal, 36 (1), 32-41.
  26. Seno, T., Kawabe, T., Ito, H. & Sunaga, S. (2013). Vection modulates emotional valence of autobiographical episodic memories. Cognition, 126, 115-120.
  27. Liao, H.-I. & Yeh, S.-L. (2013). Capturing attention is not that simple: Different mechanisms for stimulus-driven and contingent capture. Attention, Perception, & Psychophysics, 75, 1703-1714.
  28. Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2013). Contribution of within- and cross-channel information to vibrotactile frequency discrimination. Brain Research, 1529, 46-55.
  29. Ando, Hideyuki, Yoshida, Tomofumi & Watanabe, Junji (2013). "SaveYourSelf !!!" - An externalized sense of balance. Leonardo, 46 (3), 286-287.
  30. Watanabe, Junji (2013). Pseudo-haptic sensation wlicited by background visual motion. ITE Transactions on Media Technology and Applications, 1 (2), 199-202.

Conferences

  1. Liao, H.I., Yang, Y.H. & Yeh, S.L. (2013). Unconscious feature binding of color and orientation. (Neuro2013).
  2. Nishida, S. (2013). Hierarchical processing of motion information by human vision. (The Kyoto University and University College London Workshop on the Perception of Motion and Pattern 2013.).
  3. Maruya, K., Kawabe, T. & Nishida, S. (2013). Material from motion - Human perception of fluid properties from motion vector fields. (Vision Science Society 2013 (VSS2013)).
  4. Kawabe, T., Maruya, K. & Nishida, S. (2013). Transparent liquid impression. Evidence for a tuning to spatiotemporal frequencies of motion vectors. (PRISM2: The science of light and shade workshop).
  5. Kawabe, T. (2013). Non-retinotopic motion processing underlies postdictive appearance modulation. (European Conference on Visual Perception(ECVP2014)).
  6. Amemiya, T., Hirota, K. & Ikei, Y. (2013). Tactile Apparent Motion Presented from Seat Pan Facilitates Racing Experience. (In Proceedings of 15th International Conference on Human-Computer Interaction (HCI International 2013)).
  7. Sasaki, T., Hirota, K., Amemiya, T. & Ikei, Y. (2013). Train Ride Simulation using Assist Strap Device. (In Proceedings of 15th International Conference on Human-Computer Interaction (HCI International 2013)).
  8. Hirota, K., Ito, Y., Amemiya, T. & Ikei, Y. (2013). Presentation of Odor in Multi-sensory Theater. (In Proceedings of 15th International Conference on Human-Computer Interaction (HCI International 2013)).
  9. Ikei, Y., Abe, K., Masuda, Y., Okuya, Y., Amemiya, T. & Hirota, K. (2013). Virtual Experience System for a Digital Museum. (In Proceedings of 15th International Conference on Human-Computer Interaction (HCI International 2013)).
  10. Ito, S., Kimura, T. & Gomi, H. (2013). State estimation model explains perceived fatigue modulation by preceding and delayed visual feedback. (The Society for Neuroscience 2013 ,43rd Annual Meeting).
  11. Mochida, T. & Gomi, H. (2013). Speech adaptation under delayed auditory feedback environment affects speech perception in an articulator-specific manner. (Society for Neuroscience 2013).
  12. Takamuku, S. & Gomi, H. (2013). Impacts of delayed visual feedback and visual collision on perception of resistive force applied during reaching movements. (Annual meeting of the society for Neuroscience(Neuroscience2013)).
  13. Kumano, S., Otsuka, K., Matsuda, M., Ishii, R. & Yamato, J. (2013). Using A Probabilistic Topic Model to Link Observers' Perception Tendency to Personality. (The fifth biannual Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2013)).
  14. Sanchez-Cortes, D., Biel, J., Kumano, S., Yamato, J., Otsuka, K. & Gatica-Perez, D. (2013). Inferring mood in ubiquitous conversational video. (The 12th International Conference on Mobile and Ubiquitous Multimedia (MUM '13 )).
  15. Ooishi, Y., Mukai, H., Watanabe, K., Kawato, S. & Kashino, M. (2013). The effect of the tempo of music on the autonomic and endocrinological systems. (UCL-NTT meeting).
  16. Koizumi, A., Shirama, A. & Kitagawa, N. (2013). The expressers' gaze distance affects perception of facial expressions. (International Society for Research on Emotion).
  17. Shirama, A., Koizumi, A. & Kitagawa, N. (2013). Your eye movements tell who you are. (European Conference on Visual Perception).
  18. Altmann, C. F., Terada, S., Kashino, M., Mima, T., Fukuyama, H. & Furukawa, S. (2013). Independent or integrated processing of interaural time and level differences in human auditory cortex?. (Neuroscience2013).
  19. Kawano, H. (2013). Effect of Virtual Work braking on Distributed Multi-Robot Reinforcement Learning. (2013 IEEE International Conference on Systems, Man, and Cybernetics).
  20. Toshima, I., Kondo, H.M., Pressnitzer, D. & Kashino, M. (2013). Evaluating the effect of head motion on auditory streaming using an acoustical telepresence robot: TeleHead. (Final Symposium on JST-ANR Binaural Active Audition for Humanoid Robots).
  21. Lin, I.F., Kashino, M., Ohta, H., Yamada, T., Watanabe, H., Kanai, C., Tani, M., Ohno, T., Ichihashi, K., Takayama, Y., Iwanami, A. & Kato, N. (2013). The Effect of Oxytocin On Sympathetic Responses While Listening to Emotional Sounds in Autism. (Annual meeting of International Society for Autism Research).
  22. Ishii, R., Otsuka, K., Kumano, S., Matsuda, M. & Yamato, J. (2013). Predicting Next Speaker and Timing from Gaze Transition Patterns in Multi-Party Meetings. (ACM International Conference on Multimodal Interaction).
  23. Otsuka, K., Kumano, S., Ishii, R., Zbogar, M. & Yamato, J. (2013). MM+Space: n x 4 Degree-of-Freedom Kinetic Display for Recreating Multiparty Conversation Spaces. (ACM International Conference on Multimodal Interaction).
  24. Otsuka, K. (2013). onversation Scene Analysis and Reconstruction. (UCL‐NTT Collaboration “Deep Brain Communication”Project 2nd Meeting).
  25. Wang, Q., Motoyoshi, I. & Nishida, S. (2013). Characterization of high-level image features for surface gloss perception. (Vision Science Society 2013 (VSS2013)).
  26. Motoyoshi, I. (2013). Visual aftereffects in natural object categories. (Vision Science Society 2013 (VSS2013)).
  27. Kamachi, M., Ishii, T. & Motoyoshi, I. (2013). Voluntary disattention facilitates global motion detection. (Vision Science Society 2013 (VSS2013)).
  28. Nakayama, R., Motoyoshi, I. & Sato, T. (2013). Motion pop-out is determined by extra-retinal coordinate. (Vision Science Society 2013 (VSS2013)).
  29. Kanaya, S., Fujisaki, W., Nishida, S., Furukawa, S. & Yokosawa, K. (2013). Comparisons of temporal frequency limits for cross-attribute binding tasks in vision and audition. (Vision Science Society 2013 (VSS2013)).
  30. Amano, K., Qi, L., Takeda, T. & Nishida, S. (2013). Neural correlates of time marker for simultaneity judgment. (Vision Science Society 2013 (VSS2013)).
  31. Kawabe, T., Maruya, K. & Nishida, S. (2013). Seeing transparent liquids from dynamic image distortion. (Vision Science Society 2013 (VSS2013)).
  32. Sato, H., Motoyoshi, I. & Sato, T. (2013). Perception of global trend from dynamic stimuli. (Vision Sciences Society Annual meeting).
  33. Linares, D., Motoyoshi, I. & Nishida, S. (2013). Facilitation of rapid motion perception by a static, but not dynamic, synchronous surround. (Vision Sciences Society Annual Meeting).
  34. Watanabe, J. & Sakamoto, M. (2013). Sound Symbolic Relationship between Onomatopoeias and Emotional Evaluations in Taste and Touch. (Iconicity: East meets West).
  35. Amemiya, T. & Maeda, T. (2013). Depth and Rate Estimation for Chest Compression CPR with Smartphone. (IEEE 8th Symposium on 3D User Interfaces (3DUI 2013)).
  36. Amemiya, T., Hirota, K. & Ikei, Y. (2013). Tactile Flow on Seat Pan Modulates Perceived Forward Velocity. (IEEE 8th Symposium on 3D User Interfaces (3DUI 2013)).
  37. Hirota, K., Ito, Y., Amemiya, T. & Ikei, Y. (2013). Generation of Directional Wind by Colliding Airflows. (World Haptics Conference (WHC) 2013).
  38. Amemiya, T. & Gomi, H. (2013). Camera pose estimation with a two-dimensional marker grid for haptic navigation. (IEEE Virtual Reality 2013 (VR 2013)).
  39. Amemiya, T., Hirota, K. & Ikei, Y. (2013). Perceived Forward Velocity Increases with Tactile Flow on Seat Pan. (IEEE Virtual Reality 2013 (VR 2013)).
  40. Furukawa, S., Nishida, T., Kondo, T. & Kakehi, K. (2013). Sensitivities to the relative phase of interaural-time-difference modulations between carrier frequencies. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  41. Yamagishi, S., Ashihara, T., Otsuka, S., Furukawa, S. & Kashino, M. (2013). Neural Correlates of Auditory Streaming in the Human Brainstem. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  42. Kashino, M., Furukawa, S., Nakano, T., Washizawa, S., Yamagishi, S., Ochi, A., Nagaike, A., Kitazawa, S. & Kato, N. (2013). Specific Deficits of Basic Auditory Processing in High-Functioning Pervasive Developmental Disorders. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  43. Lin, I.F., Chait, M. & Kashino, M. (2013). The effect of visual cues on auditory segregation. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  44. Otsuka, S., Yamagishi, S., Hirota, K., Furukawa, S. & Kashino, M. (2013). Relationship between middle-ear transmission characteristics and frequency modulation detection. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  45. Masutomi, K., Barascud, N., Overath, T., Kashino, M., McDermott, J. & Chait, M. (2013). Recovering Sound Sources from Embedded Repetition And Directed Attention: Effect of Spectral Information on Sound Segregation. (36th ARO (Association for Research in Otolaryngology) MidWinter Meeting).
  46. Kawano, H. (2013). Hierarchical Sub-task Decomposition for Reinforcement Learning of Multi-robot Delivery Mission. (2013 IEEE International Conference on Robotics and Automation(ICRA2013)).
  47. Watanabe, Junji & Ho, Hsin-Ni (2013). Color-temperature association: A blue object is more likely to be judged as warm than a red object. Sound Symbolism Workshop 2013. Tokyo, JP.
  48. Watanabe, Junji (2013). Experience Design for Symbol Grounding an d Evoking Concepts. 15th JSAI International Symposia on AI - SIGNAC Natural Computing Meets Computational Aesthetics. Yokohama, JP.
  49. Suzuki, Yasuhiro, Suzuki, Rieko, Watanabe, Junji & Akiba, Fuminori (2013). Tactile score, a method of describing the sense of touching. 5th International Congress of International Association of Societies of Design Research. Tokyo, JP.
  50. Sakamoto, Maki, Yoshino, Junya & Watanabe, Junji (2013). Development of tactile materials representing human basic tactile sensations. 5th International Congress of International Association of Societies of Design Research. Tokyo, JP.
  51. Kuroki, Scinob, Watanabe, Junji & Nishida, Shin'ya (2013). Synthesis of vibrotactile frequencies. 36th European Conference on Visual Perception (ECVP 2013). Bremen, GE.
  52. Ho, Hsin-Ni, Iwai, Daisuke, Yoshikawa, Yuki, Watanabe, Junji & Nishida, Shin'ya (2013). Effects of color on perceived temperature. 36th European Conference on Visual Perception (ECVP 2013). Bremen, GE.
  53. Suzuki, Yasuhiro, Suzuki, Rieko & Watanabe, Junji (2013). Picture books for tactual storytelling. SIGGRAPH 2013. Vancouver, CA.
  54. Ando, Hideyuki, Miyazaki, Youhei, Maeda, Taro & Watanabe, Junji (2013). Three-Dimensional Perception in Multi-Slit Vision for Public Display. IEEE: Workshop on Ambient Information Technologies. Orland, US.
  55. Watanabe, Junji (2013). Time and frequency in touch -psychophysical approach. Computational Neuroscience Society 2013 Workshop “Early Touch: From Neural Coding to Haptic Space Geometry” (Organizer: Jonathan Platkiewicz, Vincent Hayward). Paris, FR.
  56. Watanabe, Junji & Sakamoto, Maki (2013). Sound symbolic relationship between onomatopoeia and emotional evaluations in touch and taste. Iconicity: East meets West 9th International Symposium on Iconicity in Language and Literature. Tokyo, JP.
  57. Ando, Hideyuki & Watanabe, Junji (2013). Geometric optics design of the multi-slit stereoscopic display - From the view point of public use. ASIAGRAPH 2013. Hawai'I, US.
  58. Watanabe, Junji (2013). Visualization of relationships between tactile sensory qualities. IEEE World Haptics Conference 2013 Workshop “Quantification of Tactile Feelings: How can We Analyze, Measure, and Design Diverse Textures in Touch?. Daejeon, KR.

2012

Journal articles

  1. Yamada, Y., Kawabe, T. & Miura, K. (2012). One's own name distorts visual space. Neuroscience Letters, 531 (2), 96-98.
  2. Kobayashi, M. & Kashino, M. (2012). Effect of flanking sounds on the auditory continuity illusion. PLoS ONE, 7 (12).
  3. Fuentes, C., Gomi, H. & Haggard, P. (2012). Temporal features of tendon vibration illusions. European Journal of Neuroscience, 36 (12), 3709-3717.
  4. Linares, D., Motoyoshi, I. & Nishida, S. (2012). Surround facilitation for rapid motion perception. Journal of Vision, 12.
  5. Amano, K., Takeda, T., Haji, T., Terao, M., Maruya, K., Matsumoto, K., Murakami, I. & Nishida, S. (2012). Human neural responses involved in spatial pooling of locally ambiguous motion signals. Journal of Neurophysiology, 107, 3493-3508.
  6. Lin, I.F. & Kashino, M. (2012). Perceptual grouping over time within and across auditory and tactile modalities. PLoS ONE, 7(7), e41661.
  7. Constantino, F.C., Pinggera, L., Paranamana, S., Kashino, M. & Chait, M. (2012). Different sensitivity to appearing and disappearing objects in complex acoustic scenes. PLoS One, 7, e46167.
  8. Ooishi, Y. & Kashino, M. (2012). Habituation of sympathetic orienting response to aversive sound eliminated by change in basal sympathovagal balance. Psychophysiology.
  9. Kato, Y.X., Furukawa, S., Samejima, K., Hironaka, N. & Kashino, M. (2012). Photosensitive-polyimide based method for fabricating various neural electrode architectures. Frontier Neuroeng, 5, 11.
  10. Furukawa, S. (2012). Detection of simultaneous modulation of interaural time and level differences: Effects of modulation rate and relative phase (L). Journal of the Acoustical Society of America, 132, 1-4.
  11. Tajadura Jiménez, A., Väljamäe, A., Toshima, I., Kimura, T., Tsakiris, M. & Kitagawa, N. (2012). Action sounds recalibrate perceived tactile distance. Current Biology, 22 (13), R516-R517.
  12. Kondo, H.M., Kitagawa, N., Kitamura, M.S., Koizumi, A., Nomura, M. & Kashino, M. (2012). Separability and commonality of auditory and visual bistable perception. Cerebral Cortex, 22, 1915-1922.
  13. Kondo, H.M., Pressnitzer, D., Toshima, I. & Kashino, M. (2012). Effects of self-motion on auditory scene analysis. Proceedings of the National Academy of Sciences of the United States of America, 109, 6775-6780.
  14. Kashino, M. & Kondo, H.M. (2012). Functional brain networks underlying perceptual switching: auditory streaming and verbal transformations. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 367, 977-987.
  15. Kuroki, S , Watanabe, J, Mabuchi, K, Tachi, S & Nishida, S (2012). Directional remapping in tactile inter-finger motion: a motion aftereffect study. Experimental Brain Research, 216 (2), 311-320.
  16. Ooishi, Yuuki & Kashino, Makio (2012). Habituation of rapid sympathetic response to aversive timbre eliminated by change in basal sympathovagal balance. Psychophysiology, 49 (8), 1059-1071.
  17. Sato, Takashi G & Ooishi, Yuuki (2012). Sound presentation during different respiration phases alters the sound-induced vasoconstriction mediated by the sympathetic nerve. Neuroscience letters, 521 (1), 67-70.
  18. Watanabe, Junji, Maeda, Taro & Ando, Hideyuki (2012). Gaze-contingent visual presentation technique with electro-ocular-graph-based saccade detection. ACM Transactions on Applied Perception, 9(2) (6), 1-12.
  19. Watanabe, Junji, Godai, Yusuke & Ando, Hideyuki (2012). Length and roughness perception in a moving-plateau touch display. Advances in Human-Computer Interaction, 2012 (74629).
  20. Kuroki, Scinob, Watanabe, Junji, Mabuchi, Kunihiko, Tachi, Susumu & Nishida, Shin'ya (2012). Directional remapping in tactile inter-finger apparent motion: A motion aftereffect study. Experimental Brain Research, 216 (2), 311-320.

Books/Chapters

  1. Liao, H.-I. & Shimojo, S. (2012). Dynamic preference formation via gaze and memory. Dolan, R. J. & Sharot, T (Eds.), The neuroscience of preference and choice: Cognitive and Neural Mechanisms. (pp.277-292). Elsevier Publishing.

Conferences

  1. Otsuka, Y., Motoyoshi, I., Harold, H., Kobayashi, M., Kanazawa, S. & M., Yamaguchi (2012). The role of contrast polarity of eyes on face recognition by 7-to-8month-olds. (43rd NIPS International Symposium).
  2. Ishii, T., Motoyoshi, I. & Kamachi, M. (2012). Removal of attention facilitates global motion detection. (The 11th International Symposium on Advanced Techn(ISAT)).
  3. Ochi, A., Furukawa, S., Karino, S. & Yamasoba, T. (2012). Threshold of temporal gap detection in patients with sensorineural hearing loss and with functional hearing loss. (The First Asian Otology Meeting & The 3rd East Asian Symposium on Otology).
  4. Motoyoshi, I. (2012). Climate, illumination statistics, and the style of painting. (Visual Science of Art Conference(VSAC/ECVP2012)).
  5. Lin, I.F. & Kashino, M. (2012). Perceptual grouping over time within and across auditory and tactile modalities. (Annual Meeting of Society for Neuroscience (SfN, Neuroscience 2012)).
  6. Takamuku, S. & Gomi, H. (2012). Principle factor yielding the sluggish sensation during movements with delayed visual feedback. (Society for Neuroscience (SfN, Neuroscience 2012)).
  7. Ito, S., Kimura, T. & Gomi, H. (2012). Enhancement of fatigue perception and phase lag of cyclic movement induced by delayed visual feedback. (Society for Neuroscience (SfN, Neuroscience 2012)).
  8. Kimura, T., Haggard, P. & Gomi, H. (2012). Voluntary movement compensates sensory uncertainty. (Society for Neuroscience, Annual meeting (SfN, Neuroscience 2012)).
  9. Tajadura-Jimenez, A., Väljamäe, A., Toshima, I., Kimura, T., Tsakiris, M. & Kitagawa, N. (2012). Action sounds recalibrate perceived tactile distance. (The 16th annual meeting of the ASSC (ASSC16)).
  10. Kitamura, M., Watanabe, K. & Kitagawa, N. (2012). Moods alter audiovisual integration. (International Multisensory Research Forum(IMRF2012)).
  11. Tajadura Jiménez, A., Vaijamae, A., Toshima, I., Kimura, T., Tsakiris, M. & Kitagawa, N. (2012). Action sounds recalibrate perceived tactile distance. (International Multisensory Research Forum(IMRF2012)).
  12. Roseboom, W., Kawabe, T. & Nishida, S. (2012). Content cues can constrain AV temporal recalibration regardless of spatial overlap. (European Conference on Visual Perception 2012(ECVP2012)).
  13. Ho, H-N., Bergman, P., Koizumi, A., Tajadura-Jiménez, A. & Kitagawa, N. (2012). The pleasant heat? A study of thermal-emotion associations. (International Multisensory Research Forum(IMRF2012)).
  14. Ando, H., Watanabe, J. & Sato, M. (2012). empathetic heartbeat. (ACM Multimedia 2012).
  15. Maruya, K., Uetsuki, M., Ando, H. & Watanabe, J. (2012). "Yu bi yomu": interactive reading of dynamic text. (ACM Multimedia 2012).
  16. Ikei, K., Abe, H., Hirota, K. & Amemiya, T. (2012). A multisensory VR system exploring the Ultra-reality. (18th International Conference on Virtual Systems and Multimedia(VSMM2012)).
  17. Motoyoshi, I. (2012). Broad spatial tunings of the object aftereffect: Evidence for global statistical representations of 3D shape and material. (European Conference on Visual Perception 2012(ECVP2012)).
  18. Nishida, S., Fujisaki, W., Goda, N., Motoyoshi, I. & Komatsu, H. (2012). Not glass but plastic - Audiovisual integration in human material perception. (European Conference on Visual Perception 2012(ECVP2012)).
  19. Shirama, A., Koizumi, A. & Kitagawa, N. (2012). Categorizing identity from biological motion of eyes. (European Conference on Visual Perception 2012(ECVP2012)).
  20. Suzuki, Y., Watanabe, J. & Suzuki, R. (2012). Tactile Score, a knowledge media of tactile sense for creativity. (KES IIMSS 2012).
  21. Linares, D., Holcombe, A.O., Motoyoshi, I. & Nishida, S. (2012). Perceived timing of different features at surface formation. (APCV2012).
  22. Gomi, H., Amano, K. & Kimura, T. (2012). Spatial weighting of visual motion coding in manual following response and MEG. (European Conference on Visual Perception 2012(ECVP2012)).
  23. Takano, Y., Ukezono, M., Takahashi, N. & Hironaka, N. (2012). Phase-locking of hippocampal theta during operant lever-press task in rats. (8th Forum of Neuroscience).
  24. Amemiya, T. & Gomi, H. (2012). Active touch sensing of being pulled illusion for pedestrian route navigation. (SIGGRAPH 2012).
  25. Kuroki, S., Watanabe, J. & Nishida, S. (2012). Dissociation of vibrotactile frequency discrimination performances for supra-threshold and near-threshold vibrations. (EuroHaptics 2012).
  26. Kashino, M., Adachi, E. & Hirose, H. (2012). A computational model for the dynamic aspects of primitive auditory scene analysis. (Acoustics 2012).
  27. Nakayama, R., Motoyoshi, I., Kusano, T. & Sato, T. (2012). Spatial motion coordinates that determine perceptual dominance in binocular rivalry. (Vision Sciences Society Annual Meeting 2012).
  28. Motoyoshi, I. (2012). Visual aftereffects in 3D shape and material of a single object. (Vision Sciences Society Annual Meeting 2012).
  29. Kanaya, S., Fujisaki, W., Nishida, S. & Yokosawa, K. (2012). Temporal frequency limits for within- and cross-attribute binding in vision and audition. (Vision Sciences Society Annual Meeting 2012).
  30. Roseboom, W., Kawabe, T. & Nishida, S. (2012). Changes in visual apparent motion direction by cross-modal interaction are not dependent on temporal ventriloquism. (Vision Sciences Society Annual Meeting 2012).
  31. Kawabe, T., Maruya, K. & Nishida, S. (2012). The role of dynamic visual information in the estimation of liquid viscosity. (Vision Sciences Society Annual Meeting 2012).
  32. Maruya, K. & Nishida, S. (2012). Long-range relationship between separated local motion signals is rapidly encoded in a point-to-point manner. (Vision Sciences Society Annual Meeting 2012).
  33. Furukawa, S., Washizawa, S., Ochi, A. & Kashino, M. (2012). How independent are the pitch and the interaural-time-difference mechanisms that rely on temporal fine structure information?. (16th International Symposium of Hearing 2012).
  34. Kondo, H.M., Pressnitzer, D., Toshima, I. & Kashino, M. (2012). Effect of source-motion and self-motion on the resetting of auditory scene analysis. (Acoustics 2012).
  35. Terao, Masahiko, Murakami, Ikuya & Nishida, Shin'ya (2012). Motion correspondence based on the perisaccadically compressed space. Vision Sciences Society 12th Annual Meeting (VSS2012).
  36. Kashino, Makio, Adachi, Eisaku & Hirose, Haruto (2012). A computational model for dynamic aspects of primitive auditory scene analysis. 16th International Symposium of Hearing 2012.
  37. Ken, Watanabe, Yuuki, Ooishi & Makio, Kashino (2012). The effects of respiratory rate and acoustic tempo on the autonomic nervous system. 8th FENS Forum of Neuroscience. Barcelona (Spain).
  38. Watanabe, Junji & Sakamoto, Maki (2012). Comparison between onomatopoeia and adjective for evaluating tactile sensations. IEEE: Int Conf on Soft Computing and Intelligent Systems. Kobe, JP.

Misc

  1. Mugitani, Ryoko & Hiroya, Sadao (2012). Development of vocal tract and acoustic features in children. Acoustic Science and Technology, 33, 215-220.

2011

Journal articles

  1. Ho, H.-N., Watanabe, J., Ando, H. & Kashino, M. (2011). Mechanisms underlying referral of thermal sensations to sites of tactile stimulation. Journal of Neuroscience, 31 (1), 208-213.
  2. Masuda, Ayako, Watanabe, Junji, Terao, Masahiko, Watanabe, Masataka, Yagi, Akihiro & Maruya, Kazushi (2011). Awareness of central luminance edge is crucial for the Craik-O'Brien-Cornsweet effect. Frontiers in Human Neuroscience, 5 (125), 1-9.

Conferences

  1. Yuuki, Ooishi & Makio, Kashino (2011). Interaction between slow and rapid sympathetic responses is essential for sound-induced aversiveness. 8th IBRO World Congress of Neuroscience. Florence (Toscana, Italy).
  2. Watanabe, Junji (2011). Phonological analysis of onomatopoeias for expressing tactile sensations. Tactile Research Group Meeting. Seattle, US.

2010

Journal articles

  1. Abekawa, N. & Gomi, H. (2010). Spatial coincidence of intentional actions modulates an implicit visuomotor control. J Neurophysiol, 103 (5), 2717-27.
  2. Ho, H.-N., Watanabe, J., Ando, H. & Kashino, M. (2010). Somatotopic or Spatiotopic? Frame of reference for localizing thermal sensations under thermo-tactile interactions. Attention, Perception & Psychophysics, 72 (6), 1666-1675.
  3. Tajadura-Jiménez, A., Väljamäe, A., Kitagawa, N. & Ho, H.-N. (2010). Whole-body vibration influences on front-back sound localization. Proceedings of the Institution of Mechanical Engineers, Part D, Journal of Automobile Engineering, 224, 1311-1320.
  4. Watanabe, Junji & Ando, Hideyuki (2010). Pace-sync shoes: Intuitive walking-pace guidance based on cyclic vibro-tactile stimulation for the foot. Virtual Reality, 14 (3), 213-219.
  5. Watanabe, Junji, Amemiya, Tomohiro, Nishida, Shin'ya & Johnston, Alan (2010). Tactile duration compression by vibrotactile adaptation. Neuroreport, 21 (13), 856-860.
  6. Terao, Masahiko, Watanabe, Junji, Yagi, Akihiro & Nishida, Shin'ya (2010). Smooth pursuit eye movements improve temporal resolution for color perception. PLOS ONE, 5(6) (e11214).
  7. Kuroki, Shinobu, Watanabe, Junji, Kawakami, Naoki, Tachi, Susumu & Nishida, Shin'ya (2010). Somatotopic dominance in tactile temporal processing. Experimental Brain Research, 203 (1), 51-62.

Conferences

  1. Yuuki, Ooishi & Makio, Kashino (2010). Correlation between sound-induced aversiveness and autonomic response. Neuroscience 2010 (Society for Neuroscience). San Diego(California,USA).

2009

Journal articles

  1. Noritake, Atsushi, Uttl, Bob, Terao, Masahiko, Nagai, Masayoshi, Watanabe, Junji & Yagi, Akihiro (2009). Saccadic compression of rectangle and Kanizsa figures: Now you see it, now you don't. PLOS ONE, 4(7) (e6383).
  2. Watanabe, Junji, Nakatani, Masashi, Ando, Hideyuki & Tachi, Susumu (2009). Haptic localization for onset and offset of vibro-tactile stimuli are dissociated. Experimental Brain Research, 193 (3), 483-489.

Misc

  1. Galie, J., Ho, H.-N. & Jones, L.A. (2009). Influence of Contact Conditions on Thermal Responses of the Hand. 587-592.

2008

Journal articles

  1. Jones, L.A. & Ho, H.-N. (2008). Warm or Cool, Large or Small? The Challenge of Thermal Displays. IEEE Transactions on Haptics, 1 (1), 53-70.
  2. Johnston, Alan, Bruno, Aurelio, Watanabe, Junji, Quansah, Ben, Patel, Natasha, Dakin, Steven & Nishida, Shin'ya (2008). Visually-based temporal distortion in dyslexia. Vision Research, 48 (17), 1852-1858.
  3. Terao, Masahiko, Watanabe, Junji, Yagi, Akihiro & Nishida, Shin'ya (2008). Reduction of stimulus visibility compresses apparent time intervals. Nature Neuroscience, 11 (5), 541-542.

2007

Journal articles

  1. Watanabe, Junji & Nishida, Shin'ya (2007). Veridical perception of moving colors by trajectory integration of input signals. Journal of Vision, 7(11) (3), 1-16.
  2. Watanabe, Junji, Hayashi, Seiichiro, Kajimoto, Hiroyuki, Tachi, Susumu & Nishida, Shin'ya (2007). Tactile motion aftereffects produced by appropriate presentation for mechanoreceptors. Experimental Brain Research, 180 (3), 577-582.
  3. Watanabe, Junji, Ando, Hideyuki, Maeda, Taro & Tachi, Susumu (2007). Gaze-contingent visual presentation based on remote saccade detection. Presence: Teleoperators and Virtual Environments, 16 (2), 224-234.
  4. Ando, Hideyuki, Watanabe, Junji, Inami, Masahiko, Sugimoto, Maki & Maeda, Taro (2007). A fingernail-mounted tactile display for augmented reality systems. Electronics and Communications in Japan Part 2, 90 (4), 56-65.
  5. Nishida, Shin'ya, Watanabe, Junji, Kuriki, Ichiro & Tokimoto, Toyotaro (2007). Human visual system integrates color signals along a motion trajectory. Current Biology, 17 (4), 366-372.

Misc

  1. Tajadura-Jiménez, A., Väljamäe, A., Kitagawa, N. & Ho, H.-N. (2007). Whole-Body Vibration Influences Sound Localization in the Median Plane.

2006

Journal articles

  1. Gomi, H., Abekawa, N. & Nishida, S. (2006). Spatiotemporal tuning of rapid interactions between visual-motion analysis and reaching movement. The Journal of Neuroscience, 26 (20), 5301-5308.

2005

Journal articles

  1. Noritake, Atsushi, Watanabe, Junji, Ando, Hideyuki, Terao, Masahiko & Yagi, Akihiro (2005). Spatial independency in perceived lengths of saccade-induced images. Psychologia, 48 (2), 146-153.
  2. Watanabe, Junji, Maeda, Taro & Tachi, Susumu (2005). Time course of localization for a repeatedly flashing stimulus presented at perisaccadic timing. Systems and Computers in Japan, 36 (9), 77-86.
  3. Watanabe, Junji, Noritake, Atsushi, Maeda, Taro, Tachi, Susumu & Nishida, Shin'ya (2005). Perisaccadic perception of continuous flickers. Vision Research, 45 (4), 413-430.

1998

Journal articles

  1. Nishida, Shin'ya & Shinya, Mikio (1998). Use of image-based information in judgments of surface-reflectance properties. Journal of Optical Society of America, A, 15 (12), 2951-2965.

Page top ←