|
2024 年
12 月 11 月 10 月 9 月 8 月 7 月 6 月 5 月 4 月
3 月 2 月 1 月
2023 年
12 月 11 月 10 月 9 月 8 月 7 月 6 月 5 月 4 月
3 月 2 月 1 月
2022 年
12 月 11 月 10 月 9 月 8 月 7 月 6 月 5 月 4 月
3 月 2 月 1 月
2021 年
12 月 11 月 10 月 9 月 8 月 7 月 6 月 5 月 4 月
3 月 2 月 1 月
2020 年
12 月 11 月 10 月 9 月 8 月 7 月 6 月 5 月 4 月
2020 年 3 月以前
|
|
2021 年 10 月
Date and time: Wednesday, 13 October 2021, 15:25-17:55 Venue: Talks are online. After the talks, we will get together in Kazuo's Office, Room 709 on the 7th Floor of Building 3, Ohashi Campus, Kyushu University, Fukuoka, Japan <http://www.design.kyushu-u.ac.jp/kyushu-u/english/access> Language: English Organizer: Kazuo UEDA (Kyushu Univ./ReCAPS/Research and Development Center for Five-Sense Devices)
15:10-15:25 Checking the Zoom sharing function and pre-session mingle
15:25-15:30 Brief Introduction
15:30-16:00 Color inference on warm/cold perception of surfaces Hsin-Ni HO*1, Hiroki TERASHIMA*1, Kohta WAKAMATSU2, Jinhwan KWON3,4, Maki SAKAMOTO3, Shigeki NAKAUCHI2, Shin’ya NISHIDA1 (*Equal contribution) 1. NTT Communication Science Labs, NTT Corporation; 2. Toyohashi University of Technology; 3. The University of Electro-Communications; 4. Kyoto University of Education
Colors like red and blue not only indicate different wavelengths of light, but they also signify warm and cold. This relationship is an established design norm and permeates our daily lives through consumer products, home decor, and art. In this study, we investigated how color information predicts the warm/cold perception of the beholder. We asked participants to give warm/cold ratings to 1934 color photographs of surfaces from various material categories and built regression models that predict a thermal rating of an image from its color statistics. The results showed that the color-temperature relationship is shared by a wide range of materials, from natural foliage to man-made plastic. In particular, the values in a” (green to red) and b” (blue to yellow) axes in the CIELAB color space best predict people’s temperature ratings. Overall, color information explained up to 30% of variations in people’s ratings. Our results suggest that the color-temperature relationship is governed by the low-level color statistics and shared by a wide range of material categories.
16:00-16:15 Spatiotemporal brain mechanism of auditory and tactile time shrinking Takako MITSUDO*,**, Naruhito HIRONAGA*, Hiroshige TAKEICHI***, Yoshitaka NAKAJIMA****, Shozo TOBIMATSU*,***** *Department of Clinical Neurophysiology, Neurological Institute, Faculty of Medicine, Graduate School of Medical Sciences, Kyushu University, **Department of Neuropsychiatry, Graduate School of Medical Sciences, Kyushu University, ***Advanced Data Science Project (ADSP), RIKEN Information R&D and Strategy Headquarters (R-IH), RIKEN, ****Department of Human Science/Research Center for Applied Perceptual Science, Faculty of Design, Kyushu University, *****Department of Orthoptics, Faculty of Medicine, Fukuoka International University of Health and Welfare
16:15-16:30 Modality specificity of amodal completion: Revised for Fechner Day 2021 Hiroshige TAKEICHI*, Keito TANIGUCHI**, and Hiroaki SHIGEMASU** *RIKEN, **Kochi University of Technology
16:30-16:45 Checkerboard speech: A new experimental paradigm for investigating speech perception Kazuo UEDA*,**, Riina KAWAKAMI**,***, and Hiroshige TAKEICHI**** *Department of Human Science, Kyushu University, **Department of Acoustic Design, Kyushu University, ***Rion Co. Ltd., ****Advanced Data Science Project (ADSP), RIKEN Information R&D and Strategy Headquarters (R-IH), RIKEN
16:45-17:00 Temporal and frequency resolution needed for auditory communication: Comparison between young and senior listeners utilizing mosaic speech Yoshitaka NAKAJIMA*,**,***, Tatsuro ONAKA**, Akinori OYAMA***, Kazuo UEDA*,** and Gerard B. REMIJN*,** *Department of Human Science, Kyushu University, **Department of Acoustic Design, Kyushu University, ***Sound Corporation
17:00-17:10 Break
17:10-17:25 Acoustic correlates of English consonant-vowel-consonant (CVC) words obtained with multivariate analysis Yixin ZHANG*, Yoshitaka NAKAJIMA*,**, Kazuo UEDA*, Gerard B. REMIJN* *Kyushu University, **Sound Corporation
17:25-17 40 Vection depends on the luminance contrast, average luminance and the spatial frequency of the stimulus Xuanru Guo1, Shinji Nakamura2, Yoshitaka Fujii3, Takeharu Seno1, Stephen Palmisano4 1Faculty of Design, Kyushu University, 2School of Psychology, Nihon Fukushi University, 3Graduate School of Humanities and Social Sciences, Kumamoto University, 4School of Psychology, University of Wollongong
17:40-17:55 Psychology teaching by using volumetric video based AR and avatar AR technology is more attractive than those by face to face and zoom Xuanru Guo1, Takashi Yoshinaga2,3, Aaron Hilton3, Shuji Harumoto3, Ema Hilton3, Shizuka Ono3, Takeharu Seno1 1Faculty of Design, Kyushu University, 2Institute of Systems, Information Technologies and Nanotechnologies, 3Steampunk Digital, Co. Ltd.
After the talks, we will get together in Kazuo's office, Room 709 on the 7th Floor of Building 3.
|
|