item3a

最新情報 | ごあいさつ | 開催日程 | 構成員 | 連絡先 | English

2020 年 4 月以降

2020 年

3 月
2 月
1 月

2019 年

12 月
11 月
10 月
9 月
8 月
7 月
6 月
5 月
4 月
3 月
2 月
1 月

2018 年

12 月
11 月
10 月
9 月
8 月
7 月
6 月
5 月
4 月
3 月
2 月
1 月

2017 年

12 月
11 月
10 月
9 月
8 月
7 月
6 月
5 月
4 月
3 月
2 月
1 月

2016 年

12 月
11 月
10 月
9 月
8 月
7 月
6 月
5 月
4 月

2016 年 3 月以前

2017 年 3


The 33rd Perceptual Frontier Seminar: Human Communication, Perception, and Computer Interfacing

Date and time: Tuesday, 7 March 2017, 14:00-16:00
Venue: Room 411, Building 4, Ohashi Campus, Kyushu University
Organizer: Gerard B. REMIJN (Kyushu University/ReCAPS)

Program

1. An introduction of spatial filter in BCI for classification using machine learning
Mohd Ibrahim SHAPIAI*
*Department of Electronics Systems Engineering, Malaysia-Japan International Institute of Techology, Universiti Teknologi Malaysia, Kuala Lumpur, Malaysia

Brain-Computer Interfacing (BCI) is a demanding task as the recorded electroencephalogram (EEG) signal is not only noisy and has limited spatial resolution but it is also intrinsically non-stationary. The non-stationarities in the signal may come from many different sources, for instance electrode artifacts, muscular activity or changes of task involvement, and often decline the classification performance. Among them, spatial filters and selecting the most appropriate frequency bands and related signal in the frequency domain are known to improve classification accuracy. We will look at some advantages and disadvantages of several spatial filter especially in extracting the important feature for typical EEG research on BCI. Also, we will further discuss on one popular spatial filter known as common spatial pattern (CSP). The extracted feature then will be used for classification using machine learning technique. A brief discussion on specific machine learning technique will be shared. The talk aims to encourage discussion and exchange of ideas among the audience – BCI with spatial filter for classification using machine learning.

2. User authentication with eye-tracking and visual password formation to prevent shoulder surfing
Yesaya Tommy PAULUS*, Chihiro HIRAMATSU**, Yvonne KAM HWEI SYN***, Gerard B. REMIJN**
*Graduate School of Design, Kyushu University, Fukuoka, Japan
**Department of Human Science / Research Center for Applied Perceptual Science, Kyushu University, Fukuoka, Japan
***Faculty of Engineering, Malaysia Multimedia University, Cyberjaya, Malaysia

As a new visual password method, eye-tracking can be used to facilitate eye-gaze authentication on public screens, such as for ATMs. In order to determine the best visual password format that can be used safely and reliably in public by any user, including the elderly, we will focus on factors important in authentication enrollment, such as the role of viewing distance, viewing angle, and lighting conditions. We will also focus on finding the most suitable visual password format as regards the amount of visual objects, user eye-movements, and gaze duration.

3. Intelligibility of English mosaic speech: Influence of manipulating mosaic block duration
Santi*, Yoshitaka NAKAJIMA**
*Graduate School of Design, Kyushu University, 4-9-1 Shiobaru, Minami-ku, Fukuoka, 815-8540 Japan
**Department of Human Science, Faculty of Design, Kyushu University/Research Center for Applied Perceptual Science, 4-9-1 Shiobaru, Minami-ku, Fukuoka, 815-8540 Japan

The purpose of this research is to observe the influence of manipulating block duration on the intelligibility of mosaic speech. The original speech samples were transformed into mosaic form using J Software. A silent part of 20 ms was always at the beginning of the original speech samples, and the whole signal was divided into narrow frequency bands of one critical bandwidth each covering 100-4400 Hz. A rise and a fall time of 4 ms were given to the mosaic blocks. Listeners were able to hear intelligible speech when mosaic block duration ranged from 20 to 80 ms.

Photos


The 34th Perceptual Frontier Seminar: Time Perception

Date and time: Tuesday, 21 March 2017, 17:00-18:30
Venue: Room 601, Building 3, Ohashi Campus, Kyushu University
Organizer: Yoshitaka NAKAJIMA (Kyushu University/ReCAPS)

Program

1. Perceptual assimilation and contrast between time intervals marked by sound bursts
Yoshitaka NAKAJIMA*
*Department of Human Science/ReCAPS, Faculty of Design, Kyushu University

When two adjacent time intervals up to 600 ms, T1 and T2 in this order, are marked by very short sound bursts, their durations are perceptually assimilated to or contrasted against one another in some conditions.  Assimilation and contrast are more conspicuous in T2 than in T1 indicating temporal asymmetry.  Contrast takes place only as overestimation of T1 or T2.  The relationship between different sensory modalities is another issue to be examined.

2. Filled duration illusion in vision
Erika TOMIMATSU*
*Faculty of Design, Kyushu University

When we compare the subjective length of an unfilled duration delimited by two short sounds, with that of the filled duration of a continuous sound, the filled duration is often perceived as the longer even though they are physically equivalent. In the present study, we investigated whether this filled-duration illusion would obviously occur also in vision with durations within one second. We employed random-dot figures as duration markers presented on mean-luminance background to avoid afterimages. The results showed that, also in the visual modality, the filled-duration illusion occurred for durations within one second.

3. The effects of speed and frequency on perceived duration
Kentaro YAMAMOTO*
*Faculty of Human-Environment Studies, Kyushu University

While it is known that perceived duration is distorted by visual motion, there is ongoing debate as to which factor plays a key role in the duration distortion. Here, we manipulated the speed and direction alternation frequency of moving objects and measured perceived duration to determine which is more important. The results suggested that both factors influenced perceived duration, and the magnitude of the effects varied depending on the salience of each feature.

Photos


直井さん慰労昼食会

日時:2017 年 3 月 31 日(金)12:00-13:50
会場:ウエスト大橋店(中国料理)
幹事:中島祥好(九州大学/ReCAPS)

長らく ReCAPS の事務補佐員としてご活躍くださった直井雅子さんが定年退職され,3 月 31 日(金)が最後のご勤務となりました。これを記念して,後任の圓山裕子さんを交え,記念撮影のあと昼食会を開催しました。今後のご健康とご多幸をお祈りいたします。

Photos

最新情報 | ごあいさつ | 開催日程 | 構成員 | 連絡先 | English

Last updated:
Copyright (c) 2013-2020 Research Center for Applied Perceptual Science, Kyushu University. All rights reserved.