Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę ""Facial expression."" wg kryterium: Temat


Tytuł:
Single point motion kinematics convey emotional signals in children and adults.
Autorzy:
Roberti E; Psychology Department, University of Milano-Bicocca, Milan, Italy.; Neuromi, Milan Center for Neuroscience, Milan, Italy.
Turati C; Psychology Department, University of Milano-Bicocca, Milan, Italy.; Neuromi, Milan Center for Neuroscience, Milan, Italy.
Actis-Grosso R; Psychology Department, University of Milano-Bicocca, Milan, Italy.; Neuromi, Milan Center for Neuroscience, Milan, Italy.
Pokaż więcej
Źródło:
PloS one [PLoS One] 2024 Apr 10; Vol. 19 (4), pp. e0301896. Date of Electronic Publication: 2024 Apr 10 (Print Publication: 2024).
Typ publikacji:
Journal Article
MeSH Terms:
Emotions*/physiology
Facial Expression*
Adult ; Child ; Humans ; Biomechanical Phenomena ; Fear ; Happiness
Czasopismo naukowe
Tytuł:
Additive effects of emotional expression and stimulus size on the perception of genuine and artificial facial expressions: an ERP study.
Autorzy:
Ziereis A; Department for Cognition, Emotion and Behavior, Affective Neuroscience and Psychophysiology Laboratory, Georg-August-University of Göttingen, 37073, Göttingen, Germany. .
Schacht A; Department for Cognition, Emotion and Behavior, Affective Neuroscience and Psychophysiology Laboratory, Georg-August-University of Göttingen, 37073, Göttingen, Germany.
Pokaż więcej
Źródło:
Scientific reports [Sci Rep] 2024 Mar 06; Vol. 14 (1), pp. 5574. Date of Electronic Publication: 2024 Mar 06.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Emotions*
Humans ; Physical Examination ; Anger ; Visual Perception
Czasopismo naukowe
Tytuł:
Perceiving the outlier in the crowd: The influence of facial identity.
Autorzy:
Ping Y; Capital Medical University, People's Republic of China.
Ouyang Y; Capital Medical University, People's Republic of China.
Zhang M; Capital Medical University, People's Republic of China.
Zheng W; Capital Medical University, People's Republic of China.
Pokaż więcej
Źródło:
Perception [Perception] 2024 Mar; Vol. 53 (3), pp. 163-179. Date of Electronic Publication: 2023 Dec 29.
Typ publikacji:
Journal Article
MeSH Terms:
Emotions*
Facial Expression*
Humans
Czasopismo naukowe
Tytuł:
Machine Learning-Based Interpretable Modeling for Subjective Emotional Dynamics Sensing Using Facial EMG.
Autorzy:
Kawamura N; Computational Cognitive Neuroscience Laboratory, Graduate School of Informatics, Kyoto University, Yoshida-Honmachi, Sakyo, Kyoto 606-8501, Japan.; Psychological Process Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan.
Sato W; Computational Cognitive Neuroscience Laboratory, Graduate School of Informatics, Kyoto University, Yoshida-Honmachi, Sakyo, Kyoto 606-8501, Japan.; Psychological Process Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan.
Shimokawa K; Psychological Process Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan.
Fujita T; Multimodal Data Recognition Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan.
Kawanishi Y; Multimodal Data Recognition Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan.
Pokaż więcej
Źródło:
Sensors (Basel, Switzerland) [Sensors (Basel)] 2024 Feb 27; Vol. 24 (5). Date of Electronic Publication: 2024 Feb 27.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Emotions*/physiology
Humans ; Electromyography ; Face ; Facial Muscles/physiology ; Machine Learning
Czasopismo naukowe
Tytuł:
Explainable Depression Detection Based on Facial Expression Using LSTM on Attentional Intermediate Feature Fusion with Label Smoothing.
Autorzy:
Mahayossanunt Y; Department of Computer Engineering, Faculty of Engineering, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.
Nupairoj N; Department of Computer Engineering, Faculty of Engineering, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.; Center of Excellence in Digital and AI Innovation for Mental Health (AIMET), Chulalongkorn Unversity, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.
Hemrungrojn S; Center of Excellence in Digital and AI Innovation for Mental Health (AIMET), Chulalongkorn Unversity, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.; Department of Psychiatry, Faculty of Medicine, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.; Cognitive Fitness and Biopsychiatry Technology Research Unit, Faculty of Medicine, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.
Vateekul P; Department of Computer Engineering, Faculty of Engineering, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.; Center of Excellence in Digital and AI Innovation for Mental Health (AIMET), Chulalongkorn Unversity, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.; Cognitive Fitness and Biopsychiatry Technology Research Unit, Faculty of Medicine, Chulalongkorn University, Phayathai Rd, Pathumwan, Bangkok 10330, Thailand.
Pokaż więcej
Źródło:
Sensors (Basel, Switzerland) [Sensors (Basel)] 2023 Nov 25; Vol. 23 (23). Date of Electronic Publication: 2023 Nov 25.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Depressive Disorder, Major*/diagnosis
Humans ; Depression/diagnosis ; Emotions ; Eye Movements
Czasopismo naukowe
Tytuł:
Effects of wearing an opaque or transparent face mask on the perception of facial expressions: A comparative study between Japanese school-aged children and adults.
Autorzy:
Miyazaki Y; Fukuyama University, Japan.
Kamatani M; Hokkaido University, Japan.
Tsurumi S; Hokkaido University, Japan.; Chuo University, Japan.
Suda T; Unicharm Corporation, Japan.
Wakasugi K; Unicharm Corporation, Japan.
Matsunaga K; Unicharm Corporation, Japan.
Kawahara JI; Hokkaido University, Japan.
Pokaż więcej
Źródło:
Perception [Perception] 2023 Nov; Vol. 52 (11-12), pp. 782-798. Date of Electronic Publication: 2023 Sep 20.
Typ publikacji:
Journal Article
MeSH Terms:
Masks*
Facial Expression*
Adult ; Humans ; Child ; East Asian People ; Emotions ; Perception
Czasopismo naukowe
Tytuł:
Asynchrony enhances uncanniness in human, android, and virtual dynamic facial expressions.
Autorzy:
Diel A; Cardiff University School of Psychology, Cardiff, UK. .; RIKEN Institute, Kyoto, Japan. .; Clinic for Psychosomatic Medicine and Psychotherapy, LVR University Hospital Essen, University of Duisburg-Essen, 45147, Essen, Germany. .; Center for Translational Neuro- and Behavioral Sciences (C-TNBS), University of Duisburg- Essen, 45147, Essen, Germany. .
Sato W; RIKEN Institute, Kyoto, Japan.
Hsu CT; RIKEN Institute, Kyoto, Japan.
Minato T; RIKEN Institute, Kyoto, Japan.
Pokaż więcej
Źródło:
BMC research notes [BMC Res Notes] 2023 Dec 11; Vol. 16 (1), pp. 368. Date of Electronic Publication: 2023 Dec 11.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Pattern Recognition, Visual*
Humans ; Emotions
Czasopismo naukowe
Tytuł:
Development of the RIKEN database for dynamic facial expressions with multiple angles.
Autorzy:
Namba S; RIKEN, Psychological Process Research Team, Guardian Robot Project, Kyoto, 6190288, Japan. .; Department of Psychology, Hiroshima University, Hiroshima, 7398524, Japan. .
Sato W; RIKEN, Psychological Process Research Team, Guardian Robot Project, Kyoto, 6190288, Japan. .
Namba S; Department of Psychology, Hiroshima University, Hiroshima, 7398524, Japan.
Nomiya H; Faculty of Information and Human Sciences, Kyoto Institute of Technology, Kyoto, 6068585, Japan.
Shimokawa K; KOHINATA Limited Liability Company, Osaka, 5560020, Japan.
Osumi M; KOHINATA Limited Liability Company, Osaka, 5560020, Japan.
Pokaż więcej
Źródło:
Scientific reports [Sci Rep] 2023 Dec 08; Vol. 13 (1), pp. 21785. Date of Electronic Publication: 2023 Dec 08.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Emotions*
Humans ; Movement ; Face ; Databases, Factual
Czasopismo naukowe
Tytuł:
Influence of stimulus manipulation on conscious awareness of emotional facial expressions in the match-to-sample paradigm.
Autorzy:
Sato W; Psychological Process Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 619-0288, Japan. .; Field Science Education and Research Center, Kyoto University, Oiwake-cho, Kitashirakawa, Sakyo, Kyoto, 606-8502, Japan. .
Yoshikawa S; Field Science Education and Research Center, Kyoto University, Oiwake-cho, Kitashirakawa, Sakyo, Kyoto, 606-8502, Japan.; Faculty of the Arts, Kyoto University of the Arts, 2-116 Uryuyama, Kitashirakawa, Sakyo, Kyoto, 606-8501, Japan.
Pokaż więcej
Źródło:
Scientific reports [Sci Rep] 2023 Nov 25; Vol. 13 (1), pp. 20727. Date of Electronic Publication: 2023 Nov 25.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Emotions*
Humans ; Anger ; Social Perception ; Consciousness
Czasopismo naukowe
Tytuł:
Electromyographic Validation of Spontaneous Facial Mimicry Detection Using Automated Facial Action Coding.
Autorzy:
Hsu CT; Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto 619-0288, Japan.
Sato W; Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto 619-0288, Japan.
Pokaż więcej
Źródło:
Sensors (Basel, Switzerland) [Sensors (Basel)] 2023 Nov 09; Vol. 23 (22). Date of Electronic Publication: 2023 Nov 09.
Typ publikacji:
Journal Article
MeSH Terms:
Facial Expression*
Face*
Humans ; Facial Muscles/physiology ; Electromyography/methods ; Videotape Recording ; Emotions/physiology
Czasopismo naukowe

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies