コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 otential threat (i.e., suspicious or fearful facial expression).
2 f emotional information (fearful and neutral facial expressions).
3 y viewed as being specialized for processing facial expression.
4 ing macaque faces depicting the fear grimace facial expression.
5 rocessing of facial affect during changes in facial expression.
6 prosody and through the face in the form of facial expression.
7 ical or continuous neural representations of facial expression.
8 cement and decreased recognition of negative facial expression.
9 nisms that participate in self-monitoring of facial expression.
10 mood and attenuated recognition of negative facial expression.
11 n between simultaneous speech production and facial expression.
12 passive viewing of dynamic angry and neutral facial expressions.
13 uralistic angry, fearful, happy, and neutral facial expressions.
14 l vocalizations (e.g., crying), and (silent) facial expressions.
15 a are essential for the voluntary control of facial expressions.
16 in eye contact, gaze follow, and reciprocate facial expressions.
17 n produce an extraordinarily large number of facial expressions.
18 wing the positive anticipation and agonistic facial expressions.
19 various visual cues, such as eye contact and facial expressions.
20 onset, to fearful, but not neutral or happy, facial expressions.
21 creased amygdala responsiveness to emotional facial expressions.
22 y derived from neural responses to emotional facial expressions.
23 d expressive reactions to negative emotional facial expressions.
24 ce for a facilitation of processing positive facial expressions.
25 random, and did so using neutral or fearful facial expressions.
26 pothalamus, and periaqueductal grey to angry facial expressions.
27 vity, and left amygdala response to negative facial expressions.
28 sociability and enhanced recognition of sad facial expressions.
29 ongside emotional words, stories, movies, or facial expressions.
30 enhanced recognition of positive vs negative facial expressions.
31 al pyramidal motor system controls voluntary facial expressions.
32 ed through modification of ancestral primate facial expressions.
33 have evolved from ancestral primate rhythmic facial expressions.
34 ala to angry and fearful compared with happy facial expressions.
35 nts, only one was impaired at the imagery of facial expressions.
36 ciality, pleasure, and motivation, and coded facial expressions.
37 tivity, and autonomic responses to emotional facial expressions.
38 including gaze direction, body gesture, and facial expressions.
39 whether similar overlap occurs in real-life facial expressions.
40 in transmitting diagnostic cues for decoding facial expressions.
41 -specific modulation of spatial attention to facial expressions.
42 th amygdala responsiveness to threat-related facial expressions.
43 roduction) or lagged (monitor) initiation of facial expressions.
44 acial features in processing of naturalistic facial expressions.
45 showing similar biases for detecting fearful facial expressions.
46 and a stranger showing either happy or angry facial expressions.
47 ll stimuli, including social signals such as facial expressions.
48 recognition of faces and interpretations of facial expressions.
49 on their recognition of six basic emotional facial expressions.
50 ity of a BCI to display emotions by decoding facial expressions.
51 n of fearful, happy, angry, sad, and neutral facial expressions.
52 context-sensitivity in the interpretation of facial expressions.
53 emotional spoken sentences with accompanying facial expressions.
54 otions through electroencephalogram (EEG) or facial expressions.
55 nd increase attentional bias toward positive facial expressions.
56 ve effect on their behavior towards aversive facial expressions.
57 mely, affective multimedia content, EEG, and facial expressions.
58 l touch and attentional bias toward positive facial expressions.
59 ammals, since dogs do not display human-like facial expressions.
60 ained visuocortical facilitation to aversive facial expressions.
61 amygdala and hypothalamus responses to angry facial expressions.
62 Most mammalian species produce facial expressions.
66 right amygdala habituation to threat-related facial expressions, a phenotype associated with resilien
67 tested whether these individuals can imagine facial expressions, a process also hypothesized to be un
68 fically, while the perception of incongruent facial expressions activates somatosensory-related repre
70 -oxygen-level-dependent (BOLD) response on a facial expression affective-reactivity task in both elde
71 significantly more infants had no change in facial expression after sucrose administration (seven of
73 responsiveness of this structure to fearful facial expressions, an effect that predicts superior per
75 ding changes in volition/motivation, posture/facial expression and derealization/depersonalization.
78 n perception of basic social signals such as facial expression and gaze direction, and preferential a
79 tigate this view, the present study examines facial expression and identity recognition abilities in
80 the probability density function of neutral facial expression and the natural logarithm of mode on t
81 nd a double dissociation of areas processing facial expression and those processing head orientation.
82 where it might be implicated in controlling facial expression and urinary voiding, and also in bladd
84 pecific "preparatory" system learns aversive facial expressions and autonomic responses such as skin
85 f repeated oxytocin on autistic individuals' facial expressions and demonstrated a time-course change
86 mygdala hold information about self-executed facial expressions and demonstrates an intimate overlap
87 he question of whether we can identify fetal facial expressions and determine their developmental pro
90 nly underpinned by brain regions involved in facial expressions and emotion recognition processing.
91 magnetic resonance imaging (MRI) (emotional facial expressions and executive functioning) and were c
93 tage to induce in viewer monkeys spontaneous facial expressions and looking patterns in the laborator
94 cifically painful, angry, happy, and neutral facial expressions and questionnaires including a measur
95 dala processes both the degree of emotion in facial expressions and the categorical ambiguity of the
96 g control over enactment of subtly different facial expressions and therefore skills in emotional com
97 hich measured amygdala responsivity to angry facial expressions and ventral striatum responsivity to
98 aces varying freely in viewpoint, hairstyle, facial expression, and age; and for well known cars embe
99 fect expressions but not positive or neutral facial expressions, and impaired in Stroop cognitive con
100 MA impaired recognition of angry and fearful facial expressions, and the larger dose (1.5 mg/kg) incr
106 roducing facial expressions, suggesting that facial expressions are not just inflexible and involunta
108 arch using CFS has demonstrated that fearful facial expressions are prioritised by the visual system
110 study, we aimed to test whether domestic dog facial expressions are subject to audience effects and/
113 rimate lineage, also have the ability to use facial expressions as a means of gaining social informat
116 ontrollable appearance cues (e.g., clothing, facial expressions) as shown previously, but also by fea
117 hes in the STS fundus were most sensitive to facial expression, as was the amygdala, whereas those on
118 ") and using manual analysis with changes in facial expressions assessed using the Mouse Grimace Scal
119 ively studied the human ability to recognize facial expressions associated with dog emotions (hereaft
120 to approach photographic stimuli displaying facial expressions associated with positive attention an
121 ponent displays either an angry or a neutral facial expression at the beginning of each trial and del
123 assessment while viewing fearful and neutral facial expressions at baseline and again 8 weeks later.
124 the extent to which mothers mirrored infant facial expressions at two months postpartum predicted in
125 ing facial proportions, but the same neutral facial expression, baldhead and skin tone, as stimuli.
127 g newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional res
128 ere, we demonstrate the utility of surprised facial expressions because exemplars within this emotion
129 "pain/distress" also demonstrates that this facial expression becomes significantly more complete as
130 o matrix of fearful, sad, happy, and neutral facial expressions before they were deployed to Iraq.
131 entials (nociceptive cortical activity), and facial expression (behavior) were acquired in individual
132 d manifested as subtle changes in providers' facial expression behaviours during the clinical interac
134 hemodynamic response to implicitly presented facial expressions between depressed and healthy control
136 is a key structure for processing emotional facial expressions, but it remains unclear what aspects
138 als, the analytical methods used to quantify facial expressions can be subjective, relying heavily on
143 Understanding the degree to which human facial expressions co-vary with specific social contexts
144 the probability density function of neutral facial expression compared with placebo in exploratory (
145 eases in these indices characterize autistic facial expression, compared with neurotypical individual
146 lcus was equally sensitive to all changes in facial expression, consistent with a continuous represen
147 universal language of emotion, some negative facial expressions consistently elicit lower recognition
148 at information about both dynamic and static facial expressions could be robustly decoded from Mf are
149 acebo showed reduced recognition of positive facial expressions, decreased speed in responding to pos
150 fundamental role in enhancing recognition of facial expression despite the complex stimulus changes a
151 or faces learnt through video clips, dynamic facial expression did not create better transfer of lear
155 years] passively viewed videos of universal facial expressions during functional MRI acquisition, wi
158 resonance scanning while viewing pictures of facial expressions from the Ekman and Friesen series.
160 such as eye-contact induced reciprocation of facial expression, gaze aversion, and gaze following, th
163 mans have revealed that the motor control of facial expressions has a distributed neural representati
165 namic relation between speech production and facial expression in children with autism and to establi
166 This study demonstrates how differences in facial expression in emotionally ambiguous contexts may
167 s unclear to what extent there exists common facial expression in species more phylogenetically dista
168 The authors examined neural responses to facial expressions in adults and adolescents with social
169 monstrated elevated responses to threatening facial expressions in amygdala, as well as left fusiform
173 r system during observation and execution of facial expressions in nine-month-old infants, implicatin
174 quantify and compare human and domestic dog facial expressions in response to emotionally-competent
176 amygdala responded selectively to changes in facial expression, independent of changes in identity.
177 te valence for the remaining stimulus (e.g., facial expressions), indicating a representation of vale
178 ade trials with pain faces relative to other facial expressions, indicating a difficulty disinhibitin
179 gnition contend that to understand another's facial expressions, individuals map the perceived expres
182 do not seem to be affected by facial injury, facial expression, intellectual disability, drug history
183 resentative values of objectively quantified facial expression intensity in a repeatable part of the
184 recognition has been dominated by studies on facial expression interpretation; very little is known a
186 nd emotion ambiguity (the uncertainty that a facial expression is categorized as fearful or happy).
187 er a resolution to the controversy about how facial expression is processed in the brain by showing t
191 evidence linking social context to specific facial expressions is sparse and is largely based on sur
192 Emotion processing-including signals from facial expressions-is often altered in individuals with
195 tofrontal cortex (OFC) struggle to recognize facial expressions, make poor social judgments, and freq
197 he right pSTS were better at differentiating facial expressions measured outside of the scanner.
199 screen) evoked by the perception of various facial expressions (neutral, fearful, aggressive, and ap
200 ated whether gaze cues paired with emotional facial expressions (neutral, happy, suspicious and fears
202 volume and reduced responsiveness to fearful facial expressions observed in psychopathic individuals.
203 we examined the extent to which 16 types of facial expression occurred systematically in thousands o
204 association tracts in the recognition of the facial expression of emotion and identify specific tract
206 e correlates with asymmetry, indicating that facial expression of the mouse is itself correlated with
207 inadequate to reliably distinguish universal facial expressions of "fear" and "disgust." Rather than
209 Finally, it suggests that the forms of human facial expressions of anger and happiness may have coevo
210 enteen healthy participants (8 females) with facial expressions of anger, disgust, fear, happiness, s
212 ccurred, infants were less likely to display facial expressions of distaste initially when eating the
214 ACS, an anatomically-based method for coding facial expressions of dogs, we found that the "Ears addu
215 se to morphed images to directly address how facial expressions of emotion are represented in the bra
216 nts from both cultures visually discriminate facial expressions of emotion by relying on culturally d
218 iological and social evolutionary pressures, facial expressions of emotion comprise specific facial m
219 & Scholl (F&S), we have shown that perceived facial expressions of emotion depend on the congruency b
223 Darwin's seminal works, the universality of facial expressions of emotion has remained one of the lo
225 Understanding the different categories of facial expressions of emotion regularly used by us is es
226 d Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hiera
227 systematically vary intensities of six basic facial expressions of emotion, and employed a self-paced
228 r results question the universality of human facial expressions of emotion, highlighting their true c
233 pyramidal system enables humans to simulate facial expressions of emotions not actually experienced.
234 trapyramidal motor system drives spontaneous facial expressions of felt emotions, and a cortical pyra
236 ng participants (N = 20) were presented with facial expressions of happy and sad emotion at three int
237 imuli depicting individuals being harmed and facial expressions of pain were compared between incarce
239 (Equus caballus) could discriminate between facial expressions of their conspecifics captured in dif
240 in a new expression after several different facial expressions of these faces had been shown during
243 oscience mainly focused on the processing of facial expressions, overlooking the exploration of emoti
245 ain activation in regions involved in infant facial expression processing and empathic and mentalizin
246 onth-old infants, implicating this system in facial expression processing from a very young age.
247 nd the left fusiform gyrus, recruited during facial expression processing, were positively correlated
250 A popular hypothesis holds that efficient facial expression recognition cannot be achieved by visu
251 dividuals can achieve normotypical efficient facial expression recognition despite a congenital absen
252 t the neural computations that contribute to facial expression recognition in each region are functio
253 ous work has tested this theory by examining facial expression recognition in participants with Mobiu
255 gnetic resonance imaging (fMRI) and (ii) the facial expression recognition task (FERT), a decision-ma
256 s (rpSTS) that responds more strongly during facial expression recognition tasks than during facial i
257 d nonsynesthetic participants on measures of facial expression recognition, but not on control measur
258 ery of emotional processing tasks comprising facial expression recognition, emotional categorization,
263 ychology propound that visual recognition of facial expressions requires an intermediate step to iden
266 of unknown conspecifics portraying different facial expressions, showing appropriate behavioural and
268 n covert vigilance and avoidance of aversive facial expressions, social anxiety appears to confer a s
269 that a combination of head orientation with facial expression, specifically involving both the eyes
270 completed the anti-saccade task with dynamic facial expressions, specifically painful, angry, happy,
271 reated second-order faces with happy and sad facial expressions specified solely by local directions
272 ants attended to the emotion category of the facial expression, suggesting an interaction between the
273 the human's attentional state when producing facial expressions, suggesting that facial expressions a
275 of 10 Hz for 500 ms, selectively impaired a facial expression task but had no effect on a matched fa
277 r activations to fearful relative to neutral facial expressions than did healthy comparison subjects
278 esults reveal fine-grained patterns in human facial expressions that are preserved across the modern
280 ear (happy, angry) and ambiguous (surprised) facial expressions, then re-rated similar stimuli after
284 UFS is also characterized by an abnormal facial expression upon smiling, and bilateral weakness i
285 cant risk of kidney failure, and an abnormal facial expression upon smiling, laughing, and crying.
286 ted amygdala reactivity to fearful and angry facial expressions using functional magnetic resonance i
287 r social information conveyed by conspecific facial expressions using the framework of optimal foragi
290 y binding together social signals carried by facial expressions, vocalizations, and social grooming.S
291 the probability density function of neutral facial expression was found in confirmatory trial (P < 0
292 -modal coordination of speech production and facial expression was greater when the neurotypical chil
293 image onset, and identity across a change in facial expression was uniquely associated with neural pa
294 nisms that participate in self-monitoring of facial expression, we simultaneously recorded the elicit
299 , regions varied in how frequently different facial expressions were produced as a function of which
301 aging to measure neural responses to dynamic facial expressions with positive and negative valence an