コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 mes) to long-duration cues (e.g., syllables, prosody).
2 tions based upon vocal inflection (affective prosody).
3 states recognized cross-culturally in speech prosody.
4 istic comprehension to alterations in speech prosody.
5 cessing of pitch changes in speech affecting prosody.
6 produced in either a happy, angry or neutral prosody.
7 ctor contributing to right lateralization of prosody.
8 usly associated with perception of emotional prosody.
9 brain showed particular sensitivity to happy prosody.
10 o words spoken with neutral, happy, or angry prosody.
11 nd the generation/comprehension of emotional prosody.
12 ng between acoustic features and emotions in prosody.
13 onous stimuli or failed to fully control for prosody.
14 and intonation-features often referred to as prosody.
15 could not accurately map visual and auditory prosodies.
17 RH regions change to support both emotional prosody and also typically left-lateralized language fun
18 pitch, which is an important cue for speech prosody and for tonal languages such as Mandarin Chinese
20 sive overlap in neural resources involved in prosody and music processing, music perception seems to
21 itry disambiguate uncertainties in affective prosody and propositional linguistic content of language
25 aryngeal control of pitch-related aspects of prosody and song are coordinated by a hierarchically org
26 there is not clear brain lateralization for prosody and there may be bilateral representation for th
27 ech; through the voice in the form of speech prosody and through the face in the form of facial expre
29 eaker's body movements) complements auditory prosody, and it is unclear how the brain temporally anal
30 y on tasks involving expression of affective prosody, and on tests of linguistic prosody, compared wi
31 cally driven combinatorial processing beyond prosody, and that linguistic content (i.e., structure an
32 e pSTS, a region known for processing speech prosody, and the orbitofrontal cortex and amygdala, brai
33 ilities-sentence processing versus emotional prosody-are supported by the left (LH) versus the right
35 we introduce a data-driven model for English prosody, based on large-scale analysis of spontaneous co
36 that when participants prepare for emotional prosody, bilateral ventral striatum is specifically acti
37 tences and phrases): alignment of syntax and prosody boosted EEG responses, whereas their misalignmen
38 also impaired in comprehension of affective prosody, but many patients with impairments in prosodic
39 gyrus (STG) is considered a critical hub for prosody, but the role of earlier auditory regions like H
40 statistical inference methods, we find that prosody can communicate at least 12 distinct kinds of em
41 tream decreased participants' performance in prosody categorization, arguing for a motor involvement
42 comparisons indicated that infants detected prosody changes across a broader range of conditions dur
43 y, infants demonstrated greater detection of prosody changes from IDS speech to ADS speech in native
51 ralization of cortical regions mobilized for prosody control could point to efficient processing of s
52 p children to tap into the rhythm of speech (prosody), cueing them to prosodic markers of grammatical
55 fway through each trial, the speaker changed prosody from infant-directed speech (IDS) to adult-direc
59 -face communication, multimodal cues such as prosody, gestures, and mouth movements can play a crucia
61 or the neural processing stages of emotional prosody has suggested that auditory emotion is perceived
63 We examined sensitivity to emotion in speech prosody in a sample of individuals with congenital amusi
64 re (RH) appears to be dominant for affective prosody in adults, while the left hemisphere (LH) mediat
66 d the perception and adaptation of receptive prosody in autistic adolescents and two groups of non-au
67 The interactive time course of emotional prosody in the context of emotional semantics was invest
68 reported difficulty understanding emotional prosody in their daily lives, suggesting some awareness
69 cry vocalisations and for building blocks of prosody (intonation) over the first 6 months of life, mo
72 ude that processing of both overt and covert prosody is reflected in the frequency-tagged neural resp
73 an integrative overview, arguing that speech prosody is subserved by the same anatomical and neuroche
75 asynchrony detection of visual and auditory prosodies leads to increased delta power in left motor c
78 icipants judged emotions in emotional speech prosody, nonverbal vocalizations (e.g., crying), and (si
79 shed light on the influence of language and prosody on infant attention and highlight the complexity
80 eted automatically (e.g., various aspects of prosody), or more deliberatively (e.g., description of r
82 previous research has investigated emotional prosody perception in these diseases under non-ideal lis
83 Previous research suggests that emotional prosody perception is impaired in neurodegenerative dise
84 dal neuroimaging and brain stimulation, that prosody perception takes dual routes along dorsal and ve
86 to cognitive and motor aspects of emotional prosody preparation and production and is more strongly
87 cisely how sentence processing and emotional prosody processing are both organized in the intact RH o
88 nd non-human primate models, we investigated prosody processing in narrative speech, focusing on pitc
89 d auditory sentence processing and emotional prosody processing, comparing the overlap for these two
94 peech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body ges
96 regarding how the brain perceives emotional prosody, the neural bases involved in the generation of
97 the underlying mechanisms of conversational prosody: They empirically inform and refine existing the
98 specifically employ rhythmic modulations of prosody to estimate the duration of upcoming sentences,
101 nces lacking overt prosodic cues (instructed prosody trial) and were instructed to imagine the prosod
105 esses by which people recognize emotion from prosody, US and Indian participants were asked to judge
106 n and expression of affective and linguistic prosody were tested in subjects with documented unilater
107 showed attenuated adaptivity in categorizing prosody, whereas they were equivalent to controls in ter
108 sia have some deficits recognizing emotional prosody, whereas those with the semantic variant show mo
109 o be mostly driven by changes in emotion and prosody, which are mainly captured by acoustic features.
110 ns production and comprehension of emotional prosody, while a second model proposes that emotional pr
111 than matched controls at decoding emotional prosody, with decoding rates for some emotions up to 20%
112 wocky controls with morphemes and sentential prosody, word lists with lexical content but no phrase s
113 to a large collection of intended emotional prosody, yielding more than 3,000 minutes of recordings.