戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 ict and decision uncertainty during impaired speech perception).
2  listeners activate motor brain areas during speech perception.
3 from language-universal to language-specific speech perception.
4 s' knowledge of speech production to explain speech perception.
5 ing day-to-day functioning on tasks, such as speech perception.
6 accompany learning drive changes in auditory speech perception.
7 reports implicating premotor cortex (PMC) in speech perception.
8 tral and temporal benefits to yield improved speech perception.
9 t least in part, for their difficulties with speech perception.
10 early oscillation-based selection influences speech perception.
11 knowledge are integrated in the brain during speech perception.
12 nfluence is modulatory but not necessary for speech perception.
13 vidual variability in children's audiovisual speech perception.
14 t the motor system is critically involved in speech perception.
15 ive and expressive language, and outcomes of speech perception.
16 f both types of cues would negatively affect speech perception.
17 s thought to mediate phonological aspects of speech perception.
18 c categorization and be important for robust speech perception.
19 e pure tone hearing loss, and marked loss of speech perception.
20 enewed interest in gesture-based theories of speech perception.
21 er the motor cortex has an essential role in speech perception.
22 syllable patterns and is critical for normal speech perception.
23 s exist and that they affect auditory-visual speech perception.
24 her this simulation process is necessary for speech perception.
25 ransients and may be especially relevant for speech perception.
26  perceptual and cognitive functions, such as speech perception.
27 h visual information about lip movements for speech perception.
28  sensitive period for bimodal integration in speech perception.
29 al and auditory cues are combined to improve speech perception.
30 s-by-synthesis" mechanism in auditory-visual speech perception.
31 ach of the major theoretical perspectives on speech perception.
32 t, from activation of brain areas underlying speech perception.
33  temporal lobe neural systems engaged during speech perception.
34 d discourse tracking (CDT) as the measure of speech perception.
35 s for the neural mechanisms underlying human speech perception.
36 al network computer simulation of disordered speech perception.
37  the primary auditory cortex does not affect speech perception.
38 s how formants are processed neurally during speech perception.
39 ack the rapid neural computations underlying speech perception.
40 s are critical for many behaviors, including speech perception.
41 ctrally salient features can improve tactile speech perception.
42 ive of speech comprehension rather than mere speech perception.
43 ity but not with peripheral hearing or clear speech perception.
44 sm whereby learning to write might influence speech perception.
45 ilar decomposition strategy is used in early speech perception.
46 ipients with respect to binaural hearing and speech perception.
47  different prosodic features in multisensory speech perception.
48 ated in many previous studies of audiovisual speech perception.
49  short gaps in sound, which are critical for speech perception.
50 to directly investigate its causal effect on speech perception.
51 cognitive benefits, one of which is enhanced speech perception.
52 p movements and syllables during audiovisual speech perception.
53 s may be linked to phenomena in auditory and speech perception.
54 tory objects in human AC during multi-talker speech perception.
55 ural processes that are specific to auditory speech perception.
56 STG), a brain area known to be important for speech perception.
57 he hierarchical generative models underlying speech perception.
58 al sulcus (pSTS) is known to be critical for speech perception.
59 agrammatism and subjective difficulties with speech perception.
60  the brain tracks lip movements to help with speech perception.
61  evidence for a Predictive Coding account of speech perception.
62  reliability of them to identify deficits in speech perception.
63 als in a region of the brain specialized for speech perception.
64 o accounts of how prior knowledge influences speech perception.
65 xtend current mechanistic perspectives on AV speech perception.
66 ability of articulatory codes during passive speech perception.
67 n of sensorimotor information during passive speech perception.
68 y of amblyopia have impaired visual-auditory speech perception.
69 ementation of top-down control in continuous speech perception.
70 lations in 22 participants during continuous speech perception.
71 information from the articulators influences speech perception.
72 nts are an effective technique for enhancing speech perception abilities in quiet environments for pe
73                                Self-reported speech perception abilities were significantly better fo
74 g loss.SIGNIFICANCE STATEMENT Differences in speech perception ability between individuals with simil
75 strate that neural oscillations predict both speech perception ability in noise and listening effort.
76                                              Speech perception after implantation can vary widely--we
77 imately .3 between cognitive performance and speech perception, although some variability in associat
78                                              Speech perception among cochlear implant (CI) listeners
79 nnel noise-vocoded speech, thereby impairing speech perception and assessing whether this evokes doma
80 de modulation encoding is critical for human speech perception and complex sound processing in genera
81 nvolved in both language-specific processes (speech perception and comprehension, verbal working memo
82  involved in multisensory integration during speech perception and feedback control.
83 cordingly, they downweight pitch cues during speech perception and instead rely on other dimensions s
84 spheres are equally and actively involved in speech perception and interpretation.
85  provides a link between auditory signals of speech perception and motor programs of speech productio
86 ing, key for understanding behaviors such as speech perception and multimodal sensory integration.
87 s retain residual hearing, which can improve speech perception and overall outcomes.
88 oken language, grounding cognitive models of speech perception and production in human neurobiology.
89 e close connection between brain systems for speech perception and production, and in particular, ind
90  present in the posterior insula during both speech perception and production, suggesting an anatomic
91 though evidence for this has been limited to speech perception and production.
92 temporal, frontal and parietal lobes linking speech perception and production.
93 inks between the phonological mechanisms for speech perception and production.
94 nspeech categories, and the relation between speech perception and production.
95 ract form compared with speech coding during speech perception and production.
96 cortex, consistent with previous findings in speech perception and production.
97  by a critical computational unit for online speech perception and production: syllables.
98 etween frontal and temporal contributions to speech perception and reveal a hidden cost to processing
99  explain the foundations of the link between speech perception and speech production.
100 rom previous studies on visual awareness and speech perception and suggest that correlates of conscio
101  paradigm that dissociated between conscious speech perception and task relevance while recording EEG
102 aring loss can produce prolonged deficits in speech perception and temporal processing.
103 redictive top-down control during continuous speech perception and that top-down control is largely d
104 tex will inform our growing understanding of speech perception and the processing of other complex so
105 rticular those forming the prerequisites for speech perception and understanding.
106 voices in schizophrenia arise from disrupted speech perception and verbal working memory systems rath
107       People who upweight low frequencies in speech perception (and so perceive Laurel rather than Ya
108 erages of 0.74 for motor movements, 0.84 for speech perception, and 0.74 for speech production) in co
109  different cognitive tasks (motor movements, speech perception, and speech production), we show that
110 supratemporal plane during rhythm listening, speech perception, and speech production.
111 aced with restoration of hearing thresholds, speech perception, and synchronous activity in auditory
112 r skills with the auditory skills underlying speech perception, and the possible phylogenetic interac
113 distortions can lead to systematic errors in speech perception, and therefore hearing aid prescriptio
114                                     But like speech perception, and unlike several other perceptual s
115 esponses can perform transformations between speech-perception- and speech-production-based represent
116                                    Models of speech perception are centered around a hierarchy in whi
117 , with benefits shown to extend to untrained speech perception as well.
118                                           In speech perception, as in other domains, two functionally
119                       Hierarchical models of speech perception assume that, to extract semantic meani
120                         Results suggest that speech perception automatically triggers motor action, b
121 ovided by studies that have investigated (a) speech perception, (b) intensity discrimination, and (c)
122 ed specification of a computational model of speech perception based on predictive coding frameworks.
123 shold elevation and associated reductions in speech perception because speech sounds, especially cons
124 surements to investigate interactions during speech perception between native phonemes and talker's v
125 istive technology may improve not only their speech perception but also their connection and orientat
126     Cochlear implants enable improvements in speech perception, but music perception outcomes remain
127           Visual speech facilitates auditory speech perception, but the visual cues responsible for t
128 y cortical oscillations could play a role in speech perception by fostering hemispheric triage of inf
129          These results transform theories of speech perception by suggesting that even at the initial
130           Some of the age-related decline in speech perception can be accounted for by peripheral sen
131                                        Human speech perception can be described as Bayesian perceptua
132 e rather than specialization is critical for speech-perception capabilities that some have suggested
133 lation associated with tinnitus and impaired speech perception cause cochlear synaptopathy, character
134 ed model of causal inference in multisensory speech perception (CIMS) that predicts the perception of
135 sults support a predictive coding account of speech perception; computational simulations show how a
136 arpened Signals and Prediction Errors during speech perception could both explain these behavioural a
137 tudy, sentence recognition from the Mandarin speech perception database was measured in adult and ped
138                 Subsequent tests of Mandarin speech perception demonstrated that exposure to Mandarin
139 vide novel insights into the neural basis of speech perception, demonstrating that both acoustic feat
140        The primary symptom in humans is poor speech perception despite normal pure tone audiometry.
141 s article, we review the literature on human speech perception development within the context of this
142  preventable-likely contribute to individual speech perception differences.
143 hearing loss etiology may explain heightened speech perception difficulties in people overexposed to
144             Hidden hearing loss manifests as speech perception difficulties with normal hearing thres
145 her electrophysiological, psychophysical, or speech perception effects.
146                                     Auditory speech perception enables listeners to access phonologic
147 motor and sensorimotor systems can influence speech perception even in infants too young to produce s
148 ll as sounds, and lip-reading contributes to speech perception, even for listeners with good hearing,
149 ural ENV coding was a primary contributor to speech perception, even in noise; and (2) neural TFS con
150  measures and then evaluated with subjective speech perception experiments for both normal hearing an
151 o stimulus-brain rhythm interaction predicts speech perception facilitation.
152 nding and treating individual differences in speech perception for people suffering from SNHL.SIGNIFI
153 tures and language and it suggests a role in speech perception for the motor system underlying speech
154 e effects of ARHL on brain areas involved in speech perception, from the auditory cortex, through att
155  left-sided dominance in Wernicke's area for speech perception has been demonstrated in 2.5-mo-old ba
156                                              Speech perception has been studied for over a half centu
157                   Two long-opposing views of speech perception have posited a basis either in acousti
158              Debates about motor theories of speech perception have recently been reignited by a burs
159 ver, most functional neuroimaging studies of speech perception have used metalinguistic tasks that re
160                                Self-reported speech perception, hearing thresholds (0.25-16 kHz), and
161 hus crucial for rigorously testing models of speech perception; however, to the best of our knowledge
162                         For instance, during speech perception, humans transform a continuously varyi
163 up and top-down markers of poor multi-talker speech perception identified here could inform the desig
164 tions for: (i) perception-action theories of speech perception, (ii) the impact of "motherese" on ear
165 ts (32%) with NF1 showed clinically abnormal speech perception in background noise compared with 1 pa
166                           The HINT tests for speech perception in background noise, the major complai
167 s that a similar organization might underlie speech perception in humans.
168 rom background noise, leading to deficits in speech perception in modulated background noise.SIGNIFIC
169  showed clinically meaningful improvement in speech perception in noise (39 of 49 children [79.6%]) a
170 D, CROS, and no treatment groups in terms of speech perception in noise and disease-specific QOL in p
171 ovide improvements in sound localization and speech perception in noise over unilateral CIs, bilatera
172 onths, the CI group had significantly better speech perception in noise scores than the BCD group (di
173 inal cohort study, cognitive functioning and speech perception in noise showed a clinically meaningfu
174                      However, no benefit for speech perception in noise was found for the dual-proces
175 entation of repeating elements is crucial to speech perception in noise, since it allows superior "ta
176 he SMS than auditory regions for categorical speech perception in noise.
177 understanding of the neural basis for robust speech perception in noise.
178 ich can be considered a process analogous to speech perception in noise.
179 elated positively with behavioral indices of speech perception in noise.
180 fidence interval, 2.8 to 5.7], respectively) speech perception in noise.
181  were air conduction audiometry and binaural speech perception in noise.
182 and objective intelligibility (up to 97%) of speech perception in noise.
183 Sound energy above 8 kHz thus contributes to speech perception in noise.
184 dict variability in hearing-aid outcomes for speech perception in noise.
185 negatively by aging, potentially diminishing speech perception in noisy environments.
186  Moreover, each dimension predicts outcomes (speech perception in quiet and noise, subjective listeni
187  this manipulation significantly degraded CI speech perception in quiet by 15% and speech reception t
188 essed speech on bimodal listeners' telephone speech perception in quiet environments.
189 e most important known determinants of later speech perception in young children after cochlear impla
190 ) scores explained 60% of the variability in speech-perception in noise.
191   Outcomes were (1) postoperative changes in speech perception (in quiet was measured as a proportion
192 ul example of this phenomenon is categorical speech perception, in which a continuum of acoustically
193 he auditory system has centered around human speech perception, in which categorical processes result
194                                              Speech perception involves the integration of sensory in
195                                    Emotional speech perception is a multisensory process.
196                                        Human speech perception is a paradigm example of the complexit
197 e implanted safely and that their subsequent speech perception is at least as good as children implan
198         Therefore, a key step in audiovisual speech perception is deciding whether auditory and visua
199                  It is well established that speech perception is improved when we are able to see th
200                       A striking property of speech perception is its resilience in the face of acous
201                                              Speech perception is mediated by both left and right aud
202                                        Human speech perception is multisensory, integrating auditory
203       Recent psychophysics data suggest that speech perception is not limited by the capacity of the
204                                        Human speech perception is profoundly influenced by vision.
205                                              Speech perception is supported by both acoustic signal d
206        The influence of speech production on speech perception is well established in adults.
207 ts high temporal acuity, which is pivotal to speech perception, is a central issue of auditory scienc
208 S), a brain region known to be important for speech perception, is complex, with some regions respond
209 order language processes, or bilateral, like speech perception, is controversial.
210 g (fMRI) approach assessed sound perception, speech perception, language comprehension, and covert co
211                                       During speech perception, linguistic elements such as consonant
212 nd are instead retained over long durations, speech perception may require encapsulated memory buffer
213 73 SNPs were replicated with a self-reported speech perception measure.
214  account for 47% of the variance in the same speech-perception measures.
215                            The robustness of speech perception might, in part, result from multiple,
216    Electrophysiological correlates of infant speech perception (mismatch response to speech stimuli)
217        These results demonstrate that during speech perception, missing acoustic content is synthesiz
218                                              Speech perception most strongly activated superior tempo
219                         Historic theories of speech perception (Motor Theory and Analysis by Synthesi
220 ate or frequency, plays an important role in speech perception, music perception, and listening in co
221 anterior middle temporal and angular gyri; a speech perception network involving superior temporal an
222 owerful modulator of activity throughout the speech perception network.
223                   A computer simulation of a speech perception neural network was developed.
224      Although recent evidence indicates that speech perception occurs bilaterally, prevailing models
225                    Individual differences in speech perception often arise from disparities in access
226 we will demonstrate how our understanding of speech perception, one important facet of language, has
227 culatory phonological production rather than speech perception or lexical-semantic processes.
228 ion (tACS) leads to rhythmic fluctuations in speech perception outcomes after the end of electrical s
229  conversation (lipreading) markedly improves speech perception, particularly in noisy conditions.
230 y, a considerable reduction of the spread of speech perception performance from 40% to 93% for advanc
231 operties that have shown to be important for speech perception performance, and needs to be considere
232 ment" in articulator movement can compromise speech perception performance, raising the question of w
233 the latter, the interaction directly impacts speech perception performance.
234 latory configurations can influence infants' speech perception performance.
235 ing important spectral and temporal cues for speech perception, performance on speech tests is variab
236     Theories about the neural foundations of speech perception postulate that the left and right audi
237                                              Speech perception presumably arises from internal models
238                                              Speech-perception problems associated with noise overexp
239 nvolvement of specific motor circuits in the speech-perception process, we used event-related functio
240 sed with reference to their implications for speech perception processes.
241 imuli and a silence baseline; (ii) mid-level speech perception processing abilities were assessed by
242                     As expected, audiovisual speech perception recruited both auditory and visual cor
243 nts and auditory envelope during audiovisual speech perception reduced the accuracy of subsequent the
244 he temporal structure of language influences speech perception-related brain activity.
245   Our results show that spatial multi-talker speech perception relies upon a separable pre-attentive
246                        Our results show that speech perception relies upon a shared cortical auditory
247 ich sensorimotor integration plays a role in speech perception remains highly controversial, however.
248                                              Speech perception requires the rapid and effortless extr
249                                              Speech perception requires the successful interpretation
250                                        Human speech perception results from neural computations that
251                       Hearing thresholds and speech perception scores were evaluated with respect to
252 together explain ~ 25% of the variability in speech-perception scores in quiet using the cochlear imp
253 rin Hearing in Noise Test than with Mandarin speech perception sentences at the normal rate.
254 idence that infants' increasing precision in speech perception shapes which signals they will link to
255 ral evidence for a phonological loop linking speech perception, short-term memory and production rema
256 ith cross-modal predictive mechanisms during speech perception.SIGNIFICANCE STATEMENT Verbal communic
257 is procedure is problematic for multisensory speech perception since audiovisual speech and auditory-
258                          During multisensory speech perception, slow delta oscillations (~1-3 Hz) in
259  1 outcome of interest measured numerically: speech perception, sound localization, device use, and p
260 essing in humans, including implications for speech perception, spatial auditory processing and audit
261 ation in precentral gyrus shows that, during speech perception, specific motor circuits are recruited
262 in different cortical areas along the dorsal speech perception stream are distributed on different sp
263 versial, however, to what extent the brain's speech perception system actively uses articulatory (mot
264                     Sensorimotor activity in speech perception tasks varies as a function of context,
265 pmental milestones bidirectionally on infant speech perception tasks.
266 TFS sensitivity to performance in a range of speech perception tasks.
267 hey performed single-talker and multi-talker speech perception tasks.
268 en with cochlear implants, outcomes of adult speech perception tests were greater than preimplanted l
269 me measures were audibility, scores from the speech perception tests, and scores from a questionnaire
270 t success as measured by pediatric and adult speech perception tests.
271 bited more human-like sound localization and speech perception than models without, consistent with a
272 r the HINT results, the E+P group had poorer speech perception than the E and control groups across a
273         It is well established that in human speech perception the left hemisphere (LH) of the brain
274                                       During speech perception, the electric cortical activity of the
275 peech motor system (SMS) is activated during speech perception, the functional role of this activatio
276                                          For speech perception, this learning is selective: initially
277  disruption of human premotor cortex impairs speech perception, thus demonstrating an essential role
278  motor activation contributes to categorical speech perception under adverse listening conditions.
279 nduced by this disorder may actually improve speech perception under narrow conditions within an over
280                                              Speech perception uses information from both the auditor
281 d the cortical regions mediating categorical speech perception using an advanced brain-mapping techni
282           In contrast, this study shows that speech perception via a cochlear implant is unaffected b
283 f communication, and socioeconomic group) on speech perception was analysed.
284                                    Narrative speech perception was assessed through use of a masked s
285 r age at implant and interval since implant, speech perception was highest for children with hearing
286        To examine the role of motor areas in speech perception, we carried out a functional magnetic
287                                  However, in speech perception, we lack evidence of perception being
288                           But in addition to speech perception, we routinely extract from voices a we
289 s at these characteristic time scales during speech perception, we studied the spatial and dynamic pr
290         Specifically, we obtained changes in speech perception when adaptation to altered auditory fe
291 movements of communication partners improves speech perception when auditory signals are degraded or
292 y judgments about speech sounds (rather than speech perception, which involves decoding of sounds).
293 ogy of mirror neurons to the Motor Theory of speech perception, which posits that perception and prod
294  concerns whether humans are specialized for speech perception, which some researchers argue is demon
295 ation of Heschl's gyrus selectively disrupts speech perception, while stimulation of planum temporale
296 ion will improve phoneme recognition and (b) speech perception will improve when channels with high t
297 sory detail and prior expectations influence speech perception with computational modelling, we provi
298 hat Broca's area participates in categorical speech perception, with a possible role of translating s
299               Language development builds on speech perception, with early disruptions increasing the
300 uperior temporal gyrus (STG) is critical for speech perception, yet the organization of spectrotempor

 
Page Top