戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 isually guided learning in a self-organizing neural network model.
2 across areas and reproduced by a 'reservoir' neural network model.
3 ation should be abandoned for an integrative neural network model.
4 as quantitative response variables in a deep neural network model.
5 ued with quantitative label-free imaging and neural network model.
6 ation in mixed communities, we implemented a neural network model.
7 nd the role of inhibition, based on our deep neural network model.
8 thod for interpreting the deep convolutional neural network model.
9  cognitive task activations in a large-scale neural network model.
10 control task and compared the results with a neural network model.
11 patially extended, conductance-based spiking neural network model.
12 a framework to develop complex and realistic neural network models.
13 true positives for the linear regression and neural network models.
14  species as input to train two Convolutional Neural Network models.
15  gradient training and yields more sensitive neural network models.
16 obabilities, or by a popular class of simple neural network models.
17 nsory biology, phylogenetics, and artificial neural network models.
18 than simpler "bag of words" or convolutional neural network models.
19 ing data privacy and algorithmic bias of the neural network models.
20  SITDL can be usefully applied in artificial neural network models.
21 DP-M-net and U-finger are both convolutional neural network models.
22 s to simply and efficiently simulate spiking neural network models.
23 rn new tasks more efficiently than some deep neural network models.
24 light into the black box that is the trained neural network model, a task that has proved difficult a
25 mputer simulations of a recurrent inhibitory neural network model account not only for enhanced respi
26                                Convolutional neural network models accurately identified the L3 level
27      Even when trained on limited data, deep neural network models accurately predict capsid viabilit
28 ing a taxon-specific dataset, using a larger neural network model and improving consensus basecalls i
29                      We consider a two-layer neural network model and investigate which STDP rules ca
30 with LASSO has equal performance to the best neural network model and that the use of administrative
31                                   Using both neural network modeling and concepts from the study of d
32 evelop supervised, multi-task, convolutional neural network models and apply them to a large number o
33          We used a combination of artificial neural network models and behavioral experiments to inve
34         Here we give a brief introduction to neural network models and deep learning for biologists.
35 sion, support vector machines and artificial neural network models and demonstrate the ability of our
36 lated the evolutionary history of artificial neural network models and observed an emergent bias towa
37 a, we build anatomically-constrained shallow neural network models and train them to identify visual
38 cross visual features (as measured by a deep neural network model) and different types of environment
39 lasticity research-theoretical (the field of neural network modeling) and neurobiological (long-term
40 ks by changing conductances in a biophysical neural network model, and we investigate how it affects
41                      By applying a bottom-up neural network modeling approach, our results suggest th
42 e assessments, CSNN outperformed preexisting neural network modeling approaches for both cancer diagn
43                                          The neural network models are based on parameters generally
44 network model response function, and because neural network models are nonlinear, the gradient depend
45                     Furthermore, explainable neural network models are not fully investigated for thi
46 ANCE STATEMENT When computerized distributed neural network models are required to generate both feat
47 ptic plasticity, embedded in decision-making neural network models, are shown to yield matching behav
48 novel perspective on the utility of decoding neural network models as a metric for quantifying the en
49 ist scanning (PAS), a strategy that trains a neural network model based on measurements of cellular r
50 sensors and a well-trained dilated recurrent neural network model based on prototype learning.
51                               A hierarchical neural network model based on simple computational princ
52 epsy centers were used to train feed-forward neural network models based on tissue volume or graph-th
53 version of the algorithm, closely related to neural network models based on topographic representatio
54 act (n = 20) were used to successfully train neural-network models based on either presence/absence o
55 ope of C-N couplings was actively learned by neural network models by using a systematic process to d
56 veloped a novel probability-based Artificial Neural Network model, called NORF model, using 21 years
57  we demonstrated that the deep convolutional neural network model can accurately diagnose the latent
58                     Here we show that a deep neural network model can be trained to analyse ordinary
59     Conclusion A deep-learning convolutional neural network model can estimate skeletal maturity with
60 sociated with antibiotic activity learned by neural network models can be identified and used to pred
61 ow how the binding mechanism learned by deep neural network models can be interrogated, using a recen
62                       Furthermore, shallower neural network models can exhibit stronger symmetry brea
63    Because of this assumption, the resulting neural network models cannot describe long-range interac
64                                           Do neural network models capture the cognitive demands of h
65                                    A Hebbian neural network model captures some aspects of listener p
66             A simple, hedonically structured neural network model captures this computation.
67                      Finally, recurrent deep neural network models clearly outperform parameter-match
68                                            A neural network model consisting of miR-200a/200b/429/141
69 work, we proposed a novel deep convolutional neural network model (DCNN) for HLA-peptide binding pred
70                           Here, we develop a neural network model, DeepEdit, that not only recognizes
71 splantation calculation produced by our deep neural network model demonstrated 89 +/- 4% accuracy and
72 cohort studies, the single-input ECG-AF deep neural network model demonstrated good performance in pr
73 ieved without re-training the parameters and neural-network models, demonstrating the robustness and
74                            Using a realistic neural network model describing ion mechanisms, we show
75                        Compared with earlier neural-network models, DFINE enables flexible inference,
76  a group feature selection-based deep sparse neural network model (DNN-GFS) that is optimized for neo
77                                         Deep neural network models (DNNs) are essential to modern AI
78                               A conventional neural network model (e.g., integrate-and-fire) would re
79    Here we algorithmically construct spiking neural network models, each composed of 5000 neurons.
80 d functional clustering in trained recurrent neural network models embedded with a columnar topology.
81   Third, a final 3-dimensional convolutional neural network model evaluated echocardiographic videos
82                                     Existing neural network models fail to account for these properti
83 stimuli, using image-computable hierarchical neural network models fit directly to psychophysical tri
84           In 1982, John Hopfield published a neural network model for memory retrieval, a model that
85 yante, a multi-task five-layer convolutional neural network model for predicting variant type (SNP or
86 logical network-based regularized artificial neural network model for prediction of phenotype from tr
87          Here, we investigate whether a deep neural network model for speech separation can utilize t
88 , we developed a 3-dimensional convolutional neural network model for view selection ensuring stringe
89       Although recent studies explored using neural network models for BioNER to free experts from ma
90 ial networks may expand the use of validated neural-network models for the evaluation of data collect
91 ng algorithms, Random Forest and a Recurrent Neural Network model, for gaze event classification.
92                           Because training a neural network model from scratch is costly in terms of
93           Here, we demonstrate that the same neural network models from AF2 developed for single prot
94                                    Recurrent neural network models fulfill this need, but there are m
95 jointly modeling RMST at multiple times, the neural network model gains prediction accuracy by inform
96                                              Neural network models have also been used to study the l
97    Originally inspired by neurobiology, deep neural network models have become a powerful tool of mac
98                                 For decades, neural network models have been widely popular in fields
99                                         Deep neural network models have revived long-standing debates
100                         Base-pair-resolution neural network models identified strong cell-type-specif
101                   The unique features of our neural network model in handling missing data and calcul
102           Combining human brain imaging with neural network modelling in a probabilistic reversal lea
103 EM dataset and to compare many convolutional neural network models (Inception-v3, Inception-v4, ResNe
104                                 An attractor neural network model incorporating social information as
105 re, we developed a physiologically realistic neural network model incorporating the three classes of
106 ains were used, the performance of recurrent neural network models increased in relation to the quant
107                                              Neural-network models indicate that this form of spontan
108                                A large-scale neural network model indicated that asymmetric depressio
109                                An artificial neural network model inspired by such nonassociative lea
110 ary linear threshold circuits (an artificial neural network model) into DNA strand displacement casca
111 and (in press) showed that an extension of a neural network model introduced by N. A. Schmajuk and J.
112 he silicon photonic circuit and a continuous neural network model is demonstrated through dynamical b
113                                          The neural network model is derived from a larger model that
114                                      Next, a neural network model is implemented to detect missing te
115 he training set, the generalizability of the neural network model is limited.
116 odel by 7%, indicating that the graph-driven neural network model is robust and beneficial for accura
117                                  Because the neural network model is trained on all samples simultane
118                                An artificial neural network model is trained to predict compatibility
119                                     First, a neural network model is used based on the principle of d
120                                            A neural network model is used to retain fragments with th
121                                              Neural network modeling is often concerned with stimulus
122 -symmetric viewpoint tuning in convolutional neural network models is not unique to faces: it emerges
123              A common criticism against such neural network models is that it is difficult to interpr
124         However, a major limitation with the neural network models is the requirement of high volumes
125   A critical challenge in training effective neural network models lies in hyperparameter optimizatio
126                            However, existing neural network models mainly focus on optimizing network
127                Finally, we consider how deep neural network models might help us understand brain com
128                  Two different convolutional neural networks models (MobileNet and InceptionResNet V2
129 r function prediction method, we developed a neural network model, named NNTox, which uses predicted
130 thin the broader framework of a mathematical neural network model of associative learning.
131                                            A neural network model of biophysical neurons in the midbr
132 l model, combining (1) a large-scale spiking neural network model of cat V1 and (2) a virtual prosthe
133        Here, we use a brain-constrained deep neural network model of category formation and symbol le
134                The present article applies a neural network model of classical conditioning to invest
135    The reported data are well described by a neural network model of classical conditioning.
136              Our simulations using a spiking neural network model of cortex reproduce a range of cogn
137           In this work, we extend a previous neural network model of countermanding to account for th
138  by features extracted by an artificial deep neural network model of face processing.
139  present the Latent Cause Network (LCNet), a neural network model of LC inference.
140                         Here, we show that a neural network model of prefrontal cortex and basal gang
141                                     A recent neural network model of single-neuron integration derive
142 ith attractor network theory, we developed a neural network model of the CA3 with attractors for both
143 These predictions are validated in a spiking neural network model of the OB-PC pathway that satisfies
144            In a Hodgkin-Huxley-based spiking neural network model of visual cortex, we show that modu
145               Building upon recent recurrent neural network models of choice processes, we propose a
146                         Previously described neural network models of cognitive tasks suggest that se
147 ), which stabilizes decoding using recurrent neural network models of dynamics.
148          A continuum is consistent with most neural network models of learning, in which synaptic str
149 s in the medial temporal lobe inform current neural network models of memory, and may lead to a more
150                                              Neural network models of persecutory delusions highlight
151                                         Deep neural network models of sensory systems are often propo
152 ne hand and CMR(glc(ox)) on the other allows neural network models of such activity to probe for poss
153                      In contrast, artificial neural network models of vision are typically feedforwar
154                             Deep feedforward neural network models of vision dominate in both computa
155 s to the explanatory depth and reach of deep neural network models of visual and other forms of intel
156                            By contrast, deep neural network models of visual object recognition remai
157                                    Recently, neural network models of visual object recognition, incl
158 lls in the genus Conus can be generated by a neural-network model of the mantle.
159 two pressures with information-theoretic and neural-network models of complexity and ambiguity and si
160                      By training an advanced neural network model on the difference in high-fidelity
161         Here, we present DeltaSplice, a deep neural network model optimized to learn the impact of mu
162 has been difficult to explain theoretically, neural network models optimized for WM typically also ex
163    Here we show: (1) Recurrent convolutional neural network models outperform feedforward convolution
164                                 However, the neural network model outperformed even the maximum opera
165 ating a new sequence-embedding convolutional neural network model over a thermodynamic ensemble of RN
166                                    Recurrent neural network model performance was superior under a va
167                        Analysis of recurrent neural network models performing the task revealed that
168                        Data-driven recurrent neural network modeling pointed to altered intra-habenul
169 fic at the level of individual synapses, but neural network models predict interactions between plast
170                                          The neural network models predict substructures and toxicity
171 size, grade of vascular invasion, artificial neural network models predicting the likelihood of HCC r
172  by PyMS at dates from 4 to 20 months apart, neural network models produced at earlier times could no
173                                         Deep neural network models provide a powerful experimental pl
174                                     A simple neural network model recapitulated these results and off
175       The sensitivity is the gradient of the neural network model response function, and because neur
176                     Dissecting our recurrent neural network model revealed strong inhibitory-to-inhib
177                           Finally, a spiking neural network model revealed that spatially clustered I
178                           Here, we propose a neural network model, RMSF-net, which outperforms previo
179  from the array data employing an artificial neural network model (root mean square error for testing
180 r diagnostic accuracy study, a convolutional neural network model, SCORE-AI, was developed and valida
181  invariant recognition of objects by humans, neural network models should explicitly incorporate buil
182 climate change scenarios and a convolutional neural network model show a further increase in the numb
183 gistic regression model and a deep recurrent neural network model, show very poor performance charact
184 oscopy/US system paired with a convolutional neural network model showed high diagnostic performance
185                                        A new neural network model shows that our results can only be
186                                  In a simple neural network model, simulated populations tuned to det
187                        In standard attractor neural network models, specific patterns of activity are
188  such plastic synapses are incorporated into neural network models, stability problems may develop be
189                  In a simulated hierarchical neural network model, such labels are learned quickly an
190                               We developed a neural network model that accounts for this finding as w
191 epresentation, here we propose a feedforward neural network model that adjusts its learning rate onli
192                               We developed a neural network model that can account for major elements
193                                     A simple neural network model that can efficiently compress infor
194       Here, we provide a purely unsupervised neural network model that can explain these and other re
195                          Here we introduce a neural network model that can predict various quantum pr
196 research paper proposes a deep convolutional neural network model that can rapidly and accurately ide
197                 Our results show that a deep neural network model that directly estimates the paramet
198 tex of the olfactory system with a realistic neural network model that incorporates two general mecha
199         Here, we present TweetyNet: a single neural network model that learns how to segment spectrog
200            Here, we attempt to do so using a neural network model that learns to map an internal cont
201 icroT-CNN, an avant-garde deep convolutional neural network model that moves the needle by integratin
202 troduce PDGrapher, a causally inspired graph neural network model that predicts combinatorial perturb
203                              ElemNet, a deep neural network model that predicts formation energy from
204 nic systems; we propose a correlation-driven neural network model that predicts the useful lifetime b
205 erpretable-by-design" approach, we present a neural network model that provides insights into RNA spl
206 e favorably with the predictions of a recent neural network model that uses a recurrent architecture
207 mputational models, especially convolutional neural network models that have shown success in explain
208 aper we study basic nonconvex 1- and 2-layer neural network models that learn random patterns and der
209 dataset, we used NSD to build and train deep neural network models that predict brain activity more a
210 genic regions and their vicinity, we develop neural network models that predict gene expression state
211 or transmission, and can be implemented in a neural-network model that makes testable predictions abo
212                 Here, we developed a spiking neural-network model that produces spontaneous slow ramp
213                              In keeping with neural-network models that incorporate bidirectional lea
214 class of biologically plausible hierarchical neural network models, there is a strong correlation bet
215 using the images as input to a convolutional neural network model, they were standardized and augment
216 ere, we investigate the capability of a deep neural network model to automate design of sequences ont
217              In this work, we develop a deep neural network model to automatically detect eye contact
218 this computational arrangement by training a neural network model to solve causal inference for motio
219                 First, we implemented a deep neural network model to substitute the 3D electromagneti
220 210)Pb ((210)Pb(ex)) profiles and then use a neural network model to upscale these observations.
221  as images and uses pretrained convolutional neural network models to classify copy number states.
222                             Finally, we used neural network models to demonstrate that behavioural si
223      In addition, we developed convolutional neural network models to discriminate these subtypes bas
224 graph neural networks, which generalize deep neural network models to graph structured data, have sho
225 ts in this scenario and use simulations with neural network models to illustrate why.
226  aortic domain to serve as training data for neural network models to predict the initiating combined
227     In sum, we present a framework for using neural network models to probe the sequences instructing
228  and Cognitive Development Study, we trained neural network models to stratify general psychopatholog
229                                      We used neural network models to study how context-specific visu
230 g (ML, i.e., Gaussian process and artificial neural network) models to encode the structure-property
231 we update and utilize Akita, a convolutional neural network model, to extract the sequence preference
232             By analyzing a spiking recurrent neural network model trained on a WM task and activity o
233                          We developed a deep neural network model trained on brain MRI scans of healt
234                           Deep convolutional neural network model trained on natural image sets and a
235                                            A neural network model trained on similar tasks replicated
236                    Using a modular recurrent neural network model trained to perform analogous tasks,
237                                            A neural network model trained to perform the same tasks s
238         In many applications, one works with neural network models trained by someone else.
239 ct recognition and data-driven convolutional neural network models trained end-to-end on large popula
240 rability modulation, we probed convolutional neural network models trained to categorize objects.
241 a computationally, we investigated recurrent neural network models trained to perform several WM-depe
242                     Finally, using recurrent neural network models trained to perform the same task,
243                                    Recurrent neural network models trained to perform the task reveal
244 f ramping activity also emerged in recurrent neural network models trained to solve a similar spatial
245            Moreover, current decision-making neural network models typically aim at explaining behavi
246                            In contrast, most neural network models use homogeneous units that vary on
247              A distinguishing feature of the neural network models used in Physics and Chemistry is t
248  baseline BNT performance was explained by a neural network model using left and right (1)H-MRS ratio
249 ee approaches applied: (i) CODESSA PRO, (ii) Neural Network modeling using large pools of theoretical
250                        Finally, we construct neural network models using Hyperband, an automatic hype
251  several models, including a deep multi-task neural-network model using multiple loss optimization.
252 ion model, support vector machine model, and neural network model, using a large dataset of verified
253                                       A deep neural network model was associated with higher predicti
254                                An artificial neural network model was built to predict the ammonium c
255                                An artificial neural network model was developed for the correlation o
256                                            A neural network model was developed that outperformed sen
257                  Here, the transformer-based neural network model was first pre-trained to recognize
258                            A fully recurrent neural network model was optimized to perform a spatial
259                                            A neural network model was previously developed to predict
260 c to assess the performance of the recurrent neural network model was quadratic weighted kappa (QWK)
261 eceiver Operating Characteristic curve for a neural network model was significantly larger than that
262 assessments, a cross-validated probabilistic neural network model was superior and could discriminate
263 el was used to select variables, and a fuzzy neural network model was then constructed using factors
264                                            A neural network model was then trained to output vertebra
265                              A convolutional neural network model was trained (n=1238) and validated
266                                            A neural network model was trained on each donor's pairwis
267 hods: A Lung Cancer Prediction Convolutional Neural Network model was trained using computed tomograp
268                                An artificial neural network model was used to select variables, and a
269 the dimensionality of the data and a grid of neural network models was run to optimize the model desi
270 2D) and three-dimensional (3D) convolutional neural network models was trained and internally validat
271                  Using a purely feed forward neural network model, we show that following repeated di
272 excitability and connectivity into a spiking neural network model, we were able to demonstrate that c
273                                      Through neural network modeling, we further show that a purely g
274 rprints and using them as a dataset to train neural network models, we obtained models that successfu
275                       Using simulations with neural network models, we show that contemporary statist
276                         Drawing on attractor neural-network models, we propose a framework treating s
277  the predictions of the maximum operator and neural network model were not significantly different fr
278                                          The neural network models were built using temperature-const
279 d (US) and histopathologic data, three graph neural network models were compared to predict ALNM in e
280 procedures across 44 U.S. institutions, deep neural network models were created to classify anesthesi
281                                Convolutional neural network models were trained to extract traffic an
282                                   Artificial neural network models were trained to predict the circad
283                                Convolutional neural network models were trained using pathologist ann
284 ence for small amounts of nonlinear effects, neural-network models were outperformed by linear regres
285 e these features of behavior by developing a neural network model where planning itself is controlled
286        DEOCSU entails the deep convolutional neural network model which was trained with curated ChIP
287   These images are then used to train a deep neural network model, which is used to call SNPs.
288                             We constructed a neural-network model whose output is the weighted sum of
289 crobial communities, the application of such neural network models will increase accuracy of predicti
290      Here, we introduce SpatialGlue, a graph neural network model with a dual-attention mechanism tha
291         Here, we constructed a convolutional neural network model with a large-scale GWAS meta-analys
292 model and also develop a graph convolutional neural network model with both utilizing Hi-C data and 5
293  of a strong linear component, a feedforward neural network model with entirely random connectivity c
294                           We also compared a neural network model with multiple regression analysis t
295       The present work describes a recurrent neural network model with probabilistic spiking mechanis
296 n cortex, we test whether a cortical spiking neural network model with such a mechanism can learn a m
297 mputational framework to point-process-based neural network models with exponential stochastic intens
298 of query peptides, and applying a sequential neural network model, with one long short-term memory ce
299 enotype space to the TCN space using a graph neural network model without intermediate clustering of
300 urthermore, we show that our spatial-feature neural network model, without imposing mechanistic assum

 
Page Top