戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 r signals required by strict formulations of backpropagation.
2 transpose operations required in traditional backpropagation.
3 mplex combines epistemic autonomy with error backpropagation.
4 ade dendrites and dendritic spines by active backpropagation.
5 t onset of somatic AP was produced solely by backpropagation.
6 c potentials and facilitate action potential backpropagation.
7 nts are consistent with active, not passive, backpropagation.
8 t require axonal action potential firing and backpropagation.
9  travel surveys are used as ground truth for backpropagation.
10 S to increase neuronal gain and the speed of backpropagation.
11 ells being diagnostic to be calculated using backpropagation.
12 owing for the calculation of gradients using backpropagation.
13 t parameters are optimized through a revised backpropagation.
14 that this distinct mechanism, in contrast to backpropagation, (1) underlies learning in a well-establ
15 to solve classification tasks using "in situ backpropagation," a photonic analog of the most popular
16                     In machine learning, the backpropagation algorithm assigns blame by multiplying e
17  This study presents a neuromorphic, spiking backpropagation algorithm based on synfire-gated dynamic
18 ive power of this modeling framework and the backpropagation algorithm for setting the parameters.
19 iking neural networks (SNNs) using the error backpropagation algorithm has made significant progress
20 y believed that end-to-end training with the backpropagation algorithm is essential for learning good
21                                          The backpropagation algorithm learns quickly by computing sy
22 edforward networks trained end-to-end with a backpropagation algorithm on simple tasks.
23                                          The backpropagation algorithm solves this problem in deep ar
24 g Neural Network implementation of the exact backpropagation algorithm that is fully on-chip without
25 classical conditioning tasks using the error backpropagation algorithm to guide learning.
26 te structure in large data sets by using the backpropagation algorithm to indicate how a machine shou
27  so far(10-22) have been unable to apply the backpropagation algorithm to train unconventional novel
28 arge models with the same performance as the backpropagation algorithm widely used in deep learning t
29 , the workhorse of modern deep learning, the backpropagation algorithm, has proven difficult to trans
30 al neural network-based model trained by the backpropagation algorithm.
31 e implemented in a network without using the backpropagation algorithm.
32                            In addition, this backpropagation also results in an unusually high rate o
33 ral networks faced a similar challenge until backpropagation and automatic differentiation transforme
34 fire tonically resulting in action potential backpropagation and dendritic Ca(2+) influx.
35  the influence of apical length on dendritic backpropagation and excitability, based on a Na(+) chann
36 ing algorithms, such as k-nearest neighbors, backpropagation and probabilistic neural networks, often
37 of Kv4.2 subunits, regulate action potential backpropagation and the induction of specific forms of s
38                           We report enhanced backpropagation and theta resonance and decreased summat
39 ring rates reach 40 Hz (activity-independent backpropagation); and (3) do not exhibit signs of a 'cal
40  activation and loss functions and with both backpropagation- and Hebbian-based learning rules.
41                                              Backpropagation Applied to Handwritten Zip Code Recognit
42 ing decision trees, support vector machines, backpropagation artificial neural networks, extreme grad
43 hat FSI was associated with action potential backpropagation (bAP) and a supralinear increase in dend
44                               To train PNNs, backpropagation-based and backpropagation-free approache
45              fastISM reduces the gap between backpropagation-based feature attribution methods and IS
46 utation effects from Integrated Gradients, a backpropagation-based feature attribution, and character
47 logical cognition can be achieved with error backpropagation-based learning.
48              It far surpasses the runtime of backpropagation-based methods on multi-output architectu
49 tudy on the meteorological data and types of backpropagation (BP) algorithms used to train and develo
50  that average obtained from the conventional backpropagation (BP) method can hardly overcome 0.35 and
51                                        After backpropagation (BP) training, the ANN model was able to
52                                      Despite backpropagation (BP)-based training algorithms being the
53  (10 mM) had no effect on activity-dependent backpropagation but blocked the effect of CCh.
54     Instead, recurrent networks trained with backpropagation capture the time-encoding properties and
55 In association with the enhancement of spike backpropagation, CCh increased the amplitude and duratio
56 rd Backpropagation (FFBP) and Cascadeforward Backpropagation (CFBP), were developed and trained using
57 o dopamine, the pulses of GABA prohibited AP backpropagation distally from the application site, even
58 es can likely be explained by differences in backpropagation efficiency, arising from the specific co
59           Two ANN architectures, Feedforward Backpropagation (FFBP) and Cascadeforward Backpropagatio
60 ward-propagating light and simulated in situ backpropagation for 64-port photonic neural networks tra
61 ransforming data representations, as well as backpropagation for model finetuning, saliency map compu
62 th a smooth constitutive update that enables backpropagation for the NN training.
63     To train PNNs, backpropagation-based and backpropagation-free approaches are now being explored.
64 oreover, we demonstrated that AGOP, which is backpropagation-free, enabled feature learning in machin
65                 Yet training ANN using error backpropagation has created the current revolution in ar
66                            The advantages of backpropagation have made it the de facto training metho
67             Non-decremental, non-dichotomous backpropagation in AOB primary dendrites ensures fast, r
68 ar dendritic computation would support error backpropagation in biological neurons.
69 ty-dependent attenuation of action-potential backpropagation in current-clamp simulations of a CA1 py
70 itic recording data means that the extent of backpropagation in thalamocortical (TC) and thalamic ret
71 alcium dynamics during action potential (AP) backpropagation in three types of V1 supragranular inter
72       Na(V)1.2 loss reduced action potential backpropagation into dendrites, impairing synaptic plast
73 uence synaptic potentiation following active backpropagation into dendrites.
74 tion and undergo strong attenuation in their backpropagation into the dendrites (length constant, 76
75 or local synaptic stimulation, and the rapid backpropagation into the dendritic arbor depended upon v
76 potential repolarization, repetitive firing, backpropagation (into dendrites) of action potentials, a
77    At the same time, the traditional form of backpropagation is biologically implausible.
78                                              Backpropagation is limited, however, and action potentia
79                                              Backpropagation is widely used to train artificial neura
80         We implement the Levenberg Marquardt backpropagation (LMB) algorithm for investigating an inn
81                                          The backpropagation method has enabled transformative uses o
82 mong the various training schemes, the error backpropagation method that directly uses the firing tim
83 nberg-Marquardt, quasi-Newton, and resilient backpropagation methods are employed to train the ANN.
84 phic findings and patient age, a three-layer backpropagation network was developed to predict whether
85  the four classifiers are then fused using a backpropagation neural network (BNN) to diagnose each re
86  analyzed with multivariate statistics and a backpropagation neural network (NN).
87 rs evaluated the performance of feed-forward backpropagation neural networks in predicting rapid prog
88 f the spectra using the multivariate methods backpropagation neural networks, decision tree, adaboost
89 ting the frequency of repetitive firing, the backpropagation of action potential into dendrites, and
90 onsiders nonlinear dendritic integration and backpropagation of action potentials from the soma to th
91 tion of synaptic input or the initiation and backpropagation of action potentials in a branch-selecti
92 ing expression of Nav1.2 channels attenuates backpropagation of action potentials into dendrites of c
93 s to an increase in synaptic integration and backpropagation of action potentials into distal dendrit
94                                          The backpropagation of action potentials into the dendritic
95 ppear at odds with the unusually weak active backpropagation of action potentials into the soma and d
96 h plasticity between LTD and LTP by boosting backpropagation of action potentials.
97  but is thought to rely in part on dendritic backpropagation of action potentials.
98 n, Na(V)1.2 at the proximal AIS promotes the backpropagation of APs to the soma.
99                                              Backpropagation of autonomously generated APs was reliab
100  the dendritic arbor, calcium signals during backpropagation of both single APs and AP trains were re
101 of p-hydroxyphenacyl (pHP) GABA demonstrates backpropagation of GABAAR-mediated depolarizations from
102                                              Backpropagation of somatic action potentials generated S
103 ials in the axon initial segment followed by backpropagation of these spikes throughout the neuron re
104  this computation,(1)(,)(2) facilitating the backpropagation of value from the predicted reward to th
105 works have reinvigorated interest in whether backpropagation offers insights for understanding learni
106 rs of neurons and performs as effectively as backpropagation on a variety of tasks.
107 opamine is known to inhibit action potential backpropagation, our experiments revealed an unexpected
108                    Some biological models of backpropagation rely on feedback projections that are sy
109 site of AP generation and the true extent of backpropagation remain unknown.
110  and action potential output caused by spike backpropagation results in the appearance of high spike
111 terpreting deep survival models using a risk backpropagation technique.
112 tal deep-learning models primarily relies on backpropagation that is unsuitable for physical implemen
113 xclusively controls I(NaP) generation and AP backpropagation, thereby playing a prominent role in syn
114                        Our approach combines backpropagation through time with cascade learning, enab
115 introduce the mechanical analogue of in situ backpropagation to enable highly efficient training of m
116                                ElastNet uses backpropagation to learn the hidden elasticity of object
117 nity, and we experimentally demonstrate this backpropagation to obtain gradient with high precision.
118  called physics-aware training, that applies backpropagation to train controllable physical systems.
119 ng between inner layers is scrambled because backpropagation training does not require perceptrons to
120                               Simulations of backpropagation using the device properties reach ideal
121                   This spatial control of AP backpropagation was mediated by Ia-type potassium curren
122                                              Backpropagation was reliant on Na+ channels: in 1 microm
123 med that credit assignment is best solved by backpropagation, which is also the foundation of modern
124 s-aware training combines the scalability of backpropagation with the automatic mitigation of imperfe
125 and (iii) can be specified and trained using backpropagation with the same ease-of-use as contemporar

 
Page Top