戻る
「早戻しボタン」を押すと検索画面に戻ります。

今後説明を表示しない

[OK]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 scales for the Euler fluid equations without regularization.
2 oduced for variance stabilization as well as regularization.
3 d clustering and soft clustering by sparsity regularization.
4 r mixed regressions with Lp (0 < p < 1) norm regularization.
5 al methods are based on mixture modeling and regularization.
6 inverse procedure and abolishes the need for regularization.
7 y selected by Bayesian information criterion regularization.
8 ystems of linear equations by using sparsity regularization.
9 g techniques like Group lasso with l(1)/l(2) regularization.
10 o generalized linear models with elastic net regularization (14VF and 14GT), based on the 14 genes, s
11 CAPS) project conducted extensive review and regularization across studies of all schizophrenia linka
12 nforcement may be pivotal to achieve pattern regularization and restore the neural activity in the nu
13 using an iterative regression algorithm with regularization and subsequently fit with a zero-memory n
14 field (STRF) with two hold-back datasets for regularization and validation.
15                         In addition, we used regularization and variable selection via the elastic ne
16           Age at menarche, time to menstrual regularization, and duration or intensity of menstrual f
17                         The influence of the regularization appears solely on the right-hand side, wh
18 on the negative binomial distribution, use a regularization approach to select a few transcripts coll
19 rning technique we develop uses a task-based regularization approach.
20 ssion calls lends itself perfectly to modern regularization approaches that thrive in high-dimensiona
21 odel ultilizes maximum entropy modeling with regularization-based structure learning to statistically
22 , we highlight two approaches of statistical regularization, Bayesian least absolute shrinkage and se
23  logistic regression algorithm with Bayesian regularization (BLogReg) is available from http://theova
24  of a constant second-order coefficient, our regularization by a singular function is straightforward
25                                   Reweighted regularization can enhance sparsity and obtain better sp
26 ntly developed NCA-r algorithm with Tikhonov regularization can help solve the first issue but cannot
27 esent an ensemble approach using elastic net regularization combined with mRNA expression profiling a
28           For IFBP with 15 iterations and no regularization compared to 3DRP, both using a ramp filte
29  program CONTIN, the parameter governing the regularization constraint is adjusted by variance analys
30 ions, and the various physical and numerical regularizations contributing in different proportions in
31 multivariate calibration method, constrained regularization (CR), and demonstrate its utility via num
32 pose a clustering threshold gradient descent regularization (CTGDR) method, for simultaneous cluster
33                     We study how the rate of regularization depends on the frequency of word usage.
34 ichardson-Lucy algorithm with Total Variance regularization, enabling submicrometer image fidelity, d
35 ng positive relationship between the rate of regularization errors and damage to the posterior half o
36 ption words, especially on trials where over-regularization errors occurred.
37 ency exception words, and made frequent over-regularization errors.
38 eficits were not correlated with the rate of regularization errors.
39 istinct from the lesion site associated with regularization errors.
40  with this deficit also make characteristic 'regularization' errors, in which an irregularly spelled
41            Our results show that the optimal regularization factor can be determined well with the L-
42 graph' that implements the graph-constrained regularization for both sparse linear regression and spa
43 reproducibility of the model, we incorporate regularization for simultaneous shrinkage of gene sets b
44 ans of interaction matrix weighting and dual regularization from both chemicals and proteins.
45 lution method that combines an entropy-based regularization function with kernels that can exploit ge
46 torization with a new biologically motivated regularization function.
47 new computational method, called graph-based regularization (GBR), for expressing a pairwise prior th
48  In the other two regimes, the two different regularizations give rise to the same generalized flows
49 ermediate compressibility, the two different regularizations give rise to two different scaling behav
50 y and traveling waves could lead to synaptic regularization, giving unique insights into the role and
51 ssion analysis, our model uses a new form of regularization, group l(2,1)-norm (G(2,1)-norm), to inco
52 e of the nature of sparsity in DOT, sparsity regularization has been utilized to achieve high-quality
53                Estimation can be improved by regularization, i.e. by imposing a structure on the esti
54 , the transaxial resolution for IFBP without regularization improved 35% compared to 3DRP; with regul
55                                  Statistical regularization improves predictions and provides a gener
56 ctors, underscoring the importance of strong regularization in noisy datasets with far more features
57     In this paper we develop a novel type of regularization in support vector machines (SVMs) to iden
58 ages this trait network to encode structured regularizations in a multivariate regression model over
59 dependence of the RT mixing rate on nonideal regularizations; in other words, indeterminacy when mode
60 The model for CVD chosen through elastic net regularization included interaction terms suggesting tha
61 Hanning roll-off, the noise for IFBP without regularization increased by a factor of 6 compared to 3D
62      Epithelial healing time, DeltaKmax, and regularization index (RI) were significantly better in t
63                              We incorporated regularization into the model fitting procedure to addre
64 (i) introducing combined structured sparsity regularizations into multimodal multitask learning to in
65 er analysis we demonstrate that the observed regularization is associated with memory effects in the
66                  In addition, an l(2,1)-norm regularization is utilized to couple feature selection a
67         Our approach modifies the Tikhonov's regularization, known previously in CONTIN and Maximum E
68 n by sparsity inducing regression (l(1) norm regularization) leads to a stable set of features: i.e.
69 sic particle filter (PF) with resampling and regularization, maximum likelihood estimation via iterat
70 ification and the threshold gradient descent regularization method for estimation and biomarker selec
71                                 The Bayesian regularization method for high-throughput differential a
72                  Second, we use a reweighted regularization method to obtain the sparse representatio
73             We propose a network constrained regularization method to select important SNPs by taking
74 TGDR (clustering threshold gradient directed regularization) method to genetic association studies us
75                    The integrated group-wise regularization methods increases the interpretability of
76 d procedure outperforms existing main-stream regularization methods such as lasso and elastic-net whe
77      Compared to the standard TGDR and other regularization methods, the CTGDR takes into account the
78 der, Tikhonov first-order and L1 first-order regularization methods.
79 verfitting, which can be avoided by standard regularization methods.
80 e STRE model significantly outperforms other regularization models that are widely used in current pr
81 evaluations indicate that the most efficient regularization norm is the identity matrix.
82      Morphologic results showed an analogous regularization of corneal shape with a significant reduc
83 namics of language evolution, we studied the regularization of English verbs over the past 1,200 year
84 ges were generated by voxelwise rank-shaping regularization of exponential spectral analysis (RS-ESA)
85  statistical techniques for dealing with the regularization of noisy data.
86 ll-known grammatical changes in English: the regularization of past-tense verbs, the introduction of
87  that some such models show will not survive regularization of the constitutive description, by inclu
88 sional and low-sample-sized data: one is the regularization of the covariance matrix, and the other i
89                                          The regularization of the parameter space achieved by the ME
90 n ideal observer paradigm, we illustrate how regularization of the spike train can significantly impr
91 or many types of statistical analyses, while regularization of variances (pooling of information) can
92 cial anatomical obstacle induces slowing and regularization of VF, impairs the persistence of VF as j
93                             Two most natural regularizations of this problem, namely the regularizati
94                       We propose a method of regularization on protein concentration estimation at th
95  points, and meanwhile impose the trace-norm regularization onto the unfolded coefficient tensor to a
96                        Through wavelet-basis regularization, our method sharpens signal without sharp
97         The sensitivity to the values of the regularization parameter and to the shape parameters is
98 sian approach can be taken to eliminate this regularization parameter entirely, by integrating it out
99                                          The regularization parameter is automatically set using Baye
100  An efficient method to optimally select the regularization parameter is proposed for obtaining a hig
101  a simple Bayesian approach to integrate the regularization parameter out analytically using a new pr
102 ct of this baseline correction method is the regularization parameter that prevents overfitting that
103   The number of components in the basis, the regularization parameter, and the mass spectral range fr
104 ity obtained is determined by the value of a regularization parameter, which must be carefully tuned
105 y obtained is determined by the value of the regularization parameter.
106 d have no selection criteria to optimize the regularization parameter.
107 nted effects of label noise when setting the regularization parameter.
108                                          The regularization parameters are chosen by a new, data-adap
109 iding a means for self-consistently choosing regularization parameters from data, we derive posterior
110 of a newly developed pure component Tikhonov regularization (PCTR) process that does not require labo
111 ts on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory p
112  behavior in Rayleigh-Taylor (RT) mixing are regularizations (physical and numerical), which produce
113 rajectory prediction problem into a standard regularization problem; the solution becomes solving thi
114 onclusions: The proposed network-constrained regularization procedure efficiently utilizes the known
115  article, we introduce a network-constrained regularization procedure for linear regression analysis
116                         We establish a clear regularization procedure for the use of LDA in ultrafast
117                                          Our regularization procedure is based on a combination of th
118 , and find a stable solution by exploiting a regularization procedure to cope with large matrices.
119 proximal in the structure; and a statistical regularization procedure to prevent overfitting.
120 tudy provides a quantitative analysis of the regularization process by which ancestral forms graduall
121 ch as averaging, and establish crowding as a regularization process that simplifies the peripheral fi
122 Along the path of our analysis, we present a regularization process via complexification and explore
123                           We present a novel regularization scheme called The Generalized Elastic Net
124                           We propose a novel regularization scheme over multitask regression called j
125 d populations, and it employs a novel spline regularization scheme that greatly reduces estimation er
126 st absolute shrinkage and selection operator regularization selected the clinical parameters: histolo
127 ly applied pairwise GC model (PGC) and other regularization strategies can lead to a significant numb
128 he recovery coefficient, compared with other regularization strategies such as the premature terminat
129 esents a novel physics-driven spatiotemporal regularization (STRE) method for high-dimensional predic
130 us adverse events chosen through elastic net regularization suggested that male sex, current smoking,
131                               Furthermore, a regularization technique is employed to increase the num
132 d with Tikhonov-Phillips and maximum entropy regularization techniques that provide the simplest dist
133                                        Using regularization techniques, we estimated the linear compo
134  define least-squares fitting, we motivate a regularization term in the objective function that penal
135            Sparsity is achieved by adding an regularization term to the variational principle, which
136 nstructed image from the measured data and a regularization-term based upon the sum of the moduli of
137 Bayesian approach that incorporates data and regularization terms directly into a path integral.
138 es these prior structures by introducing new regularization terms to encourage weight similarity betw
139 ential Tilt Model (pETM) using network-based regularization that captures both mean and variance sign
140 rization improved 35% compared to 3DRP; with regularization the improvement dropped to 19%.
141 ased by a factor of 6 compared to 3DRP; with regularization the noise was increased only by a factor
142                                With a strong regularization, the transaxial and axial resolution impr
143  the known labels of the protein, and uses a regularization to exploit the interactions between prote
144 ure representation together with group lasso regularization to extract a collection of sequence signa
145 We perform model selection using elastic net regularization to prevent overfitting.
146 ng species are combined with maximum entropy regularization to represent a continuous size-distributi
147 bling method is the introduction of sparsity regularization to the solution of the inverse problem, w
148  but also addresses the spatial and temporal regularizations to improve the prediction performance.
149 z., CGC-2SPR (CGC using two-step prior Ridge regularization) to resolve the problem by incorporating
150                 A 2-norm variant of Tikhonov regularization (TR) has been used with spectroscopic dat
151                     In both cases, Laplacian regularization turned out to be crucial for achieving a
152  regularizations of this problem, namely the regularization via adding small molecular diffusion and
153 via adding small molecular diffusion and the regularization via smoothing out the velocity field, are
154                              Using L1 and L2 regularizations, we developed a new variable selection m
155         Boundary element methods and numeric regularization were used to compute electrograms at 194
156 arameter is automatically set using Bayesian regularization, which not only saves the computation tim
157        We compared the method of elastic net regularization, which uses repeated internal cross-valid
158                   Our method uses trace norm regularization with a highly efficient ADMM (alternating
159                                              Regularization with L(1) that based methods controls the
160 est if the statistical method of elastic net regularization would improve the estimation of risk mode
161  Parkinsonism, it is unclear how the pattern regularization would originate from HFS.
162                                     Tikhonov regularization yields distance distributions that can be

WebLSDに未収録の専門用語(用法)は "新規対訳" から投稿できます。
 
Page Top