戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 ods (logistic regression, decision trees and support vector machines).
2 discriminative methods (lasso regression and support vector machines).
3 a one-dimensional Bayesian classifier with a support vector machine.
4 forest, naive Bayes, K-nearest neighbors and support vector machine.
5 e feature vectors which were used to train a support vector machine.
6 -predictor, MetaPred2CS, which is based on a support vector machine.
7 rived by using a radial basis function-based support vector machine.
8 icated predictive model, a well-parametrized support vector machine.
9 t, adaptive boosting, gradient boosting, and support vector machine.
10 h were used in classification model based on Support-Vector Machine.
11 kernelized machine learning methods, such as support vector machines.
12  from both Ktrans and rCBV maps coupled with support vector machines.
13 comparable to that of common methods such as Support Vector Machines.
14 e multivariate searchlight analysis based on support vector machines.
15 ictive techniques, such as random forest and support vector machines.
16 formance of the signature was assessed using support vector machines.
17 ivoxel classification scheme based on linear support vector machines.
18 0.79; for voxel-wise approach using a linear support vector machine: 0.88) and similar sensitivity fo
19       For all disease group comparisons, the support vector machine 10-fold cross-validation area und
20  as a classifier (accuracy = 73.6%) than the support vector machine (accuracy = 68.1%).
21                                          The support vector machine achieved high classification scor
22           We recently developed a one-vs-one support vector machine algorithm (OVO SVM) that enables
23  or a low confidence class assignment by the support vector machine algorithm at 1 or both sites are
24                                 A multiclass support vector machine algorithm was used for classifica
25  ingesta related hardness categories using a support vector machine algorithm.
26 erarchical cluster, principal component, and support vector machine analyses.
27 ed (hierarchical clustering) and supervised (support vector machine) analyses of these different dist
28 excellent classification and accuracy, while support vector machine analysis leads to the quantificat
29                           Here, we show that support vector machine analysis of bovine E. coli O157 i
30 models were developed: i) random forest, ii) support vector machine and iii) artificial neural networ
31  classification is performed using nonlinear support vector machine and kernel Fisher discriminant an
32   Our models, including logistic regression, support vector machine and random forest, showed robust
33 entional machine learning algorithms such as support vector machine and random forest.
34 ncluding ridge regression, lasso regression, support vector machine and random forests.
35 rs with comparable accuracy were selected by support vector machine and regression models and include
36  other algorithms (including random forests, support vector machine and single-task multiple kernel l
37 twork on the lincRNA data sets compared with support vector machine and traditional neural network.
38 hmark our method against various regression, support vector machines and artificial neural network mo
39 , including neural networks, random forests, support vector machines and boosting, on 10 September 20
40                                              Support vector machines and convolutional neural network
41                                        Using support vector machines and deep feed-forward neural net
42 ples of machine learning techniques, such as support vector machines and gradient tree boosting.
43 f network has better performance compared to Support Vector Machines and Neural Networks on the prote
44 enomics classification techniques, including Support Vector Machines and Prediction Analysis for Micr
45 ervised classification techniques, including Support Vector Machines and Random Forest, to phase dele
46 ome outperforms state-of-the-art models like Support Vector Machines and Random Forests for gene expr
47 ve supervised-learning algorithms (including support vector machines and random forests) and one-clas
48 ber of features; logistic regression, linear support vector machine, and quadratic support vector mac
49 ast squares discriminant analysis, recursive-support vector machine, and random forests revealed 196
50 stic Regression, the K-Nearest Neighbor, the Support Vector Machine, and the Artificial Neural Networ
51 ng random forests, direct-coupling analysis, support vector machines, and deep networks (stacked deno
52                                    Methods A support vector machine approach using voxel-wise lesion
53 me of interest based and voxel based using a support vector machine approach.
54  algorithms such as linear discriminators or support vector machines are limited due to the complexit
55                           A set of one-class support vector machines are then trained on each biologi
56                                          The support vector machine assigned patients' diagnoses with
57   As a complement of experimental methods, a support vector machine based-method is proposed to ident
58 of topological features and then leverages a support vector machine-based approach to identify predic
59             We have developed MetaPred2CS, a support vector machine-based metapredictor for prokaryot
60 hylation sites and methylation types using a support vector machine-based network.
61 established a new genome browser for viewing support vector machine-based NOL scores.
62                             Here, we offer a support vector machine-based, automated, Barnes-maze unb
63 d (RNA-protein interaction predictor), a new support-vector machine-based method, to predict protein-
64 east squared-discriminant analysis (PLS-DA), support vector machine classification analysis (SVM-C),
65                      Eggs were classified by Support Vector Machine classification and Linear and Qua
66                                            A support vector machine classification based on pattern e
67 43, 93.85 and 84.67%, respectively, by using support vector machine classification method.
68 paper we employ graph-theoretic measures and support vector machine classification to assess, in 12 h
69                                          The Support Vector Machine Classification was applied to ide
70                                            A support vector machine classifier accurately classified
71 ntegrates six sequence-based methods using a support vector machine classifier and has been intensive
72                    Development of a Gaussian support vector machine classifier based on HS-27 fluores
73                                 We trained a support vector machine classifier based on MPRA data to
74                                      Then, a support vector machine classifier is trained to categori
75 mparable or greater accuracy than IL17A in a support vector machine classifier of psoriasis and healt
76  of future drinking behavior, we applied the support vector machine classifier that had been trained
77                    Our method incorporates a support vector machine classifier that uses biomechanica
78 were subsequently used in conjunction with a support vector machine classifier to create a map of het
79                                     We use a support vector machine classifier to estimate the averag
80                                Training of a support vector machine classifier was performed with dia
81                                      Results Support vector machine classifier with intranodular radi
82 ed 97.2% accuracy for classification using a support vector machine classifier with radial basis.
83 natory features, which in conjunction with a support vector machine classifier yielded an Az of 0.73
84                                      Using a support vector machine classifier, we found that mothers
85 ormances of two different NN classifiers and support vector machine classifier.
86 alysis, and lesion classification by using a support vector machine classifier.
87                                            A support-vector machine classifier, trained on three dist
88 he static and dynamic SPHARM approach with a support-vector-machine classifier and compare their clas
89 ng the Random Forest, k Nearest Neighbor and Support Vector Machine classifiers show that POS achieve
90 s to train deep learning neural networks and support vector machine classifiers to predict N-/O-linke
91                                              Support vector machine classifiers were applied to UV-Vi
92 features from the DCE and T2w sequences, and support vector machine classifiers were trained on the C
93 linear support vector machine, and quadratic support vector machine classifiers were trained through
94 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the
95 lassification) and a 2-class (movement type) Support Vector Machine classifiers.
96             Our computational framework uses support vector machines combined with transfer machine l
97 supervised machine learning model, recursive-support vector machine, could classify abiotic and bioti
98 plicated UTI symptoms using random forest or support vector machine coupled with recursive feature el
99                                              Support vector machine decoding of alpha power patterns
100                                              Support-vector-machine decoding demonstrated accurate ne
101 d histologic assessment, we have developed a support vector machine-derived decision algorithm, which
102                                              Support vector machines determined whether morphometric
103 nostic or prognostic classifiers modelled by support vector machine, diagonal discriminant analysis,
104 ysis (PLSDA), K nearest neighbors (KNN), and support vector machines discriminant analysis (SVMDA).
105                                              Support vector machines-discriminant analysis (SVM-DA) w
106 ial least squares discriminant analysis, and support vector machines discriminated well cassava sampl
107 e refined the features to predictive models (support vector machine, elastic net) and validated those
108 racted and considered as input features to a support vector machine for classification.
109 basis of the breast tissue components, and a support vector machine framework was used to develop a s
110                    Gapped k-mer kernels with support vector machines (gkm-SVMs) have achieved strong
111                Using logistic regression and support-vector machine (i.e., pattern classifiers) model
112 protein sequences as text documents and uses support vector machine in text classification for allerg
113 our commonly used ML methods (random forest, support vector machines, K-nearest neighbor and logistic
114                                    Using the support vector machine learning algorithm favored by the
115                                 We trained a support vector machine learning algorithm to calculate t
116          The results were used to generate a support vector machine learning classifier, which was in
117 teria-specific melt curves are identified by Support Vector Machine learning, and individual pathogen
118 discriminating pattern was extracted using a support vector machine-learning algorithm performing an
119 s in multivoxel patterns exploited by linear Support Vector Machine, Linear Discriminant Analysis and
120 orithms including artificial neural network, support vector machine, logistic regression, and a novel
121 ipal component analysis (PCA), least squares-support vector machines (LS-SVM) and PCA-back propagatio
122               In combination with the use of support vector machine method (SVM), the impact on the s
123 imilarity, binary kernel discrimination, and support vector machine methods.
124              We show that a species specific support vector machine model based on Arabidopsis sequen
125                                          Our support vector machine model could be trained effectivel
126               Of all measures evaluated, the support vector machine model ranked highest both without
127  high gamma (70-150 Hz) time features with a support vector machine model to classify individual word
128 aid of Random Forest in feature selection, a support vector machine model was created to predict targ
129 igns these peptides probability scores using Support Vector Machine model, whose feature set includes
130 nt analysis followed by machine learning and support vector machine modeling.
131                                     However, support vector machine models exhibited similar performa
132 s findings insofar as both random forest and support vector machine models relied on alpha-band activ
133                                              Support Vector Machine models were trained and validated
134 to demonstrate the method, random forest and support vector machine models were trained on within-par
135                               Furthermore, a support vector machine multivariate classification of re
136 t, multivariate adaptive regression splines, support vector machine, naive Bayes' classification, and
137  random forests combined hierarchically with support vector machines or logistic regression (LR), and
138  (PLSR), principal component analysis with a support vector machine (PCA + SVM) and principal compone
139                        Results The quadratic support vector machine performed best in training and wa
140                                    Moreover, Support Vector Machines permitted to predict the TPI and
141                                              Support vector machines provided superior classification
142 odel, Multi-Layer Perceptron Neural Network, Support Vector Machine, Random Forest, and LASSO regress
143 rning methods including logistic regression, support vector machine, random forest, Gaussian naive Ba
144 n compared with XGBoost, k-nearest neighbor, support vector machine, random forest, logistic regressi
145 ith 3 different machine-learning algorithms (support vector machines, random forests, and artificial
146                                        Also, support vector machine recognized 87.71-96.74% of class
147 lood pressure prediction method based on the support vector machine regression (SVR) algorithm to sol
148 stimate FEV1/FVC ratios across patients, via support-vector-machine regression.
149  performance of neural nets, random forests, support vector machines (regression) and ridge regressio
150                         Interpreting trained support vector machine revealed MAP morphologies that, u
151 r computational statistical algorithms (e.g. support vector machines, ridge regression), holds much p
152  Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machi
153        Analyses using cross-validated linear support vector machines showed that the PAG discriminate
154  Square Discrimination Analysis "PLS-DA" and Support Vectors Machines "SVM"), was able to discriminat
155                We have developed CellSort, a support vector machine (SVM) algorithm that identifies o
156               To solve this, we utilized the support vector machine (SVM) algorithm to classify pheno
157 onal variables for noncorrected (NC) data by Support Vector Machine (SVM) algorithm, a computer metho
158 disorder (CVID), which was recognizable by a support vector machine (SVM) algorithm.
159 , and then the images are classified using a Support Vector Machine (SVM) algorithm.
160                For the prediction of stroke, support vector machine (SVM) algorithms had a pooled AUC
161 s predictors of pathological diagnosis using support vector machine (SVM) algorithms.
162 pport vector machine ("tarSVM"), that uses a Support Vector Machine (SVM) and a new score the normali
163 a-driven approach by using machine learning (Support Vector Machine (SVM) and Deep Learning (DL)) to
164 mber of classifiers (multi-layer perceptron, Support Vector Machine (SVM) and Dempster-Shafer ensembl
165 y: decision tree, artificial neural network, support vector machine (SVM) and k-nearest neighbor clas
166 ve features elimination (RFE) technique with support vector machine (SVM) and logistic regression).
167 alysis (PLS-DA), k-nearest neighbors (k-NN), support vector machine (SVM) and Random Forest (RF) were
168 o NMP-like populations using a purpose-built support vector machine (SVM) based on the embryo CLE and
169                                              Support vector machine (SVM) classification of the spect
170 ter resolution feature selection (CR-FS) and support vector machine (SVM) classification.
171                Second, we train a peak-level Support Vector Machine (SVM) classifier by using human-e
172                                            A support vector machine (SVM) classifier separating 56 he
173 DSM-5 AUD) features using supervised, Linear Support Vector Machine (SVM) classifier to test which fe
174                                Training of a support vector machine (SVM) classifier used diagnostic
175                                            A support vector machine (SVM) classifier was trained usin
176 d frequency domain as inputs to a non-linear support vector machine (SVM) classifier.
177 iscriminative features were used to generate support vector machine (SVM) classifiers.
178 g (TMT) quantitative proteomics approach and Support Vector Machine (SVM) cluster analysis of three c
179                                      Using a support vector machine (SVM) combined with nested cross-
180  spectral dimensionality (1203 features) and support vector machine (SVM) for classification with 95%
181 lassification models, and is as efficient as support vector machine (SVM) for classifying 1000 Genome
182                          Our strategy uses a support vector machine (SVM) framework that combines bot
183                                   Finally, a support vector machine (SVM) learning algorithm was trai
184 ibility phenotypes were separated by using a support vector machine (SVM) machine learning algorithm.
185                                          The support vector machine (SVM) method is also presented fo
186                                The quadratic support vector machine (SVM) model built using all selec
187 ral features were used to establish a linear support vector machine (SVM) model of sixteen diagnostic
188  while the prediction accuracy of edge-based support vector machine (SVM) model was poorer, due to th
189       Regardless of normalization methods, a support vector machine (SVM) model with the radial basis
190                          We have developed a support vector machine (SVM) model, trained using brain
191 9 cancer types, to build more than 2 x 10(8) Support Vector Machine (SVM) models for reconstructing a
192                             Four independent support vector machine (SVM) models performed to ensure
193   Partial least squares regression (PLS) and support vector machine (SVM) regression methods were use
194 nown answers and open questions using linear support vector machine (SVM) resulted in an above-chance
195                        DecRiPPter combines a Support Vector Machine (SVM) that identifies candidate R
196  the functional families in CATH; building a support vector machine (SVM) to automatically assign dom
197 i-layer perceptron (MLP) neural network, and support vector machine (SVM) to create an intelligent cl
198                  CADD trains a linear kernel support vector machine (SVM) to differentiate evolutiona
199 r size and histological type) are fed into a support vector machine (SVM) to generate the final predi
200 tion-alternating least squares (MCR-ALS) and support vector machine (SVM) to quantify biomolecular di
201                                            A support vector machine (SVM) using AF-CKSAAP achieves th
202 able predictive performance against survival support vector machine (SVM) using significantly fewer g
203 ase studies, accuracy exceeds an alternative support vector machine (SVM) voting model in most situat
204               The machine learning algorithm Support Vector Machine (SVM) was trained to perform supe
205                                            A support vector machine (SVM) was used on these triplet e
206 st square-discriminant analysis (PLS-DA) and support vector machine (SVM) were applied to the results
207 optimal set of features and train them using Support Vector Machine (SVM) with linear kernel to selec
208 ly, two-class classifiers are designed using support vector machine (SVM) with radial basis function
209 resulting from random forest (RF), nonlinear support vector machine (SVM), and deep neural network (D
210  such as Principal Component Analysis (PCA), Support Vector Machine (SVM), and Random Forest (RF).
211  Each descriptor is used to train a separate support vector machine (SVM), and results are combined b
212 ast-squares discriminant analysis (sPLS-DA), support vector machine (SVM), and SVM classification tre
213 e mould recognition methods were built using support vector machine (SVM), back-propagation neural ne
214 s using machine learning models that include support vector machine (SVM), boosted regression tree (B
215       Three machine learning methods, namely Support Vector Machine (SVM), k-Nearest Neighbors (k-NN)
216       Then, we compared neural network (NN), support vector machine (SVM), k-nearest neighbors (kNN)
217  (EBMC)) and three regression methods (i.e., Support Vector Machine (SVM), Logistic Regression (LR),
218  regression (LR), k-nearest neighbor (k-NN), support vector machine (SVM), random forest (RF), and de
219 n (SSR), Stochastic gradient boosting (SGB), support vector machine (SVM), Relevance vector machine (
220                           Here, we develop a support vector machine (SVM)-based classifier to investi
221 ces and the homolog noise is counteracted by support vector machine (SVM).
222 ic origin using predictive modeling based on support vector machine (SVM).
223 using principal component analysis (PCA) and support vector machine (SVM).
224 l was built based only on 7 features using a support vector machine (SVM).
225            Position-specific scoring matrix, support vector machines (SVM) and artificial neural netw
226                                Nevertheless, support vector machines (SVM) and artificial neuron netw
227 tive accuracy, which are then presented into support vector machines (SVM) and random forests (RF).
228 was developed through the integration of the support vector machines (SVM) classifier and ensemble le
229           The three QT features were used in Support Vector Machines (SVM) classifiers, and classific
230 ndom walks with an ensemble of probabilistic support vector machines (SVM) classifiers, and we show t
231  developed a machine learning approach using support vector machines (SVM) for automatic VP placement
232                  We applied state-of-the-art Support Vector Machines (SVM) methodology to pant hoots
233                                              Support vector machines (SVM) models can be successfully
234 neural networks (NN), fuzzy models (FM), and support vector machines (SVM) to predict physicochemical
235                                    SMOQ uses support vector machines (SVM) with protein sequence and
236 ), Quadratic Discriminant Analysis (QDA) and Support Vector Machines (SVM), analyzed in the biofinger
237 ), Quadratic Discriminant Analysis (QDA) and Support Vector Machines (SVM), coupled with dimensionali
238 fferent classifiers: probabilistic bayesian, Support Vector Machines (SVM), deep neural network, deci
239 chniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoo
240 ch as Restricted Boltzmann Machines (RBM) or Support Vector Machines (SVM).
241 rating Characteristics Curve (AUC) using the Support Vector Machines (SVM).
242 application of classification models such as support vector machines (SVM).
243 rate methods in an optimized predictor using Support Vector Machines (SVM).
244                    A statistical classifier, Support Vectors Machine (SVM), was then used for partici
245          The method is first demonstrated on support-vector machine (SVM) models, which generally pro
246 e-of-interest (VOI) and voxel-based (using a support vector machine [SVM] approach) (18)F-FDG PET ana
247  ML algorithms (ANN, random forest [RF], and support vector machine [SVM]) using multiple combination
248 its) with a machine learning technique (i.e. support vector machine, SVM).
249                        We trained and tested Support Vector Machine, SVM, classifiers to compare the
250 g random forests (RFs), elastic net (ELNET), support vector machines (SVMs) and boosted trees in comb
251 hniques: principal component analysis (PCA), support vector machines (SVMs) and hierarchical cluster
252 ogistic regression (LR), random forest (RF), support vector machines (SVMs) and multilayer perceptron
253                                              Support vector machines (SVMs) are advantageous in that
254  are ubiquitous in pattern recognition, with support vector machines (SVMs) being the best known meth
255           Supervised classification based on support vector machines (SVMs) has successfully been use
256                       For several years now, support vector machines (SVMs) have proven to be powerfu
257 s (PCA), linear discriminant analysis (LDA), support vector machines (SVMs), and artificial neural ne
258 ning the input sequences against an array of Support Vector Machines (SVMs), each examining the relat
259 rformance of three machine learning methods (support vector machines (SVMs), multilayer perceptrons (
260                                    Utilizing support vector machines (SVMs), SPICA was also able to u
261  of our system is operated by an ensemble of support vector machines (SVMs), where each SVM is traine
262 diction engine is operated by an ensemble of support vector machines (SVMs), with each SVM trained on
263 the performance of deep networks relative to support vector machines (SVMs).
264 clei used a string-based Profile Kernel with Support Vector Machines (SVMs).
265 amework that utilises a nearest-neighbour or support vector machine system, to integrate heterogeneou
266 iant filtering pipeline, targeted sequencing support vector machine ("tarSVM"), that uses a Support V
267 -based features are then used to construct a support vector machine that can be used for accurate pre
268 ar relationship between voxels judged by the support vector machine to be highly infiltrated and subs
269 el two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidat
270 ly, we classify stimulus orientation using a support vector machine to learn a linear classifier on t
271 n a supervised learning procedure to train a support vector machine to predict the cleavability of 3.
272                                      We used support vector machines to classify disease state based
273 ssible deep extensions and high-order kernel support vector machines to predict major histocompatibil
274        We used a machine learning algorithm (support vector machine) to examine whether ASD detection
275  robust supervised classification algorithm (Support Vector Machine) to identify characters from sent
276 ive matrix factorization, cluster 'fitness', support vector machine) to resolve rare and common cell-
277                               We applied ML (support vector machines) to MRI data (regional cortical
278 e developed a classifier, GlyStruct based on support vector machine, to predict glycated and non-glyc
279               This accuracy exceeded that of support vector machines, traditional linear discriminant
280                       We implement CADD as a support vector machine trained to differentiate 14.7 mil
281                                              Support vector machines trained on brain patterns relate
282                                              Support vector machines trained with graph metrics of wh
283 squares-discriminant analysis (sPLS-DA), and support vector machine tree type entropy (SVMtreeH), are
284 riable selection, partial least squares, and support vector machines using the radial basis function
285          Regional analysis of variance and a support vector machine were used to compare and discrimi
286      Logistic regression, random forest, and support vector machines were used as candidate methods t
287                                              Support vector machines were used to compare clinical da
288                                       Linear support vector machines were used to construct a predict
289 icial neural networks; LS-SVM, least squares support vector machine) were applied to building the cal
290 stacked denoising autoencoders combined with support vector machines) were our best performing deep n
291  beats are used as inputs for one-versus-one support vector machine, which is conducted in form of 10
292                                            A support vector machine with k-fold cross-validation was
293 rve: random forest (0.782), XGBoost (0.781), support vector machine with linear kernel (0.780), and [
294 ach using (1) data-driven methods, including Support Vector Machine with Recursive Feature Eliminatio
295                                              Support Vector Machines with gapped k-mer kernels (gkm-S
296 asis function and nu regression (SVM-R(NU)), support vector machines with polynomial kernel and epsil
297  kernel and epsilon regression (SVM-P(EPS)), support vector machines with polynomial kernel and nu re
298 stochastic search variable selection (SSVS), support vector machines with radial basis function and e
299 unction and epsilon regression (SVM-R(EPS)), support vector machines with radial basis function and n
300 ods (logistic regression, random forest, and support vector machines) yielded similar results, and a

 
Page Top