コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 ods (logistic regression, decision trees and support vector machines).
2 discriminative methods (lasso regression and support vector machines).
3 a one-dimensional Bayesian classifier with a support vector machine.
4 forest, naive Bayes, K-nearest neighbors and support vector machine.
5 e feature vectors which were used to train a support vector machine.
6 -predictor, MetaPred2CS, which is based on a support vector machine.
7 rived by using a radial basis function-based support vector machine.
8 icated predictive model, a well-parametrized support vector machine.
9 t, adaptive boosting, gradient boosting, and support vector machine.
10 h were used in classification model based on Support-Vector Machine.
11 kernelized machine learning methods, such as support vector machines.
12 from both Ktrans and rCBV maps coupled with support vector machines.
13 comparable to that of common methods such as Support Vector Machines.
14 e multivariate searchlight analysis based on support vector machines.
15 ictive techniques, such as random forest and support vector machines.
16 formance of the signature was assessed using support vector machines.
17 ivoxel classification scheme based on linear support vector machines.
18 0.79; for voxel-wise approach using a linear support vector machine: 0.88) and similar sensitivity fo
23 or a low confidence class assignment by the support vector machine algorithm at 1 or both sites are
27 ed (hierarchical clustering) and supervised (support vector machine) analyses of these different dist
28 excellent classification and accuracy, while support vector machine analysis leads to the quantificat
30 models were developed: i) random forest, ii) support vector machine and iii) artificial neural networ
31 classification is performed using nonlinear support vector machine and kernel Fisher discriminant an
32 Our models, including logistic regression, support vector machine and random forest, showed robust
35 rs with comparable accuracy were selected by support vector machine and regression models and include
36 other algorithms (including random forests, support vector machine and single-task multiple kernel l
37 twork on the lincRNA data sets compared with support vector machine and traditional neural network.
38 hmark our method against various regression, support vector machines and artificial neural network mo
39 , including neural networks, random forests, support vector machines and boosting, on 10 September 20
43 f network has better performance compared to Support Vector Machines and Neural Networks on the prote
44 enomics classification techniques, including Support Vector Machines and Prediction Analysis for Micr
45 ervised classification techniques, including Support Vector Machines and Random Forest, to phase dele
46 ome outperforms state-of-the-art models like Support Vector Machines and Random Forests for gene expr
47 ve supervised-learning algorithms (including support vector machines and random forests) and one-clas
48 ber of features; logistic regression, linear support vector machine, and quadratic support vector mac
49 ast squares discriminant analysis, recursive-support vector machine, and random forests revealed 196
50 stic Regression, the K-Nearest Neighbor, the Support Vector Machine, and the Artificial Neural Networ
51 ng random forests, direct-coupling analysis, support vector machines, and deep networks (stacked deno
54 algorithms such as linear discriminators or support vector machines are limited due to the complexit
57 As a complement of experimental methods, a support vector machine based-method is proposed to ident
58 of topological features and then leverages a support vector machine-based approach to identify predic
63 d (RNA-protein interaction predictor), a new support-vector machine-based method, to predict protein-
64 east squared-discriminant analysis (PLS-DA), support vector machine classification analysis (SVM-C),
68 paper we employ graph-theoretic measures and support vector machine classification to assess, in 12 h
71 ntegrates six sequence-based methods using a support vector machine classifier and has been intensive
75 mparable or greater accuracy than IL17A in a support vector machine classifier of psoriasis and healt
76 of future drinking behavior, we applied the support vector machine classifier that had been trained
78 were subsequently used in conjunction with a support vector machine classifier to create a map of het
82 ed 97.2% accuracy for classification using a support vector machine classifier with radial basis.
83 natory features, which in conjunction with a support vector machine classifier yielded an Az of 0.73
88 he static and dynamic SPHARM approach with a support-vector-machine classifier and compare their clas
89 ng the Random Forest, k Nearest Neighbor and Support Vector Machine classifiers show that POS achieve
90 s to train deep learning neural networks and support vector machine classifiers to predict N-/O-linke
92 features from the DCE and T2w sequences, and support vector machine classifiers were trained on the C
93 linear support vector machine, and quadratic support vector machine classifiers were trained through
94 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the
97 supervised machine learning model, recursive-support vector machine, could classify abiotic and bioti
98 plicated UTI symptoms using random forest or support vector machine coupled with recursive feature el
101 d histologic assessment, we have developed a support vector machine-derived decision algorithm, which
103 nostic or prognostic classifiers modelled by support vector machine, diagonal discriminant analysis,
104 ysis (PLSDA), K nearest neighbors (KNN), and support vector machines discriminant analysis (SVMDA).
106 ial least squares discriminant analysis, and support vector machines discriminated well cassava sampl
107 e refined the features to predictive models (support vector machine, elastic net) and validated those
109 basis of the breast tissue components, and a support vector machine framework was used to develop a s
112 protein sequences as text documents and uses support vector machine in text classification for allerg
113 our commonly used ML methods (random forest, support vector machines, K-nearest neighbor and logistic
117 teria-specific melt curves are identified by Support Vector Machine learning, and individual pathogen
118 discriminating pattern was extracted using a support vector machine-learning algorithm performing an
119 s in multivoxel patterns exploited by linear Support Vector Machine, Linear Discriminant Analysis and
120 orithms including artificial neural network, support vector machine, logistic regression, and a novel
121 ipal component analysis (PCA), least squares-support vector machines (LS-SVM) and PCA-back propagatio
127 high gamma (70-150 Hz) time features with a support vector machine model to classify individual word
128 aid of Random Forest in feature selection, a support vector machine model was created to predict targ
129 igns these peptides probability scores using Support Vector Machine model, whose feature set includes
132 s findings insofar as both random forest and support vector machine models relied on alpha-band activ
134 to demonstrate the method, random forest and support vector machine models were trained on within-par
136 t, multivariate adaptive regression splines, support vector machine, naive Bayes' classification, and
137 random forests combined hierarchically with support vector machines or logistic regression (LR), and
138 (PLSR), principal component analysis with a support vector machine (PCA + SVM) and principal compone
142 odel, Multi-Layer Perceptron Neural Network, Support Vector Machine, Random Forest, and LASSO regress
143 rning methods including logistic regression, support vector machine, random forest, Gaussian naive Ba
144 n compared with XGBoost, k-nearest neighbor, support vector machine, random forest, logistic regressi
145 ith 3 different machine-learning algorithms (support vector machines, random forests, and artificial
147 lood pressure prediction method based on the support vector machine regression (SVR) algorithm to sol
149 performance of neural nets, random forests, support vector machines (regression) and ridge regressio
151 r computational statistical algorithms (e.g. support vector machines, ridge regression), holds much p
152 Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machi
154 Square Discrimination Analysis "PLS-DA" and Support Vectors Machines "SVM"), was able to discriminat
157 onal variables for noncorrected (NC) data by Support Vector Machine (SVM) algorithm, a computer metho
162 pport vector machine ("tarSVM"), that uses a Support Vector Machine (SVM) and a new score the normali
163 a-driven approach by using machine learning (Support Vector Machine (SVM) and Deep Learning (DL)) to
164 mber of classifiers (multi-layer perceptron, Support Vector Machine (SVM) and Dempster-Shafer ensembl
165 y: decision tree, artificial neural network, support vector machine (SVM) and k-nearest neighbor clas
166 ve features elimination (RFE) technique with support vector machine (SVM) and logistic regression).
167 alysis (PLS-DA), k-nearest neighbors (k-NN), support vector machine (SVM) and Random Forest (RF) were
168 o NMP-like populations using a purpose-built support vector machine (SVM) based on the embryo CLE and
173 DSM-5 AUD) features using supervised, Linear Support Vector Machine (SVM) classifier to test which fe
178 g (TMT) quantitative proteomics approach and Support Vector Machine (SVM) cluster analysis of three c
180 spectral dimensionality (1203 features) and support vector machine (SVM) for classification with 95%
181 lassification models, and is as efficient as support vector machine (SVM) for classifying 1000 Genome
184 ibility phenotypes were separated by using a support vector machine (SVM) machine learning algorithm.
187 ral features were used to establish a linear support vector machine (SVM) model of sixteen diagnostic
188 while the prediction accuracy of edge-based support vector machine (SVM) model was poorer, due to th
191 9 cancer types, to build more than 2 x 10(8) Support Vector Machine (SVM) models for reconstructing a
193 Partial least squares regression (PLS) and support vector machine (SVM) regression methods were use
194 nown answers and open questions using linear support vector machine (SVM) resulted in an above-chance
196 the functional families in CATH; building a support vector machine (SVM) to automatically assign dom
197 i-layer perceptron (MLP) neural network, and support vector machine (SVM) to create an intelligent cl
199 r size and histological type) are fed into a support vector machine (SVM) to generate the final predi
200 tion-alternating least squares (MCR-ALS) and support vector machine (SVM) to quantify biomolecular di
202 able predictive performance against survival support vector machine (SVM) using significantly fewer g
203 ase studies, accuracy exceeds an alternative support vector machine (SVM) voting model in most situat
206 st square-discriminant analysis (PLS-DA) and support vector machine (SVM) were applied to the results
207 optimal set of features and train them using Support Vector Machine (SVM) with linear kernel to selec
208 ly, two-class classifiers are designed using support vector machine (SVM) with radial basis function
209 resulting from random forest (RF), nonlinear support vector machine (SVM), and deep neural network (D
210 such as Principal Component Analysis (PCA), Support Vector Machine (SVM), and Random Forest (RF).
211 Each descriptor is used to train a separate support vector machine (SVM), and results are combined b
212 ast-squares discriminant analysis (sPLS-DA), support vector machine (SVM), and SVM classification tre
213 e mould recognition methods were built using support vector machine (SVM), back-propagation neural ne
214 s using machine learning models that include support vector machine (SVM), boosted regression tree (B
217 (EBMC)) and three regression methods (i.e., Support Vector Machine (SVM), Logistic Regression (LR),
218 regression (LR), k-nearest neighbor (k-NN), support vector machine (SVM), random forest (RF), and de
219 n (SSR), Stochastic gradient boosting (SGB), support vector machine (SVM), Relevance vector machine (
227 tive accuracy, which are then presented into support vector machines (SVM) and random forests (RF).
228 was developed through the integration of the support vector machines (SVM) classifier and ensemble le
230 ndom walks with an ensemble of probabilistic support vector machines (SVM) classifiers, and we show t
231 developed a machine learning approach using support vector machines (SVM) for automatic VP placement
234 neural networks (NN), fuzzy models (FM), and support vector machines (SVM) to predict physicochemical
236 ), Quadratic Discriminant Analysis (QDA) and Support Vector Machines (SVM), analyzed in the biofinger
237 ), Quadratic Discriminant Analysis (QDA) and Support Vector Machines (SVM), coupled with dimensionali
238 fferent classifiers: probabilistic bayesian, Support Vector Machines (SVM), deep neural network, deci
239 chniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoo
246 e-of-interest (VOI) and voxel-based (using a support vector machine [SVM] approach) (18)F-FDG PET ana
247 ML algorithms (ANN, random forest [RF], and support vector machine [SVM]) using multiple combination
250 g random forests (RFs), elastic net (ELNET), support vector machines (SVMs) and boosted trees in comb
251 hniques: principal component analysis (PCA), support vector machines (SVMs) and hierarchical cluster
252 ogistic regression (LR), random forest (RF), support vector machines (SVMs) and multilayer perceptron
254 are ubiquitous in pattern recognition, with support vector machines (SVMs) being the best known meth
257 s (PCA), linear discriminant analysis (LDA), support vector machines (SVMs), and artificial neural ne
258 ning the input sequences against an array of Support Vector Machines (SVMs), each examining the relat
259 rformance of three machine learning methods (support vector machines (SVMs), multilayer perceptrons (
261 of our system is operated by an ensemble of support vector machines (SVMs), where each SVM is traine
262 diction engine is operated by an ensemble of support vector machines (SVMs), with each SVM trained on
265 amework that utilises a nearest-neighbour or support vector machine system, to integrate heterogeneou
266 iant filtering pipeline, targeted sequencing support vector machine ("tarSVM"), that uses a Support V
267 -based features are then used to construct a support vector machine that can be used for accurate pre
268 ar relationship between voxels judged by the support vector machine to be highly infiltrated and subs
269 el two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidat
270 ly, we classify stimulus orientation using a support vector machine to learn a linear classifier on t
271 n a supervised learning procedure to train a support vector machine to predict the cleavability of 3.
273 ssible deep extensions and high-order kernel support vector machines to predict major histocompatibil
275 robust supervised classification algorithm (Support Vector Machine) to identify characters from sent
276 ive matrix factorization, cluster 'fitness', support vector machine) to resolve rare and common cell-
278 e developed a classifier, GlyStruct based on support vector machine, to predict glycated and non-glyc
283 squares-discriminant analysis (sPLS-DA), and support vector machine tree type entropy (SVMtreeH), are
284 riable selection, partial least squares, and support vector machines using the radial basis function
286 Logistic regression, random forest, and support vector machines were used as candidate methods t
289 icial neural networks; LS-SVM, least squares support vector machine) were applied to building the cal
290 stacked denoising autoencoders combined with support vector machines) were our best performing deep n
291 beats are used as inputs for one-versus-one support vector machine, which is conducted in form of 10
293 rve: random forest (0.782), XGBoost (0.781), support vector machine with linear kernel (0.780), and [
294 ach using (1) data-driven methods, including Support Vector Machine with Recursive Feature Eliminatio
296 asis function and nu regression (SVM-R(NU)), support vector machines with polynomial kernel and epsil
297 kernel and epsilon regression (SVM-P(EPS)), support vector machines with polynomial kernel and nu re
298 stochastic search variable selection (SSVS), support vector machines with radial basis function and e
299 unction and epsilon regression (SVM-R(EPS)), support vector machines with radial basis function and n
300 ods (logistic regression, random forest, and support vector machines) yielded similar results, and a