コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 Markov models) and discriminative learning (support vector machines).
2 ods (logistic regression, decision trees and support vector machines).
3 discriminative methods (lasso regression and support vector machines).
4 -predictor, MetaPred2CS, which is based on a support vector machine.
5 re for each residue to be interfacial with a support vector machine.
6 a one-dimensional Bayesian classifier with a support vector machine.
7 , and used by a well-matched one-versus-rest support vector machine.
8 lity and hydrophobicity) are used to train a support vector machine.
9 forest, naive Bayes, K-nearest neighbors and support vector machine.
10 e feature vectors which were used to train a support vector machine.
11 method is based on supervised learning using support vector machines.
12 comparable to that of common methods such as Support Vector Machines.
13 eria to build an siRNA design algorithm with support vector machines.
14 e multivariate searchlight analysis based on support vector machines.
15 ictive techniques, such as random forest and support vector machines.
20 or a low confidence class assignment by the support vector machine algorithm at 1 or both sites are
21 een 94.1% and 100%, with the outcomes of the Support Vector Machine algorithm being identical to thos
23 d classification method was established on a support vector machine algorithm, and the reference stan
25 model was developed using random forest and support vector machine algorithms and was then applied t
26 Raman microspectroscopy in combination with support vector machines allow an identification of impor
29 ed (hierarchical clustering) and supervised (support vector machine) analyses of these different dist
31 excellent classification and accuracy, while support vector machine analysis leads to the quantificat
36 re used to train two distinct classifiers, a support vector machine and an easy to interpret exhausti
37 that our method performed as efficiently as support vector machine and Naive Bayesian and outperform
39 rs with comparable accuracy were selected by support vector machine and regression models and include
40 twork on the lincRNA data sets compared with support vector machine and traditional neural network.
41 as evaluated by radial basis function kernel support vector machines and 10-fold cross validation yie
42 hmark our method against various regression, support vector machines and artificial neural network mo
43 d and compared machine learning classifiers (support vector machines and conditional random fields) o
44 ypes and disease pathway genes compared with support vector machines and Label Propagation in cross-v
45 f network has better performance compared to Support Vector Machines and Neural Networks on the prote
46 pplied machine learning techniques including support vector machines and neural networks to identify
47 o-stage classification system comprising two support vector machines and one linear discriminant anal
48 enomics classification techniques, including Support Vector Machines and Prediction Analysis for Micr
49 ervised classification techniques, including Support Vector Machines and Random Forest, to phase dele
50 chine learning approaches and find that both support vector machines and random forests (RFs) can pro
51 ome outperforms state-of-the-art models like Support Vector Machines and Random Forests for gene expr
53 R (partial least squares, random forest, and support vector machine), and the most predictive model w
54 g approaches--a naive Bayes classifier and a support vector machine, and a random forest model--was l
55 ast squares discriminant analysis, recursive-support vector machine, and random forests revealed 196
59 tion methods such as logistic regression and support vector machine approaches may be suboptimal.
61 As a complement of experimental methods, a support vector machine based-method is proposed to ident
62 of topological features and then leverages a support vector machine-based approach to identify predic
64 ons in sample and environmental conditions), support vector machine-based least-squares nonlinear reg
65 dy in Arabidopsis and created an integrative support vector machine-based localization predictor call
68 eventual clinical translation, we also apply support vector machine-based multivariate pattern analys
74 d (RNA-protein interaction predictor), a new support-vector machine-based method, to predict protein-
75 datasets, random forests are outperformed by support vector machines both in the settings when no gen
76 ested by a large body of literature to date, support vector machines can be considered "best of class
78 east squared-discriminant analysis (PLS-DA), support vector machine classification analysis (SVM-C),
79 odium and strontium were used to construct a support vector machine classification model, obtaining a
80 paper we employ graph-theoretic measures and support vector machine classification to assess, in 12 h
82 ntegrates six sequence-based methods using a support vector machine classifier and has been intensive
85 were subsequently used in conjunction with a support vector machine classifier to create a map of het
87 using geometric criteria, we have trained a support vector machine classifier to predict the likelih
90 ed 97.2% accuracy for classification using a support vector machine classifier with radial basis.
91 natory features, which in conjunction with a support vector machine classifier yielded an Az of 0.73
92 ence: a position weight matrix approach with support vector machine classifier, and RELIEF attribute
97 ng the Random Forest, k Nearest Neighbor and Support Vector Machine classifiers show that POS achieve
99 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the
101 landscape of a model plant genome, we used a support vector machine computational algorithm trained o
102 supervised machine learning model, recursive-support vector machine, could classify abiotic and bioti
103 d histologic assessment, we have developed a support vector machine-derived decision algorithm, which
105 nostic or prognostic classifiers modelled by support vector machine, diagonal discriminant analysis,
106 ysis (PLSDA), K nearest neighbors (KNN), and support vector machines discriminant analysis (SVMDA).
108 ial least squares discriminant analysis, and support vector machines discriminated well cassava sampl
109 approach called, Doubly Optimized Calibrated Support Vector Machine (DOC-SVM), concurrently optimizes
110 e refined the features to predictive models (support vector machine, elastic net) and validated those
112 basis of the breast tissue components, and a support vector machine framework was used to develop a s
113 niques such as relevance vector machines and support vector machines have been applied to predictive
114 machine learning approach, Hybrid huberized support vector machine (HH-SVM) to prioritize phospholip
115 uated different machine learning strategies (support vector machines, hidden Markov model and decisio
117 o, and has been shown to be superior to, the support vector machine in situations that are fundamenta
118 protein sequences as text documents and uses support vector machine in text classification for allerg
121 nsfer is slightly more discriminating than a support vector machine learner using profiles and predic
124 teria-specific melt curves are identified by Support Vector Machine learning, and individual pathogen
125 discriminating pattern was extracted using a support vector machine-learning algorithm performing an
126 orithms including artificial neural network, support vector machine, logistic regression, and a novel
127 ipal component analysis (PCA), least squares-support vector machines (LS-SVM) and PCA-back propagatio
133 high gamma (70-150 Hz) time features with a support vector machine model to classify individual word
134 ned and tested on a linear regression model, support vector machine model, and neural network model,
135 igns these peptides probability scores using Support Vector Machine model, whose feature set includes
137 t, multivariate adaptive regression splines, support vector machine, naive Bayes' classification, and
138 ive tumor classification methods such as the support vector machine often appear like a black box not
139 ed, machine learning (ML) algorithms, namely support vector machine or random forest (RF) classifiers
140 random forests combined hierarchically with support vector machines or logistic regression (LR), and
142 In leave-one-out cross-validation using support vector machines- or top-scoring gene pair classi
143 mpared to other state-of-the-art models like Support Vector Machine (p=0.03, p=0.13, and p<0.001) and
144 ination (i.e., higher AUCs) when compared to Support Vector Machine (p=0.38, p=0.29, and p=0.047) and
145 (PLSR), principal component analysis with a support vector machine (PCA + SVM) and principal compone
148 rning methods including logistic regression, support vector machine, random forest, Gaussian naive Ba
149 ith 3 different machine-learning algorithms (support vector machines, random forests, and artificial
151 y of the feature selection methods including support vector machine recursive feature elimination (SV
152 m the well-known feature selection method of Support Vector Machine Recursive Feature Elimination and
153 y parameterized machine learning algorithms: Support Vector Machine, Recursive Partitioning, Random F
154 of microarray data with Tanimoto kernel for support vector machine reduces the effect of the choice
155 sets, we have compared the CRF and recursive support vector machines (RSVM) approaches to feature sel
156 Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machi
158 tasets, the probe alignment kernel used with support vector machines significantly improved patient s
159 Square Discrimination Analysis "PLS-DA" and Support Vectors Machines "SVM"), was able to discriminat
162 onal variables for noncorrected (NC) data by Support Vector Machine (SVM) algorithm, a computer metho
165 between people with ASD and controls using a support vector machine (SVM) analytic approach, and to f
166 pport vector machine ("tarSVM"), that uses a Support Vector Machine (SVM) and a new score the normali
167 A) of these two brain measures, using linear support vector machine (SVM) and cross-validation, predi
168 mber of classifiers (multi-layer perceptron, Support Vector Machine (SVM) and Dempster-Shafer ensembl
169 alysis (PLS-DA), k-nearest neighbors (k-NN), support vector machine (SVM) and Random Forest (RF) were
170 n by the popular 1 df chi-squared statistic, support vector machine (SVM) and the random forest (RF)
172 f the FABS algorithm: FABS-SVM that utilizes support vector machine (SVM) as black box, and FABS-Spec
180 molecular properties as features, we trained support vector machine (SVM) classifiers to discriminate
183 hat classification using the Elastic Net and Support Vector Machine (SVM) clearly outperforms competi
184 g (TMT) quantitative proteomics approach and Support Vector Machine (SVM) cluster analysis of three c
188 we present a bioinformatics method based on support vector machine (SVM) learning that identifies se
189 ibility phenotypes were separated by using a support vector machine (SVM) machine learning algorithm.
191 ral features were used to establish a linear support vector machine (SVM) model of sixteen diagnostic
192 nd used this dataset to successfully train a Support Vector Machine (SVM) model that predicts an addi
195 9 cancer types, to build more than 2 x 10(8) Support Vector Machine (SVM) models for reconstructing a
196 o a sample vapor and, when processed using a support vector machine (SVM) pattern recognition algorit
197 Partial least squares regression (PLS) and support vector machine (SVM) regression methods were use
198 nown answers and open questions using linear support vector machine (SVM) resulted in an above-chance
199 the functional families in CATH; building a support vector machine (SVM) to automatically assign dom
200 e developed an algorithm that uses a trained support vector machine (SVM) to determine variants from
202 present study is to explore the novel use of support vector machine (SVM) to predict infarct on a pix
209 st square-discriminant analysis (PLS-DA) and support vector machine (SVM) were applied to the results
210 ovel computational methodology, which uses a support vector machine (SVM) with kmer sequence features
211 optimal set of features and train them using Support Vector Machine (SVM) with linear kernel to selec
212 ne learning algorithms (Naive Bayes (NB) and Support Vector Machine (SVM)) using three different data
213 eural net (multi-layer perception, MLP), and support vector machine (SVM), all of which were created
214 e sequence features were used as inputs to a support vector machine (SVM), allowing the assignment of
215 e mould recognition methods were built using support vector machine (SVM), back-propagation neural ne
217 (EBMC)) and three regression methods (i.e., Support Vector Machine (SVM), Logistic Regression (LR),
218 echanism by building predictive models using Support Vector Machine (SVM), Random Forest (RF) and k-N
219 we compare these two data sets and develop a support vector machine (SVM)-based classifier to discrim
228 aliber differentiation algorithm is based on support vector machines (SVM) and partial least squares
229 igation as an example, we propose the use of support vector machines (SVM) as a nonlinear classificat
230 these curved effects, we propose the use of support vector machines (SVM) as a nonlinear regression
231 was developed through the integration of the support vector machines (SVM) classifier and ensemble le
233 ndom walks with an ensemble of probabilistic support vector machines (SVM) classifiers, and we show t
234 developed a machine learning approach using support vector machines (SVM) for automatic VP placement
236 rtitioning and Regression Trees (RPART), and Support Vector Machines (SVM) in the presence and absenc
238 neural networks (NN), fuzzy models (FM), and support vector machines (SVM) to predict physicochemical
240 chniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoo
249 e-of-interest (VOI) and voxel-based (using a support vector machine [SVM] approach) (18)F-FDG PET ana
251 hniques: principal component analysis (PCA), support vector machines (SVMs) and hierarchical cluster
258 ti-dimensional vector space and then applies support vector machines (SVMs) to measure the separation
260 ning the input sequences against an array of Support Vector Machines (SVMs), each examining the relat
262 rformance of three machine learning methods (support vector machines (SVMs), multilayer perceptrons (
264 of our system is operated by an ensemble of support vector machines (SVMs), where each SVM is traine
265 diction engine is operated by an ensemble of support vector machines (SVMs), with each SVM trained on
270 standard classification techniques [such as support vector machines (SVMs)] in a number of ways: fle
272 amework that utilises a nearest-neighbour or support vector machine system, to integrate heterogeneou
273 iant filtering pipeline, targeted sequencing support vector machine ("tarSVM"), that uses a Support V
274 stering, Bayesian learning and inference and support vector machine tasks to be performed for heterog
275 -based features are then used to construct a support vector machine that can be used for accurate pre
276 ar relationship between voxels judged by the support vector machine to be highly infiltrated and subs
277 el two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidat
280 spectral latent features were then fed into support vector machines to fine-tune the classification
281 arning theory have led to the application of support vector machines to MRI for detection of a variet
282 ssible deep extensions and high-order kernel support vector machines to predict major histocompatibil
283 robust supervised classification algorithm (Support Vector Machine) to identify characters from sent
287 riable selection, partial least squares, and support vector machines using the radial basis function
288 ed machine learning algorithm, hidden Markov support vector machines, was then used to classify the u
289 controls), both relevance vector machine and support vector machine 'weighting factors' (used for mak
292 mong protein superfamilies in SCOP database, support vector machines were trained on four sets of dis
295 icial neural networks; LS-SVM, least squares support vector machine) were applied to building the cal
296 alysis, k-Nearest neighbor, naive Bayes, and support vector machines, when tested on the Microarray Q
297 ision tree, adaboost with decision tree, and support vector machine, which interrogated the intrinsic
298 beats are used as inputs for one-versus-one support vector machine, which is conducted in form of 10
299 eneralization to different test sets, as did support vector machines, whose hyperparameters could not
300 We developed a linear programming based support vector machine with L(1) and joint L(1,infinity)
WebLSDに未収録の専門用語(用法)は "新規対訳" から投稿できます。