コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 issociation, electron transfer dissociation, decision tree).
2 oftware was used to construct a risk-benefit decision tree.
3 esponds to the implementation of a molecular decision tree.
4 uture sequences; instead, one must prune the decision tree.
5 as values of individual branching steps in a decision tree.
6 ed in 2 formats, a standard "x/y" list and a decision tree.
7 hical decision making in a task with a small decision tree.
8 based on sequential detection of SNPs and a decision tree.
9 cally accumulates upon advancing through the decision tree.
10 d number of teeth identified AMI in the CART decision trees.
11 interlinked web pages, online databases, and decision trees.
12 ve methods significantly improve accuracy of decision trees.
13 performing a formal decision analysis using decision trees.
14 l user interface for building and evaluating decision trees.
15 ng regression based on conditional-inference decision trees.
19 ate methods backpropagation neural networks, decision tree, adaboost with decision tree, and support
21 s, we designed and embedded a data-dependent decision tree algorithm (DT) to make unsupervised, real-
22 ared the prediction performances of a single decision tree algorithm C4.5 with six different decision
23 method, a pattern recognition algorithm (the decision tree algorithm C4.5) is then used to construct
26 Our machine learning approach based on a decision tree algorithm successfully provided several se
27 ces from the UniProt database and employed a decision tree algorithm to identify the sequence determi
28 Self-Organizing Global Ranking algorithm, a decision tree algorithm, and a support vector machine al
29 MediBoost significantly outperformed current decision tree algorithms in 11 out of 13 problems, givin
32 nce-based attributes of the graph to train a decision tree allowed us to increase accuracy of SNP cal
34 Furthermore, the techniques utilized for decision tree analysis have broad range of applicability
35 olar screening hit, PD012527, use of Topliss decision tree analysis led to the discovery of the nanom
42 functional imaging with PET/CT and recursive decision-tree analysis to combine measurements of tumor
44 RBF Neural Nets, MLP Neural Nets, Bayesian, Decision Tree and Random Forrest methods have been used
46 d the relationships among GO attributes with decision trees and Bayesian networks, using the annotati
47 iBoost's performance to that of conventional decision trees and ensemble methods on 13 medical classi
48 eviously-described genefinder which utilizes decision trees and Interpolated Markov Models (IMMs).
50 ere, two different machine-learning methods, decision trees and support vector machines (SVMs), are a
52 ort vector machines, hidden Markov model and decision tree) and developed an algorithm (SpiderP) for
53 eural networks, decision tree, adaboost with decision tree, and support vector machine, which interro
54 ard machine-learning algorithms-naive Bayes, decision trees, and artificial neural networks-can be ap
56 based on a simple exclusion mechanism and a decision tree approach using the C4.5 data-mining algori
57 etting, this study shows that the use of the decision tree approach with the option of substituting c
64 ilable treatment modalities and to provide a decision tree as a general guide for clinicians to aid i
65 ox methods including logistic regression and decision trees as well as less interpreitable techniques
66 work using machine learning method, in which decision tree based classifier ensembles coupled with fe
67 ome by a novel mathematical approach using a decision tree based on genes ranked by increasing varian
69 ision tree algorithm C4.5 with six different decision-tree based classifier ensembles (Random forest,
70 vaccination program in the United States, a decision tree-based analysis was conducted with populati
71 ction results indicate that our (non-linear) decision tree-based classifier can predict operons in a
74 g three types of machine-learning technique: decision trees (C4.5), Naive Bayes and Learning Classifi
75 but uses a non-parametric procedure based on decision trees (called 'jump trees') to reconstruct the
80 iscriminant analysis, 3-nearest-neighbor and decision trees (CART)-using both synthetic and real brea
82 healthy men were used to train and develop a decision tree classification algorithm that used a nine-
84 iminating rules generated with the automated decision tree classifier allowed for discrimination betw
85 nsemble classifiers always outperform single decision tree classifier in having greater accuracies an
87 , by using the enriched rules in alternating decision tree classifiers, we are able to determine the
89 baseline, RA prevalence was higher using the decision tree compared with the list approach (63% versu
90 This study evaluated the performance of the decision tree compared with the list approach in the asc
94 n of redundant sequences; (ii) clustering by decision trees coupled with analysis of ClustalW alignme
96 -CID/ETD workflow combines the best possible decision tree dependent MS(2) data acquisition modes cur
101 racy of two fingerprint-based classifiers, a decision tree (DT) algorithm and a medoid classification
105 als and coral reef ecosystems, and propose a decision tree for incorporating assisted evolution into
112 show better generalization performance, but decision trees have the advantage of generating interpre
115 hod consistently improves the performance of decision trees in predicting peptide-MHC class I binding
122 ine, and outperforms other learning methods (decision trees, k-nearest neighbour and naive Bayes).
123 ing set is provided to an adaptively boosted decision tree learner to create a classifier for predict
124 e phase) was investigated in a pragmatic and decision tree-like performance evaluation strategy.
125 ian and outperformed other learning methods (decision trees, linear discriminate analysis, and k-near
131 ring the training and feature selection, the decision tree method achieves 82.6% overall accuracy wit
134 neural net approach and also in contrast to decision tree methods described recently, the pharmacoph
138 70% overall classification accuracy and the decision tree model reveals insights concerning the comb
148 und to be a cheaper diagnostic strategy in a decision tree model when United Kingdom-based costs were
149 ir levels of enzymatic activity based on the decision tree model which showed GPX and total protein a
157 Machine learning optimization resulted in a decision tree modeling that predicted T2D incidence with
158 ity in emergency admissions, using empirical decision Tree models because they are intuitive and may
161 opic Roux-en-Y gastric bypass and to develop decision tree models to optimize diagnostic accuracy.
165 We used five statistical learning methods (decision trees, neural networks, support vector regressi
166 he samples were classified using the oblique decision tree (OC1) algorithm to provide a procedure for
167 tioning analysis (RPA), a method of building decision trees of significant prognostic factors for out
168 solve subtasks, and maladaptively prune the decision trees of subtasks in a reflexive manner upon en
172 ression tree analysis was used to generate a decision tree predicting 28-day mortality based on a com
176 ulation of single cells using a data-derived decision tree representation that can be applied to cell
179 of support vector machines, neural networks, decision trees, self-organizing feature maps (SOFM) and
181 The CT + PET strategy in the conservative decision tree showed a saving of $1154 per patient witho
182 e CT + PET strategy in the less conservative decision tree showed a savings of $2267 per patient but
183 nces, and secondary structural features in a decision tree that categorizes mRNAs into those with pot
184 ntation of the discriminant functions into a decision tree that constitutes a new program called Firs
186 ur current understanding of those tests, the decision tree that is involved in this process, and an e
188 dicted with a high predictive accuracy using decision trees that include four to six readily attainab
189 ediBoost, a novel framework for constructing decision trees that retain interpretability while having
190 of an established machine learning method, a decision tree, that can rigorously classify sequences.
192 se metrics are used to create a RandomForest decision tree to classify the sequencing data, and PARTI
193 ecursive partitioning was used to generate a decision tree to determine the likelihood that a bactere
194 in conjunction with a testing and treatment decision tree to estimate the cost-effectiveness of birt
196 s and different genetic causes, we propose a decision tree to guide clinical genetic testing in patie
197 herapeutic challenges are highlighted, and a decision tree to guide treatment in patients with early
199 annotation comparison results, we designed a decision tree to obtain a consensus functional annotatio
200 alysis algorithms along with a sophisticated decision tree to place reads into "best fit" functional/
202 Our objective was to develop a user-friendly decision tree to predict which organisms are ESBL produc
203 HE score are steps toward the development of decision trees to define EoE subpopulations and, consequ
206 charomyces cerevisiae by using probabilistic decision trees to integrate multiple types of data, incl
208 Inductive Logic Programming data mining, and decision trees to produce prediction rules for functiona
209 the findings of this review into a pragmatic decision tree, to guide the further management of the in
210 n of a particular machine learning approach, decision trees, to the tasks of predicting a protein's s
213 as random forests, which relies on a set of decision trees trained using length, sequence and other
216 The resulting trees are of the same type as decision trees used throughout clinical practice but hav
218 riteria were applied cumulatively, while the decision tree was applied cross-sectionally using either
225 d accuracy (95% confidence intervals) of the decision tree were 82.4% (63.9%-93.9%), 0% (0%-10.4%), a
229 chine (SVM), k-Nearest Neighbors (k-NN), and Decision Tree, were employed to study the sophisticated
230 tion accuracies compared to that of a single decision tree when applied to a pancreatic cancer proteo
231 In this economic evaluation, we combined a decision tree with a Markov state transition model to co
235 ecision nodes finds a large number of unique decision trees with an average sensitivity and specifici
WebLSDに未収録の専門用語(用法)は "新規対訳" から投稿できます。