戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 -Transformer), and Bayesian Optimization (BO-Transformer).
2  between the fiber and the imprinted optical transformer.
3 ncoder, hidden space embedding, and temporal Transformer.
4 he sex differentiation pathway downstream of transformer.
5 1.0% unassisted and 92.3% assisted by DeepDR-Transformer.
6 appropriate expression of the female form of transformer.
7 lled by the somatic sex differentiation gene transformer.
8 lus influenzae Rd is a gram-negative natural transformer.
9 ocusing on convolutional neural networks and transformers.
10 same computational principle as that used in transformers.
11                         Do humans learn like transformers?
12 gets of the nematode global sexual regulator Transformer 1 (TRA-1), a transcription factor acting at
13 la regulatory proteins Transformer (Tra) and Transformer 2 (Tra2) recruit different members of the SR
14 4; a member of the SR protein family), human transformer 2 beta (hTra2 beta; an exonic splicing enhan
15 sing and is dependent on the function of the transformer 2 gene.
16 ne depends upon the constitutively expressed transformer-2 (tra-2) gene, but does not seem to require
17        The Drosophila sex determination gene transformer-2 (tra-2) is a splicing regulator that affec
18       Among the best studied of these is the transformer-2 (TRA-2) protein which, in combination with
19 ex determination genes transformer (tra) and transformer-2 (tra-2) switch fru splicing from the male-
20 oint mutation in the sex-determination gene, transformer-2 (tra-2), using CRISPR/Cas9 (clustered regu
21                          The splicing factor Transformer-2 (Tra2) is expressed almost ubiquitously in
22       In the male germline of Drosophila the transformer-2 protein is required for differential splic
23  characterized one of the CPT targets, Tra2 (Transformer-2).
24 contributes 25% to emissions, soils 31%, and transformers 21%.
25                                              Transformer 2beta1 (Tra2beta1) is a splicing effector pr
26    A multi-scale sparse temporal autoencoder Transformer (3D-CNN-MSTA-Transformer) is proposed for se
27  to the chart review, Generative Pre-trained Transformer 4 achieved 87.8%-98.8% accuracies for identi
28 duce AOVIFT (adaptive optical vision Fourier transformer)-a machine learning-based aberration sensing
29 ins and small molecules, we introduce Ligand-Transformer, a deep learning method based on the transfo
30                      Furthermore, we use the Transformer, a state-of-the-art machine learning model,
31  is at least partially controlled by somatic transformer activity.
32    Here we introduce GET (general expression transformer), an interpretable foundation model designed
33 o other deep-learning architectures based on Transformer and ConvMixer in three different case studie
34 ng each of the two backbone types with seven transformer and convolution block types.
35                  The sex determination genes transformer and doublesex ensure that hub formation occu
36 eed industry, are sometimes adulterated with transformer and mineral oil as a means of illegally incr
37  and prediction statistics were obtained for transformer and mineral oils.
38                                         Both transformer and recurrent-based foundation models genera
39  machine learning system leveraging a vision transformer and supervised contrastive learning for the
40 et, and two attention-based backbones Vision Transformer and Swin Transformer to localize the importa
41                            Code for the gait transformer and the trained weights are available at .
42 ets compared to current benchmarks like Swin-Transformer and ViT.
43 ybrid model consisting of double encoders of transformers and a deep neural network to learn the rela
44                                              Transformers and building sealants represent the largest
45 cussing the key parameters for the design of transformers and generators as well as the development o
46                    The chemistry achieved by transformers and generators is (ideally) independent of
47 odules, distributing them in two subclasses: transformers and generators, which can be described resp
48 ave encoded DNA sequences into vectors using Transformers and have shown promising results in tasks i
49 inated biphenyls (PCBs), once used widely in transformers and other applications, and 1,1-dichloro-2,
50 we design and test an ensemble of our Vision Transformers and the ConvNeXt, outperforming the state-o
51 tering, available in two versions: one using transformers and the other convolutional neural networks
52 etermine differences in rCBV values between "transformers" and "nontransformers" at defined time poin
53 dom Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian Optimization (BO-Transformer)
54 hanism based on RT-DETR (Real-Time Detection Transformer), and thus the method is coded as HSA-RTDETR
55 are extracted using Adaptive Causal Decision Transformers, and the DCCMSA-DCKTKT model parameters are
56                                Employing the transformer architecture alongside dilated convolutions,
57                  By incorporating a scalable transformer architecture and a diffusion module for buil
58 lly, we demonstrated that TPN2.0 employing a transformer architecture enabled guideline-adhering, phy
59 end pipeline that leverages state of the art transformer architecture in conjunction with QA approach
60    Additionally, we propose a novel WeedSwin Transformer architecture specifically designed to addres
61 quencing model called Cascadia, which uses a transformer architecture to handle the more complex data
62  find the highest performance by combining a transformer architecture with a new modified chaos game
63  a generative adversarial network based on a transformer architecture with self-attention on coarse-g
64 tationS), a deep learning model based on the Transformer architecture, designed to predict chromatin
65                    Based on a variant of the transformer architecture, the model learns to call CNVs
66 sformer, a deep learning method based on the transformer architecture.
67  we modify the GPT(6) (generative pretrained transformer) architecture to model the progression and c
68 eneration process and incorporating advanced transformer architectures and general knowledge, our app
69 ectral encoder integrating convolutional and transformer architectures to project raw MS2 spectra int
70                                      Because Transformers are so successful across a wide variety of
71  identified the sex-specific splicing factor transformer as a functionally significant target of miR-
72 at includes a Dense Associative Memory and a Transformer as two limiting cases.
73 rward-propagated trajectories of the trained transformer, at densities not encountered during trainin
74 rther propose a novel bi-branch masked graph transformer autoencoder (BatmanNet) to learn molecular r
75                       By leveraging a shared transformer backbone, MR-IPT effectively learns universa
76                   AMPDeep fine-tunes a large transformer based model on a small amount of peptides an
77 0 to 3-4), as were PVH-Rank predictions by a transformer-based AI model ["PVPI"].
78                   Furthermore, we apply this transformer-based approach to three additional inference
79 ings underscored the superior performance of transformer-based architectures compared to traditional
80                                         Five transformer-based architectures-Swin Transformer, ViT, P
81              Additionally, the AFB uses Swin Transformer-based attention and deformable convolution-b
82 of 0.9697, outperforming GRU, LSTM, RNN, and transformer-based baselines.
83              Finally, we employ a supervised transformer-based classifier to perform the HSI classifi
84                        First, using BEHRT (a transformer-based deep learning architecture), the embed
85                       Here, we develop a new transformer-based deep learning model called UNADON, whi
86                                 Conclusion A transformer-based deep learning model increased cardiac
87                          CelloType leverages transformer-based deep learning techniques for improved
88 ich integrate features learned from advanced transformer-based deep neural network with cell-specific
89                        By employing a vision transformer-based encoder, it discerns latent image-gene
90 RNAs and human language, we present LAMAR, a transformer-based foundation LAnguage Model for RNA Regu
91 n performance (average AUROC decay of 3% for transformer-based foundation model vs. 7% for count-LR a
92 our experimentation with the proposed Vision Transformer-based framework, it achieves a higher accura
93                  Here, we introduce L-MAP, a transformer-based large language model that is trained o
94 s have existed for decades, the emergence of transformer-based large language models (LLMs) has capti
95 sage passing baselines and recently proposed transformer-based methods on more than 70 node and graph
96 -IPT outperforms both CNN-based and existing transformer-based methods, achieving superior quality ac
97                 Additionally, we developed a transformer-based model (MolBART) to predict all end poi
98  developed T-cell receptor cross (TCRoss), a transformer-based model for large-scale learning.
99               Here, we introduce HLApollo, a transformer-based model for peptide-MHC-I (pMHC-I) prese
100 opment of an interpretable and generalizable transformer-based model that accurately predicts cancer
101                                            A transformer-based model was developed and validated usin
102                                Deep learning transformer-based models using longitudinal electronic h
103                                   Customized transformer-based models were also trained to perform st
104                   Compared with standard and transformer-based models, the model attains better resul
105  of diverse programs using specially trained transformer-based networks and then filtering and cluste
106                                      A novel transformer-based neural network architecture was traine
107                                    Here, the transformer-based neural network model was first pre-tra
108                 Here we develop a recurrent, transformer-based neural network that learns to decode t
109                       Here, we develop a new transformer-based pipeline for end-to-end biomarker pred
110                               DetectYSF uses Transformer-based PLMs (e.g., BERT, RoBERTa) as its back
111 Experimental outcomes revealed that the Swin Transformer-based YOLOv10 model delivered the best overa
112                        A previously trained, transformer-based, deep learning model automatically cat
113 directional encoder representations from the transformers (BERT) approach, we pre-train Ped-BERT via
114 s bidirectional encoder representations from transformers (BERT) pretrained on 1.14 million scientifi
115 o bidirectional encoder representations from transformers (BERT) with different subsets of reports (f
116 g bidirectional encoder representations from transformers (BERT), is introduced to explore AI trends
117 a bidirectional encoder representations from transformers (BERT), the robustly optimized BERT approac
118 ucleotide (FAD/FADH2) redox center acts as a transformer by accepting two electrons from soluble nico
119 xpression of downstream target genes such as transformer by regulating RNA splicing.
120 xpression of downstream target genes such as transformer , by regulating pre-mRNA splicing and mRNA t
121 entum transformation metasurface (i.e., meta-transformer), by fully synergizing intrinsic properties
122 ith contemporary models such as LSTM and the Transformer, C-FGM clearly achieved a higher accuracy (M
123 les efficient photon transport, the photonic transformer can operate with a near-unity conversion eff
124 g models, such as bidirectional encoder from transformer, can incorporate massive unlabeled molecular
125                                 Furthermore, transformer capacity requirement analysis reveals that o
126                       Expression of a female transformer cDNA under the control of a dsf enhancer in
127 revealed that rapaprotin acts as a molecular transformer, changing from an inactive cyclic form into
128                  Chat Generative Pre-trained Transformer (ChatGPT) has demonstrated preliminary initi
129 especially OpenAI Chat Generative Pretrained Transformer (ChatGPT), a large language model.
130 AI) platforms, such as Generative Pretrained Transformer (ChatGPT), have achieved a high degree of po
131 d limitations of Chat Generative Pre-trained Transformer (ChatGPT): its ability to (i) define a core
132 nels to reduce computational costs; The MSTA Transformer classification module consists of four parts
133 tronics, including a single rf amplifier and transformer coil.
134                                  The LM meta-transformer converts red, green and blue illuminations t
135                                   Built on a transformer decoder and trained with causal masking, Tok
136 sion Transformer (ViT), Data-efficient Image Transformer (DeiT), ImageIntern, and Swin Transformer (v
137                        Among these, the Swin Transformer demonstrated the highest performance, achiev
138 CHC profile that differs from doublesex- and transformer-dependent CHC dimorphism.
139                     We also demonstrate that transformer-dependent interactions influence whether XX
140 gerprints, including autoencoder embeddings, transformer embeddings, and topological Laplacians.
141  a novel deep learning model consisting of a transformer encoder layer, a convolutional network backb
142  pathology slides by combining a pre-trained transformer encoder with a transformer network for patch
143 th the existing deep learning models, the MR-Transformer exhibited state-of-the-art performance in pr
144 ore, eliminating the expression of Fru(M) by transformer expression in OCT/tyramine neurons changes t
145      Using the GAL4/UAS system to manipulate transformer expression, we feminized or masculinized dif
146 s takeout are feminized by expression of the Transformer-F protein, male courtship behavior is dramat
147 wer cables, magnetic energy-storage devices, transformers, fault current limiters and motors, largely
148   Bidirectional encoder representations from transformers for biomedical text mining (BioBERT) for sp
149                                 Based on the transformer framework, the model decomposes the traditio
150                For comparison, ClarityNet, a transformer-free variant of the architecture, is present
151 ap, this paper introduces the Focused Reward Transformer (FRT) .
152 1-3p, that has predicted target sites in the transformer gene (Bdtra) required for female sex determi
153                     Transgenic expression of transformer gene during development transforms chromosom
154 rtion of cassette exons from the C. capitata transformer gene into a heterologous tetracycline-repres
155 cally spliced intron from the C. hominivorax transformer gene within the Lshid gene ensures that only
156                      By targeting the medfly transformer gene, we also demonstrate how CRISPR-Cas9 ge
157      By manipulating either the fruitless or transformer genes in the brains of male or female flies,
158  creating a customized generative pretrained transformer (GPT) within the OpenAI GPT Store and search
159 avinci-003 variant of Generative Pre-trained Transformer (GPT)-3) on a range of analogical tasks, inc
160 odels (LLMs), such as generative pre-trained Transformers (GPTs), across various tasks, they often pe
161 ough conventional methods: Random Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian
162 ortable X-ray generator, replacing the heavy transformer had been necessary to generate power with a
163                                              Transformers had a slightly (but not statistically signi
164  phase combinations, such ultra-compact meta-transformer has potential in information storage, nanoph
165 ong the invasion chronosequence of an exotic transformer, Heracleum mantegazzianum, which was replica
166 sNet outperforms VGG16, ResNet50, and Vision Transformer in accuracy, precision, recall, and F1-score
167                     Conversely, knockdown of transformer in chromosomal females eliminates the female
168 s an increasing role compared to traditional transformer in some specific power supply systems.
169  basic fibroblast growth factor, a candidate transformer in Xenopus, caused malformation but not hind
170 as ChatGPT (OpenAI), a generative pretrained transformer, in radiology.
171 ther germline gene, ovo, whose regulation is transformer-independent.
172 rging locations can be supported with modest transformer infrastructure (25 to 50 kVA).
173      Availability: The implementation of GRN-transformer is available at
174                                         Each transformer is used to make predictions of its correspon
175 f the key sex-determination gene tra-2 (tra, transformer) is controlled by a 28-nucleotide repeat ele
176 emporal autoencoder Transformer (3D-CNN-MSTA-Transformer) is proposed for sentiment classification of
177 th gene, fit (female-specific independent of transformer), is not controlled by tra and dsx, suggesti
178  ATGO lies in the utilization of pre-trained transformer language models which can extract discrimina
179 and a single negatively acting receptor, Eye Transformer/Latran (Et/Lat).
180 ing model, DeepECtransformer, which utilizes transformer layers as a neural network architecture to p
181  on a pretrained Local Feature Matching with Transformer (LoFTR) model to verify each component's pre
182 ement SKiM with a knowledge graph built with transformer machine-learning models to aid in interpreti
183               We instead propose a 'cochlear transformer' mechanism to interpret cochlear performance
184      Experimental results using the proposed transformer methodology on audio clips of gunshots show
185  long short-term memory (BiLSTM) model and a Transformer model and can analyze POD5 and FAST5 signal
186                GAI uses machine learning and transformer model architectures to generate useful text,
187 mpared to baseline models, the proposed LSTM-Transformer model for Caltech data shows a significant i
188 /threonine kinases to develop an explainable Transformer model for kinase-peptide interaction predict
189 sting sets, the results demonstrate that the Transformer model optimized by MOEBS significantly outpe
190 anning the years 2020-2050 using the Stacked Transformer model show a progressive decrease in the LAI
191     Here, we introduce SEQUOIA, a linearized transformer model that predicts cancer transcriptomic pr
192 ical representations from a generative music transformer model to predict human brain activity.
193                Here, we introduce RiboTIE, a transformer model-based approach designed to enhance the
194 e Bidirectional Encoder Representations from Transformers model is demonstrated on an analog inferenc
195 e-Bidirectional Encoder Representations from Transformers model was used to encode unstructured text
196 e bidirectional encoder representations from transformers model, our model detects nested entities an
197                   Additionally, it evaluates Transformer models optimized through conventional method
198 node attributes, xSiGra employs hybrid graph transformer models to delineate spatial cell types.
199                Application of fine-tuned NLP transformer models to routine ED visit data to automate
200 ntion-based convolutional neural network and Transformer models, DeepPhosPPI accurately predicts phos
201 ges unsupervised pre-training features and a Transformer module to learn both compound and protein ch
202 correlations, derived from an adaptive graph Transformer module.
203 ing a pre-trained transformer encoder with a transformer network for patch aggregation.
204                                        Gated transformer network models trained with OCT data may be
205                                        Gated transformer network performance was compared with non-de
206                                        Gated transformer network performance was worse for eyes with
207 e graph, (2) utilizing a heterogeneous graph transformer network, and (3) computing relationship scor
208         After the development of the famous "Transformer" network architecture and the meteoric rise
209                                        Gated transformer networks trained and optimized with the othe
210          Combining the advantages of NAS and Transformer networks, local and global information is ca
211 ng technology recently developed for spatial transformer networks, we propose ZephIR, an image regist
212 sanovo, a machine learning model that uses a transformer neural network architecture to translate the
213                           Further leveraging transformer neural networks for the design of phase-only
214                    Here, we demonstrate that Transformer Neural Networks learn atom-mapping informati
215 nd biosynthetic reactions through end-to-end transformer neural networks.
216                   Here, we introduce TECSAS (Transformer of Epigenetics to Chromatin Structural Annot
217  Synechococcus and Aphanizomenon as the main transformers of mercury.
218  BVB's and soya oil samples adulterated with transformer oil and mineral oil were characterised using
219 amples collected from a site contaminated by transformer oil.
220                          By pre-training the transformer on extensive synthetic materials data and fi
221 ct layer outperforms existing deep MPNNs and transformers on QM9.
222 phy (UV-NIL) to fabricate functional optical transformers onto the core of an optical fiber in a sing
223                                              Transformer parental RNAi could be used to produce all m
224                        The imprinted optical transformer probe was used in an actual NSOM measurement
225                 Here, we present Protein Set Transformer (PST), a protein-based genome language model
226                                 The OAM meta-transformer reconstructs different topologically charged
227       ChatGPT's (Chat Generative Pre-Trained Transformer) remarkable capacity to generate human-like
228 alysis reveals that only 1.8% of residential transformers require upgrades, while over 80% of commerc
229  than more resource-intensive models such as transformers, resulting in faster inference times.
230 ive to that of the germline (by manipulating transformer) reveal a dominant role for the soma in regu
231 al experimental data to fine-tune regression transformer (RT) models for generative molecular design
232  architecture, is presented to highlight the transformer's impact.
233                                 In contrast, transformers showed a continuous increase in rCBV up to
234 nsplantations show that a spatially distinct transformer signal emanates from tissues other than the
235                                              Transformer splicing regulatory proteins determine the s
236 equential, opening up the opportunity to use transformers, state-of-the-art deep learning architectur
237  AS-OCT images of 5746 eyes and (2) a Vision Transformer supervised model coupled to the autoencoder
238  we introduce a set of models including a 3D transformer (SwinUNetR) and a novel 3D self-supervised l
239   Even though higher macro-F1 is achieved by transformer systems, based on the findings, a meticulous
240 one female-specific isoforms of T. castaneum transformer (Tctra) were identified.
241  vein using the linear variable differential transformer technique.
242 ime points were also significantly higher in transformers than in nontransformers.
243 t Coulomb drag is analogous to an electrical transformer that functions at zero frequency.
244 around a three-dimensional multistage vision transformer that operates on Fourier domain embeddings.
245 ere, we present Graphormer-RT, a novel graph transformer that performs the first single-model method-
246 methods, such as CNN, LSTM, BiLSTM, GRU, and Transformer, the hyperparameter optimization of these mo
247                       By applying the vision transformer to a lung-tumour dataset, we identified and
248 gy to encode protein features by pre-trained transformer to address the issue of scarce labelled data
249 odify a recently proposed architecture named Transformer to capture the interactions between the two
250                    Cryo2Struct utilizes a 3D transformer to identify atoms and amino acid types in cr
251 -based backbones Vision Transformer and Swin Transformer to localize the important regions of insect
252 metry measurements is to design an impedance transformer to match the impedance of the load to the ch
253 nostic model, HERPAI, was developed based on Transformer to predict risk of invasive disease-free sur
254  operating in the nonlinear regime acts as a transformer to simultaneously boost the energy and brigh
255  novel sequences we use self-attention-based transformers to capture global patterns in sequences.
256  we designed new architectures for these two Transformers to enable Grad-CAM to be applicable in such
257 ls that represent residues as frames and use transformers to learn relative positions.
258 BetaAlign draws on NLP techniques and trains transformers to map a set of unaligned biological sequen
259 ly, WaveCastNet also generalizes better than transformers to rare and critical seismic scenarios, suc
260 odule) and image-based deep learning (DeepDR-Transformer), to provide individualized diabetes managem
261 ntegrates convolutional neural networks with transformer, to predict effects of genetic variations on
262       They found that lipoxygenases are like Transformer toys, being converted from one enzyme type t
263      We show that the sex determination gene transformer (tra) acts in the developing nervous system,
264 phila suppressor-of-white-apricot (SWAP) and Transformer (Tra) alternative splicing factors, we found
265 the role of the sexual differentiation genes transformer (tra) and doublesex (dsx) in regulating the
266 ed and include the hermaphroditic (her), the transformer (tra) and feminization (fem) mutations.
267 nown sex determination pathway, specifically transformer (tra) and its downstream target doublesex (d
268 show that the Drosophila regulatory proteins Transformer (Tra) and Transformer 2 (Tra2) recruit diffe
269  that the Drosophila sex determination genes transformer (tra) and transformer-2 (tra-2) switch fru s
270 entify a key role for sex determination gene transformer (tra) in regulating the male-female differen
271 ) within the regulated 3' splice site of the transformer (tra) pre-mRNA.
272 RA-2) protein which, in combination with the transformer (TRA) protein, directs sex-specific splicing
273 tein to continually instruct the target gene transformer (tra) to make its feminizing product, TRA-F.
274 ethal (Sxl) acting on its switch-gene target transformer (tra) to produce Tra(F) protein.
275 he alternative splicing of Sex-lethal (Sxl), transformer (tra), and Ultrabithorax (Ubx).
276 e sex determination pathway through the gene transformer (tra), the expression of roX1 is independent
277 npf (ms-npf) expression is controlled by the transformer (tra)-dependent sex-determination pathway.
278 morphism by controlling only the switch gene transformer (tra).
279 ination gene hierarchy, Sex-lethal (Sxl) and transformer (tra).
280 lf-supervised representations using a Vision Transformer, trained on 1.7 M histology images across 23
281 tide repeat elements, which are required for transformer/transformer-2-mediated splicing of the femal
282 ile BinaryET is trained from scratch using a transformer-type architecture.
283 t was highest for EfficientNet V2-S and Swin Transformer V2-B at 94.63% and 94.34%, respectively.
284 guage processing, the Generative Pre-trained Transformer version 2 (GPT-2), is shown to generate mean
285 ge Transformer (DeiT), ImageIntern, and Swin Transformer (versions v1 and v2).
286 rained foundational model combining a vision transformer (ViT) image encoder with PubMedBERT text enc
287   By leveraging satellite imagery and Vision Transformer (ViT) models, SolarScope achieves an Area un
288 ntNetB6, EfficientNetB7, DenseNet201, Vision Transformer (ViT), Data-efficient Image Transformer (Dei
289                                     A Visual Transformer (ViT)-based DL model that integrates informa
290  including ResNet-50, ResNet-101, and Vision Transformer (ViT).
291 n medical and veterinary diagnostics, Vision Transformers (ViT) remain relatively unexplored in this
292    Five transformer-based architectures-Swin Transformer, ViT, PVT, MobileViT, and Axial Transformer-
293 hybrid network of EfficientNet-B4 and Swin-B transformer was developed to classify patients according
294  Transformer, ViT, PVT, MobileViT, and Axial Transformer-were systematically evaluated, resulting in
295                      First, we propose Blend-Transformer, which introduces Group External Attention (
296                      dsx splicing depends on transformer, which is also alternatively spliced such th
297 erforms LSTM whose average MAPE is 4.34% and Transformer whose average MAPE is 5.42%).
298 driven tool leveraging generative pretrained transformers with retrieval-augmented generation technol
299 latory inputs from the soma (as regulated by transformer) with those from the germline (involving ovo
300  compared with a reference multistage fusion transformer without these novel strategies.

 
Page Top