コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 -Transformer), and Bayesian Optimization (BO-Transformer).
2 between the fiber and the imprinted optical transformer.
3 ncoder, hidden space embedding, and temporal Transformer.
4 he sex differentiation pathway downstream of transformer.
5 1.0% unassisted and 92.3% assisted by DeepDR-Transformer.
6 appropriate expression of the female form of transformer.
7 lled by the somatic sex differentiation gene transformer.
8 lus influenzae Rd is a gram-negative natural transformer.
9 ocusing on convolutional neural networks and transformers.
10 same computational principle as that used in transformers.
11 Do humans learn like transformers?
12 gets of the nematode global sexual regulator Transformer 1 (TRA-1), a transcription factor acting at
13 la regulatory proteins Transformer (Tra) and Transformer 2 (Tra2) recruit different members of the SR
14 4; a member of the SR protein family), human transformer 2 beta (hTra2 beta; an exonic splicing enhan
16 ne depends upon the constitutively expressed transformer-2 (tra-2) gene, but does not seem to require
19 ex determination genes transformer (tra) and transformer-2 (tra-2) switch fru splicing from the male-
20 oint mutation in the sex-determination gene, transformer-2 (tra-2), using CRISPR/Cas9 (clustered regu
26 A multi-scale sparse temporal autoencoder Transformer (3D-CNN-MSTA-Transformer) is proposed for se
27 to the chart review, Generative Pre-trained Transformer 4 achieved 87.8%-98.8% accuracies for identi
28 duce AOVIFT (adaptive optical vision Fourier transformer)-a machine learning-based aberration sensing
29 ins and small molecules, we introduce Ligand-Transformer, a deep learning method based on the transfo
32 Here we introduce GET (general expression transformer), an interpretable foundation model designed
33 o other deep-learning architectures based on Transformer and ConvMixer in three different case studie
36 eed industry, are sometimes adulterated with transformer and mineral oil as a means of illegally incr
39 machine learning system leveraging a vision transformer and supervised contrastive learning for the
40 et, and two attention-based backbones Vision Transformer and Swin Transformer to localize the importa
43 ybrid model consisting of double encoders of transformers and a deep neural network to learn the rela
45 cussing the key parameters for the design of transformers and generators as well as the development o
47 odules, distributing them in two subclasses: transformers and generators, which can be described resp
48 ave encoded DNA sequences into vectors using Transformers and have shown promising results in tasks i
49 inated biphenyls (PCBs), once used widely in transformers and other applications, and 1,1-dichloro-2,
50 we design and test an ensemble of our Vision Transformers and the ConvNeXt, outperforming the state-o
51 tering, available in two versions: one using transformers and the other convolutional neural networks
52 etermine differences in rCBV values between "transformers" and "nontransformers" at defined time poin
53 dom Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian Optimization (BO-Transformer)
54 hanism based on RT-DETR (Real-Time Detection Transformer), and thus the method is coded as HSA-RTDETR
55 are extracted using Adaptive Causal Decision Transformers, and the DCCMSA-DCKTKT model parameters are
58 lly, we demonstrated that TPN2.0 employing a transformer architecture enabled guideline-adhering, phy
59 end pipeline that leverages state of the art transformer architecture in conjunction with QA approach
60 Additionally, we propose a novel WeedSwin Transformer architecture specifically designed to addres
61 quencing model called Cascadia, which uses a transformer architecture to handle the more complex data
62 find the highest performance by combining a transformer architecture with a new modified chaos game
63 a generative adversarial network based on a transformer architecture with self-attention on coarse-g
64 tationS), a deep learning model based on the Transformer architecture, designed to predict chromatin
67 we modify the GPT(6) (generative pretrained transformer) architecture to model the progression and c
68 eneration process and incorporating advanced transformer architectures and general knowledge, our app
69 ectral encoder integrating convolutional and transformer architectures to project raw MS2 spectra int
71 identified the sex-specific splicing factor transformer as a functionally significant target of miR-
73 rward-propagated trajectories of the trained transformer, at densities not encountered during trainin
74 rther propose a novel bi-branch masked graph transformer autoencoder (BatmanNet) to learn molecular r
79 ings underscored the superior performance of transformer-based architectures compared to traditional
88 ich integrate features learned from advanced transformer-based deep neural network with cell-specific
90 RNAs and human language, we present LAMAR, a transformer-based foundation LAnguage Model for RNA Regu
91 n performance (average AUROC decay of 3% for transformer-based foundation model vs. 7% for count-LR a
92 our experimentation with the proposed Vision Transformer-based framework, it achieves a higher accura
94 s have existed for decades, the emergence of transformer-based large language models (LLMs) has capti
95 sage passing baselines and recently proposed transformer-based methods on more than 70 node and graph
96 -IPT outperforms both CNN-based and existing transformer-based methods, achieving superior quality ac
100 opment of an interpretable and generalizable transformer-based model that accurately predicts cancer
105 of diverse programs using specially trained transformer-based networks and then filtering and cluste
111 Experimental outcomes revealed that the Swin Transformer-based YOLOv10 model delivered the best overa
113 directional encoder representations from the transformers (BERT) approach, we pre-train Ped-BERT via
114 s bidirectional encoder representations from transformers (BERT) pretrained on 1.14 million scientifi
115 o bidirectional encoder representations from transformers (BERT) with different subsets of reports (f
116 g bidirectional encoder representations from transformers (BERT), is introduced to explore AI trends
117 a bidirectional encoder representations from transformers (BERT), the robustly optimized BERT approac
118 ucleotide (FAD/FADH2) redox center acts as a transformer by accepting two electrons from soluble nico
120 xpression of downstream target genes such as transformer , by regulating pre-mRNA splicing and mRNA t
121 entum transformation metasurface (i.e., meta-transformer), by fully synergizing intrinsic properties
122 ith contemporary models such as LSTM and the Transformer, C-FGM clearly achieved a higher accuracy (M
123 les efficient photon transport, the photonic transformer can operate with a near-unity conversion eff
124 g models, such as bidirectional encoder from transformer, can incorporate massive unlabeled molecular
127 revealed that rapaprotin acts as a molecular transformer, changing from an inactive cyclic form into
130 AI) platforms, such as Generative Pretrained Transformer (ChatGPT), have achieved a high degree of po
131 d limitations of Chat Generative Pre-trained Transformer (ChatGPT): its ability to (i) define a core
132 nels to reduce computational costs; The MSTA Transformer classification module consists of four parts
136 sion Transformer (ViT), Data-efficient Image Transformer (DeiT), ImageIntern, and Swin Transformer (v
140 gerprints, including autoencoder embeddings, transformer embeddings, and topological Laplacians.
141 a novel deep learning model consisting of a transformer encoder layer, a convolutional network backb
142 pathology slides by combining a pre-trained transformer encoder with a transformer network for patch
143 th the existing deep learning models, the MR-Transformer exhibited state-of-the-art performance in pr
144 ore, eliminating the expression of Fru(M) by transformer expression in OCT/tyramine neurons changes t
145 Using the GAL4/UAS system to manipulate transformer expression, we feminized or masculinized dif
146 s takeout are feminized by expression of the Transformer-F protein, male courtship behavior is dramat
147 wer cables, magnetic energy-storage devices, transformers, fault current limiters and motors, largely
148 Bidirectional encoder representations from transformers for biomedical text mining (BioBERT) for sp
152 1-3p, that has predicted target sites in the transformer gene (Bdtra) required for female sex determi
154 rtion of cassette exons from the C. capitata transformer gene into a heterologous tetracycline-repres
155 cally spliced intron from the C. hominivorax transformer gene within the Lshid gene ensures that only
157 By manipulating either the fruitless or transformer genes in the brains of male or female flies,
158 creating a customized generative pretrained transformer (GPT) within the OpenAI GPT Store and search
159 avinci-003 variant of Generative Pre-trained Transformer (GPT)-3) on a range of analogical tasks, inc
160 odels (LLMs), such as generative pre-trained Transformers (GPTs), across various tasks, they often pe
161 ough conventional methods: Random Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian
162 ortable X-ray generator, replacing the heavy transformer had been necessary to generate power with a
164 phase combinations, such ultra-compact meta-transformer has potential in information storage, nanoph
165 ong the invasion chronosequence of an exotic transformer, Heracleum mantegazzianum, which was replica
166 sNet outperforms VGG16, ResNet50, and Vision Transformer in accuracy, precision, recall, and F1-score
169 basic fibroblast growth factor, a candidate transformer in Xenopus, caused malformation but not hind
175 f the key sex-determination gene tra-2 (tra, transformer) is controlled by a 28-nucleotide repeat ele
176 emporal autoencoder Transformer (3D-CNN-MSTA-Transformer) is proposed for sentiment classification of
177 th gene, fit (female-specific independent of transformer), is not controlled by tra and dsx, suggesti
178 ATGO lies in the utilization of pre-trained transformer language models which can extract discrimina
180 ing model, DeepECtransformer, which utilizes transformer layers as a neural network architecture to p
181 on a pretrained Local Feature Matching with Transformer (LoFTR) model to verify each component's pre
182 ement SKiM with a knowledge graph built with transformer machine-learning models to aid in interpreti
185 long short-term memory (BiLSTM) model and a Transformer model and can analyze POD5 and FAST5 signal
187 mpared to baseline models, the proposed LSTM-Transformer model for Caltech data shows a significant i
188 /threonine kinases to develop an explainable Transformer model for kinase-peptide interaction predict
189 sting sets, the results demonstrate that the Transformer model optimized by MOEBS significantly outpe
190 anning the years 2020-2050 using the Stacked Transformer model show a progressive decrease in the LAI
191 Here, we introduce SEQUOIA, a linearized transformer model that predicts cancer transcriptomic pr
194 e Bidirectional Encoder Representations from Transformers model is demonstrated on an analog inferenc
195 e-Bidirectional Encoder Representations from Transformers model was used to encode unstructured text
196 e bidirectional encoder representations from transformers model, our model detects nested entities an
198 node attributes, xSiGra employs hybrid graph transformer models to delineate spatial cell types.
200 ntion-based convolutional neural network and Transformer models, DeepPhosPPI accurately predicts phos
201 ges unsupervised pre-training features and a Transformer module to learn both compound and protein ch
207 e graph, (2) utilizing a heterogeneous graph transformer network, and (3) computing relationship scor
211 ng technology recently developed for spatial transformer networks, we propose ZephIR, an image regist
212 sanovo, a machine learning model that uses a transformer neural network architecture to translate the
218 BVB's and soya oil samples adulterated with transformer oil and mineral oil were characterised using
222 phy (UV-NIL) to fabricate functional optical transformers onto the core of an optical fiber in a sing
228 alysis reveals that only 1.8% of residential transformers require upgrades, while over 80% of commerc
230 ive to that of the germline (by manipulating transformer) reveal a dominant role for the soma in regu
231 al experimental data to fine-tune regression transformer (RT) models for generative molecular design
234 nsplantations show that a spatially distinct transformer signal emanates from tissues other than the
236 equential, opening up the opportunity to use transformers, state-of-the-art deep learning architectur
237 AS-OCT images of 5746 eyes and (2) a Vision Transformer supervised model coupled to the autoencoder
238 we introduce a set of models including a 3D transformer (SwinUNetR) and a novel 3D self-supervised l
239 Even though higher macro-F1 is achieved by transformer systems, based on the findings, a meticulous
244 around a three-dimensional multistage vision transformer that operates on Fourier domain embeddings.
245 ere, we present Graphormer-RT, a novel graph transformer that performs the first single-model method-
246 methods, such as CNN, LSTM, BiLSTM, GRU, and Transformer, the hyperparameter optimization of these mo
248 gy to encode protein features by pre-trained transformer to address the issue of scarce labelled data
249 odify a recently proposed architecture named Transformer to capture the interactions between the two
251 -based backbones Vision Transformer and Swin Transformer to localize the important regions of insect
252 metry measurements is to design an impedance transformer to match the impedance of the load to the ch
253 nostic model, HERPAI, was developed based on Transformer to predict risk of invasive disease-free sur
254 operating in the nonlinear regime acts as a transformer to simultaneously boost the energy and brigh
255 novel sequences we use self-attention-based transformers to capture global patterns in sequences.
256 we designed new architectures for these two Transformers to enable Grad-CAM to be applicable in such
258 BetaAlign draws on NLP techniques and trains transformers to map a set of unaligned biological sequen
259 ly, WaveCastNet also generalizes better than transformers to rare and critical seismic scenarios, suc
260 odule) and image-based deep learning (DeepDR-Transformer), to provide individualized diabetes managem
261 ntegrates convolutional neural networks with transformer, to predict effects of genetic variations on
263 We show that the sex determination gene transformer (tra) acts in the developing nervous system,
264 phila suppressor-of-white-apricot (SWAP) and Transformer (Tra) alternative splicing factors, we found
265 the role of the sexual differentiation genes transformer (tra) and doublesex (dsx) in regulating the
266 ed and include the hermaphroditic (her), the transformer (tra) and feminization (fem) mutations.
267 nown sex determination pathway, specifically transformer (tra) and its downstream target doublesex (d
268 show that the Drosophila regulatory proteins Transformer (Tra) and Transformer 2 (Tra2) recruit diffe
269 that the Drosophila sex determination genes transformer (tra) and transformer-2 (tra-2) switch fru s
270 entify a key role for sex determination gene transformer (tra) in regulating the male-female differen
272 RA-2) protein which, in combination with the transformer (TRA) protein, directs sex-specific splicing
273 tein to continually instruct the target gene transformer (tra) to make its feminizing product, TRA-F.
276 e sex determination pathway through the gene transformer (tra), the expression of roX1 is independent
277 npf (ms-npf) expression is controlled by the transformer (tra)-dependent sex-determination pathway.
280 lf-supervised representations using a Vision Transformer, trained on 1.7 M histology images across 23
281 tide repeat elements, which are required for transformer/transformer-2-mediated splicing of the femal
283 t was highest for EfficientNet V2-S and Swin Transformer V2-B at 94.63% and 94.34%, respectively.
284 guage processing, the Generative Pre-trained Transformer version 2 (GPT-2), is shown to generate mean
286 rained foundational model combining a vision transformer (ViT) image encoder with PubMedBERT text enc
287 By leveraging satellite imagery and Vision Transformer (ViT) models, SolarScope achieves an Area un
288 ntNetB6, EfficientNetB7, DenseNet201, Vision Transformer (ViT), Data-efficient Image Transformer (Dei
291 n medical and veterinary diagnostics, Vision Transformers (ViT) remain relatively unexplored in this
292 Five transformer-based architectures-Swin Transformer, ViT, PVT, MobileViT, and Axial Transformer-
293 hybrid network of EfficientNet-B4 and Swin-B transformer was developed to classify patients according
294 Transformer, ViT, PVT, MobileViT, and Axial Transformer-were systematically evaluated, resulting in
298 driven tool leveraging generative pretrained transformers with retrieval-augmented generation technol
299 latory inputs from the soma (as regulated by transformer) with those from the germline (involving ovo