戻る
「早戻しボタン」を押すと検索画面に戻ります。 [閉じる]

コーパス検索結果 (1語後でソート)

通し番号をクリックするとPubMedの該当ページを表示します
1 99m)Tc-tetrofosmin and a solid-state cardiac camera.
2 m were measured using a rotating Scheimpflug camera.
3 the emitted SWCNTs fluorescence using a CMOS camera.
4 an inertial measurement unit and a 3D stereo camera.
5  and quantitatively with use of a smartphone camera.
6 erates chemiluminescence captured with a CCD camera.
7 sensitive cooled charge-coupled device (CCD) camera.
8 photograph captured with an ordinary digital camera.
9 y trained imagers using a wide-field digital camera.
10 regions of the sensor chip in the smartphone camera.
11 focal system than in the conventional fundus camera.
12 c (Visuscout 100(R)-Germany) digital retinal camera.
13 ctral peaks, like a digital zoom in a simple camera.
14 ients are directly imaged using an ultrafast camera.
15 ith a simple flatbed scanner or a smartphone camera.
16 sing a photomultiplier tube (PMT) or digital camera.
17 climbing fish was observed with a high speed camera.
18 films on metallic surfaces using an infrared camera.
19 ent, compared to a conventional flash fundus camera.
20 adout method that requires only a smartphone camera.
21 ve readouts were obtained using a smartphone camera.
22 ging spectral images onto an intensified CCD camera.
23 e of tissue, while imaging this plane onto a camera.
24  photographs were acquired with each retinal camera.
25 igned Single-Photon Avalanche Diodes (SPADs) camera.
26 as12a system was recorded using a smartphone camera.
27 d with temperature monitoring by an infrared camera.
28 bandage and an unmodified commercial digital camera.
29 2 capture montages when possible, UWF fundus cameras.
30 e that parallelizes image acquisition across cameras.
31 y increased from R1 to R3 disease using both cameras.
32 t supervisors and residents were captured by cameras.
33 the prone position proposed with other SPECT cameras.
34 that mimic the capabilities of line-of-sight cameras.
35 ince, but eluded capture by high-speed video cameras.
36 f the lower resolution of conventional Anger cameras.
37 ient alternative to commonly used scientific cameras.
38 evice than for the conventional flash fundus camera (0.071% versus 0.025%, p < 0.001).
39 lting ejecta were recorded by the Deployable CAMera 3 for >8 minutes, showing the growth of an ejecta
40 er budget lower than a 200 mW (10 mW for the camera, 90 mW for the liquid lens and ~100 mW for the co
41  degrees panoramic-view (lesions detected by camera A and B) over the standard 172 degrees -view (les
42                               The smartphone camera acquired the images of fluorescence derived from
43               The spatial resolution of some cameras allowed investigators to confirm 22 flashes with
44 t the light collected from the sample to the camera, an external lens for image formation, and diffus
45                        The system uses a CCD camera and a line laser beam to measure the height of th
46  we have tested a desktop scanner, a digital camera and a smartphone to determine iron using three st
47 ges are simultaneously followed by a digital camera and analyzed by Mathematica software.
48 compared with a conventional tabletop fundus camera and clinical examination.
49        The Callascope contains a 2 megapixel camera and contrast agent spray mechanism housed within
50 -green complex was captured using smartphone camera and DIC was employed using Red, Green, and Blue (
51 ivities (AAAs) for 13 noncardiac adult gamma camera and PET/CT examinations were derived retrospectiv
52 st, position of camera, and distance between camera and sample solution.
53 or between the heart rate estimates from the camera and the average from two reference pulse oximeter
54 ween the respiratory rate estimates from the camera and the reference values (computed from the Elect
55 act setup employing a snapshot hyperspectral camera and the related algorithms for the automated dete
56 nized with a second high-speed visible light camera and two thermocouples to provide insight into the
57 ns simultaneously by polysomnography and 3-D-camera and visual perceptive computing (19 patients with
58 non-invasive approaches, exploiting distance cameras and lasers, have recently arisen to measure the
59 ta is typically acquired on highly sensitive cameras and often requires complex image processing algo
60 n efficacy of OGI surveyors, using their own cameras and protocols, with controlled releases in an 8-
61 olling the visibility of objects to infrared cameras and, more broadly, opportunities for quantum mat
62 ideo, (ii) load platform, (iii) Kinect depth camera, and (iv) Kinematic data.
63 pe of phone, region of interest, position of camera, and distance between camera and sample solution.
64 analytical images were captured by a digital camera, and lipid peroxidation was monitored using color
65 tically tracks pig movement with depth video cameras, and automatically measures standing, feeding, d
66                             Using concurrent camera- and live-trapping, we investigated the local-sca
67                                 A smartphone camera application and an user interface are developed t
68 ce the multichannel pipette and the infrared camera are both operated with batteries.
69                      Both, the laser and the camera are controlled by a Field Programmable Gate Array
70 tless measurements during the night by a 3-D-camera are less time-consuming in comparison to polysomn
71  purchased by every health clinic, so fundus cameras are an inconvenient tool for widespread screenin
72                              However, fundus cameras are too big and heavy to be transported easily a
73  chances of effective treatment where fundus cameras are used to capture retinal image.
74 to recordings from the tablets' front-facing camera (area under the receiver operating characteristic
75     While observational data from systematic camera arrays have informed inferences on the spatiotemp
76 arefully characterize an industry-grade CMOS camera as a cost-efficient alternative to commonly used
77 mages from a digital hand-held non-mydriatic camera at a medical clinic, with dilatation of pupil of
78 ards (a) the detective quantum efficiency of cameras at high resolution, (b) converting phase modulat
79 survey flights with both optical and thermal cameras at the Caliente lava dome, part of the Santiagui
80 onditions and the systematic use of the same camera avoid the periodical calibration of the system im
81                          The PMT and visual (camera)-based detections showed linear responses from 1
82  address the inherent experimental biases in camera-based and MINFLUX-based single-molecule tracking,
83 resolution using wide-field illumination and camera-based detection methods.
84            Specifically, a simple smartphone-camera-based device measures the colour signal provided
85   In this paper, we develop PulseCam - a new camera-based, motion-robust, and highly sensitive blood
86  an intensified charge-coupled device (ICCD) camera-based, time-domain USF imaging system.
87                                 The infrared camera blow rate averaged 49 blows/whale/hour over 5-8 J
88             Police departments use body-worn cameras (body cams) and dashboard cameras (dash cams) to
89  were taken in 4 subjects using 3 smartphone cameras{Bq, Iphone, Nexus}, 2 lighting levels{high 815 l
90                             Using an in-situ camera, bubble dynamics at the source were measured ever
91 livering either positive or negative news to camera, but were not instructed to deliberately or artif
92                             Police body-worn cameras (BWCs) have been widely promoted as a technologi
93 ening using a nonmydriatic hand-held digital camera by trained general physicians in a non-ophthalmic
94 e detected in the field without the need for camera calibration or a fixed imaging position.
95 the significant advantages that smartphones' cameras can provide in teleophthalmology and artificial
96                    Conventional flash fundus cameras capture color images that are oversaturated in t
97  silicon charge-coupled device (CCD), and IR camera CCD to monitor the activity, surface body tempera
98 olded injection process, and a low cost CMOS camera chip.
99 iation is analyzed through a high-resolution camera (CMOS) and the reacted image is processed by a RG
100 Raspberry Pi single-board computer and color camera combined with Fourier ptychography (FP), to compu
101 low-energy collimation was available in this camera configuration.
102                             The fast-optical camera could lead to multiple applications in Quantum In
103            Most of the periods for which the camera could not produce a reliable heart rate estimate
104 y of a Raspberry Pi microcontroller and a Pi camera, coupled with three ultraviolet light emitting di
105 structures and an imaging station, with five cameras covering the whole soil surface, complement the
106  body-worn cameras (body cams) and dashboard cameras (dash cams) to monitor the activity of police of
107                          For example, remote camera data (1314 observations of toad activity times ov
108 ative prior had the best agreement with nest camera data, outperforming TDFs derived from captive fee
109 flow immunoassay paper sensor and smartphone camera demonstrates its potential in DR screening.
110 omparing model output to high-precision nest camera diet estimates.
111 ng technologies, from windows and glasses to cameras, digital displays and photonic devices.
112               However, the implementation of cameras directly mimicking the eyes of common arthropods
113  have emerged as alternatives to multi-pixel cameras due to reduced costs and superior durability.
114 - and standard-speed video and digital still cameras, electric field meters, fast electric-field ante
115 s of its precision, providing guidelines for camera exposure time settings depending on imaging signa
116                                     Instead, camera exposure times that allow the nanoparticle to ful
117                           Utilizing only the camera feed from the smartphone and built-in image proce
118           Using co-located vehicle dashboard camera footage, we find that wiper measurements are a st
119 t to image four 96-well plates with a single camera for automated measurements of activity in a 384-w
120 a nanoprobe compatible with a smartphone RGB camera for detection of nucleic acids.
121  microphone synchronized with two high-speed cameras for 3D flightpath reconstructions.
122 ctroscopy, and fast single-frame FLIM at the camera frame rate with 10(3)-10(5) times higher throughp
123                  With images from the Topcon camera, graders reported similar sensitivities and speci
124                  Demand for smartphones with cameras has driven down the price and size of CMOS image
125 ry Pi, smartphones and other sub-$50 digital cameras) has lowered the barrier to accessing cost-effic
126                                 Single-pixel cameras have emerged as alternatives to multi-pixel came
127 olution, and the need for wiring or external cameras, have limited their application.
128 ith the chest leaning forward on the D.SPECT camera-head at 35 degrees from vertical, had an addition
129 ava fountain height was estimated by thermal camera image processing.
130                       High-frequency digital camera imagery, and vegetation indices derived from that
131 ace hyperspectral remote sensing and digital camera imagery, tower-based CO(2) flux measurements, and
132                                Scintillation camera images contain a large amount of Poisson noise.
133 the retinal vasculature morphology in fundus camera images.
134 of tethered, adult Drosophila using multiple camera images.
135 The automated system performs real-time dual-camera imaging (brightfield & fluorescent), processing,
136                                        alpha-Camera imaging of (225)Ac-L1 revealed high renal cortica
137 ging of the tibial marrow compartment, alpha-camera imaging, and Monte Carlo dosimetry modeling revea
138                                              Camera imaging-based systems can be used to measure thes
139 ssed blood activity concentrations and gamma-camera imaging.
140 ned using the biodistribution data and alpha-camera imaging.
141 h-sensitivity D.SPECT cadmium-zinc-telluride camera in a forward-leaning bikerlike position, which ma
142 stic value of MPI performed with a CZT SPECT camera in a large cohort of patients suspected of having
143  be clearly monitored and imaged by a corona camera in daylight and room light.
144 and respiratory events measured with the 3-D-camera in OSA patients (r = 0.823; p < 0.001).
145  obtained with calibrated and non-calibrated cameras, in different lighting conditions and optical ma
146 or quantifying gait pathology with commodity cameras increase access to quantitative motion analysis
147 ) in the Eidon images compared to the Topcon camera, indicating a greater richness of color.
148                             Video from these cameras informs review of police conduct in disputed cir
149 uses high-precision diet estimates from nest cameras installed on a subset of nests in lieu of a cont
150 te the ability to modify a commercial fundus camera into a low-cost laser speckle contrast imaging (L
151       However, calibration of a smartphone's camera is essential when extracting objective data from
152 self-assemble, and readout by eye or digital camera is possible within 5 minutes of analyte addition,
153  compensate for the highly aberrated re-used camera lens and to compensate for misalignments associat
154 n's evaluation was not affected by different cameras, lighting conditions or optical magnifications,
155 ong all three species, individuals looked at cameras longer when they were young, were associating wi
156  colposcope demonstrated that the Callascope camera meets visual requirements for cervical imaging.
157 to integrated audio-video systems containing cameras, microphones, and sensors, capturing and synthes
158 er, the combination of optical data from the camera module and chemical data obtained by mass analysi
159                                      We used cameras mounted on remotely operated vehicles to study t
160 content, though weaker relationships between camera-NDVI and LAI.
161 und significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and betw
162  leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker rel
163                                      Several camera networks are already well established, and some c
164 rd bright-field microscope with a lamp and a camera - no laser or interferometry is needed.
165 mproved by attaching microscopic lens on the camera objective.
166 a 3D-printed compact holder and the built-in camera of a smartphone.
167 to-mechanical attachment coupled to the rear camera of the smartphone, which contains two white light
168 n imaging (MPI) performed with a solid-state camera on patients in 2 positions (semiupright, supine)
169 d 172 degrees -view (lesions detected by one camera only).
170                                     Existing camera-only blood perfusion imaging modality suffers fro
171  future infrared camera surveys use multiple cameras optimised for different ranges offshore.
172 lude, for example, controlling an ingestible camera or expelling a kidney stone.
173 g rapid spatiotemporal inspection by digital cameras or the naked eye.
174 nd can be quantified using a cell phone, CCD camera, or plate reader.
175 s depending on imaging signal properties and camera parameters for optimal time resolution.
176                       The rotary Scheimpflug camera (Pentacam(R) type) analyzed the topographic, pach
177 rapeutic activity can interfere with the PET camera performance and degrade both image quality and qu
178                      The method improves the camera performance, enabling fast, low-light and quantit
179 ficantly lower than prior studies focused on camera performance.
180 HdG and that are discernible on a smartphone camera photograph.
181                                ACsN combines camera physics and layered sparse filtering to significa
182  discuss imaging protocols used with the new cameras, potential imaging pitfalls, and the latest data
183 ings show that MPI acquired with a CZT SPECT camera provides excellent prognostic information, with l
184 aticity space, while the conventional fundus camera provides images with a clear shift toward red at
185                                    Using CCD camera quantification of CL with femto-luminol reagent g
186            A high-resolution thermal imaging camera recorded facial imagery, from which a computation
187                        A state-of-the-art IR camera recorded IR images of the subject's breasts, a 3D
188  (i.e., data acquisition time, including CCD camera recording time and intensifier gate delay; focuse
189 works are already well established, and some camera records are a more than a decade long.
190 plementary Metal-Oxide-Semiconductor (sCMOS) camera relies on statistical compensation for its pixel-
191                 Our single-shot polarization camera requires no moving parts, specially patterned pix
192 ouse and rat models of cancer with a thermal camera reveals material heterogeneity and delineates dis
193                  We used a baited underwater camera rig to record the behavioural responses of eight
194 ite sharks spent less time around the baited camera rig when the artificial sound was presented, but
195 M measured by polysomnography and by the 3-D-camera (RLS: r = 0.654; p = 0.004).
196 ally-explicit capture-recapture analyses for camera sampling data, and a removal analysis for removal
197                                              Camera sampling was the least expensive, but had large v
198 ion methods (i.e. non-invasive DNA sampling, camera sampling, and sampling from trapping) by applying
199                                          For camera sampling, effort could only be marginally reduced
200 id mobile and fixed-location ultra-widefield camera screening system was able to be deployed to carry
201 e imaging in humans by utilizing non-contact camera sensing of Cherenkov emission during the radiatio
202 servations of lightning have been limited by camera sensitivity, distance from Jupiter and long expos
203  of post-processing binary raw data from any camera source to improve the sensitivity of functional f
204 bined stress TPD was computed using sex- and camera-specific normal limits.
205 ts 40 Hz imaging of 700 mum-thick volumes if camera speed is sufficient.
206                          We also address the camera speed limitation by introducing Distributed Plana
207 s are limited by volume scanning rate and/or camera speed.
208 the other in Iceland (the latter captured on camera), spontaneously using a small wooden stick to scr
209 ch is compatible with the output of the xcms/CAMERA suite, uses the data-rich output of mass spectrom
210              We suggest that future infrared camera surveys use multiple cameras optimised for differ
211 d LiDAR approach which combines a multi-view camera system for position and distance information, and
212 ra system with a frequency-domain (FD) based camera system regarding their measuring characteristics
213 rformance of an older time-domain (TD) based camera system with a frequency-domain (FD) based camera
214  rotation (orientation) data obtained from a camera system.
215          Lastly, general-purpose solid-state camera systems and hybrid SPECT/CT systems have also bee
216                                 When the two camera systems are used separately to image human partic
217                            Solid-state SPECT camera systems have facilitated dramatic reductions in b
218 ary leap in SPECT imaging with the advent of camera systems that use solid-state crystals and novel c
219 uced by the use of the 3dMDface or Vectra H1 camera systems, in conjunction with the MeshMonk registr
220                       For studies using both camera systems, this upper bound increases to 0.85 mm, o
221 ive imaging using commonly used fluorescence camera systems.
222  the eyelids, nostrils, and mouth by the two camera systems.
223 ur sources, using the 3dMDface and Vectra H1 camera systems.
224 t here, an open-source solution for multiple-camera tandem-lens optical systems for multiparametric m
225 improvements in fluorescent probes and multi-camera technology have increased the power of optical ma
226      Here, we combine high-speed intensified camera technology with a versatile, reconfigurable and d
227 tion capture system (3-D system) and a video camera that was positioned perpendicular to the obstacle
228    Our system makes use of 62 machine vision cameras that encircle an open 2.45 m x 2.45 m x 2.75 m e
229                       Using a smartphone RGB camera, the nanoprobe response can be readily detected a
230                           Using a high-speed camera, three regimes (no cavitation, cavitation, and ps
231 them to those of a conventional flash fundus camera through chromaticity analysis.
232 nch testing for comparison of the Callascope camera to a $20,000 high-end colposcope demonstrated tha
233  equipped to allow an ultra-widefield fundus camera to be mounted inside, and the van was sent with a
234 able sensors, light and sound sensors, and a camera to collect data on patients and their environment
235 tation and emission filters on the flash and camera to image the signal from distinct fluorophore-lab
236 use a new electron source, energy filter and camera to obtain a 1.7 angstrom resolution cryo-EM recon
237 of fixed-location and mobile ultra-widefield cameras to extend the reach of the screening opportunity
238                                We used trail cameras to monitor the fates of size-appropriate chicken
239 the Remidio FOP and a Topcon tabletop fundus camera (Topcon Medical Systems, Inc., Oakland, NJ).
240 ribe a new PLIM imager based on the Timepix3 camera (Tpx3cam) and its application for imaging of O(2)
241 ions using an intensified high-speed optical camera, Tpx3Cam.
242                                 We then used camera trap and high-altitude photographic data to compa
243 orne light detection and ranging (LiDAR) and camera trap data to characterize the response of a tropi
244 trated a stronger looking impulse toward the camera trap device compared to chimpanzees, suggesting h
245           We also investigated the effect of camera trap placement on detection probability.
246                                 Experimental camera trap studies have already begun to play an import
247                                   Only 9% of camera trap studies on predator-prey ecology in our revi
248 review challenges to conducting experimental camera trap studies.
249  conceptual guide for designing experimental camera trap studies.
250                                              Camera trap technology has galvanized the study of preda
251                   We discuss applications of camera trap-based experiments to evaluate the diversity
252                 To illustrate the utility of camera trap-based experiments using a case study, we pro
253 ge carnivores and 11 ungulates across 21,430 camera trap-nights in West Africa.
254 ield sites in Africa) to a novel object, the camera trap.
255 ge methods to classify wildlife species from camera-trap data, we shed light on the methods themselve
256                         Using an exceptional camera-trap dataset spanning 32 protected areas across s
257 tifying individually-unique individuals from camera-trap photos may not be as reliable as previously
258 e animals with individually-unique markings, camera-trap surveys are a benchmark standard for estimat
259  captive snow leopards (Panthera uncia) were camera-trapped on 40 occasions and eight observers indep
260 cascading interactions using high-resolution camera trapping and remote sensing data in the best-pres
261 rchid pollination, and describe novel remote camera trapping methods.
262 density estimates obtained through worldwide camera trapping studies for an emblematic and ecological
263                            Using large-scale camera trapping, we monitored western gorillas in Republ
264 sy darting, scat detection dogs, and regular camera-trapping.
265           We examined surveillance data from camera traps at bait sites and records of wild pig remov
266 onnectivity in red fox social groups, we set camera traps for four consecutive seasons to record cont
267                                        Using camera traps in forest areas with different degrees of d
268  and recapture probabilities were higher for camera traps placed near water resources compared with t
269 ile benefiting from the distinct capacity of camera traps to generate large datasets from multiple sp
270  advances in the experimental application of camera traps to investigate fundamental mechanisms under
271      Experimental study designs that utilize camera traps uniquely allow for testing hypothesized mec
272 mpling technique that leverages imagery from camera traps with conventional distance sampling, valida
273                    However, such pairings of camera traps with experimental methods remain underutili
274                                        Using camera traps, we evaluated crossing rates of a pipeline
275 ars on marked pumas with detection data from camera-traps substantially improved density estimates by
276             We applied clustered sampling to camera-traps to detect marked (collared) and unmarked pu
277 adova, Italy) and a traditional flash fundus camera (TRC-NW8, Topcon Corporation, Tokyo, Japan) were
278 oncave mirror eyes in scallops and the large camera-type eyes of the more derived cephalopods.
279  fluorescence is measured using a high-speed camera under spatially multiplexed two-colour laser illu
280 ngth and in the IR detection range of our IR camera) validate our theoretical findings.
281 of detection of a whale blow by the infrared camera was the same at night as during the day.
282 th at half-maximum intensity, HWHM), but the camera was unlikely to have detected the dim outer edges
283                      A Phantom v7 high-speed camera was used to record images at a rate of 66,700 fra
284                                     A single camera was used to simultaneously image two acrylic, FTI
285   By mounting the new lenses on a commercial camera, we demonstrate their functionality, showing how
286                The images obtained using the camera were converted into pressure/force maps using a p
287                            Additional fundus cameras were placed in several high-volume clinic locati
288 detected in the epi-direction by a far-field camera where individual binding events show up as diffra
289 ctively captured by a wearable point-of-view camera which provides a unique viewpoint.
290                                       A mini camera, which is mounted on the tip of the pen, magnifie
291 the LFs were carried out with a mobile phone camera whose imaging resolution was improved by attachin
292 ustly combining the video recording from the camera with a pulse waveform measured using a convention
293 at panel detector placed in front of a gamma camera with cone beam collimator focused at the x-ray fo
294 8)F as they were imaged on a preclinical PET camera with increasing activities of (177)Lu.
295 tomography (SPECT) is feasible using cardiac cameras with solid-state detectors.
296 ovel approach combining multiple ultraviolet cameras with synchronous aerial measurements, we calcula
297                                              Cameras with tracking software were used to visually tra
298 e-gated Single Photon Avalanche Diode (SPAD) camera, with acquisition rates up to 1 Hz.
299 y that successfully captured an image from a camera, with resolution and clarity that remain impressi
300 s enable a compact, full-Stokes polarization camera without standard polarization optics.
301 d scene are outside the line of sight of the camera, without requiring controlled or time-varying ill

 
Page Top