コーパス検索結果 (1語後でソート)
通し番号をクリックするとPubMedの該当ページを表示します
1 99m)Tc-tetrofosmin and a solid-state cardiac camera.
2 m were measured using a rotating Scheimpflug camera.
3 the emitted SWCNTs fluorescence using a CMOS camera.
4 an inertial measurement unit and a 3D stereo camera.
5 and quantitatively with use of a smartphone camera.
6 erates chemiluminescence captured with a CCD camera.
7 sensitive cooled charge-coupled device (CCD) camera.
8 photograph captured with an ordinary digital camera.
9 y trained imagers using a wide-field digital camera.
10 regions of the sensor chip in the smartphone camera.
11 focal system than in the conventional fundus camera.
12 c (Visuscout 100(R)-Germany) digital retinal camera.
13 ctral peaks, like a digital zoom in a simple camera.
14 ients are directly imaged using an ultrafast camera.
15 ith a simple flatbed scanner or a smartphone camera.
16 sing a photomultiplier tube (PMT) or digital camera.
17 climbing fish was observed with a high speed camera.
18 films on metallic surfaces using an infrared camera.
19 ent, compared to a conventional flash fundus camera.
20 adout method that requires only a smartphone camera.
21 ve readouts were obtained using a smartphone camera.
22 ging spectral images onto an intensified CCD camera.
23 e of tissue, while imaging this plane onto a camera.
24 photographs were acquired with each retinal camera.
25 igned Single-Photon Avalanche Diodes (SPADs) camera.
26 as12a system was recorded using a smartphone camera.
27 d with temperature monitoring by an infrared camera.
28 bandage and an unmodified commercial digital camera.
29 2 capture montages when possible, UWF fundus cameras.
30 e that parallelizes image acquisition across cameras.
31 y increased from R1 to R3 disease using both cameras.
32 t supervisors and residents were captured by cameras.
33 the prone position proposed with other SPECT cameras.
34 that mimic the capabilities of line-of-sight cameras.
35 ince, but eluded capture by high-speed video cameras.
36 f the lower resolution of conventional Anger cameras.
37 ient alternative to commonly used scientific cameras.
39 lting ejecta were recorded by the Deployable CAMera 3 for >8 minutes, showing the growth of an ejecta
40 er budget lower than a 200 mW (10 mW for the camera, 90 mW for the liquid lens and ~100 mW for the co
41 degrees panoramic-view (lesions detected by camera A and B) over the standard 172 degrees -view (les
44 t the light collected from the sample to the camera, an external lens for image formation, and diffus
46 we have tested a desktop scanner, a digital camera and a smartphone to determine iron using three st
50 -green complex was captured using smartphone camera and DIC was employed using Red, Green, and Blue (
51 ivities (AAAs) for 13 noncardiac adult gamma camera and PET/CT examinations were derived retrospectiv
53 or between the heart rate estimates from the camera and the average from two reference pulse oximeter
54 ween the respiratory rate estimates from the camera and the reference values (computed from the Elect
55 act setup employing a snapshot hyperspectral camera and the related algorithms for the automated dete
56 nized with a second high-speed visible light camera and two thermocouples to provide insight into the
57 ns simultaneously by polysomnography and 3-D-camera and visual perceptive computing (19 patients with
58 non-invasive approaches, exploiting distance cameras and lasers, have recently arisen to measure the
59 ta is typically acquired on highly sensitive cameras and often requires complex image processing algo
60 n efficacy of OGI surveyors, using their own cameras and protocols, with controlled releases in an 8-
61 olling the visibility of objects to infrared cameras and, more broadly, opportunities for quantum mat
63 pe of phone, region of interest, position of camera, and distance between camera and sample solution.
64 analytical images were captured by a digital camera, and lipid peroxidation was monitored using color
65 tically tracks pig movement with depth video cameras, and automatically measures standing, feeding, d
70 tless measurements during the night by a 3-D-camera are less time-consuming in comparison to polysomn
71 purchased by every health clinic, so fundus cameras are an inconvenient tool for widespread screenin
74 to recordings from the tablets' front-facing camera (area under the receiver operating characteristic
75 While observational data from systematic camera arrays have informed inferences on the spatiotemp
76 arefully characterize an industry-grade CMOS camera as a cost-efficient alternative to commonly used
77 mages from a digital hand-held non-mydriatic camera at a medical clinic, with dilatation of pupil of
78 ards (a) the detective quantum efficiency of cameras at high resolution, (b) converting phase modulat
79 survey flights with both optical and thermal cameras at the Caliente lava dome, part of the Santiagui
80 onditions and the systematic use of the same camera avoid the periodical calibration of the system im
82 address the inherent experimental biases in camera-based and MINFLUX-based single-molecule tracking,
85 In this paper, we develop PulseCam - a new camera-based, motion-robust, and highly sensitive blood
89 were taken in 4 subjects using 3 smartphone cameras{Bq, Iphone, Nexus}, 2 lighting levels{high 815 l
91 livering either positive or negative news to camera, but were not instructed to deliberately or artif
93 ening using a nonmydriatic hand-held digital camera by trained general physicians in a non-ophthalmic
95 the significant advantages that smartphones' cameras can provide in teleophthalmology and artificial
97 silicon charge-coupled device (CCD), and IR camera CCD to monitor the activity, surface body tempera
99 iation is analyzed through a high-resolution camera (CMOS) and the reacted image is processed by a RG
100 Raspberry Pi single-board computer and color camera combined with Fourier ptychography (FP), to compu
104 y of a Raspberry Pi microcontroller and a Pi camera, coupled with three ultraviolet light emitting di
105 structures and an imaging station, with five cameras covering the whole soil surface, complement the
106 body-worn cameras (body cams) and dashboard cameras (dash cams) to monitor the activity of police of
108 ative prior had the best agreement with nest camera data, outperforming TDFs derived from captive fee
113 have emerged as alternatives to multi-pixel cameras due to reduced costs and superior durability.
114 - and standard-speed video and digital still cameras, electric field meters, fast electric-field ante
115 s of its precision, providing guidelines for camera exposure time settings depending on imaging signa
119 t to image four 96-well plates with a single camera for automated measurements of activity in a 384-w
122 ctroscopy, and fast single-frame FLIM at the camera frame rate with 10(3)-10(5) times higher throughp
125 ry Pi, smartphones and other sub-$50 digital cameras) has lowered the barrier to accessing cost-effic
128 ith the chest leaning forward on the D.SPECT camera-head at 35 degrees from vertical, had an addition
131 ace hyperspectral remote sensing and digital camera imagery, tower-based CO(2) flux measurements, and
135 The automated system performs real-time dual-camera imaging (brightfield & fluorescent), processing,
137 ging of the tibial marrow compartment, alpha-camera imaging, and Monte Carlo dosimetry modeling revea
141 h-sensitivity D.SPECT cadmium-zinc-telluride camera in a forward-leaning bikerlike position, which ma
142 stic value of MPI performed with a CZT SPECT camera in a large cohort of patients suspected of having
145 obtained with calibrated and non-calibrated cameras, in different lighting conditions and optical ma
146 or quantifying gait pathology with commodity cameras increase access to quantitative motion analysis
149 uses high-precision diet estimates from nest cameras installed on a subset of nests in lieu of a cont
150 te the ability to modify a commercial fundus camera into a low-cost laser speckle contrast imaging (L
152 self-assemble, and readout by eye or digital camera is possible within 5 minutes of analyte addition,
153 compensate for the highly aberrated re-used camera lens and to compensate for misalignments associat
154 n's evaluation was not affected by different cameras, lighting conditions or optical magnifications,
155 ong all three species, individuals looked at cameras longer when they were young, were associating wi
156 colposcope demonstrated that the Callascope camera meets visual requirements for cervical imaging.
157 to integrated audio-video systems containing cameras, microphones, and sensors, capturing and synthes
158 er, the combination of optical data from the camera module and chemical data obtained by mass analysi
161 und significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and betw
162 leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker rel
167 to-mechanical attachment coupled to the rear camera of the smartphone, which contains two white light
168 n imaging (MPI) performed with a solid-state camera on patients in 2 positions (semiupright, supine)
177 rapeutic activity can interfere with the PET camera performance and degrade both image quality and qu
182 discuss imaging protocols used with the new cameras, potential imaging pitfalls, and the latest data
183 ings show that MPI acquired with a CZT SPECT camera provides excellent prognostic information, with l
184 aticity space, while the conventional fundus camera provides images with a clear shift toward red at
188 (i.e., data acquisition time, including CCD camera recording time and intensifier gate delay; focuse
190 plementary Metal-Oxide-Semiconductor (sCMOS) camera relies on statistical compensation for its pixel-
192 ouse and rat models of cancer with a thermal camera reveals material heterogeneity and delineates dis
194 ite sharks spent less time around the baited camera rig when the artificial sound was presented, but
196 ally-explicit capture-recapture analyses for camera sampling data, and a removal analysis for removal
198 ion methods (i.e. non-invasive DNA sampling, camera sampling, and sampling from trapping) by applying
200 id mobile and fixed-location ultra-widefield camera screening system was able to be deployed to carry
201 e imaging in humans by utilizing non-contact camera sensing of Cherenkov emission during the radiatio
202 servations of lightning have been limited by camera sensitivity, distance from Jupiter and long expos
203 of post-processing binary raw data from any camera source to improve the sensitivity of functional f
208 the other in Iceland (the latter captured on camera), spontaneously using a small wooden stick to scr
209 ch is compatible with the output of the xcms/CAMERA suite, uses the data-rich output of mass spectrom
211 d LiDAR approach which combines a multi-view camera system for position and distance information, and
212 ra system with a frequency-domain (FD) based camera system regarding their measuring characteristics
213 rformance of an older time-domain (TD) based camera system with a frequency-domain (FD) based camera
218 ary leap in SPECT imaging with the advent of camera systems that use solid-state crystals and novel c
219 uced by the use of the 3dMDface or Vectra H1 camera systems, in conjunction with the MeshMonk registr
224 t here, an open-source solution for multiple-camera tandem-lens optical systems for multiparametric m
225 improvements in fluorescent probes and multi-camera technology have increased the power of optical ma
226 Here, we combine high-speed intensified camera technology with a versatile, reconfigurable and d
227 tion capture system (3-D system) and a video camera that was positioned perpendicular to the obstacle
228 Our system makes use of 62 machine vision cameras that encircle an open 2.45 m x 2.45 m x 2.75 m e
232 nch testing for comparison of the Callascope camera to a $20,000 high-end colposcope demonstrated tha
233 equipped to allow an ultra-widefield fundus camera to be mounted inside, and the van was sent with a
234 able sensors, light and sound sensors, and a camera to collect data on patients and their environment
235 tation and emission filters on the flash and camera to image the signal from distinct fluorophore-lab
236 use a new electron source, energy filter and camera to obtain a 1.7 angstrom resolution cryo-EM recon
237 of fixed-location and mobile ultra-widefield cameras to extend the reach of the screening opportunity
239 the Remidio FOP and a Topcon tabletop fundus camera (Topcon Medical Systems, Inc., Oakland, NJ).
240 ribe a new PLIM imager based on the Timepix3 camera (Tpx3cam) and its application for imaging of O(2)
243 orne light detection and ranging (LiDAR) and camera trap data to characterize the response of a tropi
244 trated a stronger looking impulse toward the camera trap device compared to chimpanzees, suggesting h
255 ge methods to classify wildlife species from camera-trap data, we shed light on the methods themselve
257 tifying individually-unique individuals from camera-trap photos may not be as reliable as previously
258 e animals with individually-unique markings, camera-trap surveys are a benchmark standard for estimat
259 captive snow leopards (Panthera uncia) were camera-trapped on 40 occasions and eight observers indep
260 cascading interactions using high-resolution camera trapping and remote sensing data in the best-pres
262 density estimates obtained through worldwide camera trapping studies for an emblematic and ecological
266 onnectivity in red fox social groups, we set camera traps for four consecutive seasons to record cont
268 and recapture probabilities were higher for camera traps placed near water resources compared with t
269 ile benefiting from the distinct capacity of camera traps to generate large datasets from multiple sp
270 advances in the experimental application of camera traps to investigate fundamental mechanisms under
271 Experimental study designs that utilize camera traps uniquely allow for testing hypothesized mec
272 mpling technique that leverages imagery from camera traps with conventional distance sampling, valida
275 ars on marked pumas with detection data from camera-traps substantially improved density estimates by
277 adova, Italy) and a traditional flash fundus camera (TRC-NW8, Topcon Corporation, Tokyo, Japan) were
279 fluorescence is measured using a high-speed camera under spatially multiplexed two-colour laser illu
282 th at half-maximum intensity, HWHM), but the camera was unlikely to have detected the dim outer edges
285 By mounting the new lenses on a commercial camera, we demonstrate their functionality, showing how
288 detected in the epi-direction by a far-field camera where individual binding events show up as diffra
291 the LFs were carried out with a mobile phone camera whose imaging resolution was improved by attachin
292 ustly combining the video recording from the camera with a pulse waveform measured using a convention
293 at panel detector placed in front of a gamma camera with cone beam collimator focused at the x-ray fo
296 ovel approach combining multiple ultraviolet cameras with synchronous aerial measurements, we calcula
299 y that successfully captured an image from a camera, with resolution and clarity that remain impressi
301 d scene are outside the line of sight of the camera, without requiring controlled or time-varying ill