Field of view (image processing)

Field of view (image processing)

The field of view of an imaging sensor or imaging system is defined as the maximum vertical and horizontal angular extent "viewed" by the sensor or system. Optical tests for measuring the FOV are versatile tests capable of measuring the FOV of UV, visible, and infrared sensors; (about .1-20 microns in the electromagnetic spectrum).

Field of view test

The Purpose of this test is to calculate the horizontal and vertical Field of View of a lens used in an imaging system or sensor. (Although this is one typical method the Optics industry uses to measure the FOV there exist many other possible methods).

Optical apparatus

UV/Visible light from an Integrating sphere is combined with infrared radiation from a BlackBody and is focused onto the focal plane of a collimator. A square test target lays in the focal plane of the collimator. Within the collimator, the light from the square test target is reflected off of a fold mirror, (flat plane mirror), unto an off-axis parabolic mirror. The incident light/radiation to the parabolic mirror is reflected/collimated back through the other end of the collimator. This means that all the light reflecting off the off-axis parabolic mirror is now near perfectly parallel. The Unit Under Test processes the collimated light/radiation and the image is displayed on a monitor. [Mazzetta, J.A.; Scopatz, S.D. (2007). Automated Testing of Ultraviolet, Visible, and Infrared Sensors Using Shared Optics. Infrared Imaging Systems: Design Analysis, Modeling, and Testing XVIII,Vol. 6543, pp. 654313-1 654313-14 ]

Monitor display

The target is displayed on a monitor in pixels. Dimensions of the screen display in pixels are known and the dimensions of the displayed target are determined by inspection.
**D_y,! = Vertical Dimension of Display (pixels)
**D_x,! = Horizontal Dimension of Display (pixels)
**d_y,! = Vertical Dimension of Image of Target (pixels)
**d_x,! = Horizontal Dimension of Image of Target (pixels)

Angular extent

To Calculate angular extent of the target (alpha,) use the following formulas... [Young, H.D., Freedman, R.A. (2004). "University Physics (11th ed.)."San Francisco, CA: Addison Wesley]
**alpha,_x = 2 arctan (frac{L_x}{2f_c})

**alpha,_y = 2 arctan (frac{L_y}{2f_c})

***L_x,! = Horizontal Dimension of Target
***L_y,! = Vertical Dimension of Target
***f_c,! = Focal Length of Collimator

For a derivation of the angle of extent formula see "Derivation of the angle-of-view formula" under the article Angle of view

Example Calculation

*Dimensions of square target and focal length of the collimator are initially known.
**Given: f_c,! = 2000.00mm, L_y,! = 25.00mm, and L_x,! = 25.00mm
**Find: horizontal and vertical angular extent alpha,_x and alpha,_y in radians (rad).
**Solution:
***alpha,_x = 2 arctan (frac{25mm}{2*2000mm})

****alpha,_x = .0125 rad

****alpha,_y = .0125 rad

Calculating the field of view

The Horizontal and Vertical fields of view (HFOV, and VFOV,, respectively) are estimated (in radians) from the value of angular extent (in radians) multiplied by the display pixel count divided by the target pixel count. [Electro Optical Industries, Inc.(2005). EO TestLab Methadology. In "Education/Ref". http://www.electro-optical.com/html/toplevel/educationref.asp.]

**HFOV = alpha,_x frac{D_x}{d_x}

**VFOV = alpha,_y frac{D_y}{d_y}

Example Calculation

*Dimensions of screen display are initialy known and dimensions of target are known from inspection.
**Given: D_x,! = 1280pixels, d_x,! = 640pixels, D_y,! = 1024pixels, d_y,! = 640pixels, and alpha,_x,alpha,_y = .0125rad
**Find: horizontal and vertical field of view (HFOV, and VFOV,) in radians (rad).
**Solution:
***HFOV = alpha,_x frac{1280px}{640px}

***VFOV = alpha,_y frac{1024px}{640px}

****HFOV = .025rad,

****VFOV = .0120rad,

References

External links

# http://spie.org/x399.xml
# http://www.electro-optical.com/html/


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Field of view — The field of view (also field of vision) is the angular extent of the observable world that is seen at any given moment.Different animals have different fields of view, depending on the placement of the eyes. Humans have an almost 180 degree… …   Wikipedia

  • Drizzle (image processing) — Drizzle (or DRIZZLE) is a digital image processing method for the linear reconstruction of undersampled images. It is normally used for the combination of astronomical images and was originally developed for the Hubble Deep Field observations… …   Wikipedia

  • Digital image processing — This article is about mathematical processing of digital images. For artistic processing of images, see Image editing. Paranal Observatory landscape after application of the city globe image technique. The Digital image processing is the use of… …   Wikipedia

  • Crystallographic image processing — (CIP) is a set of methods for determining the atomic structure of crystalline matter from high resolution electron microscopy (HREM) images obtained in a transmission electron microscope (TEM). The term was created in the research group of Sven… …   Wikipedia

  • Angle of view — In photography, angle of view describes the angular extent of a given scene that is imaged by a camera. It parallels, and may be used interchangeably with, the more general visual term field of view.It is important to distinguish the angle of… …   Wikipedia

  • Image stitching — or photo stitching is the process of combining multiple photographic images with overlapping fields of view to produce a segmented panorama or high resolution image. Commonly performed through the use of computer software, most approaches to… …   Wikipedia

  • Image stabilization — (IS) is a family of techniques used to reduce blurring associated with the motion of a camera during exposure. Specifically, it compensates for pan and tilt (angular movement, equivalent to yaw and pitch) of a camera or other imaging device. It… …   Wikipedia

  • Image Quality — is a characteristic of an image that measures the perceived image degradation (typically, compared to an ideal or perfect image). Imaging systems may introduce some amounts of distortion or artifacts in the signal, so the quality assessment is an …   Wikipedia

  • Image trigger — A image trigger initiates the grabbing of single or multiple frames of a digital CMOS camera by analysing the signals of its sensor.For capturing and analysing of fast moving objects (e.g. as in quality control of production lines) frequently… …   Wikipedia

  • Digital image — A digital image is a numeric representation (normally binary) of a two dimensional image. Depending on whether or not the image resolution is fixed, it may be of vector or raster type. Without qualifications, the term digital image usually refers …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”