Neighborhood operation

Neighborhood operation

In computer vision and image processing a neighborhood operation is a commonly used class of computations on image data which implies that it is processed according to the following pseudo code:

Visit each point p in the image data and do {
  N = a neighborhood or region of the image data around the point p
  result(p) = f(N)
}

This general procedure can be applied to image data of arbitrary dimensionality. Also, the image data on which the operation is applied does not have to be defined in terms of intensity or color, it can be any type of information which is organized as a function of spatial (and possibly temporal) variables in p.

The result of applying a neighborhood operation on an image is again something which can be interpreted as an image, it has the same dimension as the original data. The value at each image point, however, does not have to be directly related to intensity or color. Instead it is an element in the range of the function f, which can be of arbitrary type.

Normally the neighborhood N is of fixed size and is a square (or a cube, depending on the dimensionality of the image data) centered on the point p. Also the function f is fixed, but may in some cases have parameters which can vary with p, see below.

In the simplest case, the neighborhood N may be only a single point. This type of operation is often referred to as a point-wise operation.

Examples

The most common examples of a neighborhood operation use a fixed function f which in addition is linear, that is, the computation consists of a linear shift invariant operation. In this case, the neighborhood operation corresponds to the convolution operation. A typical example is convolution with a low-pass filter, where the result can be interpreted in terms of local averages of the image data around each image point. Other examples are computation of local derivatives of the image data.

It is also rather common to use a fixed but non-linear function f. This includes median filtering, and computation of local variances.

There is also a class of neighborhood operations in which the function f has additional parameters which can vary with p:

Visit each point p in the image data and do {
  N = a neighborhood or region of the image data around the point p
  result(p) = f(N,parameters(p))
}

This implies that the result is not shift invariant. Examples are adaptive Wiener filters.

Implementation aspects

The pseudo code given above suggests that a neighborhood operation is implemented in terms of an outer loop over all image points. However, since the results are independent, the image points can be visited in arbitrary order, or can even be processed in parallel. Furthermore, in the case of linear shift-invariant operations, the computation of f at each point implies a summation of products between the image data and the filter coefficients. The implementation of this neighborhood operation can then be made by having the summation loop outside the loop over all image points.

An important issue related to neighborhood operation is how to deal with the fact that the neighborhood N becomes more or less undefined for points p close to the edge or border of the image data. Several strategies have been proposed:

  • Compute result only for points p for which the corresponding neighborhood is well-defined. This implies that the output image will be somewhat smaller than the input image.
  • Zero padding: Extend the input image sufficiently by adding extra points outside the original image which are set to zero. The loops over the image points described above visit only the original image points.
  • Border extension: Extend the input image sufficiently by adding extra points outside the original image which are set to the image value at the closest image point. The loops over the image points described above visit only the original image points.
  • Mirror extension: Extend the image sufficiently much by mirroring the image at the image boundaries. This method is less sensitive to local variations at the image boundary than border extension.

References

  • Bernd Jähne (1997). Practical Handbook on Image Processing for Scientific Applications. CRC Press. ISBN 0-8493-8906-2. 
  • Bernd Jähne and Horst Haußecker (2000). Computer Vision and Applications, A Guide for Students and Practitioners. Academic Press. ISBN 0-13-085198-1. 

Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Operation Imposing Law — Part of the Iraq War (Operation Phantom Thunder) U.S. soldiers ta …   Wikipedia

  • Operation Days of Penitence — Part of the 2004 Israel–Gaza conflict Date September 30 – October 16, 2004 Location Gaza Strip Resul …   Wikipedia

  • Operation Imposing Law — Teil von: Besetzung des Irak seit 2003 …   Deutsch Wikipedia

  • Operation Law and Order — Operation Imposing Law Teil von: Irakkrieg Datum …   Deutsch Wikipedia

  • Operation Together Forward — Part of the Iraq War Date June 14, 2006 – October 24, 2006 Location Baghdad, Iraq …   Wikipedia

  • Operation Warrior's Rage — Part of War on Terrorism , Operation Iraqi Freedom Date 14 July 2005 Location Baghdad, Iraq …   Wikipedia

  • Operation Purple Haze — Part of Iraq War Date July 15, 2007 Location Baghdad, Iraq Result …   Wikipedia

  • Operation Phantom Thunder — Part of the Iraq War A cloud of smoke and dust envelopes a U.S. soldier seconds after he fired an AT 4 rocket launcher at an insurg …   Wikipedia

  • Operation Hot Winter — Part of the 2007–2008 Israel–Gaza conflict Area of the conflict …   Wikipedia

  • Operation Alljah — Part of the Iraq War (Operation Phantom Thunder) An Iraqi policeman talks with two Iraqi civilians as they wait in line to receive new i …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”