Sensor fusion

Sensor fusion

Sensor fusion is the combining of sensory data or data derived from sensory data from disparate sources such that the resulting information is in some sense "better" than would be possible when these sources were used individually. The term "better" in that case can mean more accurate, more complete, or more dependable, or refer to the result of an emerging view, such as stereoscopic vision (calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints).

The data sources for a fusion process are not specified to originate from identical sensors. One can distinguish "direct fusion", "indirect fusion" and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data from a set of heterogeneous or sensors, soft sensors, and history values of sensor data, while indirect fusion uses information sources like a priori knowledge about the environment and human input.

Sensor fusion is also known as "(multi-sensor) Data fusion" and is a subset of "information fusion".

Transducer Markup Language (TML) is an XML based markup language which enables sensor fusion.

Examples of sensors

* Radar
* Sonar and other acoustic
* Infra-red / thermal imaging camera
* TV cameras
* Sonobuoys
* Seismic sensors
* Magnetic sensors
* Electronic Support Measures (ESM)
* Phased array

ensor fusion algorithms

Sensor fusion is a term that covers a number of methods and algorithms, including:

* Kalman filter
* Bayesian networks
* Dempster-Shafer

Levels

There are several categories or levels of sensor fusion that are commonly used.

* Level 0 - Data Alignment
* Level 1 - Entity Assessment (e.g. signal/feature/object)
** Tracking and object detection/recognition/identification
* Level 2 - Situation Assessment
* Level 3 - Impact Assessment
* Level 4 - Process Refinement (i.e. sensor management)
* Level 5 - User Refinement

[http://www.infofusion.buffalo.edu/tm/Dr.Llinas'stuff/Rethinking%20JDL%20Data%20Fusion%20Levels_BowmanSteinberg.pdf Rethinking JDL Data Fusion Levels]

ee also

* Information integration
* Data mining
* Data fusion
*
* Data (computing)
* multimodal integration
* Fisher's method for combining independent tests of significance

External links

* http://www-prima.inrialpes.fr/Prima/Homepages/jlc/papers/SigProc-Fusion.pdf


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Fusion — can refer to combining two or more distinct things *Cell fusion *Freezing, a chemistry term for a liquid undergoing a phase change into a solid *Gene fusion, a genetic event and molecular biology technique *Nuclear fusion, the process by which… …   Wikipedia

  • Sensor Web — The Sensor Web is a type of sensor network or geographic information system (GIS) that is especially well suited for environmental monitoring and control. The term describes a specific type of sensor network: an amorphous network of spatially… …   Wikipedia

  • Sensor grid — A Sensor Grid integrates wireless sensor networks with grid infrastructures to enable real time sensor data collection and the sharing of computational and storage resources for sensor data processing and management. It is an enabling technology… …   Wikipedia

  • Data fusion — Data fusion, is generally defined as the use of techniques that combine data from multiple sources and gather that information into discrete, actionable items in order to achieve inferences, which will be more efficient and narrowly tailored than …   Wikipedia

  • Multi-Sensor Data Fusion — Multi Sensor Datenfusion (engl. multi sensor data fusion, kurz oft auch nur Data Fusion genannt) bezeichnet die Zusammenführung und Aufbereitung von bruchstückhaften und teilweise widersprüchlichen Sensordaten in ein homogenes, für den Menschen… …   Deutsch Wikipedia

  • Visual sensor network — A visual sensor network is a network of spatially distributed smart camera devices capable of processing and fusing images of a scene from a variety of viewpoints into some form more useful than the individual images. A visual sensor network may… …   Wikipedia

  • Image fusion — In computer vision, Multisensor Image Fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images. In remote sensing applications,… …   Wikipedia

  • Wireless sensor network — WSN redirects here. For the metasyntax, see Wirth syntax notation. Typical multi hop wireless sensor network architecture A wireless sensor network (WSN) consists of spatially distributed autonomous sensors to monitor physical or environmental… …   Wikipedia

  • Location estimation in sensor networks — Location estimation in wireless sensor networks is the problem of estimating the location of an object from a set of noisy measurements, when the measurements are acquired in a distributedmanner by a set of sensors.MotivationIn many civilian and… …   Wikipedia

  • MEMS sensor generations — represent the progress made in micro sensor technology and can be categorized as follows:;1st Generation :MEMS sensor element mostly based on a silicon structure, sometimes combined with analog amplification on a micro chip.;2nd Generation:MEMS… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”