Implicit data collection

Implicit data collection

Implicit data collection is used in human computer interaction to gather data about the user in an implicit, non invasive way.

Overview

The collection of user related data in human-computer interaction is used to adapt the computer interface to the end user. The data collected are used to build a user model. The user model is then used to help the application to filter the information for the end user. Such systems are useful in recommender applications, military applications (implicit stress detection) and others.

Channels for collecting data

The system can record the user's explicit interaction and thus build an MPEG7 usage history log. Furthermore the system can use other channels to gather information about the user's emotional state. The following implicit channels have been used so far to get the affective state of the end user:

* facial activity
* posture activity
* hand tension and activity
* gestural activity
* vocal expression
* language and choice of words
* electrodermal activity

Emotional spaces

The detected emotional value is usually described any of the two most popular notations:

* a 3D emotional vector: valence, arousal, dominance
* degree of affiliation to the 6 basic emotions (sadness, happiness, anger, fear, disgust, surprise)

External links

* [http://affect.media.mit.edu/pdfs/05.picard-daily.pdf Evaluating affective interactions: Alternatives to asking what users feel] Rosalind Picard, Shaundra Briant Daily


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Implicit Certificates — are a variant of public key certificate, such that a public key can be reconstructed from any implicit certificate, and is said then to be implicitly verified, in the sense that the only party who can know the associated private key is the party… …   Wikipedia

  • Data Intensive Computing — is a class of parallel computing applications which use a data parallel approach to processing large volumes of data typically terabytes or petabytes in size and typically referred to as Big Data. Computing applications which devote most of their …   Wikipedia

  • Data parallelism — (also known as loop level parallelism) is a form of parallelization of computing across multiple processors in parallel computing environments. Data parallelism focuses on distributing the data across different parallel computing nodes. It… …   Wikipedia

  • Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… …   Wikipedia

  • Data Protection Directive — The Data Protection Directive (officially Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data) is a European Union directive which regulates the processing of… …   Wikipedia

  • Garbage collection (computer science) — This article is about garbage collection in memory management. For garbage collection in an SSD, see garbage collection (SSD). For other uses, see garbage collection. In computer science, garbage collection (GC) is a form of automatic memory… …   Wikipedia

  • Directive 95/46/EC on the protection of personal data — The full title of this European Union directive is Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data. The directive was implemented in 1995 by the European… …   Wikipedia

  • Tree (data structure) — A simple unordered tree; in this diagram, the node labeled 7 has two children, labeled 2 and 6, and one parent, labeled 2. The root node, at the top, has no parent. In computer science, a tree is a widely used data structure that emulates a… …   Wikipedia

  • Heap (data structure) — This article is about the programming data structure. For the dynamic memory area, see Dynamic memory allocation. Example of a complete binary max heap In computer science, a heap is a specialized tree based data structure that satisfies the heap …   Wikipedia

  • Recommender system — Recommender systems form a specific type of information filtering (IF) technique that attempts to present information items (movies, music, books, news, images, web pages) that are likely of interest to the user. Typically, a recommender system… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”