Bag of words model

Bag of words model

The bag-of-words model is a simplifying assumption used in natural language processing and information retrieval. In this model, a text (such as a sentence or a document) is represented as an unordered collection of words, disregarding grammar and even word order.

The bag-of-words model is used in some methods of document classification. When a Naive Bayes classifier is applied to text, for example, the conditional independence assumption leads to the bag-of-words model. [cite conference
first = David
last = Lewis
title = Naive (Bayes) at Forty: The Independence Assumption in Information Retrieval
booktitle = Proceedings of ECML-98, 10th European Conference on Machine Learning
pages = 4-15
publisher = Springer Verlag, Heidelberg, DE
date = 1998
location = Chemnitz, DE
url = http://citeseer.ist.psu.edu/lewis98naive.html
] Other methods of document classification that use this model are latent Dirichlet allocation and latent semantic analysis. [cite journal
last = Blei
first = David M.
coauthors = Andrew Y. Ng and Michael I. Jordan
title = Latent Dirichlet Allocation
journal = J. Mach. Learn. Res.
volume = 3
pages = 993–1022
publisher = MIT Press
location = Cambridge, MA
date = 2003
doi = 10.1162/jmlr.2003.3.4-5.993
]

Example: Spam filtering

In Bayesian spam filtering, an e-mail message is modeled as an unordered collection of words selected from one of two probability distributions: one representing spam and one representing legitimate e-mail ("ham"). Imagine that there are two literal bags full of words. One bag is filled with words found in spam messages, and the other bag is filled with words found in legitimate e-mail. While any given word is likely to be found somewhere in both bags, the "spam" bag will contain spam-related words such as "stock", "Viagra", and "buy" much more frequently, while the "ham" bag will contain more words related to the user's friends or workplace.

To classify an e-mail message, the Bayesian spam filter assumes that the message is a pile of words that has been poured out randomly from one of the two bags, and uses Bayesian probability to determine which bag it is more likely to be.

See also

* Natural language processing
* Document classification
* Machine learning
* Document-term matrix
* Bag of words model in computer vision

References


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Bag of words model in computer vision — This is an article introducing the Bag of words model (BoW) in computer vision, especially for object categorization. From now, the BoW model refers to the BoW model in computer vision unless explicitly declared.Before introducing the BoW model,… …   Wikipedia

  • Constellation model — The constellation model is a probabilistic, generative model for category level object recognition in computer vision. Like other part based models, the constellation model attempts to represent an object class by a set of N parts under mutual… …   Wikipedia

  • List of British words not widely used in the United States — Differences between American and British English American English …   Wikipedia

  • Australia's Next Top Model, Cycle 4 — Infobox Television show name = Australia s Next Top Model Cycle 4 caption = From left to right: Samantha, Belinda, Emma, Alexandra, Kristy, Rebecca, Alyce, Demelza, Kamila, Alamela, Leiden, Jamie, Caris. picture format = 576i (SDTV) audio format …   Wikipedia

  • America's Next Top Model, Cycle 10 — Promotional photograph of the cast of Cycle 10 of America s Next Top Model Format Reality television Created by Tyra Banks …   Wikipedia

  • List of French words and phrases used by English speakers — Here are some examples of French words and phrases used by English speakers. English contains many words of French origin, such as art, collage, competition, force, machine, police, publicity, role, routine, table, and many other Anglicized… …   Wikipedia

  • Relational model — The relational model for database management is a database model based on first order predicate logic, first formulated and proposed in 1969 by Edgar Codd. [ Derivability, Redundancy, and Consistency of Relations Stored in Large Data Banks , E.F …   Wikipedia

  • List of words having different meanings in British and American English: A–L — Differences between American and British English American English …   Wikipedia

  • Object categorization from image search — In computer vision, the problem of object categorization from image search is the problem of training a classifier to recognize categories of objects, using only the images retrieved automatically with an Internet search engine. Ideally,… …   Wikipedia

  • Object recognition (computer vision) — Feature detection Output of a typical corner detection algorithm …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”