- Data farming
-
Data Farming is the process of using a high performance computer or computing grid to run a simulation thousands or millions of times across a large parameter and value space. The result of Data Farming is a “landscape” of output that can be analyzed for trends, anomalies, and insights in multiple parameter dimensions.
Contents
Origins of the term
The term Data Farming comes from the idea of planting data in the simulation and parameter/value space, and then harvesting the data that results from the simulation runs.
Usage
Data Farming was originally used in the Marine Corp’s Project Albert. Small agent-based distillation models (simulations) were created to capture a specific military challenge. These models were run thousands or millions of times at the Maui High Performance Computer Center and other facilities. Project Albert analysts would work with the military subject matter experts to refine the models and interpret the results. The Naval Post Graduate School also worked closely with Project Albert in model generation, output analysis, and the creation of new experimental designs to better leverage the computing capabilities at Maui and other facilities.
Workshops
International Data Farming Workshops are held twice each year, in the Spring and Fall. Workshop information, including proceedings from prior workshops and registration information for future ones, can be found at the Naval Postgraduate School's SEED Center for Data Farming .
External links
- An article summarizing data farming in the June 2005 issue of SIGNAL.
- MITRE Corporation research paper on data farming
Categories:
Wikimedia Foundation. 2010.