Data efficiency

Data efficiency

Data Efficiency refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints.

A management definition of Data Efficiency would be the measure of how data storage and usage across an enterprise or within a department or within a project - impacts the organization’s costs and revenues.

On the broadest level:

DE = expected benefits from applying I.T. to a given task / cost of application of I.T.

On the technical side, in the development of computer hardware, software and systems, Data Efficiency can refer to many things such as packing bits on a physical medium1, or chip area usage on a silicon wafer2, or the use of data in programming so as to require less time and computation resources3.

Examples of these two categories of use for “Data Efficiency” (managerial and technical) can be found in process industries and computer chip research and development:

1.Traditional water/wastewater management procedures include travel to pump stations, reading and hand recording of meter numbers, transposition of log sheets, and other manual operations. This whole process can be said to have low data efficiency4.

2.In the design of today’s Dynamic Random Access Memory (DRAM) computer chips, R&D optimizes parameters such as row and column access times, chip area usage, burst length and row granularity. Input/output times are measured in very small fractions of a second. The latest versions of these chips are said to have high data efficiency2.

Both these examples above show the application of different information technologies that process data to reach a defined outcome. Sometimes processes are within time, space and resource constraints, and sometimes they are not.

References


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Data Efficiency — refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints.A management definition of Data… …   Wikipedia

  • Data envelopment analysis — (DEA) is a nonparametric method in operations research and economics for the estimation of production frontiers[clarification needed]. It is used to empirically measure productive efficiency of decision making units (or DMUs). Non parametric… …   Wikipedia

  • Data Envelopment Analysis — (DEA) is a nonparametric method in operations research and economics for the estimation of production frontiers. It is used to empirically measure productive efficiency of decision making units (or DMUs). There are also parametric approaches… …   Wikipedia

  • Data center infrastructure management — (DCIM) is the integration of information technology (IT) and facility management disciplines to centralize monitoring, management and intelligent capacity planning of a data center s critical systems. Achieved through the implementation of… …   Wikipedia

  • Data center infrastructure efficiency — (DCIE), is a performance improvement metric used to calculate the energy efficiency of a data center. DCIE is the percentage value derived, by dividing information technology equipment power by total facility power.[1][2][3] See also Power usage… …   Wikipedia

  • Efficiency — as a technical term may refer to: * Algorithmic efficiency, optimizing the speed and memory requirements of a computer program * Efficient energy use, useful work per quantity of energy ** Energy conversion efficiency, desired energy output per… …   Wikipedia

  • Data-flow analysis — is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program s control flow graph (CFG) is used to determine those parts of a program to which a particular value assigned… …   Wikipedia

  • Data logger — Cube storing technical and sensor data A data logger (also datalogger or data recorder) is an electronic device that records data over time or in relation to location either with a built in instrument or sensor or via external instruments and… …   Wikipedia

  • Data governance — is an emerging discipline with an evolving definition. The discipline embodies a convergence of data quality, data management, data policies, business process management, and risk management surrounding the handling of data in an organization.… …   Wikipedia

  • Data Intensive Computing — is a class of parallel computing applications which use a data parallel approach to processing large volumes of data typically terabytes or petabytes in size and typically referred to as Big Data. Computing applications which devote most of their …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”