Data verification

Data verification

Data Verification is a process wherein the data is checked for accuracy and inconsistencies after data migration is done.[1]

It helps to determine whether data was accurately translated when data is transported from one source to another, is complete, and supports processes in the new system. During verification, there may be a need for a parallel run of both systems to identify areas of disparity and forestall erroneous data loss.

References

External links


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • data verification — duomenų patikrinimas statusas T sritis automatika atitikmenys: angl. data validation; data verification vok. Datenprüfung, f rus. проверка данных, f pranc. vérification de données, f …   Automatikos terminų žodynas

  • Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… …   Wikipedia

  • vérification de données — duomenų patikrinimas statusas T sritis automatika atitikmenys: angl. data validation; data verification vok. Datenprüfung, f rus. проверка данных, f pranc. vérification de données, f …   Automatikos terminų žodynas

  • data validation — duomenų patikrinimas statusas T sritis automatika atitikmenys: angl. data validation; data verification vok. Datenprüfung, f rus. проверка данных, f pranc. vérification de données, f …   Automatikos terminų žodynas

  • Data erasure — (also called data clearing or data wiping) is a software based method of overwriting data that completely destroys all electronic data residing on a hard disk drive or other digital media. Permanent data erasure goes beyond basic file deletion… …   Wikipedia

  • Verification and Validation — Verification Validation is the process of checking that a product, service, or system meets specifications and that it fulfils its intended purpose. These are critical components of a quality management system such as ISO… …   Wikipedia

  • Data exchange — is the process of taking data structured under a source schema and actually transforming it into data structured under a target schema, so that the target data is an accurate representation of the source data[citation needed]. Data exchange is… …   Wikipedia

  • Data Securities International — Data Securities International, DSI is a company based in San Francisco, California that escrows source code for licensees. History In 1981, mathematician Dwight Olson saw an opportunity in the infant software product industry. Software companies… …   Wikipedia

  • Verification (spaceflight) — Verification in the field of space systems engineering covers two verification processes: Qualification and Acceptance Overview Verification standards are developed by organizations like the NASA and the ECSS to establishes the requirements for… …   Wikipedia

  • Data-driven testing — (DDT) is a term used in the testing of computer software to describe testing done using a table of conditions directly as test inputs and verifiable outputs as well as the process where test environment settings and control are not hard coded. In …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”