-
21 restore
"A multi-phase process that copies all the data and log pages from a specified backup to a specified database (the data-copy phase) and rolls forward all the transactions that are logged in the backup (the redo phase). At this point, by default, a restore rolls back any incomplete transactions (the undo phase), which completes the recovery of the database and makes it available to users." -
22 staging
"The process of running a management agent that imports data from a connected data source into the connector space, and then immediately stopping the run." -
23 validation
The process of comparing files on local volumes with their associated data in secondary storage by Remote Storage. Volumes that are validated ensure that the correct data is recalled from remote storage when a user attempts to open the file from a local volume. -
24 DPS Agent
"Software, installed on a server, that tracks changes to protected data and transfers the changes to the DPM server. The protection agent also identifies data on a server that can be protected and is involved in the recovery process." -
25 sync
The process of reconciling the differences between data stored in one location and a copy of the same data stored in another location. -
26 synchronization
The process of reconciling the differences between data stored in one location and a copy of the same data stored in another location. -
27 Windows Disk Diagnostic
"Feature of Windows that proactively detects impending disk failures and can alert the support center to replace the failing hard disk before total failure occurs. For administrators, this feature will guide them through the process of backing up their data so the hard disk can be replaced without data loss." -
28 ar-sa أيام عدم النشاط
A data management process that identifies similar or identical data. -
29 duplicate detection
A data management process that identifies similar or identical data. -
30 type inference
A process in which the compiler determines the data type of a local variable that has been declared without an explicit data type declaration. The type is inferred from the initial value provided for the variable. -
31 access token
"A data structure that contains authentication and authorization information for a user. Windows creates the access token when the user logs on and the user's identity is confirmed. The access token contains the user's security ID (SID), the list of groups that the user is a member of, the list of privileges held by that user. Each process or thread started for the user inherits a copy of the access token. In some cases a user may have more than one access token, with different levels of authority." -
32 encryption
The process of converting readable data (plaintext) into a coded form (ciphertext) to prevent it from being read by an unauthorized party. -
33 software upgrade
A software package that replaces an existing version of a product with a newer and/or more powerful or sophisticated version of the same product. The upgrade process typically leaves existing customer data and preferences intact while replacing the existing software with the newer version. -
34 audio mixing
The process of combining multiple streams of audio data into a single stream. -
35 compression
A process for removing redundant data from a digital media file or stream to reduce its size or the bandwidth used. -
36 concurrency
A process that allows multiple users to access and change shared data at the same time. The Entity Framework implements an optimistic concurrency model. -
37 decompression
"The process of reversing the procedure that is run by compression software. During decompression, the compressed data returns to its original file size and format so it can be accessed or played." -
38 implicit targeting
The process of delivering targeted content by using existing user data to extrapolate unknown information about users who browse your site. -
39 Knowledge Consistency Checker
"A built-in process that runs on all domain controllers and generates the replication topology for the Active Directory forest. At specified intervals, the KCC reviews and makes modifications to the replication topology to ensure propagation of data either directly or transitively."English-Arabic terms dictionary > Knowledge Consistency Checker
-
40 primary master
"An authoritative DNS server for a zone that can be used as a point of update for the zone. Only primary masters have the ability to be updated directly to process zone updates, which include adding, removing, or modifying resource records that are stored as zone data. Primary masters are also used as the first sources for replicating the zone to other DNS servers."
См. также в других словарях:
Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… … Wikipedia
data — n. 1) to feed in; process; retrieve; store data 2) to cite; evaluate; gather data 3) biographical; raw; scientific; statistical data USAGE NOTE: Purists insist on the data are available and consider the data is available to be incorrect. * * * [… … Combinatory dictionary
Process and Reality — In philosophy, especially metaphysics, the book Process and Reality , by Alfred North Whitehead, sets out its author s philosophy of organism, also called process philosophy. The book, published in 1929, is a revision of the Gifford Lectures he… … Wikipedia
Data driven journalism — is a journalistic process based on analyzing and filtering large data sets for the purpose of creating a new story. Data driven journalism deals with open data that is freely available online and analyzed with open source tools.[1] Data driven… … Wikipedia
Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… … Wikipedia
Process mining — techniques allow for the analysis of business processes based on event logs. They are often used when no formal description of the process can be obtained by other means, or when the quality of an existing documentation is questionable. For… … Wikipedia
Data warehouse — Overview In computing, a data warehouse (DW) is a database used for reporting and analysis. The data stored in the warehouse is uploaded from the operational systems. The data may pass through an operational data store for additional operations… … Wikipedia
Data quality control — is the process of controlling the usage of data with known quality measurement for an application or a process. This process is usually done after a Data quality assurance process, which consists of discovery of data inconsistency and correction … Wikipedia
Data collection — is a term used to describe a process of preparing and collecting data, for example, as part of a process improvement or similar project. The purpose of data collection is to obtain information to keep on record, to make decisions about important… … Wikipedia
Data profiling — is the process of examining the data available in an existing data source (e.g. a database or a file) and collecting statistics and information about that data. The purpose of these statistics may be to: Find out whether existing data can easily… … Wikipedia
Data Execution Prevention — (DEP) is a security feature included in modern operating systems. It is known to be available in Linux, Mac OS X, and Microsoft Windows operating systems and is intended to prevent an application or service from executing code from a non… … Wikipedia