-
21 paging
"The process of moving virtual memory back and forth between physical memory and the disk. Paging occurs when physical memory limitations are reached and only occurs for data that is not already ""backed"" by disk space. For example, file data is not paged out because it already has allocated disk space within a file system." -
22 restore
"A multi-phase process that copies all the data and log pages from a specified backup to a specified database (the data-copy phase) and rolls forward all the transactions that are logged in the backup (the redo phase). At this point, by default, a restore rolls back any incomplete transactions (the undo phase), which completes the recovery of the database and makes it available to users." -
23 staging
"The process of running a management agent that imports data from a connected data source into the connector space, and then immediately stopping the run." -
24 validation
The process of comparing files on local volumes with their associated data in secondary storage by Remote Storage. Volumes that are validated ensure that the correct data is recalled from remote storage when a user attempts to open the file from a local volume. -
25 DPS Agent
"Software, installed on a server, that tracks changes to protected data and transfers the changes to the DPM server. The protection agent also identifies data on a server that can be protected and is involved in the recovery process." -
26 sync
The process of reconciling the differences between data stored in one location and a copy of the same data stored in another location. -
27 synchronization
The process of reconciling the differences between data stored in one location and a copy of the same data stored in another location. -
28 Windows Disk Diagnostic
"Feature of Windows that proactively detects impending disk failures and can alert the support center to replace the failing hard disk before total failure occurs. For administrators, this feature will guide them through the process of backing up their data so the hard disk can be replaced without data loss." -
29 ar-sa أيام عدم النشاط
A data management process that identifies similar or identical data. -
30 duplicate detection
A data management process that identifies similar or identical data. -
31 type inference
A process in which the compiler determines the data type of a local variable that has been declared without an explicit data type declaration. The type is inferred from the initial value provided for the variable. -
32 access token
"A data structure that contains authentication and authorization information for a user. Windows creates the access token when the user logs on and the user's identity is confirmed. The access token contains the user's security ID (SID), the list of groups that the user is a member of, the list of privileges held by that user. Each process or thread started for the user inherits a copy of the access token. In some cases a user may have more than one access token, with different levels of authority." -
33 encryption
The process of converting readable data (plaintext) into a coded form (ciphertext) to prevent it from being read by an unauthorized party. -
34 software upgrade
A software package that replaces an existing version of a product with a newer and/or more powerful or sophisticated version of the same product. The upgrade process typically leaves existing customer data and preferences intact while replacing the existing software with the newer version. -
35 audio mixing
The process of combining multiple streams of audio data into a single stream. -
36 compression
A process for removing redundant data from a digital media file or stream to reduce its size or the bandwidth used. -
37 concurrency
A process that allows multiple users to access and change shared data at the same time. The Entity Framework implements an optimistic concurrency model. -
38 decompression
"The process of reversing the procedure that is run by compression software. During decompression, the compressed data returns to its original file size and format so it can be accessed or played." -
39 implicit targeting
The process of delivering targeted content by using existing user data to extrapolate unknown information about users who browse your site. -
40 Knowledge Consistency Checker
"A built-in process that runs on all domain controllers and generates the replication topology for the Active Directory forest. At specified intervals, the KCC reviews and makes modifications to the replication topology to ensure propagation of data either directly or transitively."English-Arabic terms dictionary > Knowledge Consistency Checker
См. также в других словарях:
Process-data diagram — A process data diagram is a diagram that describes processes and data that act as output of these processes. On the left side the meta process model can be viewed and on the right side the meta concept model can be viewed. A process data diagram… … Wikipedia
Data efficiency — refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints. A management definition of Data… … Wikipedia
Data Efficiency — refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints.A management definition of Data… … Wikipedia
Data Validation and Reconciliation — Industrial process data validation and reconciliation or short data validation and reconciliation (DVR) is a technology which is using process information and mathematical methods in order to automatically correct measurements in industrial… … Wikipedia
Data modeling — The data modeling process. The figure illustrates the way data models are developed and used today. A conceptual data model is developed based on the data requirements for the application that is being developed, perhaps in the context of an… … Wikipedia
Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… … Wikipedia
Data Protection Directive — The Data Protection Directive (officially Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data) is a European Union directive which regulates the processing of… … Wikipedia
process — I. noun (plural processes) Etymology: Middle English proces, from Anglo French procés, from Latin processus, from procedere Date: 14th century 1. a. progress, advance < in the process of time > b. someth … New Collegiate Dictionary
Data driven journalism — is a journalistic process based on analyzing and filtering large data sets for the purpose of creating a new story. Data driven journalism deals with open data that is freely available online and analyzed with open source tools.[1] Data driven… … Wikipedia
Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… … Wikipedia
Process mining — techniques allow for the analysis of business processes based on event logs. They are often used when no formal description of the process can be obtained by other means, or when the quality of an existing documentation is questionable. For… … Wikipedia