-
61 EdgeSync synchronization
The task or process that the Microsoft Exchange EdgeSync service performs to propagate data from the Active Directory directory service to the subscribed Edge Transport server. -
62 clean installation
The process of installing Windows onto a bare-metal system or overwriting an existing operating system installation. Clean installations do not migrate data from previous installations. -
63 test agent
"A background process that receives, runs, and reports on tests and collects data on a single computer. The test agent communicates with test agent controller, usually located on another computer." -
64 postback
The process in which a Web page sends data back to the same page on the server. -
65 database replication
"The process of creating two or more special copies (replicas) of an Access database. Replicas can be synchronized, changes made to data in one replica, or design changes made in the Design Master, are sent to other replicas." -
66 batching
The process of sending changes in small groups instead of in a one-shot transfer of the data in its entirety. -
67 source file
A file that contains the data that a program will process and store in a destination file. -
68 Font Cache Service
"A Win32 service process that optimizes performance of applications by caching commonly used font data. Applications will start this service if it is not already running. It can be disabled, though doing so will degrade application performance." -
69 Windows Font Cache Service
"A Win32 service process that optimizes performance of applications by caching commonly used font data. Applications will start this service if it is not already running. It can be disabled, though doing so will degrade application performance."English-Arabic terms dictionary > Windows Font Cache Service
-
70 peer discovery
The process of locating peers which have the data one would like to retrieve. -
71 table lookup
"The process of using a known value to search for data in a previously constructed table of values: for example, using a purchase price to search a tax table for the appropriate sales tax." -
72 type checking
"The process performed by a compiler or interpreter to make sure that when a variable is used, it is treated as having the same data type as it was declared to have." -
73 validity check
The process of analyzing data to determine whether it conforms to predetermined completeness and consistency parameters. -
74 Microsoft Assessment and Planning Toolkit
"A powerful inventory, assessment, and reporting tool that can run securely in small or large IT environments without requiring the installation of agent software on any computers or devices. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating to Windows."English-Arabic terms dictionary > Microsoft Assessment and Planning Toolkit
-
75 MAP
"A powerful inventory, assessment, and reporting tool that can run securely in small or large IT environments without requiring the installation of agent software on any computers or devices. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating to Windows." -
76 nagling
An optimization process for HTTP over TCP that increases efficiency by trying to minimize the number of packets that are required before data is sent. -
77 throttle
A process that restricts the flow of data. -
78 reporting
The process of programmatically generating reports to present a customized view of stored data. -
79 decision engine
Software or service able to process a large amount of data and provide reports that help users make more informed decisions. -
80 slab consolidation
The process of reducing the number of slabs allocated in thinly provisioned arrays and thinly provisioned virtual disks by rearranging data from sparsely populated slabs to densely populated slabs.
См. также в других словарях:
Process-data diagram — A process data diagram is a diagram that describes processes and data that act as output of these processes. On the left side the meta process model can be viewed and on the right side the meta concept model can be viewed. A process data diagram… … Wikipedia
Data efficiency — refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints. A management definition of Data… … Wikipedia
Data Efficiency — refers to efficiency of the many processes that can be applied to data such as storage, access, filtering, sharing, etc., and whether or not the processes lead to the desired outcome within resource constraints.A management definition of Data… … Wikipedia
Data Validation and Reconciliation — Industrial process data validation and reconciliation or short data validation and reconciliation (DVR) is a technology which is using process information and mathematical methods in order to automatically correct measurements in industrial… … Wikipedia
Data modeling — The data modeling process. The figure illustrates the way data models are developed and used today. A conceptual data model is developed based on the data requirements for the application that is being developed, perhaps in the context of an… … Wikipedia
Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… … Wikipedia
Data Protection Directive — The Data Protection Directive (officially Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data) is a European Union directive which regulates the processing of… … Wikipedia
process — I. noun (plural processes) Etymology: Middle English proces, from Anglo French procés, from Latin processus, from procedere Date: 14th century 1. a. progress, advance < in the process of time > b. someth … New Collegiate Dictionary
Data driven journalism — is a journalistic process based on analyzing and filtering large data sets for the purpose of creating a new story. Data driven journalism deals with open data that is freely available online and analyzed with open source tools.[1] Data driven… … Wikipedia
Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… … Wikipedia
Process mining — techniques allow for the analysis of business processes based on event logs. They are often used when no formal description of the process can be obtained by other means, or when the quality of an existing documentation is questionable. For… … Wikipedia