-
61 clean installation
The process of installing Windows onto a bare-metal system or overwriting an existing operating system installation. Clean installations do not migrate data from previous installations. -
62 test agent
"A background process that receives, runs, and reports on tests and collects data on a single computer. The test agent communicates with test agent controller, usually located on another computer." -
63 postback
The process in which a Web page sends data back to the same page on the server. -
64 Platform Server Role
"A role that consists of the services that are used to process data. To run Microsoft Dynamics CRM, you must have at least one computer that is running the Platform Server role." -
65 database replication
"The process of creating two or more special copies (replicas) of an Access database. Replicas can be synchronized, changes made to data in one replica, or design changes made in the Design Master, are sent to other replicas." -
66 batching
The process of sending changes in small groups instead of in a one-shot transfer of the data in its entirety. -
67 source file
A file that contains the data that a program will process and store in a destination file. -
68 Font Cache Service
"A Win32 service process that optimizes performance of applications by caching commonly used font data. Applications will start this service if it is not already running. It can be disabled, though doing so will degrade application performance." -
69 Windows Font Cache Service
"A Win32 service process that optimizes performance of applications by caching commonly used font data. Applications will start this service if it is not already running. It can be disabled, though doing so will degrade application performance."English-Arabic terms dictionary > Windows Font Cache Service
-
70 peer discovery
The process of locating peers which have the data one would like to retrieve. -
71 table lookup
"The process of using a known value to search for data in a previously constructed table of values: for example, using a purchase price to search a tax table for the appropriate sales tax." -
72 type checking
"The process performed by a compiler or interpreter to make sure that when a variable is used, it is treated as having the same data type as it was declared to have." -
73 validity check
The process of analyzing data to determine whether it conforms to predetermined completeness and consistency parameters. -
74 Microsoft Assessment and Planning Toolkit
"A powerful inventory, assessment, and reporting tool that can run securely in small or large IT environments without requiring the installation of agent software on any computers or devices. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating to Windows."English-Arabic terms dictionary > Microsoft Assessment and Planning Toolkit
-
75 MAP
"A powerful inventory, assessment, and reporting tool that can run securely in small or large IT environments without requiring the installation of agent software on any computers or devices. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating to Windows." -
76 nagling
An optimization process for HTTP over TCP that increases efficiency by trying to minimize the number of packets that are required before data is sent. -
77 throttle
A process that restricts the flow of data. -
78 reporting
The process of programmatically generating reports to present a customized view of stored data. -
79 decision engine
Software or service able to process a large amount of data and provide reports that help users make more informed decisions. -
80 slab consolidation
The process of reducing the number of slabs allocated in thinly provisioned arrays and thinly provisioned virtual disks by rearranging data from sparsely populated slabs to densely populated slabs.
См. также в других словарях:
Data model — Overview of data modeling context: A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional… … Wikipedia
data — n. 1) to feed in; process; retrieve; store data 2) to cite; evaluate; gather data 3) biographical; raw; scientific; statistical data USAGE NOTE: Purists insist on the data are available and consider the data is available to be incorrect. * * * [… … Combinatory dictionary
Process and Reality — In philosophy, especially metaphysics, the book Process and Reality , by Alfred North Whitehead, sets out its author s philosophy of organism, also called process philosophy. The book, published in 1929, is a revision of the Gifford Lectures he… … Wikipedia
Data driven journalism — is a journalistic process based on analyzing and filtering large data sets for the purpose of creating a new story. Data driven journalism deals with open data that is freely available online and analyzed with open source tools.[1] Data driven… … Wikipedia
Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… … Wikipedia
Process mining — techniques allow for the analysis of business processes based on event logs. They are often used when no formal description of the process can be obtained by other means, or when the quality of an existing documentation is questionable. For… … Wikipedia
Data warehouse — Overview In computing, a data warehouse (DW) is a database used for reporting and analysis. The data stored in the warehouse is uploaded from the operational systems. The data may pass through an operational data store for additional operations… … Wikipedia
Data quality control — is the process of controlling the usage of data with known quality measurement for an application or a process. This process is usually done after a Data quality assurance process, which consists of discovery of data inconsistency and correction … Wikipedia
Data collection — is a term used to describe a process of preparing and collecting data, for example, as part of a process improvement or similar project. The purpose of data collection is to obtain information to keep on record, to make decisions about important… … Wikipedia
Data profiling — is the process of examining the data available in an existing data source (e.g. a database or a file) and collecting statistics and information about that data. The purpose of these statistics may be to: Find out whether existing data can easily… … Wikipedia
Data Execution Prevention — (DEP) is a security feature included in modern operating systems. It is known to be available in Linux, Mac OS X, and Microsoft Windows operating systems and is intended to prevent an application or service from executing code from a non… … Wikipedia