-
41 offline cube file
A file you create on your hard disk or a network share to store OLAP source data for a PivotTable or PivotChart report. Offline cube files allow you to keep working when you are not connected to the OLAP server. -
42 full import
"An import of all data from a file, or a scoped view, of a connected data source to the connector space. Data from the connected data source is compared with data in the connector space. If there are no attribute changes, the object is not changed in the connector space." -
43 query
"A formalized instruction to a data source to either extract data or perform a specified action. The query can be in the form of a query expression, or a method-based query (or a combination of the two). The data source can be in different forms, e.g. relational databases, XML documents, or in-memory objects." -
44 ODC file
"A file that stores information about a connection to a data source, such as an Access database, spreadsheet, or text file, and that facilitates data source administration." -
45 connector framework
"A software component that can be used to connect to a data source, and index and include data from that source in search results." -
46 bind
"To connect a control to a field or group in the data source so that data entered into the control is saved to the corresponding field or group. When a control is unbound, it is not connected to a field or group, and data entered into the control is not saved." -
47 module
"In programming, a collection of routines and data structures that performs a particular task or implements a particular abstract data type. Modules usually consist of two parts: an interface, which lists the constants, data types, variables, and routines that can be accessed by other modules or routines, and an implementation, which is private (accessible only to the module) and which contains the source code that actually implements the routines in the module." -
48 file DSN
file Data Source Names. File-based data sources shared among all users with the same drivers installed. These data sources are not dedicated to a user or local to a computer. -
49 replica
"A complete copy of protected data residing on a single volume on the DPM server. A replica is created for each protected data source after it is added to its protection group. With co-location, multiple data sources can have their replicas residing on the same replica volume." -
50 dynaset
"A database recordset with dynamic properties. Unlike a snapshot, which is a static view of the data, a recordset object in dynaset mode stays synchronized with the data source and data updates made by other users." -
51 ar-sa مكعب
A known data source specific to a target type that provides data to a collector type. -
52 packet
"An Open Systems Interconnection (OSI) network layer transmission unit that consists of binary information representing both data and a header containing an identification number, source and destination addresses, and error-control data." -
53 provider
"In a Web Parts connection, a server control that sends data to a consumer control. A provider can be a WebPart control or any type of server control, but must be designed to function as a provider. A provider must have a special callback method marked with a ConnectionProviderAttribute attribute in the source code. This method provides data to consumer controls in the form of an interface instance." -
54 ar-sa يحدث
"An interval of time used by of a zone to determine how often to check if their zone data needs to be refreshed. When the refresh interval expires, the secondary master checks with its source for the zone to see if its zone data is still current or if it needs to be updated using a zone transfer. This interval is set in the SOA (start-of-authority) resource record for each zone." -
55 refresh interval
"An interval of time used by of a zone to determine how often to check if their zone data needs to be refreshed. When the refresh interval expires, the secondary master checks with its source for the zone to see if its zone data is still current or if it needs to be updated using a zone transfer. This interval is set in the SOA (start-of-authority) resource record for each zone." -
56 retry interval
"The time, in seconds after the refresh interval expires, used by secondary masters of a zone to determine how often to try and retry contacting its source for zone data to see if its replicated zone data needs to be refreshed. This interval is set in the for each zone." -
57 staging
"The process of running a management agent that imports data from a connected data source into the connector space, and then immediately stopping the run." -
58 destination file
"The file that a linked or embedded object is inserted into, or that data is saved to. The source file contains the information that is used to create the object. When you change information in a destination file, the information is not updated in the source file." -
59 IDC/HTX files
Microsoft Internet Information Server uses an IDC file and an HTX file to retrieve data from an ODBC data source and format it as an HTML document.IDC/HTX ملفات -
60 attribute field
"A field in the data source that can contain data and that is an attribute, instead of an element. Attribute fields cannot contain other fields."
См. также в других словарях:
Source data — is the origin of information found in electronic media. Often when data is captured in one electronic system and then transferred to another, there is a loss of audit trail or the inherent data cannot be absolutely verified. There are systems… … Wikipedia
source data — pirminiai duomenys statusas T sritis informatika apibrėžtis ↑Duomenų šaltinio formuojami duomenys. Kartais pirminiais duomenimis vadinami ir duomenys, kuriuos pateikia vartotojas uždaviniui spręsti, nors pastaruosius tinkamiau vadinti ↑pradiniais … Enciklopedinis kompiuterijos žodynas
source data automation — A method for reusing recorded, coded data … IT glossary of terms, acronyms and abbreviations
Open Source Data Integration — The Open Source Data Integration framework from the [http://snaplogic.org SnapLogic] project [cite web|url=http://www.snaplogic.org|title= Open Source Data Integration Framework] is an open source framework for enterprise scale data integration.… … Wikipedia
Common Source Data Base — Technical documentation is used in many areas of the everyday life. Nearly everything has to be provided with at minimum a drawing including a few locators. The product liability and many other issues regarding consumer protection have to be… … Wikipedia
Postal Source Data System — (PSDS) An electronic data processing network that gathers operational and administrative data (such as mail volume and labor hours) from large post offices. It gathers the data with little or no manual intervention, processes it at a central site … Glossary of postal terms
Data compression — Source coding redirects here. For the term in computer programming, see Source code. In computer science and information theory, data compression, source coding or bit rate reduction is the process of encoding information using fewer bits than… … Wikipedia
Data transformation — Data transformation/Source transformation Concepts metadata · data mapping data transformation · model transf … Wikipedia
Data warehouse — Overview In computing, a data warehouse (DW) is a database used for reporting and analysis. The data stored in the warehouse is uploaded from the operational systems. The data may pass through an operational data store for additional operations… … Wikipedia
Data exchange — is the process of taking data structured under a source schema and actually transforming it into data structured under a target schema, so that the target data is an accurate representation of the source data[citation needed]. Data exchange is… … Wikipedia
Data acquisition — is the process of sampling signals that measure real world physical conditions and converting the resulting samples into digital numeric values that can be manipulated by a computer. Data acquisition systems (abbreviated with the acronym DAS or… … Wikipedia