D
DAB (digital audio broadcasting)
DAB transmits digital signals rather than the analog audio signals traditionally used in broadcast radio. DAB is broadcast on terrestrial networks, with future prospects for satellite broadcasting. Apart from receiving high-quality audio entertainment via the radio, programs can be accompanied by text, such as lyrics. The DAB-IP variant used by Virgin U.K. can support video.
DAB+ (digital audio broadcasting plus)
DAB+ is an extension of DAB to support new and more efficient codecs with better error correction. This will, however, introduce challenges for backward compatibility with older DAB radios.
daemon
A Unix process initiated during system boot and activated automatically to perform a particular task.
daisy-chaining
The connection of multiple devices in a serial fashion. An advantage of daisy-chaining is savings in transmission facilities. A disadvantage is that if a device malfunctions, all of the devices daisy-chained behind it are disabled.
DAP (Directory Access Protocol)
A protocol for working among X.500 Directory Service Agents.
DAP (Distributed Application Platform)
An application framework introduced in 1997 by Visigenic (later acquired by Borland International).
DAPP (data analysis and provider profiling)
DAPP vendors are those that provide healthcare value-added analytic applications to support analysis of administrative data for the purposes of network management, actuarial and underwriting functions, medical management and performance measurement — including Health Plan Employer Data and Information Set (HEDIS) reporting.
dark fiber
Fiber-optic cable deployments that are not yet being used to carry network traffic. (The word "dark" refers to the fact that no light is passing through the optical fibers.)
DARPA (Defense Advanced Research Projects Agency)
The U.S. government department that developed the Transmission Control Protocol/Internet Protocol (TCP/IP) ARPANET protocol architecture.
DAS (distributed antenna system)
System that uses passive (non-powered) or active (powered) networking equipment, such as antennas, fiber-optic, coaxial cable and other technologies to extend RF coverage (of any technology) inside a building.
DAS (dual-attached station)
In Fiber Distributed Data Interface (FDDI), a device that is attached to both the primary and secondary rings.
DASD (direct-access storage device)
Generic nomenclature for a storage peripheral that can respond directly to random requests for information; usually denotes a disk drive.
dashboard
This subset of reporting includes the ability to publish formal, Web-based reports with intuitive interactive displays of information, including dials, gauges, sliders, check boxes and traffic lights. These displays indicate the status of an item or the state of a performance metric compared with a goal or target value. Increasingly, dashboards are used to disseminate real-time data from operational applications.
DAT (digital audiotape)
A magnetic tape that stores audio data converted to digital form.
DAT (dynamic address translation)
The change of a logical storage address to an actual storage address.
data center
The data center is the department in an enterprise that houses and maintains back-end information technology (IT) systems and data stores – its mainframes, servers and databases. In the days of large, centralized IT operations, this department and all the systems resided in one physical place, hence the name data center...
data center outsourcing
Data center outsourcing is a multiyear or annuity contract or relationship involving the day-to-day management responsibility for operating server/host platforms, including distributed servers and storage. Services include any combination (or all) of the product support and professional services as they specifically relate to the ongoing management of the computing and storage resources. Minimally, data center outsourcing contracts always include services encompassed by the computing platform of the operation services segment. Help desk management services are included only to the extent that problem determination and resolution is at the computing hardware level or the infrastructure software or OS software level. Application management services are included only to the extent of the infrastructure software or OS software level. Information management software and system management tools may be provided and used by the outsourcer or the enterprise client. Services may be provided at the client site or off-site. IT assets may be owned by either the client or the ESP, or a third party. Contracts may include the transfer of client employees, IT assets and facilities to the ESP.
data integration tools
The discipline of data integration comprises the practices, architectural techniques and tools for achieving the consistent access and delivery of data across the spectrum of data subject areas and data structure types in the enterprise to meet the data consumption requirements of all applications and business processes.
Data integration tools have traditionally been delivered via a set of related markets, with vendors in each market offering a specific style of data integration tool. In recent years, most of the activity has been within the ETL tool market. Markets for replication tools, data federation (EII) and other submarkets each included vendors offering tools optimized for a particular style of data integration, and periphery markets (such as data quality tools, adapters and data modeling tools) also overlapped with the data integration tool space. The result of all this historical fragmentation in the markets is the equally fragmented and complex way in which data integration is accomplished in large enterprises — different teams using different tools, with little consistency, lots of overlap and redundancy, and no common management or leverage of metadata. Technology buyers have been forced to acquire a portfolio of tools from multiple vendors to amass the capabilities necessary to address the full range of their data integration requirements.
This situation is now changing, with the separate and distinct data integration tool submarkets converging at the vendor and technology levels. This is being driven by buyer demands as organizations realize they need to think about data integration holistically and have a common set of data integration capabilities they can use across the enterprise. It is also being driven by the actions of vendors, such as those in individual data integration submarkets organically expanding their capabilities into neighboring areas, as well as by acquisition activity that brings vendors from multiple submarkets together. The result is a market for complete data integration tools that address a range of different data integration styles and are based on common design tooling, metadata and runtime architecture.
The data integration tool market comprises vendors that offer software products to enable the construction and implementation of data access and delivery infrastructure for a variety of data integration scenarios, including:
Data acquisition for BI and data warehousing – Extracting data from operational systems, transforming and merging that data, and delivering it to integrated data structures for analytic purposes. BI and data warehousing remains a mainstay of the demand for data integration tools.
Creation of integrated master data stores – Enabling the consolidation and rationalization of the data and representing critical business entities, such as customers, products and employees. Master data management (MDM) may or may not be subject-based, and data integration tools can be used to build the data consolidation and synchronization processes that are key to success.
Data migrations and conversions – These were traditionally addressed most often via the custom coding of conversion programs, but data integration tools are increasingly addressing the data movement and transformation challenges inherent in the replacement of legacy applications and consolidation efforts during merger and acquisition activities.
Synchronization of data between operational applications – Similar in concept to each of the prior scenarios, data integration tools provide the capability to ensure database-level consistency across applications, both on an internal and interenterprise basis, and in a bidirectional or unidirectional manner.
Creation of federated views of data from multiple data stores – Data federation, often referred to as EII, is growing in popularity as an approach for providing real-time integrated views across multiple data stores without physical movement of data. Data integration tools are increasingly including this type of virtual federation capability.
Delivery of data services in an SOA context – An architectural technique, rather than a data integration usage itself, data services are the emerging trend for the role and implementation of data integration capabilities within SOAs. Data integration tools will increasingly enable the delivery of many types of data services.
Unification of structured and unstructured data – This is also not a specific use case itself, and it is relevant to each of the above scenarios. There is an early but growing trend toward leveraging data integration tools for merging structured and unstructured data sources as organizations work on delivering a holistic information infrastructure that addresses all data types.
The following describes the different functional capabilities that are included in data integration tools:
- Data integration adapters – Data integration adapters provide simplified connectivity and access to databases, files and other data structures to enable applications and other data integration tools to perform read and update operations. Using adapters, applications and tools can reach into a variety of database types to access data for a range of purposes. This technology is commonly used to make legacy data accessible to more-modern applications and processes.
- Data federation – Federation technology, sometimes referred to as EII, executes distributed queries across multiple databases to create a "virtual data layer." By leaving the data in place, this approach provides integrated views of data without creating a physical database to house them. These integrated views can be consumed and manipulated by applications, other integration infrastructure components, or query and reporting tools via SQL interfaces, XML or Web services. The data view is virtual in the sense that it resides only in memory or cache (that is, it generally is not persisted for long-term storage and is not backed up in its intermediate form). Data federation tools leave the application data in place, in contrast to ETL tools, which perform data movement to create a new copy of data. Federation tools can access a range of heterogeneous data sources on different platforms. Data may be cached for performance reasons and thus may not be entirely virtual. The early usage of data federation products tends to be simplistic, generally not addressing complex transformations, data cleansing or updates. This is typically because of the performance challenges inherent in delivering large-scale federated views.
- Data replication/synchronization – Data replication/synchronization technology enables the creation and ongoing maintenance of multiple copies of data. Replication/synchronization is generally focused on real-time consistency of transactions between databases. Although most replication/synchronization scenarios tend to be between homogeneous DBMSs and schemas, some replication/synchronization technology supports heterogeneous environments as well. A common use pattern for replication/synchronization technology is maintaining consistency of data between multiple, geographically dispersed instances of an operational application.
- ETL – ETL technology supports the acquisition and integration of data from multiple-source databases and provides the ability to change the syntax and semantics of that data and then deliver it to one or more target databases. ETL technology typically supports the movement of data for batch-oriented data integration processes, often in the context of building an integrated data structure, such as a data warehouse or master data store. BI-oriented data integration has been the traditional focus of ETL vendors; however, there is a trend toward broader use of the tools, particularly for implementing batch-oriented data consistency integration patterns between operational applications. ETL technology is well-suited when low-latency delivery of data is not a requirement and rich set-based transformation capabilities are important.
data mining
The process of discovering meaningful correlations, patterns and trends by sifting through large amounts of data stored in repositories. Data mining employs pattern recognition technologies, as well as statistical and mathematical techniques.
data quality tools
The market for data quality tools has become highly visible in recent years as more organizations understand the impact of poor-quality data and seek solutions for improvement. Traditionally aligned with cleansing of customer data (names and addresses) in support of CRM-related activities, the tools have expanded well beyond such capabilities, and forward-thinking organizations are recognizing the relevance of these tools in other data domains. Product data – often driven by MDM initiatives – and financial data (driven by compliance pressures) are two such areas in which demand for the tools is quickly building.
Data quality tools are used to address various aspects of the data quality problem:
- Parsing and standardization – Decomposition of text fields into component parts and formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules, and knowledge bases of values and patterns
- Generalized "cleansing" – Modification of data values to meet domain restrictions, integrity constraints or other business rules that define sufficient data quality for the organization
- Matching – Identification, linking or merging related entries within or across sets of data
- Profiling – Analysis of data to capture statistics (metadata) that provide insight into the quality of the data and aid in the identification of data quality issues
- Monitoring – Deployment of controls to ensure ongoing conformance of data to business rules that define data quality for the organization
- Enrichment – Enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors)
The tools provided by vendors in this market are generally consumed by technology users for internal deployment in their IT infrastructure, although hosted data quality solutions are continuing to emerge and grow in popularity. The tools are increasingly implemented in support of general data quality improvement initiatives, as well as within critical applications, such as ERP, CRM and BI. As data quality becomes increasingly pervasive, many data integration tools now include data quality management functionality.
data replication
The data replication segment includes a set of data replication products that reside in the disk array controller, in a device in the storage network or on a server. Included are local and remote replication products, migration tools, and disk imaging products. Also included are replication products specifically targeted as an alternative to backup applications. Not included are database replication products, log-based DBMS replication products or application-based replication products.
data synchronization
A form of embedded middleware that allows applications to update data on two systems so that the data sets are identical. These services can run via a variety of different transports but typically require some application-specific knowledge of the context and notion of the data being synchronized.
data warehouse
A storage architecture designed to hold data extracted from transaction systems, operational data stores and external sources. The warehouse then combines that data in an aggregate, summary form suitable for enterprisewide data analysis and reporting for predefined business needs. The five components of a data warehouse are production data sources, data extraction and conversion, the data warehouse database management system, data warehouse administration and business intelligence (BI) tools.
database design
This includes logical (entity relationship) and physical (table, column and key) design tools for data. Physical data modeling is becoming almost mandatory for applications using relational database management systems (RDBMSs). Strong support for physical modeling is paired with facilities to manage multiple models, to submodel or extract from larger models, and to reverse-engineer a database design from established tables. Data architects/analysts and database designers/administrators are the primary targeted users of these tools, although developers are a secondary market often targeted with a subset of the complete functionality.
DBMS (database management system)
A DBMS is a product used for the storage and organization of data that typically has defined formats and structures. DBMSs are categorized by their basic structures and, to some extent, by their use or deployment.
DBMS management
Included here are tools for monitoring and diagnosing problems with databases, analyzing and improving the performance of databases, and routine administration of databases, including configuration changes. Examples include database management monitors, SQL tuners, space tuners, reorganization tools, utilities, loaders and unloaders, and many other tools, as well as suites that may include several of the above.
DBS (direct broadcast satellite)
Type of satellite used for consumer services, primarily the transmission of radio and TV programs. A direct broadcasting satellite is similar to a fixed-service satellite (FSS); however, it offers a higher power output, requiring smaller antennas to receive the signal. Typical DBS services offer digital programming, digital audio services and, increasingly, high-definition TV (HDTV).
DDBMS (distributed database management system)
A DBMS that enables end users or application programmers to view a collection of physically separate databases as one logical single-system image. The concept that is most fundamental to the DDBMS is location transparency, meaning the user should not be conscious of the actual location of data.
DDL (data definition language)
A language used to describe the data model for a database, i.e., the names and access paths for the data and how they are interrelated. In some software products, the DDL describes the logical, not the physical, data. Other products use it to describe both.
demand-driven value network (DDVN)
A demand-driven value network is a business environment holistically designed to maximize value across the set of processes and technologies that senses and orchestrates demand based on a near-zero-latency signal across multiple networks of employees and trading partners.
Deming PDCA cycle
Continuous improvement model of "Plan, Do, Check, Act." Often represented as the four quadrants of the rim of a circle to reflect the fact that once all four elements have been accomplished, the cycle repeats.
deployment
Deployment services support the implementation and rollout of new applications or infrastructure. Activities may include hardware or software procurement, configuration, tuning, staging, installation and interoperability testing.
desktop outsourcing
Desktop outsourcing is a multiyear or annuity contract or relationship involving the day-to-day management responsibility for operating desktop/client platforms. Services include any combination (or all) of the product support and professional services as they specifically relate to the ongoing management of the desktop resources (including desktop peripherals). Minimally, desktop outsourcing contracts always include services encompassed by the computing environment of the operation services segment. Help desk management services are included only to the extent that problem determination and resolution is at the computing hardware level or the infrastructure software or OS software level. Application management services are included only to the extent of the infrastructure software or OS software level. A desktop system can include any client system (including a notebook) and may include the client systems of remote employees, such as telecommuters and mobile staff. Services may be provided at the client site or off-site. IT assets may be owned by either the client, the ESP or a third party. Contracts may include the transfer of client employees, IT assets and facilities to the ESP.
desktop virtualization
Desktop virtualization is not a single market category, but rather is made up of four distinct markets that address different requirements – virtualization software, hosted virtual desktops, application virtualization and portable personality solutions...
DEVA (document-enabled vertical application)
A Gartner concept that applies integrated document and output management (IDOM) technologies in specific industries for support of vertical (or sometimes horizontal) processes. Examples of industries and related processes include insurance (claims processing), engineering (technical document management), pharmaceuticals (new drug application), financial services (retirement processing) and cross-industry applications (call-center support).
development and integration services
Development and integration services support the implementation and rollout of new network infrastructure, including consolidation of established network infrastructure. Activities may include hardware or software procurement, configuration, tuning, staging, installation and interoperability testing.
device resource management
Storage subsystems and SAN infrastructure component software products provide configuration utilities and agents that collect capacity, performance and status information, usually for a single device type or a set of devices from a single vendor. Most of the products in this segment are called element managers.
DFSS (design for Six Sigma)
Design for Six Sigma is a technique that prescribes a specific approach to product design emphasizing variability reduction and quality.
DFX (design for X)
Design for "x," where "x" can be manufacturing, service, quality, maintenance, etc.
DGT (Directorate General of Telecommunications, Taiwan)
Regulator for telecommunications, broadcast radio and TV in Taiwan, an agency of the Ministry of Transportation and Communications.
DIF (Data Interchange Format)
A file format developed for VisiCalc, the first electronic spreadsheet. Still used today as a means for transferring files to and from spreadsheets.
digital
Signal transmission that conveys information through a series of coded pulses representing 1s and 0s (binary code).
digital copiers
Image capture using digital scanning and image transfer using electronic impulse in which the image is scanned from the platen and digitized into electronic data. The electronic data is processed to enable the image to be transferred to the photoconductor. The electronic image data is then transferred to a print engine, which may utilize a number of different technologies, such as laser, LED or solid ink.
digital dial tone
A Gartner term describing the combination of XML and Internet transport protocols – such as HTTP, SMTP and FTP – to create a ubiquitous capability to exchange structured information. The metaphor relies on the contrast between previous business to business (B2B) message exchanges – such as electronic data exchange (EDI) using X12, EDI for Administration, Commerce and Transportation (EDIFACT) and various vertical-industry-specific standards – and the unstructured communications media of telephone and fax. The former required considerable investment and the interposition of a private network – or substantial bilateral negotiation – to achieve interoperability, while the latter are ubiquitous and broadly interoperable. With the Internet and ubiquitous software to implement Internet transport protocols, business partners can reduce the cost, delay and risk of implementing B2B messaging.
Internet transport protocols by themselves do not constitute a complete digital-dial-tone solution. For secure application messaging to occur over the Internet, communicating systems must be programmed to additional specifications in key areas such as security, routing and access control. Ubiquity will not be achieved until integration products agree on standards for meeting these requirements.
digital divide
The gap in opportunities experienced by those with limited accessibility to technology, especially the Internet. This includes, but is not limited to, accessibility challenges in the following areas:
- Cultural (e.g., membership of a community that prohibits or restricts access to technology)
- Physical (e.g., having a disability that make it difficult or impossible to use a computer)
- Economic (e.g., being unable to afford a computer)
- Educational (e.g., not knowing how to use a computer)
digital forensics
Gartner defines digital forensics as the use of specialized, investigative techniques and technologies to determine whether illegal or otherwise inappropriate events have occurred on computer systems, and provide legally defensible information about the sequence of those events.
digital loopback
A technique for testing the digital processing circuitry of a communications device. It can be initiated locally or remotely via a telecommunications circuit; the device being tested will echo back a received test message (after first decoding and then re-encoding it) the results of which are compared with the original message.
digital modem
A system component that enables communication over digital access facilities with a remotely located system connected to the public network over analog facilities.
digital network
A network incorporating both digital switching and digital transmission.
digital signature
A core function of a public key infrastructure (PKI). A digital signature can prove identity because it is created with the private key portion (which only the key holder should access) of a public/private key pair. Anyone with the sender's widely published public key can decrypt the signature and, by doing so, receive the assurance that the data must have come from the sender (nonrepudiation of the sender) and that the data has not changed (integrity). The data that is encrypted with the private key is not the entire message, but a short, fixed-length block of data that is computed from the message using a so-called "hash" function.
digital switching
The process of establishing and maintaining a connection under stored program control where binary-encoded information is routed between an input and an output port.
digitize
To convert or express an analog form in a digital format.
DIP (document image processing)
A technology used to scan, digitize and store documents (e.g., checks or invoices) as images.
DIP (dual in-line package)
A method of packaging electronic components for mounting on printed circuit boards.
direct channel
This is a channel through which hardware, software and peripherals are sold by the manufacturer directly to the end user:
Direct sales force – This is a channel through which products move directly from the manufacturer or vendor to the end user, usually by a professionally trained field sales force.
Direct fax/phone/Web – This is a channel through which manufacturers sell their own products directly to end users through the use of the telephone, Web, fax, fax back and mail, including e-mail and catalog.
Direct retail – These are storefront operations owned and managed by the vendor, typically a manufacturer of computer systems. Direct stores are more common in Europe and Japan than in other parts of the world. Sales through direct stores are not reported separately by Gartner's worldwide services. They are grouped under direct sales force or one of the indirect channels.
directed speech recognition
A system that uses a script-like dialog instead of complete, free-form natural language. For each question asked, there are a limited number of valid responses. With this approach, accuracy rates may go up dramatically on less-expensive hardware.
directory services
Middleware that locates the correct and full network address for a mail addressee from a partial name or address. A directory service provides a naming service and extends the capabilities to include intelligent searching and location of resources in the directory structure.
"dirty" protocols
Many Internet Protocol (IP) applications assume that direct IP connectivity exists between hosts. In today's Internet or extranets, this is often not true. The problems of limited IP address space have caused many enterprises to use private Request for Comment (RFC) 1918 addresses. These addresses cannot be routed and, for enterprises to connect to the Internet or to communicate in an extranet, address translations or application proxies must be used. For applications that exchange their IP addresses between the client and the server, these "dirty" IP address are not valid when one or both of the end systems exist on an RFC 1918 network. In addition, without using special techniques, applications like File Transfer Protocol (FTP) will not work when an enterprise uses private RFC 1918 addresses.
disciplined multisourcing
The disciplined provisioning and blending of business and IT services from the optimal set of internal and external providers in the pursuit of business goals.
discrete manufacturing
The production of a discrete category of goods (e.g., automobiles, aircraft, computers or component assemblies).
discretionary security controls
An operating-system security rating of C2 or higher based on U.S. Department of Defense trusted computer system evaluation criteria.
discussion database
A database designed specifically for the capture, exchange and storage of ideas (e.g., Lotus Notes).
distributed computing
A form of computing in which data and applications are distributed among disparate computers or systems, but are connected and integrated by means of network services and interoperability standards such that they function as a single environment. See DCE (distributed computing environment).
distributed database
A database whose objects (tables, views, columns and files) reside on more than one system in a network, and can be accessed or updated from any system in the network.
distributed data management
A form of client/server computing in which some portion of the application data executes on two or more computers.
distributed function
A form of client/server computing in which some of the application program logic executes on one computer, possibly with a database, and the rest of the application resides on another computer, possibly along with presentation services.
DMAIC (define, measure, analyze, improve, control)
"Define, Measure, Analyze, Improve, Control"; a problem-solving methodology associated with Six Sigma process improvement.
DMB (digital multimedia broadcasting)
Technology that can transmit digital video to mobile devices. It developed out of the DAB standard, which established itself as the best terrestrial radio system for delivering CD-quality, digital stereo sound in fixed, portable and mobile reception conditions.
DMI (data management and integration)
Gartner defines data management and integration as the practices, architectural techniques and tools for achieving consistent access to and delivery of data across the spectrum of data subject areas and data structure types in the enterprise, to meet the data consumption requirements of all applications and business processes
document management
Document management is a function in which applications or middleware perform data management tasks tailored for typical unstructured documents...
document management hardware services
This segment includes copier and printer services.
- Copier services – Copiers perform image capture and transfer. This category includes analog (optical technology) and digital (digital scanning and printing technology) copiers.
- Printer services – A printer is the peripheral output device of a computer system for producing computer-generated images on paper using various marking technologies. To be classified in this segment, the device needs to be capable of using plain or coated papers with a minimum size of International Organization for Standardization A4, U.S. size A (letter) or continuous forms with an 8-inch print width or greater, but it excludes products that support paper widths above A2 or U.S. size C (17 inches x 22 inches). The definition also excludes other classes of application-specific printers, such as point-of-sale printers, airline ticket printers, video printers and dedicated photo printers.
DoJa (DoCoMo Java)
The DoJa profile is the NTT DoCoMo Java environment specification for i-Mode mobile phones, used mainly for i-Mode games.
domain
- A group of nodes on a network forming an administrative entity.
- On the Internet, a part of the naming hierarchy that refers to groupings of networks based on organization type or geography.
domain name
A unique identifier for an Internet site or Internet Protocol (IP) network address, consisting of at least two segments separated by periods. Enterprises must register top-level domains with the Web Internet Registry and pay a yearly fee to maintain the registry.
downlink
Satellite communication link that involves signal transmission or retransmission from in-orbit satellites to earth stations or other receiving terminals on the ground. See also uplink.
download
The process of bringing a file down to a computer through a network and typically from a server, or some other computing device. Download times can be greatly effected by the method of connection to the network.
downtime
The total time a system is out of service.
DPMO (defects per million opportunities)
A critical measure associated with Six Sigma-based quality management.
DPO (defects per opportunity)
A measure of quality that reflects whether a specific product or service has any defects.
DPU (defects per unit)
A measure of quality that measures how many defects are associated with a single product or service unit.
DR (disaster recovery)
- The use of alternative network circuits to re-establish communications channels in the event that the primary channels are disconnected or malfunctioning.
- Methods and procedures for returning a data center to full operation after a catastrophic interruption (e.g., including recovery of lost data).
DR (distributed request)
A single read-only request to multiple data sources.
DRAM (dynamic random-access memory)
A computer memory chip that requires electronic refresh cycles to preserve data stored for manipulation by logic chips.
DRM (digital rights management)
Trusted exchange of digital information over the Internet whereby the user is granted only the privileges that the document sender allows.
DRM (distributed resource management)
An evolving discipline consisting of a set of software, hardware, network tools, procedures and policies for enabling distributed enterprise systems to operate effectively in production. DRM embraces solutions for the daily monitoring, resource planning, system administration, change management, operations, performance and other initiatives that are needed to maintain effective productivity in a distributed networked computing environment.
DRP (disaster recovery planning)
Planning to ensure the timely recovery of information technology assets and services following a catastrophe, such as fire, flood or hardware failure.
DRP (distribution requirements planning)
The process of assessing from which location products and services should be deployed, and determining the stock-keeping unit (SKU) and location-level replenishment plan.
drum, buffer, rope
A constraint-aware workflow control process in which the "drum" beat sets the pace of production based on the constraint's capacity, the "buffer" provides a contingency, and the "rope" controls the flow of work.
DSL (digital subscriber line)
A technology for high-speed network or Internet access over voice lines. There are various types, including asymmetric DSL (ADSL), high-bit-rate DSL (HDSL), symmetric DSL (SDSL) and very-high-bit-rate DSL (VDSL). The whole group is sometimes referred to as "xDSL."
DSL/cable-sharing residential/small-office gateway/router
This device can be wired or wireless. Similar to a residential gateway/router, this device does not have an integrated DSL modem. It is distinguished by its ability to work with different types of broadband distribution network, such as cable or DSL, and, therefore, has a port allowing for a connection with the output of an external modem (either DSL or cable).
DSM (distributed systems management)
A technology for managing the interconnected parts of a system. As managed items – i.e., components of applications, nodes, links or subsystems – become active, they must notify their manager of their status. DSM tools are capable of dealing with a limited number of distinct elements and require a strong directory.
DTH (direct to home)
TV and broadcasting industries that deliver by satellite services directly to consumer households enabled by individual reception systems (antenna/dish and satellite Integrated Receiver-Decoder (IRD)/receiver). DBS satellite providers deliver a form of direct-to-home service. See also DBS and IRD.
dual-band
Mobile device that supports voice and data communications conforming to one bearer technology, such as GSM, but on two different sets of frequencies. For example, to support additional mobile network operators or to provide additional capacity and coverage, many European and Asia/Pacific countries/markets have licensed deployment of GSM networks on both 900MHz and 1,800MHz spectrum. A dual-band GSM phone enables the user to roam automatically across networks on either frequency. Most GSM phones sold in these countries are dual-band. A tri-band phone is required to roam among operators in Asia/Pacific, Europe and North America, because GSM has been deployed in 1,900MHz spectrum in North America. See also tri-band.
dual-band network
Cellular radio system that operates in two different frequency bands in which network elements conform to identical network architectures and radio interfaces.
dual mode
Mobile device that functions on two different bearer technologies, such as GSM and WCDMA, or 1x and WCDMA. Most 3G phones are dual-mode and tri- or quad- band to enable users to roam onto 2G networks when they are outside the 3G coverage area. See also tri-band.
dumb terminal
A terminal that does not performing local processing of entered information, but serves only as an input/output device for an attached or network-linked processor.
dump
To transfer all information from a record to another storage medium, e.g., copying from memory to a printer.
duplex channel
Two-way radio communications channel.
DVB-H (digital video broadcasting – handheld)
Technology standard for systems that transmit digital multimedia data to mobile devices in the form of IP datagrams. It is a development from the Digital Video Broadcasting – Terrestrial (DVB-T) standard, which was intended mainly for portable and stationary reception using rooftop antennas.
DVB-RCS (digital video broadcasting – return channel via satellite)
Technical standard that defines a complete air interface specification for two-way satellite broadband very small aperture terminal (VSAT) systems. The technology is being used as part of a low-cost VSAT system to provide highly dynamic, demand-assigned transmission capacity to residential and commercial/institutional users. DVB-RCS enables the near-equivalent speeds of asymmetric digital subscriber line or cable Internet connections without the need for local terrestrial infrastructure. Depending on satellite link budgets and other system design parameters, DVB-RCS-based systems can dynamically provide up to 20 Mbps to each terminal on the downlink, and up to 5 Mbps or more from each terminal on the uplink. The standard is published by ETSI as EN 301 790. See DVB Project and VSAT.
DVB-SH (digital video broadcasting – satellite services to handheld)
Transmission system standard designed to deliver video, audio and data services to small handheld devices via satellite using S-band frequencies. DVB-SH aims to take advantage of the characteristics of higher frequency S-band, where there is less congestion than in an ultrahigh frequency spectrum. DVB-SH's key feature: it is a hybrid satellite/terrestrial system enabling the use of a satellite to achieve coverage of large regions or a whole country. In areas where direct reception of the satellite signal is not possible, a terrestrial gap filler system, such as an ATC system, can be used to provide coverage. DVB-SH systems are designed to use frequencies below 3GHz, typically around 2.2GHz. DVB began work on the DVB-SH specifications in 2006. The system and waveform specifications have been published as ETSI standards TS 102 585 and EN 302 583. See also DVB Project, DVB-H, ETSI, and S-band.
DVB-T (digital video broadcasting – terrestrial)
Standard used in Europe and Asia to transmit and decode digital television signals. North America, parts of Latin America and South Korea have adopted the Advanced Television Systems Committee (ATSC) standard. See also DVB-H.
DXC (digital cross-connect)
DXCs are used at major network nodes to cross-connect a number of inbound and outbound circuits. The cross-connecting of circuits is done when circuits are provisioned, but, typically, cross-connects are also used to implement various schemes for protection switching and network restoration. In the SDH market, the abbreviation "DXC" is used for digital cross-connects, whereas they are referred to as "DCSs" in the SONET world.
dye sublimation
An output device that prints one line at a time, using an electrically heated element to produce images. Instead of spraying jets of ink onto a page as inkjet printers do, dye sublimation printers apply a dye from a plastic film. This takes the form of a roll or a ribbon, similar to that used by thermal wax printers, usually containing consecutive panels of cyan, magenta, yellow and black dye.
dynamic adaptive routing
Automatic selection and use of alternative communications paths among two or more midrange systems of the same supplier in the event of a congested, faulty or downed circuit within the preferred data path.
dynamic application security testing and static application security testing
Web application and source code security vulnerability scanners are technologies identifying application or code conditions that are indicative of an exploitable vulnerability.
dynamic bandwidth allocation
The process of determining current traffic loads over a channel, and automatically increasing or decreasing the bandwidth of the channel to optimize overall utilization efficiency.
dynamic content
Web site content that is continually refreshed to provide new or updated information to attract new viewers and to keep prior viewers returning to the site.
dynamic database restructuring
The ability to change the relational-database structure, table capacities and security without unloading and reloading the database.
dynamic routing
A method of wide-area network transmission that uses a router to select the most appropriate path for each section of data packet transmission along a network.
dynamic Web application tools
The dynamic Web application tool market includes tools that support interpreted and dynamic languages, such as Perl, Python, PHP, Ruby and ECMAScript. These tools are focused primarily (although not exclusively) on combining traditional integrated development environment features with Web design features.
No comments:
Post a Comment