-
81 речевой ввод
1. verbal input2. vocal input3. voice input -
82 символ
1. cipher2. token3. emblem4. digit5. icon6. letter7. sign8. symbolic unit9. badge10. char11. character12. symbolsсимвол входа; имя входа — entry symbol
13. trappings14. symbol -
83 сортировка с двухканальными вставками
ошибка ложного восприятия; ложная вставка — insertion error
вставной блок; блок для вставки — insertion unit
Русско-английский большой базовый словарь > сортировка с двухканальными вставками
-
84 сортировка с простыми вставками
ошибка ложного восприятия; ложная вставка — insertion error
вставной блок; блок для вставки — insertion unit
точка вставки; место вставки — insertion point
Русско-английский большой базовый словарь > сортировка с простыми вставками
-
85 тензометрическое множительное устройство
Русско-английский большой базовый словарь > тензометрическое множительное устройство
-
86 устройство вывода
1. output equipment2. output mechanismвыходное устройство; устройство вывода — output device
программа вывода; выходная программа — output routine
механизм вывода; устройство вывода — output mechanism
3. output4. output block5. output device6. output unitвизуальный выход; визуальный вывод — visual output
Русско-английский большой базовый словарь > устройство вывода
-
87 четырехканальное множительное устройство
Русско-английский большой базовый словарь > четырехканальное множительное устройство
-
88 электронное множительное устройство
Русско-английский большой базовый словарь > электронное множительное устройство
-
89 bit
E-com1. a binary digit number (0 or 1), the smallest unit of computerized data2. an item of information or knowledge -
90 Shannon, Claude Elwood
[br]b. 30 April 1916 Gaylord, Michigan, USA[br]American mathematician, creator of information theory.[br]As a child, Shannon tinkered with radio kits and enjoyed solving puzzles, particularly crypto-graphic ones. He graduated from the University of Michigan in 1936 with a Bachelor of Science in mathematics and electrical engineering, and earned his Master's degree from the Massachusetts Institute of Technology (MIT) in 1937. His thesis on applying Boolean algebra to switching circuits has since been acclaimed as possibly the most significant this century. Shannon earned his PhD in mathematics from MIT in 1940 with a dissertation on the mathematics of genetic transmission.Shannon spent a year at the Institute for Advanced Study in Princeton, then in 1941 joined Bell Telephone Laboratories, where he began studying the relative efficiency of alternative transmission systems. Work on digital encryption systems during the Second World War led him to think that just as ciphers hide information from the enemy, "encoding" information could also protect it from noise. About 1948, he decided that the amount of information was best expressed quantitatively in a two-value number system, using only the digits 0 and 1. John Tukey, a Princeton colleague, named these units "binary digits" (or, for short, "bits"). Almost all digital computers and communications systems use such on-off, or two-state logic as their basis of operation.Also in the 1940s, building on the work of H. Nyquist and R.V.L. Hartley, Shannon proved that there was an upper limit to the amount of information that could be transmitted through a communications channel in a unit of time, which could be approached but never reached because real transmissions are subject to interference (noise). This was the beginning of information theory, which has been used by others in attempts to quantify many sciences and technologies, as well as subjects in the humanities, but with mixed results. Before 1970, when integrated circuits were developed, Shannon's theory was not the preferred circuit-and-transmission design tool it has since become.Shannon was also a pioneer in the field of artificial intelligence, claiming that computing machines could be used to manipulate symbols as well as do calculations. His 1953 paper on computers and automata proposed that digital computers were capable of tasks then thought exclusively the province of living organisms. In 1956 he left Bell Laboratories to join the MIT faculty as Professor of Communications Science.On the lighter side, Shannon has built many devices that play games, and in particular has made a scientific study of juggling.[br]Principal Honours and DistinctionsNational Medal of Science. Institute of Electrical and Electronics Engineers Medal of Honor, Kyoto Prize.BibliographyHis seminal paper (on what has subsequently become known as information theory) was entitled "The mathematical theory of communications", first published in Bell System Technical Journal in 1948; it is also available in a monograph (written with Warren Weaver) published by the University of Illinois Press in 1949, and in Key Papers in the Development of Information Theory, ed. David Slepian, IEEE Press, 1974, 1988. For readers who want all of Shannon's works, see N.J.A.Sloane and A.D.Wyner, 1992, TheCollected Papers of Claude E.Shannon.HO -
91 Binär-Ausgabeeinheit
fbinary output unit -
92 Binär-Eingabeeinheit
fbinary input unit -
93 десятичная дробь
1. periodical decimal2. decimal fraction -
94 дробь
-
95 неправильная дробь
-
96 несократимая дробь
-
97 периодическая дробь
1. periodical fraction2. repeating decimal -
98 подобные дроби
-
99 разложение дроби
-
100 разложение на простые дроби
Русско-английский научный словарь > разложение на простые дроби
См. также в других словарях:
Binary Unit — This is the basic unit of information that computers use, consisting of an electronic signal for 0 or 1. The 0 means off and the 1 means on. Also called Bit … The writer's dictionary of science fiction, fantasy, horror and mythology
Binary prefix — Prefixes for bit and byte multiples Decimal Value SI 1000 k kilo 10002 M mega … Wikipedia
Binary economics — is a heterodox theory of economics that endorses both private property and a free market but proposes significant reforms to the banking system. The aim of binary economics is to ensure that all individuals receive income from their own… … Wikipedia
Binary Synchronous Communications — Binary Synchronous Communication (BSC or Bisync) is an IBM link protocol, announced in 1967 after the introduction of System/360. It replaced the synchronous transmit receive (STR) protocol used with second generation computers. The intent was… … Wikipedia
Binary option — In finance, a binary option is a type of option where the payoff is either some fixed amount of some asset or nothing at all. The two main types of binary options are the cash or nothing binary option and the asset or nothing binary option. The… … Wikipedia
Binary icosahedral group — In mathematics, the binary icosahedral group is an extension of the icosahedral group I of order 60 by a cyclic group of order 2. It can be defined as the preimage of the icosahedral group under the 2:1 covering homomorphism:mathrm{Sp}(1) o… … Wikipedia
Binary tetrahedral group — In mathematics, the binary tetrahedral group is an extension of the tetrahedral group T of order 12 by a cyclic group of order 2.It is the binary polyhedral group corresponding to the tetrahedral group, and as such can be defined as the preimage… … Wikipedia
Binary octahedral group — In mathematics, the binary octahedral group is an extension of the octahedral group O of order 24 by a cyclic group of order 2. It can be defined as the preimage of the octahedral group under the 2:1 covering homomorphism:mathrm{Sp}(1) o… … Wikipedia
Binary-coded decimal — In computing and electronic systems, binary coded decimal (BCD) is a digital encoding method for numbers using decimal notation, with each decimal digit represented by its own binary sequence. In BCD, a numeral is usually represented by four bits … Wikipedia
Binary logarithm — NOTOC In mathematics, the binary logarithm (log2 n ) is the logarithm for base 2. It is the inverse function of n mapsto 2^n. The binary logarithm is often used in computer science and information theory (where it is frequently written lg n , or… … Wikipedia
Binary multiplier — A binary multiplier is an electronic circuit used in digital electronics, such as a computer, to multiply two binary numbers. It is built using binary adders. A variety of computer arithmetic techniques can be used to implement a digital… … Wikipedia