搜索
热搜: music

History of computer science

2014-3-8 21:48| view publisher: amanda| views: 1002| wiki(57883.com) 0 : 0

description: While formal algorithms have existed for millennia (Euclid's algorithm for determining the greatest common divisor of two numbers is still used in computation), it was not until 1936 that Alan Turing, ...
While formal algorithms have existed for millennia (Euclid's algorithm for determining the greatest common divisor of two numbers is still used in computation), it was not until 1936 that Alan Turing, Alonzo Church and Stephen Kleene formalized the definition of an algorithm in terms of computation. While binary and logical systems of mathematics had existed before 1703, when Gottfried Leibniz formalized logic with binary values for true and false. While logical inference and mathematical proof had existed in ancient times, in 1931 Kurt Gödel proved with his incompleteness theorem that there were fundamental limitations on what statements could be proved or disproved.
These developments have led to the modern study of logic and computability, and indeed the field of theoretical computer science as a whole. Information theory was added to the field with a 1948 mathematical theory of communication by Claude Shannon. In the same decade, Donald Hebb introduced a mathematical model of learning in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of neural networks and parallel distributed processing were established. In 1971, Stephen Cook and, working independently, Leonid Levin, proved that there exist practically relevant problems that are NP-complete – a landmark result in computational complexity theory.
With the development of quantum mechanics in the beginning of the 20th century came the concept that mathematical operations could be performed on an entire particle wavefunction. In other words, one could compute functions on multiple states simultaneously. This led to the concept of a quantum computer in the latter half of the 20th century that took off in the 1990s when Peter Shor showed that such methods could be used to factor large numbers in polynomial time, which, if implemented, would render most modern public key cryptography systems uselessly insecure.
Modern theoretical computer science research is based on these basic developments, but includes many other mathematical and interdisciplinary problems that have been posed.

About us|Jobs|Help|Disclaimer|Advertising services|Contact us|Sign in|Website map|Search|

GMT+8, 2015-9-11 22:08 , Processed in 0.130023 second(s), 16 queries .

57883.com service for you! X3.1

返回顶部