Quotesdtb.com
Home
Authors
Quotes of the day
Top quotes
Topics
Computing Quotes - page 2
Let it be remarked ... that an important difference between the way in which we use the brain and the machine is that the machine is intended for many successive runs, either with no reference to each other, or with a minimal, limited reference, and that it can be cleared between such runs; while the brain, in the course of nature, never even approximately clears out its past records. Thus the brain, under normal circumstances, is not the complete analogue of the computing machine but rather the analogue of a single run on such a machine. We shall see later that this remark has a deep significance in psychopathology and in psychiatry.
Norbert Wiener
When taken as a way of modeling cognitive architecture, connectionism really does represent an approach that is quite different from that of the classical cognitive science that it seeks to replace. Classical models of the mind were derived from the structure of Turing and Von Neumann machines. They are not, of course, committed to the details of these machines as exemplified in Turing's original formulation or in typical commercial computers-only to the basic idea that the kind of computing that is relevant to understanding cognition involves operations on symbols.. In contrast, connectionists propose to design systems that can exhibit intelligent behavior without storing, retrieving, or otherwise operating on structured symbolic expressions. The style of processing carried out in such models is thus strikingly unlike what goes on when conventional machines are computing some function.
Jerry Fodor
Solomon: Your entry in Wikipedia says that your work has inspired many students to begin careers in computing and artificial intelligence. Hofstadter: I have no interest in computers. The entry is filled with inaccuracies, and it kind of depresses me. Solomon: So fix it. Hofstadter: The next day someone will fix it back.
Douglas Hofstadter
A religious college in Cairo is considering issues of nanotechnology: If replicators are used to prepare a copy of a strip of bacon, right down to the molecular level, but without it ever being part of a pig, how is it to be treated? (If the mind of one of the faithful is copied into a computing machine's memory by mapping and simulating all its synapses, is the computer now a Moslem? If not, why not? If so, what are its rights and duties?)
Charles Stross
What we wanted to preserve was not just a good environment in which to do programming, but a system around which fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.
Dennis Ritchie
I started keeping a list of these annoyances but it got too long and depressing so I just learned to live with them again. We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy.
Rob Pike
On a related topic, let me say that I'm not much of a fan of object-oriented design. I've seen some beautiful stuff done with OO, and I've even done some OO stuff myself, but it's just one way to approach a problem. For some problems, it's an ideal way; for others, it's not such a good fit. [...] OO is great for problems where an interface applies naturally to a wide range of types, not so good for managing polymorphism (the machinations to get collections into OO languages are astounding to watch and can be hellish to work with), and remarkably ill-suited for network computing. That's why I reserve the right to match the language to the problem, and even - often - to coordinate software written in several languages towards solving a single problem. It's that last point - different languages for different subproblems - that sometimes seems lost to the OO crowd.
Rob Pike
The most important application of quantum computing in the future is likely to be a computer simulation of quantum systems, because that's an application where we know for sure that quantum systems in general cannot be efficiently simulated on a classical computer.
David Deutsch
If an ɑ-machine prints two kinds of symbols, of which the first kind (called figures) consists entirely of 0 and 1 (the others being called symbols of the second kind), then the machine will be called a computing machine.
Alan Turing
In the post-Snowden world, you need to enable others to build their own cloud and have mobility of applications. That's both because of the physicality of computing–where the speed of light still matters–and because of geopolitics.
Satya Nadella
Nobody deserves to have to die - not Jobs, not Mr. Bill, not even people guilty of bigger evils than theirs. But we all deserve the end of Jobs' malign influence on people's computing.
Richard Stallman
You and I we exist for ourselves, fundamentally. We should care about others but each human being is a source of value, each human being deserves things. And so if you lose control over your computing, that's bad for you, directly bad for you. So my first reaction is to say: Oh, what a shame; I hope you recover the control over your computing and the way you do that is to stop using the non-free software.
Richard Stallman
More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity.
William Wulf
We must not forget that it is not our [computing scientists'] business to make programs, it is our business to design classes of computations that will display a desired behaviour.
Edsger W. Dijkstra
For me, the first challenge for computing science is to discover how to maintain order in a finite, but very large, discrete universe that is intricately intertwined. And a second, but not less important challenge is how to mould what you have achieved in solving the first problem, into a teachable discipline: it does not suffice to hone your own intellect (that will join you in your grave), you must teach others how to hone theirs. The more you concentrate on these two challenges, the clearer you will see that they are only two sides of the same coin: teaching yourself is discovering what is teachable.
Edsger W. Dijkstra
Industry suffers from the managerial dogma that for the sake of stability and continuity, the company should be independent of the competence of individual employees. Hence industry rejects any methodological proposal that can be viewed as making intellectual demands on its work force. Since in the US the influence of industry is more pervasive than elsewhere, the above dogma hurts American computing science most. The moral of this sad part of the story is that as long as computing science is not allowed to save the computer industry, we had better see to it that the computer industry does not kill computing science.
Edsger W. Dijkstra
It is time to unmask the computing community as a Secret Society for the Creation and Preservation of Artificial Complexity.
Edsger W. Dijkstra
Computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were. So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
Alan Kay
In the early days of modern computing - the 40s, 50s and 60s - computing was a priesthood. Only a few were allowed to commune directly with the machine; all others would give their punched card offerings to the anointed, who would in turn genuflect before their card readers and perform their rituals amid the flashing of lights, the clicking of relays, and the whirring of fans and motors. If the offering was well-received, the anointed would call the communicants forward and in solemn silence hand them printed manuscripts, whose signs and symbols would be studied with fevered brow.
Grady Booch
I have discovered that there are two types of command interfaces in the world of computing: good interfaces and user interfaces.
Daniel J. Bernstein
The world got enamored with smartphones and tablets, but what's interesting is those devices don't do everything that needs to be done. Three-D printing, virtual-reality computing, robotics are all controlled by PCs.
Michael Dell
In my essay I refuted the Hayek-Robbins argument by showing how a market mechanism could be established in a socialist economy which would lead to the solution of the simultaneous equations by means of an empirical procedure of trial and error. [...] Were I to rewrite my essay today my task would be much simpler. My answer to Hayek and Robbins would be: so what's the trouble? Let us put the simultaneous equations on an electronic computer and we shall obtain the solution in less than a second. The market process with its cumbersome tatonnements appears old-fashioned. Indeed, it may be considered as a computing device of the preelectronic age.
Oskar R. Lange
Previous
1
2
(Current)
3
4
Next