Can we see the brain as an analogue computer?
Douglas R. Hofstadter sees the brain as a computer. And I think he means a kind of digital ine as pulses of electrical potentials are running around in there like in hot hell. Large collective parallel concerts are continuously whirling around in the safe shelter of the skull and in continuous contact with body and outside world. You can say the processes going on started with the big bang which cohld be one of an infinite array. Right now there might just have formed a new universe from a big bang behind me, though this will probably happen if this universe has experienced something like a big rip.
Now when you examine the brain mire closely you find it functions completely different from an electronic digital computer. The pulses of electricity are not pushed by an external voltage, there is no external program let loose on the pulses of information, there is no separation between program and data, there is massive parallelism, the neurons have erratic lighning flash forms, etc.
Can we say the brain is an analogue computer being able to simulate all physical processes in thd world, even a lightning flash?
Now when you examine the brain mire closely you find it functions completely different from an electronic digital computer. The pulses of electricity are not pushed by an external voltage, there is no external program let loose on the pulses of information, there is no separation between program and data, there is massive parallelism, the neurons have erratic lighning flash forms, etc.
Can we say the brain is an analogue computer being able to simulate all physical processes in thd world, even a lightning flash?
Comments (194)
It's one thing that we tend to find it easy to think of things in a computational manner on/off, etc. It's another thing to say that what we are describing is actually computational in nature. People do computations, but they do many other things as well.
I can imagine a supernova, let alone imagine a lightning flash. I have curiosity. For instance, when I wrote this response, I wrote "lightening"; that's what I say when I describe an electrical discharge in the clouds. Then I noticed how you spelled it. "Hmmm," I thought; "which one is correct." It turns out your spelling is correct. "Lightening" is the present participle of lighten, or reduce the darkness of something. All these years (I'm 75) I've been saying it wrong,
A computer can not ask itself whether it is right or wrong, and can not 'feel' anything, either way.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.
We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.[/quote]
There can be doubt, philosophically, that we even truly know we have brains. (Though I admit we do have smarts.)
Um...I think we need to remember, though, that we're not just brains -- we're these whole bodies, we really are. The fact that we are bodies is and isn't contained in the brain. And if the body is a computer, it's a robot.
But since we imagine robots with free will, we should give ourselves the benefit of the doubt that our inner experience is richer than any mundane vision of mechanism -- it's not just wholly expendable, it's worth something more than any sum of its parts. That is, if one chooses to define the parts as ordinary...
I certainly am not an expert on this, but it is my understanding that an analog computer works by modelling a specific process using an analogous process, e.g. the flow of water modelled using the flow of electricity. I don't think a single analog set-up is capable of doing multiple sets of calculations the way a digital computer can using software. Analog computers are much more limited than digital ones are.
Someone please correct the errors in my understanding.
We don’t develop rules, knowledge, lexicons and representations?
"I'm afraid the brain has to be a digital computer, "
Why are you afraid? It can be an analogue one too.
"modelling a specific process using an analogous process, e.g. the flow of water modelled using the flow of electricity. I don't think a single analog set-up is capable of doing "
Exactly.
"not a mechanical one."
Why not?
This!
The article does well in showcasing how the narrative and context of our time changes the way how we see ourselves. However, discrediting such a brain-computer metaphor is not necessary in my opinion. Here's the deal: It's all about language.
The metaphors that are used might not be accurate - but they're not exactly wrong either in the process they're trying to describe. I'd say they're simply appropriate in the time and place they came to be used. They're the best the language, powered through the knowledge of that time, had to offer. It's simply that the brain works in reference and comparison - it can not evaluate something which is completely unknown, which it has no comparable experience to. So it makes sense to me that we'd use our newest and most complex framework and the language it provides to describe the processes of our brain.
It's perfectly viable then to break down the brain into computer lingo. Your brain might not work exactly like a computer but a computer has enough complexity to reflect the workings of the brain. It's not just that either: The thing is that programming languages are just that - languages. And they're constructed in a way so you can express any process, similar to how we use our own language every day.
Quoting Bitter Crank
Let's take this as a little example:
1. Brain continuously runs a thread (process) called "SensoryInput".
2. The SensoryInput thread recognizes a new object of the type "Text".
3. Every letter, every word, as well as the whole sentence, get passed over by SensoryInput to
a function named "compare".
4. The compare function checks the object of type Text with the internal reference archive. If a word is in that archive, we'll have a meaning to it. In the same way, this is where our spelling would be stored. In this case, we realize the difference between our own internal archive and the text we have as input (lightning/lightening).
5. The compare function, having found a discrepancy, calls the function "check". We look up the correct spelling for the word lightning.
6. We see that our spelling of lightning was incorrect. The function check calls the function "learn" and updates our internal archive with the correct spelling.
"Your brain might not work exactly like a computer but a computer has enough complexity to reflect the workings of the brain"
Not exactly? Exactly not! Enough complexity? Not even a zillionth!
The thing is, we have no access to how our brain produces "us". Most of the brain's activities are sub- or non-conscious. The conscious mind is cut out of all the internal traffic -- or at least, 99.9%. This isn't a defect--it's a feature. It enables us to attend to what we wish to attend to consciously, or what is forced upon us. We don't have to deal with both the action on the baseball diamond and the details of digestion, proprioception, and keeping our heart and lungs operating all at the same time.
We can observe something about the brain with EEGS and fMRIs, which are still a very far cry from observing our brain producing "us".
Can't I upvote your comment?:smile:
The brain receives a lot of input all the time, from internal and external sources. A lot of what passes through seems to be unexamined. Sometimes it's appalling how much, sometimes important stuff, can pass along this flow of information and not be noticed. I didn't notice the revolution, but suddenly noticed that dog-shaped cloud.
There's nothing wrong with your flow chart; it looks like what you would put together if you were applying yourself to writing an input/output program. Maybe you composed the list using memory plus imagination. I bet you have no knowledge of how the brain actually delivered that material to your fingers so you could type it. I can't explain what part of my brain is generating the text I am now typing. (Sure, the language production area; saying so comes out of memory not experience.)
User Illusion
[quote=Wikipedia]The user illusion is the illusion created for the user by a human–computer interface, for example the visual metaphor of a desktop used in many graphical user interfaces. The phrase originated at Xerox PARC.
Some philosophers of mind have argued that consciousness is a form of user illusion.[/quote]
Was it Sigmund Freud, can't recall, who used the iceberg analogy for consciousness? It basically states that what we're aware of consciously is only 10% of all the brain processing at any one time.
N.B. Only 10% of an iceberg is visible above the water. The rest, a huge 90%, is underwater.
See:
Quoting Hermeticus
Quoting Bitter Crank
It's true that the process of the brain is intangible.
As the Upanishads put it:
"You cannot see That which is the Seer of seeing; you cannot hear That which is the Hearer of hearing; you cannot think of That which is the Thinker of thought; you cannot know That which is the Knower of knowledge. This is your Self, that is within all; everything else but This is perishable.”"
This stands true for an intelectual and psychological point of view. However, through science, we can now take a physical look at the brain. For instance, we now know about neurotransmitters and synapses. We know synapses have a sort of treshold. They have to gain a signal of X "strength" to let the neuron cross the synaptic cleft and reach either the next neuron or a cell itself. This is precisely what you're describing:
Quoting Bitter Crank
It's not as if it seems to be unexamined. Certainly stimuli are actually unexamined because sensory input isn't strong enough to pass through "presynaptic evaluation". In recent years, it even came to light that this process is a two-way street. Based on experience the brain will adjust these synaptic thresholds. The brain programs its own operation based on what it experiences.
In this sense, our thinking apparatus can be equated to a self-improving algorithm.
Quoting Bitter Crank
It's like this: I can't actually have a peek into the "source code" but by comparing my inputs with the results that they deliver, over time, I can make rather accurate guesses as to how that source code might look.
Ultimately my point is: No, our brains are not exactly computers. But much of the processes are comparable. And I reckon it's fair to describe the happenings in technological terms for the sake of discussion. It's just a matter of taste. All words in the end are an attempt to infuse a certain meaning into communicational interaction.
But the question is: is it an analogue one? As I think.
Truth be told, our brains seem capable of grasping multivalent and fuzzy logic and that seems to suggest some kind of analog nature to it. However, even digital computers seem as able, which undermines any conclusions that our encephalon is analog in form and/or function.
" seem as able"
Indeed, seem. The brain functions on completely different priciples as those for a digital computer. If I think of a chess board there is litteraly an analogue pattern in the brain.
...not quite. Neurons do indeed tend to "fire" or "not", but they quite often fire at frequencies which can increase and decrease. Furthermore, neural firings are affected by their local chemistries, and there's a lot of chemistry going on in the brain. Said chemistries often work at the level of individual gates on the neurons affecting the relative concentration of ions across the cell barrier. It's overly simplistic to conclude that neurons must be digital just because they fire all or none.
It has nothing to do with simplicity.
An analog computer uses an analog signal. It means that I have a continuous input in some shape and form - be it a mechanical motion or an electric current.
A digital signal on the other hand is a sequence of isolated values within a certain range. Neurons must be digital precisely because they fire all or none. It's binary. It falls under the definition of a digital signal.
Here's the difference visualized:
Neurons firing and changing in frequencies only means just that: A sequence. A pattern. Do my senses communicate me "00100100" or "10101010"?
Quoting InPitzotl
Now this is where the interesting things happens. Neurons just go "true" or "false". But these "gates" decide what to do with the true or false. This is where the programming, the algorithm of the brain lies.
"Now this is where the interesting things happens. Neurons just go "true" or "false". But these "gates" decide what to do with the true or false. This is where the programming, the algorithm of the brain lies.
There is no programming. In the sense that a program is going on somewhere in other parts of the brain and that directs the flow of information, like in a digital computer. The information just follows a path of least resistance and this resistance is contained in connection strengths between neurons.
A further difference is that 1's and0's in a computer are driven by an external potential or voltage and current source. The potential peaks traveling in the brain don't need these (there are just out- and inflows of positive ions through ion channels along the dendrites). A totally different mechanism.
There are no machines in nature. This point was proved decisively when Newton discovered action at a distance, which shows that there need not be physical contact between objects for there to be motion.
Prior to that, it was thought that the world was essentially a giant clock, working in essentially a mechanistic manner.
Now with the newer physics, the world is even less machine like than ever. I don't see what is particularly computer like about the brain. People can "compute" things, and do many other things as well.
"There are no machines in nature. This point was proved decisively when Newton discovered action at a distance."
Isn' t the connected couple human body and brain (emerged in Nature, insomuch you can still speak of Nature) a machine? Not one made by humans but just evolved slowly. I dont see why action at a distance proves there are no machines. Action at a distance is matter "dominated". What forms the content of matter will always remain a mystery. Though we can feel it!!!
Machines are something we attribute to nature. They aren't found in nature.
Action at a distance disproved the world is machine because mechanistic materialism thought the world worked like giant clock, based on contact mechanics. The idea behind this being "if we can build it, we can understand it."
But it turns out the world does not work like a giant clock, there can be action with direct mechanistic contact.
But many people still stipulate that the brain is a machine, or aspects of the universe a machine. But they aren't. Machines happen to be a common way people think about things.
Chomsky and E.A. Burrt go over this history quite thoroughly. It can be found to some extent in Russell too. The brain aspect is covered quite well by Tallis.
Isn't the planetary system like a giant clockwork. Without the planets being in contact? In that respect, my brain looks more like a clockwork. The BIG difference being that there is no spring (the voltage source in digital computers) driving the parts.
It depends how you view a machine. It is not nessesarily something made from parts and subsequently put in motion. People cant construct people. The brain is not set in motion at some point.
And does the brain not direct the flow of information? The programming in this sense is the brains chemistry. Sensory input is received, a neuron signal is fired, synapse checks if the condition to jump this particular gap is true or false and if true lets the neuron pass into a cell. The cell then in accordance to the received signal runs a function of its own.
My point again: It's all just a metaphor anyway. A brain is a brain. A computer is a computer. I do believe the mechanics are quite similar. But even if they weren't I could construct a metaphor of it being so because programming languages were made for exactly that: describing any process.
Quoting Prishon
Also an external power source is not a criteria for being a digital system. It depends on the signal structure as I explained above. Furthermore, there is no such thing as an internal power source. Yes, our biology does create energy - but why do you think you have to take in sustenance to stay alive?
And does the brain not direct the flow of information?
The brain only directs the body. And the external world and body direct the brain. The information just flows, like a lightning flows to Earth. No program involved. The fact that there are pulsed potential flowing ( very differently as in computers) can of course be compared with the zeros and ones in computers. Coincidentally in the digital computers and in the brain there are pulsed signals to be seen. Thats why the comparison is made. It turns out that working with ones and zeros (which are defined for computers much more stricktly than in brains; computer ones and zeros are abstractions of those occuring in the brain) is a good way to represent or simulate physical events. In a computer the 1s and0s are computed. That is a orogram is let loose on them to compute the next state of them. In a brain thats not happening. The information (ths brains potential spikes) are freely flowing in the neuronal medium, with no program computing their new state. The information just follows paths of least resistance. Just like real physical processes flow. N
Because of the huge number of connections a truly cosmic number of paths can be followed (a one with a billion zeros behind it!). Thereby facilitating the possibility to analogue represent all physical processes in the universe.
Exactly.
But that's the point, how one views a machine. We project it into nature. It doesn't follow that our projections are correct. More often than not, they're incorrect, so far as the nature of the world (including the brain as part of the world) is concerned.
What is the correct view on the human machine?
Good for him! :smile:
First of all, computers work on a digital basis. The brain, on the other hand, is neither analog nor digital. It works using signal processing. It receives and transmits signals.
The computer consists of digital circuits, which are used to create combinational logic and perform functions based on boolean logic. Does anything of this remind of the brain? :smile:
Sorry, but the comparison is totally unsuccessful. :sad:
Thats my idea too. Mister Hofstadter is a computer nerd himself. Thats why he looks in a rather biased way. Everywhere where signals are traveling he sees a digital framework. :smile:
Which is not to say that some analogue to outside processes can be found. I once saw the squares of a chess board litteraly light up on neural structures. There are almost infinite patterns to be made on the neural network.
Certainly.
Quoting Prishon
Interesting! Neural networking is also a hot subject in artificial intelligence!
As for chess, although computers (chess programs) can beat good chess players --IBM's Deep Blue chess-playing computer has even won the world champion Garry Kasparov about 30 years ago!-- lack an important thing the human brain (actually, "mind") has: imagination!
Cute picture. So, your digital picture shows 101010101010?
Or does it show 110011001100110011001100?
Or is it 111000111000111000111000111000111000?
Or are you making unwarranted assumptions?
"important thing the human brain (actually, "mind") has: imagination!"
That is very true (not true but very true, whatever I mean by that...). A computer like good old deepblue cant envision moves or experience smoke blown in his eyes! :smile:
Quoting Alkis Piskas
I think computers own their AI mainly on speed. The neuron firings in the brain cant compete with the computerclocktime. Nevertheless the processes are much more complicated.
Exactly.
I find the "brain as computer" metaphor as useful as everyone else. But a metaphor is "a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable.
Indeed. Computers are the very model of unimaginative idiot savants.
Another huge thing the partisans of "brain as computer" do not account for is that the brain is flesh. Animals are embodied, and the body is subject to all sorts of gross and subtle influences of which silicon circuits know nothing. Cool, fresh air and bright sunshine in the morning can make oe eel glad to be alive. A beer or three can smooth out the rough edges of reality for a while. The prospect of great sex can organize one's whole day. All that and much more because we are flesh. Computers have no bodies. Brains are always part of a body -- even in the case of C. elegans, where the 'brain" is composed of 302 neurons.
I often like to remind people that before "computer" was a type of machine you bought with a keyboard and monitor, it was a job title.
Can we see the human eye as a video camera? the leg as a kickstand? the skull as a hat rack?
Tools such as the computer can at best mimic the activity of the human body, but are never accurate representations of it. The question should be the other way about: is the computer a brain? The answer is always no.
That could very well be. If it is the unconscious mind that does all the work, it would need an interface. But then the question arises, Who uses the interface the unconscious mind creates? Is the ego part of that interface, and the part that interacts with other egos' interfaces?
You could, but it would be question-begging to say that it is a simulation without any justification. The world in our heads need not be anything like the world outside of it.
The interface IS the ego
If I think of a face, there is an analogue structure visible in the brain.
:smile:
Re "AI": Yes, speed and also storage capacity (e.g. big data). Human memory capacity looks tiny compared to it! And, although thinking works at the speed of light (maybe faster) retrieval from memory takes "eons" compared to that using AI techniques.
Re "neuron firings": They must also be quite fast, but I guess thought (as a thinking process) is much faster and, of course, much more complicated. Although brain waves can be traced and measured, I don't think that thought (will ever) can.
You can think of every process in the universe. The memory capacity of brain is larger than that all computers together.
Don't be so ready to belittle them! Computers can do a million things better and faster than us! :smile:
Besides, they are extensions of our mind. In fact, where would were be today w/o them?
(BTW, you are "talking" to a computer programmer, who refuses to work with idiots! :grin:)
Quoting Bitter Crank
Just forget about them ... They certainly don't know what a computer and/or the brain are/is, and certainly they ignore the mind. Because, if you can compare computers with something human, that would be the mind, not the brain. Totally different things! (But this another story ...)
Good one! :lol:
Quoting Prishon
What kind of "process"? Anyway, it doesn't really matter. I am talking from a practical point of view. I will give you a very simple example.
Just write a totally random (i.e not structured or formed in any way) 16-digit number, as quick as possible --no extra processing, no mnemonics or other tricks-- in a text processor, save and close the file. Then try to remember it. Most probably you can't. (Except maybe if you are a mnemonist! :smile:) Then open the file with the written number. There it is! Instantly!
You can think now how impotent your memory capacity and retrieval process are compared to those of a computer!
Potentially, you can remember the sequence (though it is highly unlikely the sequence is random, but thats another issue). the problem with computer memories is that if one memory spot is occupied it cant participate in a memory for another object. Thats where neurons come in handy. :smile:
Yes, I got that you are talking theoretically (because this is what "potentially" implies). That's why I gave you a practical example, one that can be applied to life.
Besides, since you refer to potentiality, whatever physical information a human being can receive from the environment can be stored either on a huge computer disk (... "potentially"! :smile:) or --more realistically-- distributed to smaller ones. But even a relatively small disk can already contain mush more than what a human being can remember.
Only intangible, non-physical, things (feelings, emotions, abstract ideas etc.) cannot be stored as such, i.e. exactly how they are felt or thought of by the human being, but this refers to a spiritual reality, out of the present frame of reference.
In general, whenever you are talking about data (information), you must remember that this is the realm and "the reason of existence" of the computers! This is why we have built them. We wouldn't had to, if we could do all the computations and rebember everything ourselves ...
Ther is, however, something else that I thought just a little while ago: There's a belief that a person's memory actually contains everything that it has been registered into it and that it's only the recollection of the information that is at fault or deficient. Yet, certainly this is theoretical, as it has never been proven and, most importantly, even if it had, it would have some use only if we were able to develop reliable and efficient memory-recovering techniques, something like hypnosis, but more scientific and reliable.
Well, we have gone astray of the topic, the subject of which is proven (at least by me) to be baseless, anyway! :smile:
I couldnt remember anymore what the topic wss. I havent looked yet but I guess it to be information... :smile:
If you consider computers to be built because we wanna store information yes. A memory spot can contain the information of the places of black dots on a while board. All configurations contain the same amount of information ( the number of bits in the memory chip) but every memory is different. They are explicitly present on a memory chip. All of them can be present ina part of your brain too. The difference being that the neurons involved are the same, while on a chip the different memories are stored in separate parts. :smile:
Exactly! It's what I mentioned at the end of my previous message! :grin:
(The topic was --and still is-- "Can we see the brain as an analogue computer?") It is good to write it down each now and then in these long discussions so that we get aware that the wind has made the sailing boat drift away from its course! However, uncotrolled (unmonitored) discussions like these offer for opening new, interesting and juicy subjects! And fun too! :grin:)
Actually, computers are created for computations (as the word itself implies). The first computers had a very small memory capacity. You were using them exclusively for solving problems, demonstrations and that sort of things. (I had worked with one such computer!)
Man, this site is a true relief! It almost brings tears to my eyes! What a difference with Philosophy Stack Exchange! I got kicked out there by some crazy mods... Ah I see you made another comment. Ill resd!
It's time for me to mention that all this stuff with memory, neurons, cells, etc. is kind of "floating on the air". There's still no definite proof that memory is part of the brain. (I say "still", because scientists continue to change each now and then both the actual location of the memory and its functioning. I watch this serial since the early 70s ...) Much less has been proven that thinking and consciousness are products of the brain, as most scientists still believe (mainly because they can't figure out, as pure materialists, where else these could be!) That's why, I believe it is safer if you use the word "mind" instead of "brain" (both for those who identify it with "brain" and the other, like myself, who believe that these are two different things). And certainly avoid talking about "neurons"! I certainly don't know how exactly they function, but I know that they work for receiving and transmitting signals. And this is more or less what the brain does. There may be also some kind of "memory", which has to do exclusively with the body and which is located in cells other than neurons or other specific parts of the brain, but I cannot tell. I was never much interested to find out!
I don't know about that, never subsribed to it, but I have been to a couple of other philosophical forums and they suck big time! I can openly say that TPF is best by far!
What scientists (I dont consider myself one although I know a lot about "the hardest" of them all, elementary particles and quantum fields in curved spacetimes, which made me realize there is more than matter and space inly) overlook is the content of matter. There is no memory like in computers (which is one of the reasons I consider the brain as an analogue computer, although litterally computation doesnt take place). The strengths between neurons (looking materialitically now) determine how the patterns flow freely, unforced by an external potential source. On the same set of interconnected neurons many patterns can flow.
I can really not say anytthing more about neurons than what I have already said.
Quoting Prishon
OK, but as we have already established there's no analogue computer. So any comparison with human memory falls apart, doesn't it?
Quoting Prishon
Isn't this one more reason for not comparing the mind with a computer?
The definition of an analogue computer is not a device that actually computes (this makes the term confusing indeed). It is a device in which an analogue process to whatever external process takes place. Like the analogue computer used in representing a drop that falls from a tap (chaos can appear). There is a changing electric current representing the drops. Not a current based on 1 and 0 like in a d.c.
Quoting Prishon
OK, maybe you mean this:"An analog computer or analogue computer is a type of computer that uses the continuously variable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude." (Wikipedia)
If this is so, then I have wrongly interpreted your subject. My bad! See, we often are biased by the things we know best or words/terms that are more commonly in use, and in this case it is "digital computers". Digital computers --or just "computers"-- are so much involved in our lives that they almost hide any chance that there's some other kind of computer! At least, this is what happene to me. I'm sorry.
Bad luck again, though! I am not good in and know very little about mechanics, in general, so I can't help here either (as I couldn't with neurons)! So, there's a chance that the brain works in somehow a similar was with analogue computers as defined above.
But wait a min! There's a branch in artificial intelligence called "Neural Networks". There may be some similarity between them, which refer to a digital world, and the brain. This is again from Wikipedia:
"A neural network is a network or circuit of neurons, or in a modern sense, an artificial neural network, composed of artificial neurons or nodes. Thus a neural network is either a biological neural network, made up of biological neurons, or an artificial neural network, for solving artificial intelligence (AI) problems. The connections of the biological neuron are modeled in artificial neural networks as weights between nodes. A positive weight reflects an excitatory connection, while negative values mean inhibitory connections. All inputs are modified by a weight and summed. This activity is referred to as a linear combination."
Simplified view of a feedforward artificial neural network
I don't know if this can help. But it is the least I can do for misinterpreting your "analogue computer"!
:up: :smile:
That helps indeed! Thanks! It looks indeed as that whats going on in the brain. Artificial neural networks are pretty good in "recognizing" patterns. I think you can see why. The networks are too straight in my vision (contrary to the lightning shaped real neurons). Im not sure if there is a digital program lying under ANNs. Thanks again!
I am glad I have inspire you! :smile: And I see that you are knowledgable in the AI field!
However, although there is some parallel between ANNs and the brain, we must not forget that neurons are analogue --they operate on a continuum of signals-- whereas computers (AI) work on a 0-1 basis. This, and also the computing abilities of the second make their comparison impossible. We can only talk figuratively: The previous image of a neural network is no more than what in programming we call control flow, just a representation, a rough description of a process or program, which I am sure you know well.
So we can say that ANN's are programmed processes, contrary to the real stuff.
I didnt know that. I understand these are less efficient than their high level counterparts in standard computers. I think there are no higher level functions in the brain. Its one whole in which whirling is going on in different forms.
https://www.theguardian.com/science/neurophilosophy/2015/mar/09/false-memories-implanted-into-the-brains-of-sleeping-mice
Neuroscientists in France have implanted false memories into the brains of sleeping mice. Using electrodes to directly stimulate and record the activity of nerve cells, they created artificial associative memories that persisted while the animals snoozed and then influenced their behaviour when they awoke.
Manipulating memories by tinkering with brain cells is becoming routine in neuroscience labs. Last year, one team of researchers used a technique called optogenetics to label the cells encoding fearful memories in the mouse brain and to switch the memories on and off, and another used it to identify the cells encoding positive and negative emotional memories, so that they could convert positive memories into negative ones, and vice versa.
The new work, published today in the journal Nature Neuroscience, shows for the first time that artificial memories can be implanted into the brains of sleeping animals. It also provides more details about how populations of nerve cells encode spatial memories, and about the important role that sleep plays in making such memories stronger.
Yes, we can say that.
There's something missing from this definition, something that is crucial to the question whether the brain works like an analogue computer. What's missing is the fact that we use the computer to model the problem being solved. The computer doesn't use the physical phenomena, we do.
A slide rule is a mechanical analogue computer. Mathematical calculations are performed by aligning a mark on a sliding central strip with a mark on one of two outer fixed strips, and then observing the relative positions of other marks on the strips.
The computer's output has to be interpreted by an outside observer. It's the same with any computer, analogue or digital. It isn't the same with the brain/mind. Your brain is producing your present conscious experiences regardless of how an outside observer might interpret what is happening.
Therefore the brain/mind does not work in the same way as a computer. QED
Can you list some examples of analogue computers? (if there had been any in real world)
https://www.analogcomputermuseum.org/
:smile:
What about a record player vs compact disc player?
They are just electrical devices, not computers.
Quoting Corvus
I dunno. The compact disc player translates (computes) information into soundwaves. The record player does the same. But there is no computation.
These ARE computers. Only non-digital. They conform to the definition of an analogue computer. True, they are non-digital? What do you expect as an answer?
controlled the big guns on the USS St. Louis. The St. Louis was the first ship
to make it out of Pearl Harbor on December 7 1941!"
If definitions are too wide, then discussions end up science fiction.
The mentioned computers are no fiction. They are real computers but not conforming to the standard view on computers.
I was not saying the device are fiction.
I said the discussion will end up in fiction :D
That depends on the people discussing... ?
I see them all the time :)
So, no we cannot see the human brain as computer.
https://en.wikipedia.org/wiki/Analog_computer
Quoting Corvus
https://en.wikipedia.org/wiki/Computer
https://en.wikipedia.org/wiki/Computer_(occupation)
Good examples not to trust everything you see on the internet sites.
Could have been written by the teen nerds.
If you see machines as soulless things made by man, no they cant.
You can see machines whatever you want, but human brains cannot be reduced to machines.
And analogue computers? That is just another contradictory concept which not makes sense.
Computers use "bits = 0 or 1" not continuos analogue voltages.
All devices using analogue voltages are called appliances.
Arent we made out of matter? Thats what you eat. You can say we evelved from some initial state of the universe and thede days we have an internal representation of the physical outside world. In this inside world things are going on like in the outsidde world. Im not saying we are matter only (I think thats what you mean by a machine). Matter has content.
There's nothing to debate here.
That was my point mate.
Sure but the fact that humans are made of matter, and the physical body is not enough argument to say the real entity of mind and souls are also the physical body and matter.
The real entity? Whats that?
The essence.
Sorry, I'm lost. First you were saying to your knowledge there has never been an analog computer. Then I gave you a listing of them (a museum manifest), and you said those were not computers, "just" electric devices. I then linked you to wiki articles, and you mumbled something about teen nerds. So I said there's nothing debate... and that was your point?
Do you have something interesting to say or not?
Cant the essence be the non-physical content of matter?
If anyone comes with the picture of the old electronic analogue meters or vintage recording machines, and call them analogue computers, then no I have no comments further to make.
There are many different views on the question. Depends on what you believe, but it is not a simple to just say either this or that.
Why is that not easy. I can simply say that there is some magical stuff inside matter that becomes our soul and feelings once inside us.
Explain what is the content of the matter, and how is it non-physical.
And elaborate why it should be the essence of human mind with evidence.
I am away for lunch and some work, so my further replies will be later.
Ill think about it. Buon appetito!
Cute narrative, but that is not what happened. I linked to a museum manifest and a wikipedia article. I've yet to call anything an analog computer... I linked to other people calling things analog computers.
But I'll be happy to do that:
This is a picture of an analog computer. More precisely, it's a picture of a picture of one; that picture being from the operating manual of a TR-10.
Was it using the punch cards for the data storage? That is still bits. Maybe it was powered the analogue power, to denote electricity for the machine.
I suppose you could call a horse cart as car, saying that it has wheels, moves and take you from A to B.
Thanks mate. The pizza was good. But seriously ...
There are about 900 pages in my Oxford Handbook of Mind, and it weighs like a brick.
So there must be loads of different theories in the Mind and Body topic, and it has been the hot topic ever since Philosophy began in ancient Greece.
But my own point of view on the topic of why human mind cannot be seen as computers is this.
Why human brains are not computers.
1. Non replaceability for the uniqueness
No technical perfection of replacing the matter in computers can render the uniqueness of the individual mind. There are billions of minds out there, but no mind is same with other minds. All minds are unique.
You cannot build Prishon's mind no matter what you do in physics chemistry and computer technology. But a computer mind can be built to be exactly the same, and identical in every possible way by using the exact parts and components. So computers cannot be minds.
2. Non revive-ability
One's mind dies, it can never be revived. It is gone forever. Matter cannot replace the uniqueness of mind. But machines can be rebuilt, repaired and revived. Minds cannot. Therefore minds are not computers.
3. Detectability
No matter how human-like computers are created, and in action, it will be detected as machines by the real humans. There will be no human feelings between the machines and humans. Human consciousness evolves with time and interacts with the environments and situations. Machines lack that property. Well, one can argue, it will be developed into that level in the future, however, at present moment, it has not. Human mind is not a computer.
You wrote (in last comments) what I was thinking. It seems hard to imagine that computers are only computers if bits are involved. Thanks again for your examples! Nice material. I wanna use them in a book. I never knew about these guys! Ive only seen one used in a chaotic drop experiment.
cool mate. have a nice day :up:
I was just making a comment about what you said! Dont you wanna continue? Are you offended that I said that computers can be analogue?
No I am off to do some work now. Never get offended by philosophical discussions. I just present my views on the points, that is it. I could be wrong, then I stand to be corrected, and learn. If my point was meaningful to the others, that's cool. Nothing less or more than that. I will be back later when peace and quiet. :)
No.
Quoting Corvus
That would be changing the standard usage of terms. But that's not what's going on here. The TR-10 was commercially sold as an analog computer, as you can clearly see from the operator manual cover. That would make you the one changing the standard usage of the terms.
It sounds like the time when nobody knew what computer was, or was for. Really it looks like too grossly far fetched definition of computer compared from the modern definition we are familiar with in any shape form or meanings. They might have written anything in their user manual, but that does not qualify as a formally acceptable meaningful term by the contemporary population or computer scientists or philosophers, just because someone written it and published into a wee leaflet.
No idea what you're talking about, but:
The PACE TR-10 was developed in 1959.
The ALGOL computer language was originally developed in 1958.
The first computer science degree program was in 1953.
The Association for Computer Machinery was founded in 1947.
The ENIAC was from 1943.
Quoting Corvus
Which modern definition exactly? The most popular kind of computers are digital computers, but to say you have never heard of an analog computer in the history of human kind, where analog computer means digital computer, is a bit weird and meaningless. To "philosophically" only count a computer as a computer if it is a digital computer is a bit ridiculous.
All computer is digital device by my 1st order definition.
Give us your definition of what "analogue" and "computer" is.
The existence of analogue computers is already established a few times in this thread! This is one more:
https://en.wikipedia.org/wiki/Analog_computer
That is for ANALOG Computing (an acronym for Atari Newsletter And Lots Of Games) . :D
Atari is a company name, gone bust long time ago.
Sorry for the sarcasm. OK, analogue computers existed in history. I admit that they existed. I was totally unaware of it. I learned something about computers. Cool. But are they relevant to human brains?
Quoting Corvus
Please make an effort to actually read the reference (i.e. more than its sub-reference) at https://en.wikipedia.org/wiki/Analog_computer again. Then try not to laugh ironically, but when something is actually funny.
Here. I make it easier for you:
Analog computer
For the Atari 8-bit computer magazine, see ANALOG Computing.
An analog computer or analogue computer is a type of computer that uses the continuously variable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude.
Analog computers can have a very wide range of complexity. Slide rules and nomograms are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated. Systems for process control and protective relays used analog computation to perform control and protective functions.
Analog computers were widely used in scientific and industrial applications even after the advent of digital computers, because at the time they were typically much faster, but they started to become obsolete as early as the 1950s and 1960s, although they remained in use in some specific applications, such as aircraft flight simulators, the flight computer in aircraft, and for teaching control systems in universities. More complex applications, such as aircraft flight simulators and synthetic-aperture radar, remained the domain of analog computing (and hybrid computing) well into the 1980s, since digital computers were insufficient for the task.
Analogue: There is a real physical process in the physical world. There is an analogue pricess to that. Say the motion of the planets and an electric process making the same oscillatory motion. Then that electric current is an analogue process and can be used to model the planetary system. No computation by a program on data is involved.
If you can't stand dialectics, then why are you here?
Thanks for your explanations. Excellent.
Quoting Daemon
I agree. Please do.
Your welcome. I didnt see yet the comment before mine. Excellent indeed!
I questioned until it was clear. I am still in denial of those old machines as computers, but did admit that someone went and wrote them into wiki.
BTW, Your attitude is not philosophical, actually despicable. You don't allow people doubt and question on things which are murky in origin.
There's something missing from this definition, something that is crucial to the question whether the brain works like an analogue computer. What's missing is the fact that we use the computer to model the problem being solved. The computer doesn't use the physical phenomena, we do.
A slide rule is a mechanical analogue computer. Mathematical calculations are performed by aligning a mark on a sliding central strip with a mark on one of two outer fixed strips, and then observing the relative positions of other marks on the strips.
The computer's output has to be interpreted by an outside observer. It's the same with any computer, analogue or digital. It isn't the same with the brain/mind. Your brain is producing your present conscious experiences regardless of how an outside observer might interpret what is happening.
Therefore the brain/mind does not work in the same way as a computer. QED
Great point, mate. :up:
Thank you for being a patient and respectable dialectic interlocutor Prishon in this thread. From the dialectic process I have learnt something new today. It was cool. Never a waste of time. I salute ~
I a digital computer there is a program (fixed 1s and 0s) let loose on information (flowing 1s and 0s). The programs pulls through the 1s and 0s. By means of a voltage source. Thats computation. In the brain this doesnt happen. Neither in an analogue proces involved in modeling other processes.
Hope to see you again! :smile:
There are no 1s and 0s in a PC. There are voltages, or "pits and lands" on an optical disc, and we interpret these as representing 1s and 0s.
The difference netween the potentials is though that the soike in a neuron is not pulled through by a oot. difference between the two ends of a neuron. Between the body and the axon, that is.
Look, you've just told us that computation is 0's and 1's, the program pulls through the 0's and 1's, whatever that means.
Then I've pointed out to you that there are no 0's and 1's. You need to address that.
Stop wasting our time.
Well that rules out analog computers (as well as quantum computers), but it doesn't sound like it's talking the same language as people who use terms like "analog computers" (including you) and "quantum computers".
Quoting Corvus
It's given in the links already provided to you. Here's a definition for "computer":
Here's a definition for "analog computer":
...and as for the standalone definition of analog, it's a red herring. Analog computer is a compound term with meaning and referents. The usage of the compound term establishes the meaning of it.
Quoting Corvus
You've got this entire exercise backwards. Terms get their meaning from established usage. Per the established usage, the TR-10 is referred to as an analog computer, not a meter, and not a vintage recording machine. The quality of your definition comes from its ability to describe the established usage... so it's kind of futile for you to argue that because you define "analog computer" as a square circle, the TR-10 is not one. The absolute best you could do with this argument is to argue that an analog computer doesn't match your definition of a computer, which is uninteresting.
Per linguistic standards, insofar as your definition does not fit the established usage, its your definition that's wrong. And by wrong, what I specifically mean is that it fails to describe the established usage of the term. It's fine to have local, arbitrary definitions, but if that's what you mean when you say this:
Quoting Corvus
...then all you're saying is you have not seen a square circle. So what?
Normally I go with the dictionary definitions on most concepts, but wiki? I don't trust wiki sorry.
And in my profession, I have dealt with myriads of analogue devices, so I know exactly what they do, and are for. In real life and the world, they cannot be classed as computers.
For computers, they must be able to store, retrieve, compute and search for data, and process them into useful and organised form of information. Analogue devices cannot do that from the limitation of their structure. OK, if you are desperate, you can call an ancient abacus a computer.
But due to the misuse and widening of the concepts, you will find that the confusions will never go away in the discussions and even in real life. I try to narrow the concepts whenever possible.
I asked the question! I cant help it that you dont understand my vision on the brain. Man! @Corvus was right! What an attitude. When you dont understand you say to stop waisting your time. Now what do I mean by pulling trough bits? Aint that clear?
I judge wikipedia's definition the same way I judge yours. Wikipedia's definition is good by this criteria. Yours is wanting.
Quoting Corvus
Everyone deals with myriads of analog devices. I interact with SSD's all of the time in my profession. This has nothing to do with your definition.
Quoting Corvus
Then (a) what would you call a TR-10? (b) Given everyone else calls the TR-10 an analog computer, why should I care what you call a TR-10?
Quoting Corvus
Second time... the TR-10 was commercially sold as an analog computer. That goes in the established usage bucket, not the desperate relabeling bucket.
Quoting Corvus
What confusions? The biggest confusion here is your weird claim that to your knowledge there has never been an analog computer, followed by denying that what everyone else calls an analog computer is an analog computer. If that's the confusion you're talking about, I have another idea of how to resolve it.
At the end of the day, there is the TR-10. We call things like the TR-10 analog computers. There's nothing here to be confused about, and outside this brief lesson on linguistics, there's not really anything interesting here.
It is not as simple as that.
The most significant difference between analogue device and computer is that, the computers have microprocessors equipped in. The processors have the pre-programmed instructions for processing the data. So, it is flexible and versatile. Computers can be interfaced with other specially designed interfaces to perform other myriad of functions too.
Analogue devices don't have the microprocessors in them. They cannot process any data. They are purely mechanical in their design and structure, and cannot perform even 1% of what the modern digital computers can. They are data monitoring devices or receivers / players at best.
What microprocessor did TR-10 have? Which programming languages does it operate on?
And what is the O/S for the TR-10?
You're begging the question here.
Quoting Corvus
Not all of them.
Quoting Corvus
None. The TR-10 includes interchangeable plug-in components including coefficient setting potentiometers, integrator networks, function switches, comparators, function generators, reference panels, tie point panels, multipliers, and operational amplifiers, as described in the operations manual.
Is there a point to this game or are you just going to indefinitely annoy me? You mentioned something in just the last post about confusion. Now suddenly you're rambling something about microprocessors and lecturing me on the thing I'm using to type messages at you.
The Tr-10 analogue "computer". Computes a continous input to give a continuous output. Just like the brain. Without bits and bites. Only with current. Like in the brain. The difference being that the currents in our brain are not pulled through by voltage sources on ends of wires.
I am just trying to clarify the points that you have been throwing at me. No emotions here.
Computers must have,
the microprocessor
input and out device
storage device for very minimum for HW.
For SW, it must have the OS for the central instruction processing and ROM (booting)
And computers must be able to process data, and take new instructions via the programming languages.
Now which analogue device is equipped with above components and capabilities?
Yes. You are wrong. The brain IS an analogue computer.
My point is that rather than accepting the concepts and ideas from the Wiki or other internet sites just because they have typed up and uploaded unto there, but why not try come to the knowledge by discussing and arguing for the clearer concepts and conclusion with the philosophical and logical discourse.
The meanings and concepts reveal in this process seem far clearer and logical than some unknown bloke written out and put them on the net. This is the whole point of philosophy. We don't take anything prima facie. We discuss and debate till the truths emerge from the pure reason.
The PDP-11 did not have a microprocessor.
Quoting Corvus
Nope. The OS is absolutely unnecessary. Usually this is referred to as bare metal.
Quoting Corvus
Also unnecessary. I recall using the hex keypad to punch machine code into the 6800. Yeah, I did actually write a program first, but I did the compiling, not the 6800.
Quoting Corvus
Irrelevant. This is yet another round of barking out requirements that are absolutely not requirements, then pretending you have a gotcha.
Quoting Corvus
Philosophy forum is an internet site, you are a "they", and I don't accept your ideas just because you typed it up here.
Quoting Corvus
If you are after "clearer concepts", let's start with what's so unclear about calling the TR-10 an analog computer.
Quoting Corvus
You're giving me the wrong lecture. I'm not buying what some unknown bloke (Corvus) wrote out and put them on the net (philosophy forum). Why should I trust you?
But do you think the brain is an analogue unit?
The brain is the brain. It's probably not useful to think of the brain as a digital computer or as an analog computer.
I don't expect you trust me. Never said that.
It is a principle in philosophical discussion (finding truths via dialectic discussion and let the reason reveal). If one reject that, then there is no point in discussion.
You're lecturing me, virtually calling me a clueless wiki-zombie, despite my explicitly giving you my criteria for rejecting your definitions. Which still apply.
Quoting Corvus
But you're not doing any philosophy here. You were directly asked what was so confusing about calling the TR-10 an analog computer. Instead of replying, and giving an argument, you chose to lecture me on how trusting random blokes on the internet yada yada yada, yada yada yada. In other words, you went on a tirade, which is not an argument.
If you're serious about clearer concepts, get to it. If you're just going to lecture me on how mindless you think I am, I've got more important things to do than tease your fantasies of me. You don't accomplish anything meaningful by patronizing me.
I think I have given out the clear reason why analogue devices are not computers with all the necessary conditions for being computer in one of my posts with the HW and SW specs.
The reason that I mentioned about the Wiki and stuff was that your only argument for believing the analogue machines were computers was that you have seen that page in Wiki, and someone's write up on it, and was presenting it as some infallible necessary universal truth rather than telling us your arguments why analogue devices are computers.
How is ranting on the importance of establishing clearer concepts, instead of, oh I don't know, actually trying to do that... not lecturing?
Which leads to the elephant in the room. Why is it so unclear to call the TR-10 an analog computer?
Quoting Corvus
Nope. You basically said, all swans have white feathers. That thing has black feathers, so there's no way it's a swan.
Your definition is one of niche and habit; not generalized applicability. If we're going to discuss whether the brain is some form of analog computer, we're probably not going after whether it has a microprocessor in it; and despite your diatribes, philosophy isn't hanging in the balance over whether or not we count the brain as a non-computer because there's no silicon wafers in it. You are lacking all sense of proportion here.
I have asked you about the details on the TR-10 you were talking about in its specs and SW/OS it uses, but you have not given your replies at all. All you ever then seem doing is just going on about lecturing and Wiki and irrelevant details for the dialectic process.
I feel that it is important to clarify the concepts involved in the debate, otherwise you will end up talking about rivers and cakes and mountains, when the topic is human brains and computers.
That's fair. But so is my analogy.
Quoting Corvus
I told you it didn't have a microprocessor.
Quoting Corvus
Not true. I told you a PDP-11 doesn't have a microprocessor, an OS is optional (gave you the term "bare metal"), and told you how I did the compiling for that 6800 I programmed. IOW, I am dismantling your arbitrary criteria.
Incidentally, again, the thread isn't about whether brains have silicon wafers in it. It's not asking whether brains run an operating system. It's not asking whether brains run on software written in programming languages. So all of these are fanciful distractions. There's nothing clarified here about how the brain works, and how it doesn't work, and how that might compare to what we call analog computers and what we call digital computers, to be found in these criteria that don't always apply to the things we call analog computers and digital computers anyway.
PDP-11 had processor in the form of LSI. PDP stands for Programmed Data Processor.
To me, your talks on the analogue computers didn't make any sense at all. And brining up those ancient analogue devices insisting they are computers, into the philosophical discussion discussing human brain as computer just didn't sound right.
Quoting InPitzotl
For the OP, it was hugely meaningful to clarify the contradictory concept "analogue computer". Without the clarification of concepts, the discussions tend to degrade into long drawn chitchats.
Thats all I want to know.
Oh. If anyone is lecturing than it's you.
Oh, I am definitely lecturing!
When I get more time I'll fill in a few more details for that quote if you like.
Mea culpa; I meant to refer to the PDP-8 as having no microprocessor... the PDP-11 did have one. Then again, I note that you mutated this from "microprocessor" into "processor".
I don't think anyone would be interested in PDP-8 or 11 in this thread. See you are the one who keeps bringing these dinosaur devices insisting they are the computers?
What is relevant to OP with the analogue devices would be their in/outputs being continuous voltage rather than digital 0/1 bits, that is same with the human brain. When the electronic probes are attached to the human brain, what can be measured is continuous voltage. That is all measurable from the human brain activities. So the human brains and analogue devices share the type of signals they generate which is the continuous electric voltages, measurable and monitor-able by oscilloscopes or voltage monitoring meters.I think this is a significant point for the OP. In this regard the human brains and analogue devices have common data type output. I think the human brains also generate some sort of radio waves which can be monitored via the wave receivers and monitoring apparatus. But I don't know about it in detail off hand.
Your main interest in your posts in this thread seems to keep pointing out that I have done this and that, and that is wrong blah blah ... instead of focusing on the OP, and trying to come up with some conclusions after clarifying the concepts via seeking logical arguments for it. No one would be interested in what you are pointing out about me, I am sure, and that is a waste of time in my view.
As I said earlier, the only point that you brought and presented to us and insisted the analogue computers do exist was the Wiki pages on the internet. And your explanations had little to do with the OP in any ways.
I don't blindly reject Wiki. I am sure there are excellent Wiki contents on some topics, but also there are poor and wrong contents too. So always be open minded to them.
But we are not here to keep bringing in the Wiki pages and insist something is truths, just because someone wrote about it in there. I could register to Wiki, and blab about something shifting some data from some other places, and put it up there. Would it make more certain information because it is in Wiki than someone's argument in the forum threads? I doubt it. It is a methodical doubt in principle.
Being in Philosophy forums means that we try to avoid that type of truth gullible tendency, but try to be critical on all the issues we meet, and trying to come to some conclusions and truths by our own discourse based on reasoning and basic logical sense, and also clarifying the concepts. If one says that is a waste of time, then I will say, No! you are wrong.
You have answered my question! :love:
Glad to hear it sir. :strong: :wink:
Im glad you came back! Very good last reply to your "opponent", by the way... :smile:
Thanks! I tend to be around here on and off most days. I might be doing other stuff, and not able to engage more than would like to. :)
I think you did already enough! At least, for me, by participating in this discusdion..
Sorry, what I meant was "doing my other things for making living" here in the office. :)
Wish I was a full time student of philosophy not having to think about other things than philosophy.
I will keep buying the lottery :D
This was a very interesting thread in that initially I had no clear idea about the whole thing and the concept of "analogue computer" sounded contradictory but interesting.
It was only after exchange of many conversations questions and answers with you, things were getting clearer and clearer. This morning I was reflecting about it again briefly and was able to come up with more ideas about it.
But great, that you found interesting too. Keep having dialectic discourse and reflecting about the topics until the truths emerge out of the pure reason just like Socrates and his interlocutors had done, seems still one of the best ways doing philosophy. I must thank you for that. cheers.
Gee! You know how to express yourself! The meaning of a dialectic discourse is clear to me now! Thanks again and I hope you win the lottery! Then can we have endless discourse... No, just kidding! Dont let your office stress you too much! :smile:
Maaan! This "guy" has a way! Pure reason (something I despiced before) are a shining beackon now. Dialectic discourse made clear in practice! He truly deserves a medal. And then to think Popper was assigned the title "Sir"... The real sir resides here. A tear leaves my eye. NO irony involved!
What is an interlocutor has become clear now too! Gee, you should have been a philosophy teacher! But enough but licking now. I'll give a kiss only.
Quoting Prishon
Not really. Corvus is making a similar mistake that Hermeticus made in the first page of this topic; he's conflating analog and digital signals with analog and digital computation.
As a generic example, I offer the internet. In particular, I'm connected to the internet via a cable modem. The cable modem I use communicates using QPSK (quadrature phase shift keying); in this scheme, four symbols are communicated at once over a carrier wave (sine wave) by shifting the phase of the wave. QPSK signaling is digital; however, the carrier uses continuous values. The point being, you can't just assume that since you're measuring continuous values, you've got analog computation on your hands.
Neurons do indeed have continuous valued voltages, but they aren't wires or electric circuits... they're living cells. Neurons communicate using an all-or-nothing principle; they basically either fire or don't fire. A neuron that signals another neuron is firing, and in the act of firing, the neuron releases neurotransmitters along its axis to the next neuron. A neuron that isn't firing just isn't doing that, and in not doing that it does not send signals to the next neuron. These signals involve for the most part sodium and potasium ions, and it is the distribution of said ions that generate the electric charges you measure with probes.
At this level, computation requires transmission of signals, not just having them. My QPSK cable modem uses its continuous signals to communicate 4 symbols, which are decoded/encoded into the 2-symbol values as they start to travel along my network (initially over ethernet). The continuous-valued-nature of the carrier wave is irrelevant, because what actually makes it across is the 4-state symbols. For an example the other way, Hermeticus showed a nice stereotypical square wave, which was presumed to carry the bits 101010 (I'll cut it off there; annoying doing six pairs)... but if I were to use a generic scheme that looked like that to carry digital signals, the wave might be carrying 110011001100; or maybe even just 111, depending on the exact times I'm supposed to be getting the signal out. Neural signaling is even messier, because it doesn't seem to follow nice clean clocks all of the time; how long between neural firings does it take for the neuron to represent three 0's in a row? What if it fires a fraction of a way into the fourth? How does it signal two 1's in a row?
Despite the discrete signaling of neurons, the fact that they fire at some frequency allows for the possibility that there is indeed something analog going on; only, not with the "voltage" as Corvus is oversimplifying it, but with the frequency. The general problem of how neurons communicate signals is referred to as neural coding.
We can sketch how things work by looking at particular areas. For example, the cones in your eyes generally transmit signals based on opsins absorbing photons. There's a particular probability that an opsin will absorb a photon and kick off the cascade, based on the opsin and the frequency of the photon. What this generally means is that if more light is present, more of those opsins will absorb it, and therefore there will be more cascades involving the cone signals. But again, this doesn't trigger continuous signals coming out; rather, it modulates the frequency at which the cones fire. Those signals come out of the cones and travel down through the optic nerve; so it must be true that at least at some points along the brain, the intensity of the signals the cone measured is encoded more or less in the firing rate of those signals. But note that even if this is an apt description of how color perception works at a signaling level, it's inadequate to establish how the brain works overall.
Finally a second answer actually adressing the brain question! But I have to say that Im on the side of Corvus still. Grainy as the currents in the neural network might be there is stil an analogue process flowing on the network. Massive parallel non-externaly-driven flows representing external processes. Or externally driven by the senses. The huge variety of possible flowpaths creates the opportunity to represent virtually all processes in the universe. The number of possible paths is about a 1 followed by 10exp30 zeros!
Not sure what you mean by Corvus's side.
And he even had the decency not to reduce people to the brain: human. I dont like choosing sides though.
So, am I typing on a digital computer? My QPSK cable modem uses a continuous signal.
No, you are not typing on a digital computer. You are typing on a dial board.
...to compute (A+B)×C.
Let's consider two scenarios. In each scenario, there will be two cases. In case 1, we compute (2+3)×4, using A=2, B=3, and C=4. In case 2, we compute (3+3)×4, using A=3, B=3, C=4.
In scenario one, we will use a toy analog computer that works with continuous voltage. In case 1, we pump 2v into A, 3v into B, and 4v into C. Via this value encoding, and by definition of an adder, our computer's adder must then produce 5v at (A+B); and by definition of the multiplier must produce 20v at (A+B)×C. In case 2, we change the input at A to 3v. The result of doing this must then produce 6v at (A+B), and consequentially 24v at (A+B)×C.
In scenario two, we will use a toy binary computer. Binary computers only have 2-value symbols, so we'll have the inputs be strings of 5 symbols, and let's label the two values 0 and 1. So in case 1, we pump in the string 00010 into A, the string 00011 into B, and the string 00100 into C. Via this value encoding, and by definition of an adder, our computer's adder must then produce the string 00101 at A+B, and produce the string 10100 at (A+B)×C. In case 2, we change the input at A to the string 00011. The result of doing this must then produce the string 00110 at A+B, and the string 11000 at (A+B)×C.
For the machine to work in scenario one, it's insufficient that the input signals on A, B, and C are continuous values; and that the output signals on (A+B) and (A+B)×C are also continuous values. The components absolutely must be capable of distinguishing the input values to affect the output values as required by each computation.
If it turns out that the 2v versus 3v inputs into A don't affect the adder's output at all, then you can't possibly compute addition, or multiplication, or any function that depends on A (aka produces different results for 2 and 3). Likewise for scenario 2, it's insufficient that you are able to encode strings and send them to the adder; the adder absolutely must be able to distinguish all 2^5 values of the strings.
So back to the neuron case, it's firing at different rates, let's say. Okay. But the resulting signals go from neuron to neuron. For it to use 200Hz and 202Hz as distinct values in computations, it is absolutely necessary that there is something in the network that can distinguish those two values to produce different outputs; otherwise, the fact that they are continuous inputs is completely irrelevant.
Now, as I've sketched out before, there probably is indeed a significance here, as in the color perception case. But you cannot derive this simply from the fact that you found some analog inputs going into the neurons, and analog outputs coming out of them. If the neurons don't distinguish the values when reacting, the neurons can't compute using them. The processing of values is critical in a computer.
Analogue signals can always be converted to digital signals using ADC within the system. I don't see much significance in making and emphasising difference in their nature here.
What can be measured from human brain via monitoring instruments is analogue voltage signals, not the digital signal. Maybe the workings of the brain can be explained in digital signal forms.
Whatever the case, it seems human mind cannot be reduced to the workings of the signals. But the signals could be converted to replicate / emulate some of the functions of human brain.
It took a while to get a grip (I have read this sentence over and over again, with different emphasis, even loudup, th annoyance of my wife...Women...). But now I have that grip its crystal clear, another example of how concourse is done). If you have not eaten yet: buon appetito! ?
:100: :up:
Thanks for the great OP - very interesting, and I learnt a lot via the thinking and dialectic process.
Just returned from lunch. Thanks. You too.
:100: