Your Brain As a Transducer

computer and brain
Image by ParallelVision from Pixabay

For about as long as we have had anything considered to be a computer, we have compared it to our brain. Since we continue to try to create artificial intelligence that is like a human brain, we alternately have used computers to try to understand our brains.

Is it a fair comparison? A computer has storage. So does a brain. Different computers have different processing speeds. Check on brains. We always talk about computer memory, and we talk a lot about our human memory.

Both use electrical signals to send information, though the brain uses chemicals and computers use electricity. The nervous system is high speed but the computer is faster.

What about those on or off (binary) computer switches? Our neurons also fire on and off.

Computer memory can grow by adding computer chips. The brain has plenty of memory space and it expands by making stronger synaptic connections.

But they are not really the same things. Computers are faster than brains and computer memory is more precise. But humans have more storage capacity. And computers still can’t nuance memory access like a brain.

A typical computer runs on about 100 watts of power, but a brain only needs about 10 watts. Super energy efficient.

The computer as brain metaphor has been the dominant metaphor in neuroscience, but now it has fallen out of favor. In fact, it might even have sent scientists in the wrong direction for decades. How about your brain is a transducer?

What is a transducer?  It is a device that converts variations in a physical quantity, such as pressure or brightness, into an electrical signal or vice versa. They are all around us – microphones, loudspeakers, thermometers, position and pressure sensors, and antenna.

The brain is still pretty much a mystery. It’s not a mystery like ghosts, but more of a mystery like dreams. For example, my fingers are right now putting pressure on my keyboard and moving a mouse and both movements and pressures are causing transduction. Analog is converting to digital. Words are appearing on the screen. The words – converted to bits – are flying through the air in my family room to a wireless WiFi point and then flying through a wire off to a server in the “cloud” that might just as well be in the real clouds.

But let’s back up to before my fingers started putting pressure on keys. Organic transduction via our sense organs — eyes, ears, nose, tongue, and skin — is happening. I can’t even comprehend what effect electromagnetic radiation, air pressure waves, airborne chemicals, liquid-borne chemicals, textures, pressure, and temperature are having on my writing right now. Electrical and chemical activity in my brain is somehow sending those words in a reasonable order down to my fingertips.

Thank you evolution for all the forms of transduction we possess. And thanks for most of the forms of transduction that humans have invented and are still inventing.

There are still some missing transducers. I can’t connect to plants or the universe. I know there are those who say via things like ayahuasca that they can connect to the unseen. Religions all seem to offer connections to a transcendent reality. Neither path has worked for me.

Let’s see if transduction theory catches hold and leads to a better understanding of the brain or the universe.


Ignore More

Ever since I was a very young student, I’ve been told to pay attention and focus – and was sometimes scolded for not doing so.. Both things are obviously important to succeeding in school and in later life. But I have also come to recognize how important it is to ignore some things.

I suppose “ignore” has a negative connotation, so let me clarify. You need to better allocate your limited time, attention and focus to find the most factual, practical and useful knowledge needed to make informed decisions and choices.

A simple example is screen time. Whether the screen is a big flat one on the wall or a small one in your hand, there is more information available there than anyone can view, process or use. The current information age is a time of scrolling and interruptions.  You need to be effective at ignoring information that turns out to be wrong factually or just irrelevant. If only filtering information was as easy as turning on a filtration system in your home.

In my lifetime of teaching, I know that teachers are always working with students doing research to be more intelligent and effective at filtering out the irrelevant and inaccurate.

All that sounds good and uncontroversial – but it’s not. Social media has come under increasing pressure to be better at filtering just as we have taught students, but every filtering method has been criticized. They have tried using trained humans but that is slow and not very efficient. They have tried using algorithms and technology but that isn’t always as smart as a human though it is faster and more efficient.

Bias also enter the equation. This past week Facebook and Twitter CEOs faced tough (and sometimes ill-informed) questions about how they operate. Do the platforms filter with a bias that disfavors conservatives and Republican and President Trump, or is that where the most disinformation is generated?

“The net is designed to be an interruption system, a machine geared to dividing attention,” wrote Nicholas Carr his book The Shallows: What the Internet Is Doing to Our Brains.

Can we unplug from the Net and media? Of course, you can. You can hide away in a cabin on a remote mountaintop, but is that a way to live? It’s an extreme reaction.

It makes more sense to improve your filtering, but that isn’t easy. There is no course you can take or an easy list of ten things to do. You can start by knowing that you can’t read every article, tweet, email, Facebook or Twitter post.  Can you resist? You’ve probably heard the acronym FOMO (Fear of Missing Out) about the actual physically measurable “fear” people get when they see that badge that says they have unread, unseen content. It’s hard for some people to just ignore or delete without checking.

Carr’s book covers research that shows that this flood of information is more than our brain is configured to handle. TMI – Too Much Information – is literally the case. We take it in and relevant or not our brain tries to categorize and store it. It gets filled like that storage room with a lot of stuff that we don’t need. It’s easier to clean that storage space than clean your brain.

I have taken to watching the half-hour evening news rather than putting on a 24-hour news channel that repeats the same news over and over and adds in a lot of opinions. Do I miss some news? Yes, but I get the major stories and if I want to know more about a story I can easily find it online.

In the same way that tobacco companies used formulas and advertising to keep people wanting more, networks and media platforms work hard at keeping us looking.  When one streaming episode or movie ends, another one is queued up for you to continue. When you search for a certain book, video or topic, the Net will certainly suggest others. Going down that rabbit hole is very, very easy. As the title of a book by Adam Alter puts it, Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked.

“Behavioral addiction” is what makes us be obsessed by text messages, emails, likes, and feeds and makes us binge video. We average about three hours each day on our smartphones. Back in the 1950s, 60s and 70s there was a lot of research and talk about how broadcast television was hurting kids learning.  If all that research had some validity, imagine what Millennial, Generations Y and Z, Zoomers and Generation Alpha kids are doing to their brains and learning by the amount of screen time and information they consume.

What can we do? Alter suggests that we reverse engineer behavioral addictions. Good luck with that.

Our Brain’s Constant Predicting

The end of a year and the start of a new year brings many predictions about things to come. Predictive coding has nothing to do with “coding” computers or predicting trends and everything to do with our personal neuroscience.

The classical view of perception states that we experience the world by receiving input from our environment, processing it at the higher levels of our brain, and then responding accordingly.

A newer alternative theory proposes to add to those three steps that our higher faculties often “predict” the input from our environment. That means we have a perception of some things before we experience it. This is called predictive coding or predictive processing.

I read an article by Sara Briggs and then followed up with another titled “To Make Sense of the Present, Brains May Predict the Future.” In those readings, I encountered this theory (still controversial) that suggests that perception, motor control, memory, and other brain functions all depend on comparisons between ongoing actual experiences and the brain’s modeled expectations.

The next day I noticed a connection when my son’s visiting dog seemed to do some predictive processing. Pepper reacts to her doorbell at home by barking and sprinting to the front window. We were watching the movie Love Actually and in one scene Hugh Grant’s character rang a series of doorbells looking for a woman’s home. Even though these were different doorbell sounds from the sound in Pepper’s home, she reacted to each ring in the same way that she does at home. Her actual experience in my home and her brain’s modeled expectation created a match.

One way scientists look for evidence to support this theory is to look at cases where the brain predicts too much or too little. For example, individuals with autism would presumably have a weak predictive filter. That would mean that they have a harder time categorizing items based on past experiences.  They would have an extreme sensitivity to input from the environment and the many “new” experiences could be overwhelming.

A person with schizophrenia would be at the other extreme with an overly strong predictive filter. Their brain would be so certain about what it’s looking at, it will cancel out new information and have false perceptions, possibly even hallucinations.

What is considered “normal” is somewhere in the middle of this spectrum.

Of course, we can change that by changing our brain chemistry. That is why some research uses psychedelic substances. Some neuroscientists might say that our “normal” perception is a “controlled hallucination.” Substances like psilocybin and LSD remove the predictive filter and so when we under that influence someone sees something common to daily life, such as a tree, there is no prediction and it alternative perceptions emerge. The branches moving in the wind are arms and the leaves are flames. The drugs don’t add to perception but by removing the filter they allow other possibilities.

How does this predictive coding affect learning new things?

To learn new things we need to be open to new perceptions which means the filter must be reduced to some extent. But in order to retain the new information and use it in the future, we need a predictive model of that information, which requires that filter to be operating normally. When the two are balanced, learning and memory are optimized.

In a more simplified explanation, being open-minded should lead to greater learning. We don’t put information in a box and move on.

Some of this theorizing isn’t new at all. Back in the 1860s, the framework known as the “Bayesian brain” was introduced and Helmholtz’s concept of unconscious inference emerged. It proposes that the brain makes probabilistic inferences about the world based on an internal model, – it calculates a “best guess” interpretation of what it’s perceiving. The name comes from Bayesian statistics which quantifies the probability of an event based on relevant information gleaned from prior experiences.

These “controlled hallucinations” based on predictions don’t wait for all sensory information to drive cognition. We are constantly constructing hypotheses about the world. We use these to explain new experiences. The brain is constantly generating and updating a mental model of sensory input.


I’m Not Right-Brained After All

right left brain

Neuromyths are false beliefs about the brain. Some of them have affected the ways we teach and try to learn.

An article from The Chronicle’s Teaching Newsletter pointed me to a report on “Neuromyths and Evidence-Based Practices in Higher Education.”

One of those myths is one I was a believer in a few decades ago. According to the report, it is one of the most widely believed neuromyths: that students learn best when they’re taught according to their preferred learning style. This idea emerged in the 1970s and led to articles, books, and approaches to teaching that often focused on learning styles (such as visual or auditory learners) and led to the idea that some of us are right-brained and some of us are more left-brained.

The report states that there is no evidence to support the idea that people learn best when taught in their preferred learning style.

Back when this was a popular theory, I had come to really believe that I was right-brained and that explained both my problems with math and my more creative interests and abilities. I was a visual and auditory learner for sure. Not so, says the newer research. Actually, that teaching or learning based on a learning styles approach may hurt students who then would seek only information presented in a particular way.

This debunking of the myth of learning styles is not breaking news. It has been around for about a decade itself.

Learning styles and right/left brain styles are not the only neuromyths. Another one that I have heard since I was a child is that “we use only 10 percent of our brain. ”

The report also suggests that some commercial products for the brain and learning (brain games, for example) might encourage the belief in neuromyths.

I will say though that I still enjoy and refer to my copy of Drawing on the Right Side of the Brain as an interesting approach to art. Research be damned.


Read and download the full report at…

Free Will, Regret and the Choice Engine

choices of doors

While I was on vacation earlier this month,  I had a few “heavy” talks with a friend who was with us. At one point we got into a discussion of regrets. My philosophy is no regrets. I think regrets hurt our present and future. I’m a believer in the idea that if you change one thing in your past, you change everything that follows. And I am not unhappy with my present and changing something in the past that I wasn’t happy about would move me out of this present.  Yes, changing something might make my present better in some ways,  but there’s no guarantee of that positive result.

Of course, this is all a thought experiment since we can’t change the past. That only happens in science-fiction.

Are you reading this article because you chose to? Or are you doing so as a result of forces beyond your control? That is how an article I read this past week about free will and regrets begins.

Tom Stafford is a Senior Lecturer in Psychology and Cognitive Science at the University of Sheffield who studies learning and decision making. “The Choice Engine” is an “interactive essay” about the psychology, neuroscience and philosophy of free will.

How and why do we choose? Are our choices free, or determined by things like our past, our brains or our environment? Are our choices ours?

Studies have shown that people who believe things happen randomly and not through our own choice often behave much worse than those who believe in free will. That makes sense. If you don’t think you have a choice in the matter, then what-the-hell is the difference?

There is a simple example given using an insect to illustrate. When a female digger wasp is ready to lay her eggs, she hunts down a cricket or similar prey, paralyses it with a sting, drags it back to the lip of her burrow, and then enters to check for blockages. If you move the cricket a few centimetres away before she re-emerges, she will again drag it to the threshold and again leave it to check for blockages. She will do this over and over. The wasp has no free will – no choice in the matter. The digger wasp has become an example for biologists of determinism.

Determinism is the idea that what we think of as a “choice” is in fact a path dictated by pre-existing factors.  I don’t subscribe to that philosophy.

“I’m no wasp,” you might say. “My choices are my own. Freely made.” But these neuroscience-of-decision-making people seem to think that sophisticated animal that we are, we are also trapped in behavior beyond our control. Free will is just an illusion.

I disagree.  Stafford, a cognitive scientist, disagrees.  I would like to believe that he is correct and that “… the evidence shows that most people have a sense of their individual freedom and responsibility that is resistant to being overturned by neuroscience.”

Stafford’s book, Mind Hacks: Tips & Tricks for Using Your Brain, has hacks/exercises that examines specific operations of the brain. They are a hands-on way to see how your brain responds and learn about the “architecture”of the brain. You can try to “Release Eye Fixations for Faster Reactions,” “See Movement When All is Still,” “Feel the Presence and Loss of Attention,” and “Understand Detail and the Limits of Attention.”

It is your choice whether or not to read the book.

The Strange Case of Phineas Gage

I love reading stories about brain research. Exploring the brain is like exploring outer space. I’m not sure that we will ever figure everything out about either of them.

Go back a hundred year or more and scientists didn’t know anything about which parts of the brain did cognitive functions where the senses were located.  It was still considered legitimate for doctors to use phrenology. That was measuring bumps on someone’s head as a way to detect mental illness.

And then came Phineas Gage. Poor Phineas had a tragic accident.

In 1848, Phineas Gage was a 25-year-old foreman of railroad crew that was cutting a railroad bed into rock for a new rail line in Cavendish, Vermont.

It was dangerous work. They would pack explosives into a hole, pack it down with a tamping iron, top it with sand and then stand back and blow it up.

September 13 was not a lucky day for Phineas. He was packing a hole and probably was distracted. looked away while he was tamping and the metal pole hit rock, set off a spark and ignited the explosive.

The explosion shot his tamping iron into Gage’s skull just under his eye socket and it came out the back of his head.

That tamping iron was 3 feet long, 1.25 inches round and it weighed 13 pounds. It did not kill him.

Animation of Gage’s injury in the frontal lobe.

But it did change him. I don’t mean that he looked different, but he did. It was his personality that changed.

His friends and workers described the pre-accident Gage as being “amiable, with a well-balanced mind… shrewd, smart businessman, very energetic and persistent in executing all his plans of operation.”

But Gage after the accident swore at everyone, was constantly drunk, became a lousy worker and was no longer much of a friend.

He lost his railroad job could only get menial work.

But his injury showed doctors that there was a link between the brain and our personalities. With our current knowledge, we can map where the tamping iron passed through his brain and the function of the temporal lobe. We now know that the temporal lobe is responsible for processing information we see and hear. But it also is responsible for long-term memory, behavior and personality and processing language.

Phineas Gage moved to San Francisco to be near his mother and sister, but after suffering from a number of seizures, he died 1860 at age 36.