Take a Tech Holiday

bullet through a cell phone

I considered taking a tech holiday this holiday weekend. You have heard of this, I’m sure – turn off or leave behind the phone and laptop; go where there is no wifi or service, or just ignore all that at home for a day, weekend or week.

I have tried in the past. Twice on vacations I had no phone or wifi on a regular basis – and I survived. But when I did go somewhere that had free wifi, I was quick to log in and check messages, email etc.  So, I cheated.

I generally write these posts during the week and schedule them for the weekend when possible, so it may appear that I am online when I am not. But this weekend I will be mostly at home and hoping to spend a lot of time out in the garden. My only tech out there is that I always listen to podcasts while I am working.

I queued up three posts during the past week: Friday, one on Memorial Day becoming a weekend thing, this one and a follow up reminder for the original Memorial Day on May 30.  So, nothing for tomorrow, Sunday. Let that be my tech holiday.

Here are some thoughts on technology to ponder, with a few comments of mine – and for you comment on, if you are so inclined.

“Even when you take a holiday from technology, technology doesn’t take a break from you. ” – Douglas Coupland
This would be a good essay writing prompt for my students. What does he mean?

“Technology can be our best friend, and technology can also be the biggest party pooper of our lives. It interrupts our own story, interrupts our ability to have a thought or a daydream, to imagine something wonderful, because we’re too busy bridging the walk from the cafeteria back to the office on the cell phone.”  –  Steven Spielberg
I will admit to “bridging” my walks, sitting in waiting rooms, riding on a train, sitting on the toilet, drinking coffee at a cafe with my phone. It’s hard to be totally bored if I have my phone. Is that a bad thing?

“Technology is, of course, a double edged sword. Fire can cook our food but also burn us.” – Jason Silva
This has always been true, from knives and swords to breaking the atom.

“Technology is nothing. What’s important is that you have a faith in people, that they’re basically good and smart, and if you give them tools, they’ll do wonderful things with them.” – Steve Jobs
Nice words from a man who was obsessed with technology and not particularly nice to people, no matter what kind of “Zen” vibes he tried to give off.

“The advance of technology is based on making it fit in so that you don’t really even notice it, so it’s part of everyday life.” – Bill Gates
That may be a goal of technologists, but it is also sneaky. Is this what Coupland’s quote means? Is our unconsciousness about picking up the phone 100 times a day or that the phone is pinging cell towers and telling my location even when I’m not using it?

“Every time there’s a new tool, whether it’s Internet or cell phones or anything else, all these things can be used for good or evil. Technology is neutral; it depends on how it’s used.” – Rick Smolan
Sounds like the same idea as the Silva quote, but then there’s the “neutral” part. Some tech is not neutral. A weapon is not neutral. A computer virus is not neutral. Technologies and the use of technology is sometimes designed for evil. I know that a smart bomb won’t do anything without a person using it with intent, but that tech is not neutral.

Unplugging

Well, you haven’t quite missed out on the National Day of Unplugging. Here you are, once again, online. All day you have been checking your phone’s email and messages, working online, posting photos to Instagram, checking on who tagged you on Facebook and Twitter.

Need a break? The National Day of Unplugging this year is from sundown March 1 to sundown March 2, so you can still give it a try.

Sign the Unplug pledge and disconnect. Talk to people you meet. Eat a few uninterrupted meals. Read a printed book to yourself or aloud to a child or partner.

This project is an outgrowth of The Sabbath Manifesto, which was a practice of our ancestors of carving out one day per week to unwind, relax, reflect, get outdoors, and connect with loved ones. Our ancestors at one time did have to “unplug” but nowadays that is the hardest part of any Sabbath Manifesto.

Isaac Asimov Predicted Some of 2019 Back in 1983

“It’s difficult to make predictions, especially about the future,” said someone clever.  It is difficult, and yet people keep doing it.

I have written that I tend to believe the predictions made by scientists more than those made by mystics. Of course, Sir Isaac Newton throws my theory against the wall with his predictions of the end of the world that he based on The Bible.

Scientists don’t always get it right, but sometimes science fiction writers do a good job of predicting. The best science fiction is probably fiction that is actually grounded in real science. Some of my favorite sci-fi writers, such as Philip K. Dick, have gotten it right and also a lot of it very wrong.

Isaac Asimov was born in Russia in 1920, but his family immigrated to the United States when he was three years old. His parents owned a candy store in Brooklyn and young Isaac spent a lot of time there – and reading the store’s popular magazines which included “pulp fiction” that included science fiction.

At 21, this very prolific writer wrote one of his most anthologized stories, “Nightfall.” The story was inspired by a conversation with his friend and editor John Campbell. Campbell had been reading Ralph Waldo Emerson’s Nature and noted this passage: “If the stars should appear one night in a thousand years, how would men believe and adore; and preserve for many generations the remembrance of the city of God which has been shown!” Asimov wrote a story about a planet with six suns that has a sunset only once every 2,049 years.

What did Asimov predict back in 1983 for us living in 2019? (And why did he pick 36 years in the future to target?)

“The consequences of human irresponsibility in terms of waste and pollution will become more apparent and unbearable with time and again, attempts to deal with this will become more strenuous.” A “world effort” must be applied, necessitating “increasing co-operation among nations and among groups within nations” out of a “cold-blooded realization that anything less than that will mean destruction for all.”

Is that the climate crisis? It was obvious to some scientists in 1983 that things were headed in the wrong direction.

He was more positive that we would be dealing better with overpopulation, pollution and militarism.  We probably are dealing better with those issues, though we haven’t “solved” any of them.

Education – a career and life choice for me – was something he predicted “will become fun because it will bubble up from within and not be forced in from without.” I wouldn’t use “fun” as my main adjective for education today, but through MOOCs, alternate degrees, customized programs and other DIY educational paths there is more education “bubbling up” than ever before.

What about technology? Like others, he believed that the increase in the use of everyday technology will enable increased quality of life and more free time for many people.  He said that “… more and more human beings will find themselves living a life rich in leisure. This does not mean leisure to do nothing, but leisure to do something one wants to do; to be free to engage in scientific research. in literature and the arts, to pursue out-of-the-way interests and fascinating hobbies of all kinds.”

You can read his full essay at The Star. I was alerted to his predictions by an article on the always interesting Open Culture website.

A Beginner’s Guide to the Internet

The title “A Beginner’s Guide to the Internet” is going to make some readers move on because they figure “I know all about the Internet. I’m no beginner.” Of course you are.

This is 1999. To a viewer who is under 20 years old, this may seem like a film from the 1950s. This is the World Wide Web. You know, the www is a web address. No social media, no streaming video, no blogs. Your web browser was Netscape Navigator or Opera or Mozilla or maybe the Internet Explorer that was pre-installed on your Windows computer.

Google was launched the year before, but no Chrome browsers, just a search page. And a competitor to guiding you along the information superhighway was the Internet portal company Lycos who made this film with John Turturro.

John Turturro was no unknown. The year before we saw in the cultish film The Big Lebowski. In this short film (38 minutes), he plays a history teacher (aspiring comedian) whose car breaks down in Tick Neck, Pennsylvania on his way cross-country to Las Vegas.

While he is stuck there, he stops in a diner, connects his laptop modem to the phone there and dials up his internet service provider’s number.

1999 was the end of the 20th century and just before the Internet (we used to capitalize it) exploded.

Where did you see this film? Definitely not online. A film of that length would have eaten up all my data for a month, and probably wouldn’t have loaded anyway on my dial-up connection. But you get a free rental VHS videotape copy of it at your friendly Blockbuster, West Coast Video stores, or a public school library. It was probably shown in some classrooms.

The film, funded by Lycos, was a good promotional tool and it might have help educate the public about the World Wide Web. Lycos was in 1999 the most visited online destination in the world. In 2000, Telefónica acquired it for $12.5 billion.

There are some now-funny lines in the film. A kid tells Turturro “My family doesn’t own a computer, and my dad doesn’t like ’em. He says facts are facts.” His dad was probably quite happy with the 2016 election result.

How Smart Do You Think You Are?

I read about two studies that were done concerning IQ and the more general sense of just how smart we think we are.

Your IQ (intelligence quotient) was probably tested and measured in school, though you probably were never told your magic IQ number. Think you might be a genius?

Genius IQ is generally considered to begin around 140 to 145. That’s about ~.25% of the population or 1 in 400 people. There are varying guides to how the geniuses are divided up. One guide shows:
115-124 – Above average (e.g., university students)
125-134 – Gifted (e.g., post-graduate students)
135-144 – Highly gifted (e.g., intellectuals)
145-154 – Genius (e.g., professors)
155-164 – Genius (e.g., Nobel Prize winners)
165-179 – High genius
180-200 – Highest genius
>200 – “Unmeasurable genius”

Einstein was considered to “only” have an IQ of about 160.

Since the early 20th century, IQ scores were increasing at 10 points per generation, but in the last twenty or thirty years, humans have started getting dumber – if dropping IQ scores are to be believed.

The trend that IQ increased throughout the 20th century is known as the Flynn effect, named after intelligence researcher James Flynn after he observed the rises in IQs for every decade in the 20th century. But in recent years there has been a slowdown or reversal of this upward trend, at least in some countries.

The Flynn Effect is attributed to a variety of societal improvements during the 20th century, including prenatal and early post-natal care, reduced exposure to lead, reduction of pathogens, improved nutrition, better education and improved social environment.

But from the 1970’s onwards, our intelligence has started falling. Are we getting dumber?

One theory concerns dividing our intelligence into two types: fluid and crystallized. Blame is thrown at schools that value and judge you on your ability to recall information for tests and exams. That is crystallized intelligence. It is a type of intelligence that is fine for many service class jobs.  An increasing number of people are going into these kinds of service jobs, and many of those jobs are being dumbed down. You don’t need to add or subtract or even put in amounts when the iconized cash register shows you a picture of a soda or a burger or fries and does it all for you.

But fluid intelligence is what we use for problem solving, critical thinking and higher order skills. It’s not that fluid is better; it’s that both kinds are needed for higher intelligence.

Let me bring in here a second effect: the Dunning Kruger Effect. This was developed by David Dunning and Justin Kruger of Cornell University who found a cognitive bias that occurs when people fail to adequately assess their level of competence (or incompetence) at a task. They consider themselves to be more competent than they actually are.

The theory has a far less academic name, according to the Urban Dictionary, as “Mount Stupid.” This is a mountain you climb until you get to the place where “you have enough knowledge of a subject to be vocal about it, without the wisdom to gather the full facts or read around the topic.”

It sounds like pop psychology, but there have been serious studies done on the effect. People with low ability do not have the necessary critical ability and self-awareness to recognize how low their ability actually is, and that leads them to have an inflated view of their own competence and knowledge.

In much cruder terms, this effect occurs when people are “too stupid to know how stupid they are.” Have you ever noticed this effect?

Dunning and Kruger tested developed their theory with tests of humor, logic, science and grammar. They found that those who performed best consistently underestimated their ability. But those who performed worst believed that they had in fact done well. As cognitive ability worsens, so does the ability for the participant to accurately assess their ability.

Again, in simpler terms, those with only a little knowledge were more dangerous than those that knew they had no knowledge about a subject. “A little knowledge is a dangerous thing“ said Alexander Pope way back in 1709.

The more you learn, the more you realize how much you don’t know. You have heard that, right? It is a commonly said idea, but it is actually a different cognitive bias known as “Imposter Syndrome.”

When Nicholas Carr published The Shallows: What the Internet Is Doing to Our Brains in 2011 (and was nominated for a Pulitzer Prize) people kept quoting his earlier Atlantic Monthly article “Is Google making us stupid?” He hit a nerve at the time – we enjoy the Internet a lot, but are we sacrificing our ability to read and think deeply by using it too much?

Carr references earlier thinkers from Plato to McLuhan and notes that the idea that every information technology (printed books to the Net) also changes our nature of knowledge and intelligence.

Thinking people feared that the printed book would erode our use of memory. But it actually served to focus our attention and promoted deep and creative thought.

Carr doesn’t think the Internet is doing good things. It encourages rapid, distracted dipping into bits of information from many sources. His theory is that what it is making us better at is scanning and skimming – not concentration, contemplation, and reflection.

But you’re reading this article and you’re thinking about it. Did it make you feel a bit stupider or a bit smarter to read it? Will you comment on it, or share it, or read more about it, or talk to someone else about it?

What If Steve Jobs Had Been Your Professor?

Steve Jobs was never a teacher in the classroom. He only did a year of college himself. But he was surrounded in his workplaces with talented people. most of whom had college degrees, many who had advanced degrees. He seems to have taken on a teaching role is many of his interactions.

Walter Isaacson has written about several geniuses and innovators, and his book, Steve Jobs, portrayed the Apple co-founder and CEO as a visionary and a difficult and sometimes cruel person to have as a boss.

Jobs didn’t take to college. He attended Reed College in 1972 and dropped out that same year. He wandered a bit aimlessly, then after two years he traveled through India seeking enlightenment and studying Zen Buddhism like a good mid-70s late-to-the-party hippie.

What kind of professor would he have been?

He would have been a tough grader. He would not hold back on his criticism.For example, he obviously did not like his early competitor, Microsoft, and called Windows “the worst development environment that’s ever been invented.”

Jobs was not a real geeky, tech guru. It was really Steve Wozniak who made the first Apple computer and Jobs partnered with him as the sales guys for Wozniak’s Apple I personal computer. His tech side was more of the outside, and he was famous for his demands for sleek and simple designs. He was a good salesman.

I came across a series of videos of a Jobs “teaching” at MIT in 1992, when he was 37. At that point he was  and running his company NeXT, founded in 1985 after he was originally forced out of Apple.

A few years after this, he would be launching a little computer graphics division that would later become Pixar. And the technology and designs that he implemented at NeXT would end up revolutionizing Apple when it bought NeXT in 1997.

But before he would take back Apple in a pretty ruthless fashion, he was in this MIT classroom. I would call this lecturing and not teaching. (I know a lot of you had lectures that passed for teaching in college but…)

With his turtleneck tucked into his jeans uniform and pacing back and forth, he talked about tough topics. (These video clips were on YouTube, but disappeared this past week – perhaps they will return; perhaps the Jobs estate had them taken down.) He spoke about why Windows NT was lousy and how he stole people from Microsoft and why the Apple III and Lisa computers failed.

When asked what he learned by being fired by his own Apple company, he took a very long pause before answering. (This clip was posted by another source and hopefully it will still be there when you read this.)

 

If Steve Jobs was an adjunct professor at my university, I wouldn’t be sure where to place him. Should he teach in the school of management, computer science, or communications? Would students like him as a teacher beyond admiring him for what he had done?

I think the answers would vary greatly depending on what Steve you had in the classroom: the young Apple founder, the just dismissed from Apple boss, the NeXT/Pixar visionary, the tough, calculating CEO of the new Apple, or the late year Steve who knew his time remaining was limited. Any of them would have been an interesting semester.


Steve Job’s gave a commencement speech at Stanford in 2005 that is often quoted (text version). The three stories he tells are three lessons he might have used in the classroom if he was teaching at that point in his life.