The memory gap: how technology took over the mind

9 posts

Random logic
What happens when we outsource part of our brains to the internet?

Lara Prendergast

13 August 2016


Ask me what I had for lunch yesterday and I couldn’t tell you. Names disappear as swiftly as smoke.-Birthdays, capital cities, phone numbers — the types of facts that used to come so-readily — are no longer forthcoming. I’m 26, yet I feel I have the memory of a 70-year-old. My brain is a port through which details pass, but don’t stay.

I’m not alone. Many young people feel our memories have been shot to pieces. It’s the embarrassing secret of my generation. We can hardly recall a thing. We joke about having early-onset Alzheimer’s, often with a hint of real anxiety. We know that when we reach to remember any detail — a route, a phrase, a historical fact — our minds do not perform at the critical moment. So we reach instead for our phones, which are much more trustworthy. We do so as-naturally as we might scratch an itch. How to get from A to B. How to make risotto. How to write a magazine article. Can’t spell a word? No bother — tap a guess on to your screen and Google will figure it out.

Every day, and increasingly in every way, we are outsourcing our brains to the-internet. But at what cost? As smartphones get smarter, it’s easy to argue that we’re getting thicker. That’s not quite true. Our brains are not necessarily shrivelling; they are adapting. Thanks to technology, the need to know has been replaced by the ability to find out. Younger people, especially the ‘digital natives’ who have never known life without the web, are most comfortable in this new environment. In almost every profession, expertise is being made redundant. In my profession, journalism, the sharp kids with fresh IT skills can claim to have the edge over the seasoned hacks (or at least that’s what they want our bosses to think). Or look at the taxi industry, which has been revolutionised by apps such as Uber. London’s black-cabbies have discovered that ‘the Knowledge’ — that impressively encyclopaedic study of the capital’s streets — has become all but-obsolete. With Uber, anyone who can drive and has a smartphone can earn money as a taxi driver.

Even in high-pressure fields such as medicine, politics or law, the speed with which information can be found means that professionals rely on technology as much as everyone else. Some doctors-freely admit to searching Google for symptoms as their patients describe them. Lawyers no longer need to remember the intricacies of tort law — at least not parrot fashion.

What does that mean for education? University finals, where students rely on memory unaided, seem an anachronism. Once, it was the perfect training for later life: learn a subject, store the information, use it later to your career advantage. But when every fact is just a click or two away, what’s the point?

I can at least still remember what it used to be like to commit a fact to memory. You could take pride in it. It was a delicious, joyful thing, a gentle high. That has now been replaced by addictive short, sharp hits of dopamine mixed with adrenaline. Who can search first? Who has the fastest fingers tap-tapping away on their phone? The pleasure of contemplation has been replaced by the constant buzz of ephemera passing us by: on Instagram, on Facebook, on crack-like news apps. Even language is often bypassed; we increasingly communicate via images to save time. Forget a thousand words; send an emoji. Or a picture via-Snapchat that will self-destruct after a few seconds.

Neurologists talk about the ‘plasticity’ of the brain — its ability to adapt its-function according to which neural pathways are most employed — and there is evidence to suggest that our brains are changing to meet the demands of this high-octane modern world. It’s reactionary to assume that this is bad news: the idea that technology is ruining our ability to think and communicate properly is as old as technology itself. People blamed the telegram for curtailing speech. Radio was thought to be dangerously-mindless, and everybody has always said that television rots the brain. But, for all these obstacles, humanity has just become more ingenious, so much so that we invented the internet, a medium for being clever without using our intelligence.

If our brains are changing to the new digital environment, maybe we should feel encouraged by our resourcefulness.-Perhaps memory is something we can afford to sideline, and instead we can focus on skimming off facts and figures while relying on our short-term memory. Ensuring that knowledge is actually remembered requires time and concentration. And in this world of instant notifications and non-stop info, speed is king. Why bother learning ten things when your phone can find out any one of a million things in a few seconds?

The answer is that the brain requires exercise, and we allow it to atrophy at our peril. While we get better at juggling ideas, our memories are taking a battering. An academic study into the ‘Google effect’ showed that people tend not to bother remembering something if they believe it can be looked up later. People were more likely to index; to remember where information was-located rather than the actual information itself. That study was five years ago and technology has moved on significantly. Want to bet that people’s memories have got better — or worse — since then? Last year, 91 per cent of people surveyed for another study into ‘digital amnesia’ said they used the internet as an ‘online extension’ of their brain and 44 per cent relied on their smartphone. Of 6,000 adults surveyed across Europe, more than a third turned to computers to help recall information. The UK had one of the worst rates: more than half of British adults admitted that they don’t even try to remember answers, they just search online. We are becoming a flabby-brained nation.

Techno-libertarians rejoice at the idea of computers becoming integral to the human experience. The big nerdy fantasy is that we’ll all become hybrids — part human, part computer. And with the advent of wearable tech — Apple watches, Fitbits and so on — that process seems well under way. But if we cease to be fully human, life must become something less. Already, young people depend on technology for peace of mind. When our phone batteries run out, we feel a deep anxiety, not because we desperately want to read our emails, but because our gizmos are now part of who we are.

Worse still, the machines have an-insatiable appetite for more information, which must come at our expense. My phone keeps telling me it has run out of memory and that I must buy more. Then there is the ‘cloud’, a separate memory bank where the phone also suggests I store things. I assume this cloud brain floats somewhere above California. Keeping it full of my information could become a very expensive habit. But the thought of losing that stuff is-terrifying. I would (and do) pay handsomely to insure against it.

Suppose my cloud and all other clouds vanished, though. What if, instead of a-nuclear strike or tsunami, an-electromagnetic pulse wiped every hard drive, every detail of every bank account, every family photo? Then what? History is littered with examples of knowledge being destroyed or damaged. Details about ancient Roman sanitation were lost for hundreds of years during the medieval period; the destruction of the Great Library of Alexandria — one of the ancient world’s great archives of knowledge — should still serve as a warning for us. We assume that nothing is ever lost online, but that’s not true. A computer-science study from 2012 showed that almost a third of recorded history shared over social media during the ‘Arab Spring’ uprising in Egypt has since been deleted.

If everything is lost in the digital ether and nobody has bothered to remember anything, then what? The Long Now Foundation hopes to become a ‘long-term cultural institution’ to counter the fact that ‘civilisation is revving itself into a pathologically short attention span’. Its grandiose plan — to help archive digital material in a responsible way for the next 10,000 years — sounds whimsical, but the idea behind it opens up an-interesting discussion: how do we preserve our experiences so that when future civilisations look back at us, they don’t just see another dark age?

The problem of digital amnesia is more immediate. John Locke thought that memory and our sense of self were inevitably linked, because personal identity was founded on consciousness. (At least that’s what his Wikipedia page says.) Surely that’s right. When memories fail in old age, we feel we lose a part of us that rests deep within. That is why Alzheimer’s, which afflicted both my paternal grandparents, is such a cruel disease. It may well be that memory is more spiritual than we like to admit. By using our minds, we nourish a part of us that goes beyond the physical. Equally, by storing memory outside of ourselves on a piece of technology, we lose something fundamental.

Ted Hughes recommended memorising poetry not just for its own sake but as a form of exercise — for mind and soul (thanks again, Wiki). My plan now is to try to do just that: remember a few more things each day, rely less on my smartphone and have a go at learning the odd poem or two. Reflection, I hope, can cure this modern affliction.
kenshiro

Interestingly, in America "rote memorization" has been out-of-fashion for a long time in pedagogy because it makes black people look bad. Every educational fad for decades has been trying to unlock a student's "critical thinking" skills, which isn't as focused on "mere facts." I found privately that the best way to think critically is to vacuum as many facts as possible and use those to start trying to infer general patterns about how the world works, but most formalized teaching is not interested in this. This is why Americans of the last 2-2.5 generations know practically nothing, not even the names of all 50 states or the Presidents or who Napoleon was. This is an old 8th grade test from Kentucky in 1912, and I can confidently say that there are exactly zero members of U.S.'s "intelligentsia" who could get a satisfactory score:

[​IMG]

Ironically, I think the affect of smart phones on memory in USZOG will be minimal compared to the rest of the world. Parents who want their kids to sort of know things should probably restrict the amount of time with electronics (especially TV), and stay away from public education in general.

Thoughts

I actually often believe that the entire idea of education is largely subjective.

People today probably know as much disconnected "facts" as people did before, but it falls into different categories than in the traditional categories. Scratch anybody allegedly "poorly educated" and you'll find some "encyclopedic" knowledge of bicycles, baseball games or other garbage. Anyone can easily acquire facts, all it requires is just the knowledge of reading and writing: I remember cramming for geography tests during grade school the night before an exam, subsequently "acing" it, and then forgetting everything a few months afterwards. Anybody you pick can easily write a "test" on individual facts that other people will fail instantly (but obviously pass if they were simply given a list of topics to cram for a few days).

Nor is "critical thinking" any better. Anyone can perform "critical thinking". All you have to do is: be mentally retarded, and chafe at the sign of the slightest obstacle in front of you. The non-critical student merely performs arithmetic. The critical thinker thinks that it's possible to make mistakes while performing arithmetic, and therefore "it cannot be trusted" (and by the way, he wonders, does the world really exist at all, or am I in a virtual reality program?). He subsequently becomes a tenured professor of "feminist ethics" and rants about how science is a patriarchal, phallic construct.

Pure "critical thinking" is just mentally wandering around: you don't need to actually focus or think your way out of a paper bag, which partly explains the mediocrity of academic philosophy departments. Simply memorizing individual facts doesn't require you to closely perform work, write things down, closely look at visual content, organize and construct things, etc. It also comes very close to vacuous and empty pure thought, the mental equivalent of "doing nothing".

But the primarily difference between knowledge of inert facts and knowledge of skills and technique, is not even difficulty. It is the fact that in the second case, one is reminded of the real-world consequences of making a mistake, usually consisting of physical pain (or long-term damage of some kind).

As people use mindless consumer devices like smartphones, TV, etc., they get stupider. Instead of using a tool (such as a knife, or a hammer) that reminds them of the objective consequences of making mistakes, they simply swipe around with pudgy fingers on a glass touchscreen, with no real-life responsibilities. They lose any kind of technical skill, all of which requires hands-on work (actually writing things down). They acquire, however, some kind of ability to rote memorize an "encyclopedic" body of knowledge about celebrity gossip, techno-vaporware and health fads given in internet news feeds. Joy.

Thoughts

This thread is part of the broader topic of how human memory has been altered throughout human history: Ancient Greeks and Romans could memorize things much better than people today could (they could recite entire epic poems from memory in a single hearing), and they had very sophisticated mnemonic techniques that are mostly forgotten except for some relatively simple ones . Writing instruments led to deteriorated memory, but improved other features of cognition in which a "layer" of physical movement and responses could coordinate with what is purely mental. E.g., poetry is often best written using an actual pen and piece of paper, since there are details in how you can "pace" yourself, while prose is better written on a typewriter.

By the way, this is an important point of general cognition that should be discussed in depth. In a certain sense it must be true, because every act of learning requires memorization, or else it would be forgotten (the only alternative to memorization). Hence, there can be no learning -- or even human cognition -- without memorization.

"Rote" memorization is often the very first step in gaining technical skill, and in those cases it is indispensable, but in most cases it is not the basis for any kind of technical or in-depth knowledge whatsoever. In the case of geography, it is the basis for making immediate decisions and perhaps responding to current events, but it is not even the most important part of that basis (which should consist more in the skill of closely observing the complexities of a situation that is directly drawn out, whether it is drawn out from memory or otherwise).

In other cases, what appears to be "rote" memorization is not rote memorization at all, but intelligent memorization -- for example, how theorems, formulae or complicated machinery is remembered. In that case, obviously "memorization" requires the task of analyzing the objects into components and putting them together in a specific order so that you can unpack them extremely quickly and in a visual format useful for problem-solving. (For example, memorizing objects by gradually "expanding" objects from the simple to the complicated. Memorizing the location of an object within the machinery by gradually "expanding" from that object to the entire machinery)

Not only does "intelligent" memorization imply analytic or observational skill, but the second also leads to the the first (the necessity for memorization being obviously 'seen' during the course of any analysis -- except to those incapable of observing their own thought).

Memorization without analytic skill, or which does not serve as the basis for further such skill, is certainly not any important part of general education in the sense that it does not indicate any kind of "intelligence" or mental skill. It is also not that important in terms of its consequences, as explained above. It (or regions of it) may be necessary however in a moral sense, simply because history is part of the human heritage.

I also note how educators and the like frame intelligent thinking as "critical" thinking, or even "reasoning", which is poisonous for various reasons that I can't go into for reasons of space. Analytic (hence physical, manual, and *perceptual*) work -- moving the hands and the eyes around to construct something complicated and to observe all the features, memorization for the purpose of *holding up* a picture and picturing tons of items in the side panels of one's field of vision (as when playing chess) -- is the essence of responsible thinking. Add physical pain there, too, but it should be noted that those who are only responsible for simple things easily lack responsibility for the consequences of seeing details, that have a slightly longer-term impact. The better alternative is to suffer extreme physical pain, or at least public humiliation for all posterity, as a consequence of not looking at some detail of a situation or distinction (and yet promoting certain decisions). Certainly, as far as 'responsibility' is concerned most everyone is fatally flawed in some region or another.
kenshiro

Another example of the rapid decline of higher educational standards just within a century.

IMG_20160915_103653.jpg

Thoughts
kenshiro SixtusVIth Welund

On the decline of education, here a link to my post on mnemonic techniques in the past:

https://salo-forum.com/index.php?threads/lost-sciences-of-the-ancient-world.4622/#post-39902

Jesuits like Ricci can learn a foreign language in a very short span of time, and in fact Ricci was known to be able to memorize Chinese characters immediately upon seeing them.

There's this completely symptomatic and delusional book that says basic algebra and calculus are "useless" because you can get a degree in political "science" or "most fields" (that are unrigorous) by knowing nothing about actual mathematics or physics, nor any kind of knowledge of anything whatsoever.


So education certainly declined, although the caveat that I touched on above is that both modern and pre-modern education was deficient in an absolute sense. As necessary as the more straightforward "rote" memorization is, in relative importance it is less than the dense connection of memorization and analysis (such as developed in skills like chess, various other board games, mathematics, chemistry, physics and so forth), as briefly explained here . The alternative is the purely superficial analysis of facts, which (as mentioned elsewhere) has its place but is not to be emphasized over the knowledge of detailed technique. In this respect, both modern and pre-modern education was relatively deficient. There's a whole bunch of subjects that should be taught today in elementary school, but are not: mechanism design, combinatorics, statistics, etc.

Another final point to repeat is that even the knowledge of surface facts (while even worse now than previously) has always been deficient in terms of how subjective it was. E.g.: apart from basic geography and such, there should be knowledge of basic industrial geography and infrastructure, which is not inferior to geography in importance. One kind of conceit is that urban people imagine that they "know more" than farmers or rural inhabitants, whereas this is not usually the case but is a matter of different directions of knowledge (if one in fact actually examines this in detail).

Another illustration of this is: that I can immediately name what are the biggest industrial firms in each sector (say Du Pont, or Corning in glass) and their relative revenues and market capitalization for each period. This is much more important than knowledge of the exact placement of each river (say which regions each river runs through).

As textbooks, one can use something like this , or this .
Welund

This article doesn't go the extra mile to connect intelligence with physical reality, but it includes the dysgenic trend, "decay" of education into compliance certificates, a comparison between farmer knowledge and modern "service worker" knowledge, as well as an explanation for what I would call the Jewish Lumberjack costume among professionals and professed unprofessionals: mildly unauthentic, thick -brimmed glasses ( very bad eyesight = intelligence?), possible flannel/beard (to signify a intellectual-romantic "return to nature" and "hedged consumerism" I guess).

http://www.unz.com/article/better-than-a-tigers-maw/

Better Than a Tiger’s Maw
Why the Future is Looking Bright for Dullards
ROBERT BAXTER

Welund
This thread can serve as an even yet broader topic of "How Technology Took Over the Mind," reverse Turing tests, etc.

Here reality decides to troll poster Thoughts :


YouTube: https://www.youtube.com/embed/dHLSoSqzlzo
Thoughts