Will the web mean the end of the contemplative mind?
The human brain is an organ that adapts readily to experience. It was once believed that the malleability of our neural pathways ended with childhood, but we now know that even the adult brain is constantly changing in response to stimuli and cues from the environment. "Plasticity," says Alvaro Pascual-Leone, a neuroscientist at the Harvard Medical School, is "the normal ongoing state of the nervous system throughout the life span."
The tools we use to aid our brains in gathering, analyzing, and storing information - what I call our intellectual technologies - exert a particularly powerful influence over the functioning of the billions of neurons in our skulls and the trillions of synapses that connect those neurons into the circuits that govern cognition, perception, and emotion. The influence has been documented by many neurological studies over the past forty years. It is also immediately apparent from even a cursory glance at mankind's history. Intellectual technologies like the map, the mechanical clock and the printing press helped to spur major shifts in the way our ancestors thought, with enormous ramifications for society and culture.
As we celebrate the 20th birthday of the World Wide Web, we would be wise to spare a moment to consider the effects the Web is having, and will continue to have, on the lives of our minds. The Internet is rapidly subsuming all of our traditional communications media, the various tools we use to transmit, discover and share information. We devote more and more of our time and attention to the Web because it is so cheap and convenient to use and so responsive to our needs and desires. A touch of a key or a keypad brings a gratifying response - an answer, a message, a connection. Who can resist?
But as the Web bombards us with an endless stream of entrancing multimedia tidbits, as it prods us to skitter from one task to another, it is also altering the circuitry of our brains. Just as the intellectual technology of the book taught us how to be deep readers (calm, reflective, patient), so the Web is teaching us how to be info-surfers (hurried, distracted, anxious). The neural changes do not go away when we turn off our computer; they persist in the structure and functioning of our gray matter. They become part of our mental "wiring." A Stanford University study, published just last month, showed that heavy "media multitaskers" score much more poorly on tests of attentiveness and concentration than do people who do little media multitasking. Heavy multitaskers are "suckers for irrelevancy," said the lead researcher, Clifford Nass. "They're distracted by everything."
For most of the past 500 years, the ideal mind was the contemplative mind. The loss of that ideal, and that mind, may be the price we pay for the Web's glittering treasure.
Comment number 1.
At 14th Sep 2009, SheffTim wrote:I’m posting a link to more on the Stanford University study on multitasking that Carr mentions. The page includes a video about it too.
Complain about this comment (Comment number 1)
Comment number 2.
At 14th Sep 2009, EnglishFolkfan wrote:The question I would ask is what about people who have and who will continue to pass through the 'learning to read' net. I mean those who fail to become natural and continuous consumers of the paper based printed word. Over the past four decades as technology has placed the computer and mobile phone keyboard in more 'non reading' peoples hands I wonder if these people are also creating and reading more text than they would have otherwise been doing. I haven't checked for any scientific references but I'm theorising on the old idea 'males don't read/converse'.
As for reading for pleasure we were told in my childhood that this would become an outdated pastime with the new advent of television (just as cinema would die out too). Now globally there are more books published and films made annually than back when the Web started. Indeed thanks to the Web and the various book projects discussed elsewhere on 大象传媒DigRev even out of print books will become available again.
When I sat and read 4 novels for pleasure over 3 days last week it did not matter if they were printed in a conventional form or whether they were ebooks it still was a formal reading task. I believe there is much more flexibility in ebook usage; and don't get me stared on how amazing the audio book is for giving many people a chance to indulge their literary needs plus, I suspect, the brain is probably conditioned for this media from when we were pre/early readers and enjoed having books read to us.
I have no knowledge of the age range or personality types of the subjects in the Stamford multitasking research tests. I am also sceptical of the results because I don't agree with their definition of a multitasker or 'media multitasker' as Nicholas Carr calls it. One of the things I would want to be taken into consideration when measuring performance is personality type as defined in the Myers-Briggs Type Indicator which measures preference, not ability. According to the MBTI research we learn and operate according to our 'Type' and, if I remember correctly, two of these can in a very simplified way be described as those who prefer to undertake tasks in a linear way and those who prefer to see the whole problem before understanding the solution. From this I would surmise that some people would not choose be multitaskers and some will naturally be multitaskers. There has been much research on the importance of this being understood and applied to the presentation of online learning.
Perhaps the better solution for how we access web based information will be in letting us each do so in ways that fit our 'Learning Style', maybe that will lessen the neurological problems that Nicholas Carr fears.
The Wikipedia explanation of MBTI is a good summary of it's US use:
The MBTI Foundation is here
Georgia State University: some good ideas on the supporting of learning styles:
Some recent research on 'The use of learning styles in adaptive hypermedia'
For a little historical perspective and interest plus for the fun of looking back too
here is a paper from 90's on 'Adaptive Hypermedia & Online Learning'
and also the 'Brave New Worlds'
Sadly I doubt that many online 'multitaskers' will be commenting here but for me, Nicholas Carr, the distraction of thinking and researching online has kept my little grey cells occupied this dull September afternoon so - Thank You.
Complain about this comment (Comment number 2)
Comment number 3.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:What’s of more concern than the way the Web is altering our brains is that today’s neuroscientists seem to be losing the language, the reasoning, the imagination and the art of science.
“笔尝础厂罢滨颁滨罢驰”???!!!
Please, I need some space…….. to LOL and cough at the same time!
From a Harvard neuroscientist too (Alvaro Pascual-Leone), especially after Professor Greenfield’s pop psychology about twittering socks possibly correlating with some form of Freudian ego-superego-id deficiencies.
Wow, the state of neuroscience. No wonder everyone’s worry wort is getting bigger --- only they’re obviously worrying about the wrong target: the Web and its potential wrongs.
You see, the last time I checked the brain is NATURAL and composed of ORGANIC matter. It's not a synthetic (silicate, polymer or acrylic).
Moreover, “plasticity” would imply that a noticeable amount of induced artificial external energy (and noxious chemicals like chlorine and acetone) is being directly applied to the human brain in order to “mould” it…….All within an INDUSTRIALIZED MACHINE. That’s what happens in the molding of plastics: machines, extreme temperatures, and hazardous inputs.
Unless anyone wants to raise their hands and say that they have a plastic brain, um……..”Plasticity” is not the most precise or pertinent scientific term to use to convey the dynamism and adaptability of our natural neurals. Even if you were trying to make science accessible to “John Doe on Main Street”, you’d hope a Harvard neuroscientist would have more imagination than that!
础颈-测辞….
Seriously, by all means analogize about how the electrons coursing through our synapses are able to slingshot, attach/detach and spark off our nodes in much the same way that Spiderman can shoot, bind/unbind and blitz his cellulose superstrings from one building or tree to another…..
BUT PLEASE……….NO “PLASTICITY”!!!
O!
M!
Newtons!
Well, at least he didn’t use “ductile”. That would really have floored me --- since ductility is most appropriately applied to metals. Now, if anyone wants to hold up their hands and admit they’re a Cyborg with a metal brain implant, they’re welcome to. No value judgment of them and their ID on my part.
Seriously, I am almost ROTFLOL @ the choice of “plasticity”.
“It was once believed that the malleability of our neural pathways ended with childhood” --- Nicholas Carr.
Clearly, this must be a Western heritage thing. Orientals haven’t believed this since Confucius (circa 500 BC) and later reinforced by Sun Tzu (exact dates not established but I’d estimate circa 425BC) and Zhūgě Liàng (circa 200 AD).
This Western “ending with childhood” concept is also surprising and something of an incongruence given that, with the exception on Mozart and an anomaly of other child prodigies, the greatest body of brilliance and “Eureka!” solutions were achieved by people well into their adulthood. Presumably this was because their neural pathways were continuously and contiguously remapping on contact with additional concepts until they finally got those gold nugget insights --- rather than that they’d reached the apex of their intelligence in childhood but simply didn’t publish their scintillating works out of laziness, arrogance, forgetfulness or the lack of the Gutenberg.
The tools we use to aid our brains in gathering, analyzing, and storing information - what I call our intellectual technologies…--- Nicholas Carr.
Hmmn, it’s interesting Carr classifies our brain abilities so linearly and makes no mention of:
? PERCEIVING
? IMAGINING
? CROSS-POLLINATING
? SENSE-MAKING
? COMMUNICATING
? SATIRIZING (or at least humoring)
? CULTIVATING
Heavy multitaskers are "suckers for irrelevancy," said the lead researcher, Clifford Nass. "They're distracted by everything."
I wonder what today’s neuroscientists would make of Leonardo da Vinci, Queen Elizabeth I, Qin Shi Huang, Orson Welles, Peter Ustinov or any of the world’s known polymaths then? No, it’s probably better not to let today’s neuroscientists anywhere near our polymaths or they might try to make Bakelite from their brains.
LOL, again.
Naturally, by polymaths I mean people who are able to multi-task (or examine multiple dimensions of a situation or project-task simultaneously), complete objectives in a dynamic way average humans can’t and who apply the full flexibility of their brains: for example, paint the Mona Lisa and The Last Supper, build military weapons, revolutionize the concepts of anatomy and imagine a whole range of futuristic contraptions --- the parachute, the helicopter, the tank, the plane, scuba diving gear, solar panels, plus etcetera etcetera, ancora una oltra volta.
Then again the likes of Nass would probably argue that da Vinci didn’t do those things all-at-once on the Internet and if da Vinci had only learned to contemplate and concentrate more, he’d have bequeathed to us a larger collection of completed masterpieces rather than what he has.
Yes, what a failure that neural multi-tasker, da Vinci, was!
See? Somehow, “suckers for irrelevancy” is probably not a correct label for heavy multi-taskers. The Stanford team probably didn’t have an appropriate sample population because the true heavy multi-taskers like Oprah or the top-flight Hollywood directors and dynamic CEOs are out there running US$ billion projects instead of being on campus or in class, or separating blue squares from all the red ones. Plus we should ask whether their political affiliations may have caused any bias in their ability to see blue when it was red and vice versa.
In seriousness, what would be a much more interesting and important study is to work out how some people are capable of engaging with multiple sources of media information and EXCLUSIVELY FILTER IN what’s relevant, accurate, up-to-date and pertinent as an input they need to go and complete a task on their multiple to-do lists.
MISSING INFORMATION ABOUT THE STANFORD STUDY
==================================
It says in the article that the sample population was 100 students. What would be more informative is what the male:female ratio of this sample population was (no info on the video either) since there are contrasting studies which indicate women are more adept multi-taskers and that age is also a contributing factor:
?
?
?
?
Also, the Stanford team is comprised purely of men so their tests for multi-tasking are going to be pre-emptively……..male. This compares with some other studies on the matter where the research team comprises both genders or only women.
This returns us to the often debated, “We (inadvertently/deliberately) pre-affect the conditions to produce the results we want” argument about scientific methodology.
It would be interesting to see the results of the same Sanford team with 3 female researchers designing the study too and an as-near 50:50 male:female ratio in the sample population as possible. Age group and ethnic group sampling should also be included.
Then we’d probably get a clearer perspective on the whole multi-tasking issue.
MULTI-TASKING, DISTRACTED OR CONNECTED
=============================
For those who are seriously interested and can afford the US$695 report from eMarketer:
*
Otherwise, simply google for the free resources or check the Amazon books on the matter.
****
****
GOOGLE IS MAKING US STUPID --- Nicholas Carr
What a pity the Atlantic didn’t allow for threads on that post. Maybe they didn’t think readers would be able to concentrate long enough to read the article and get to the end to thread or that we’d have to go and google “Google” because our mayfly brains just couldn’t even remember what Carr was writing about.
LOL, sorry more humor slipped from me there.
From my perspective and experience, Google is certainly NOT making us more stupid. It's simply surfacing resources with more speed and in a different dimension. When I go in search of resources being able to bookmark those resources certainly helps but MY BRAIN is still the natural and personally tailored system that has to perceive it appropriately, make sense of the content, contextualize it, reconfigure/re-imagine it for a project, cross-pollinate, synergize it to derive a solution from it, communicate it and do all this with some GSOH!
My brain is also the one which has to nanosecond Rolodex through a list of memories of not just the keyword I need to input into the search box, but also the when and where of how I remember my first, last and most recent exposure to that item, and who it was that flagged me on it.
Yes and my usage of Google hasn’t diminished my ability to recall and recite entire tracts of literature acquired when I was a teenager, my algebraic reasoning or my flexibility to switch between several languages.
Ditto my knowledge that “plasticity” is a scientifically inappropriate analogy for the brain’s adaptability.
As for Carr’s references to the Frederick Winslow Taylor methodology in industrial processes, presumably he’s unaware that Taylor became increasingly redundant after the advent of the Ford philosophy of car manufacturing and the Japanese in the 1980s showed that JIT is a superior approach to systematic processes. This itself was then superseded by the more holistic Six Sigma approach and we’re now on the Global Brain approach as proposed by Professor Mohanbir Sawhney and his colleague, Satish Nambisan.
Or perhaps he simply forgot to google it, apply concentration and make the connections to bridge that gap between Taylor’s framework in his 1911 treatise when there was no GATT (that appeared in 1947 for the historians amongst us) to today and its globalization effects on manufacturing protocols.
Unfortunately it’s exposure to quasi-science, disjointed history and pop psychology that’s supposedly passing as scientific fact (facilitated, ironically enough, by the accessibility of the Web) that’s in danger of making us all more stupid than the Web, all the digital influxes and Google --- of and in themselves.
The sooner the contextualization and coherency filters for the Web arrive, the smarter it will be for all our brains!
Complain about this comment (Comment number 3)
Comment number 4.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:Now I'm going to google "Nicholas Carr" 140 times as a Google-Twitter brain cross-training test. LOL.
Oh and the "contemplative mind" as the ideal mind concept? Around for the last 500 years only?
Carr might want to google "origins of philosophy". The Aristotelians, the Marcus Aurelius-types, the Confucians, the Buddhists, the Sikhs, the Taoists and some other deep thinkers might have an opinion on that.
Complain about this comment (Comment number 4)
Comment number 5.
At 14th Sep 2009, EnglishFolkfan wrote:I rather think that Neurolasticity is a well established scientific term.
Seven scientific journals and some of the articles can be seen here:
This was via the now usually 1st used search tool
From the quote:
"What is brain plasticity? Does it mean that our brains are made of plastic? Of course not. Plasticity, or neuroplasticity, is the lifelong ability of the brain to reorganize neural pathways based on new experiences."
Now to go fire up some more synapses!
Complain about this comment (Comment number 5)
Comment number 6.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:Ah, okay and for the purposes of fair play it should be noted that Carr's postulation is:
Is Google Making Us Stupid?
Rather than as a foregone conclusion in the definitive.
Nonetheless, my observations stand.
Complain about this comment (Comment number 6)
Comment number 7.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:Oh, I'm au fait with neurolasticity --- just about......because elasticity gets associated with rubber bands and then rubber gets etymologically melded with plastic and ergo "plasticity".
However, it's still NOT the smartest or most eloquent way to analogize the brain's flexibility.
Complain about this comment (Comment number 7)
Comment number 8.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:It's doubly funny because the neuroscientists are worrying about how the Web is remapping our brains and their not even aware that the way they're labeling brain morphing as "plasticity" is also rewiring our brains.
Complain about this comment (Comment number 8)
Comment number 9.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:Oops, typo: THEY'RE not their.
Complain about this comment (Comment number 9)
Comment number 10.
At 14th Sep 2009, SheffTim wrote:As we've been discussing literacy, attention spans and learning styles; previous posters have commented on educational uses of the Web it could be worth asking if limitations in real-world education are limiting Web take up, even in 1st world countries.
Unless people reach a minimum literacy (reading/writing & digital skills) threshold they are unlikely to access Internet services, know how to use them or understand information* on web pages because they struggle with reading - no matter how skilfully designed or educationally valuable.
----------
*The wider your vocabulary the greater the number of concepts you understand, the greater the comprehension (The ability to understand what is being said within the context that it is being written, as well us connecting it to existing knowledge).
----------
DigRev should at least touch on the growing digital divide; the haves and have nots; it is unlikely to go away in the foreseeable future. The past 20 years has seen a revolution, but it hasn't affected everyone.
According to the Leitch Review of Skills (2005) 15% of the UK population were functionally illiterate; 45,000 16-year-olds leave UK schools each year functionally illiterate and/or innumerate.
The Basic Skills Agency reported (2004) 'that adults described as functionally illiterate total 24% of the UK adult population rising to as much as 40% in some areas.'
According to NIACE (Annual Media Literacy Survey 2008) 17 million people in the UK (30%) are still excluded from digital technology?
6 million who cannot afford internet access.
One in three people over 50 don’t use the internet.
Over one-third (36%) of adults do not have access to a computer and over two-fifths (42%) lack access to the Internet.
And there are digital refuseniks that wouldn't use the Web even if given free access:
"We are at a tipping point in relation to the online world. It is moving from conferring advantage on those who are in it to conferring active disadvantage on those who are without, whether... [on] offers and discounts, lower utility bills, access to information and access to public services. Despite that increasing disadvantage there are several obstacles facing those that are off-line: availability, affordability, capability and relevance..." NIACE.
Complain about this comment (Comment number 10)
Comment number 11.
At 14th Sep 2009, A_PERSON_NOT_A_BOT wrote:+1 for SHEFFTIM's comments.
It is important to gain the perspectives of those who are not online at all, particularly if there is an educational disparity involved. How is their identity, sense of self and awareness about net lingo like "plasticity", "tweet" and "digital native" different from those who are immersed in these trajectories of language evolution online?
If they're not keeping apace with this rapid linguistic re-routing is it also affecting their relative intelligence?
Complain about this comment (Comment number 11)
Comment number 12.
At 15th Sep 2009, nicholascarr wrote:Dear A_Person_Not_a_Bot,
I believe the word "plasticity" derives from the Greek word "plastikos" rather than from the man-made material to which you so animatedly refer. The use of "plasticity" to refer to adaptability is well established in biology.
Yours,
Nick Carr
Complain about this comment (Comment number 12)
Comment number 13.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:XLNT, now we have true interactivity and inclusiveness: a key interviewee is communicating and clarifying in an common exchange of contextualization.
Complain about this comment (Comment number 13)
Comment number 14.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Dear Nicholas Carr,
The Greek word is PLAISTIKOS, not plastikos as you wrote. If we google the latter we will end up mostly on plastic injection-moulding sites and plastic surgery sites.
Yours,
A_PERSON_NOT_A_BOT
Complain about this comment (Comment number 14)
Comment number 15.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Here is something interesting about language, its origins, its orientations, its fluidity, its semiotics and its neural metamorphosis which has relevance to the Web. Digital indexability and searchability (aka functional utilities provided by the likes of Google) means that those of us who have access to the Internet can trace a word's originations and mutations over time. It's not limited to Classical scholars.
Obviously, we do need to be able to spell whatever word it is properly in the first place (with appropriate accents, umlauts, conjunctions and other denotations on the vowels if need be). Plus, in the case of "plaistikos" it's smarter to search for it in Google (Greek) as "ΠΛΑΣΤΙΚΟΣ" (also spelt as "πλαστικο?"?).
If we do this it becomes clear that there is a "loss in translation" effect somewhere in the derivation from the Greek word to the American usage and that this was probably seeded in the fact that the English alphabet stems from Latin sources, which has variations from the Greek alphabet. Plus the fact that English also has Nordic influences in its DNA as well as Latin.
If we do a letter-for-letter swap from Greek to Latin alphabets, "ΠΛΑΣΤΙΚΟΣ" becomes "plastikos" and "πλαστικο?" is "plastikoz". So......somewhere an "i" has gone missing which may also be an insight we're missing. Anyone who thinks that a single vowel doesn't make a difference should be aware that the words "complement" and "compliment" have different meanings and connotations. In Chinese, a dot can be the difference between us writing an actual word and complete gibberish.
Now, if we search for the Anglicization "plaistikos" with Google (Greek) it only shows 2 results and both refer to plastic surgery procedures. However, if we search with the Anglicization "plastikos" in Google (Greek) there are over 50 pages of results --- some referring to brain formation, some to plastics moulding and some to plastic surgery. Meanwhile, if we search for "plastikos" under Google (Greek) we only arrive at the links to the biological connections to the word's Greek origins under "PLAISTIKOS".
Interestingly, whilst I was surfacing this information it sparked in my brain that perhaps the "ai" might be like the way the German umlaut works: ü is shorthand for "ue". If so, then "ai" could just be "a" also.
IT WOULD BE HELPFUL IF WE COULD HAVE A GREEK ETYMOLOGIST HERE TO HELP US.
Now, the classical Greek word ΠΛΑΣΤΙΚΟΣ means "to form". At some contemporary point, it's also acquired the meanings of "to shape, to mould" and then been mutated into a connection with brain flexibility via the advent of plastic surgery --- probably principally in observance of the elasticity of our largest organ to be stretched and re-shaped.
Whilst searching with the Greek version of Google, I can across a dissertation which ties the word "plaistikos" in with " tartuensis form". If anyone can explain the semiotics of " tartuensis" that would be really helpful.
Yes and Nicholas Carr might be pleased to know that Google is not making us stupid. It just facilitated my ability to connect some language derivations and semiotic mutations of word DNA.
Complain about this comment (Comment number 15)
Comment number 16.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:As ENGLISHFOLKFAN and SHEFFTIM rightly highlight gaining some insights into the language barriers involved online would be useful for Program 4, particularly since language is a key contributor to our cultural identities.
Plasma, that's the other word that immediately sprung into mind when I read "plasticity" because the suffix is the same. Then I remembered that a plasma is essentially a collection of free moving electrons and ions (atoms that have lost electrons). Could this be a proxy for our brains? Some of our electrons are free moving along the synapses and can engage in some form of electrophilic or nucleophilic exchange (addition or substitution) with the nodes whilst there are other zones with no electrical activity (either electron loss or some sort of blockage/barrier) which means a thought can't be transmitted here --- i.e., the person doesn't receive it.
People may notice that sometimes I spell with English English, sometimes with American English, sometimes I inject pure SMS lexicon into paragraphs, sometimes I utilize grammar and punctuation that Lynne Truss ( would probably disapprove of and sometimes I slingshot in some foreign idioms.
Rest assured, I have a Double A in English Language and Literature, my teachers (English, Chinese, French and Italian) were all pedants and I gained that education before grammar checking in MS Word became the norm.
It's just interesting creative communication to be able to do this:
O!
M!
Newtons!
The other noticeable way in which our brains are evolving online, linguistically, is the proliferation and dispersion of acronyms, e.g.: LOL, WTG and the Net Lingo repertoire here (.
Recently, in my typical "What if?" mode, I derived an acronym for the Web: GUNK (Great Universal Neural Kinesis).
In mereology, it’s the philosophical term for any whole whose parts all have further proper parts. In hair care, it’s a British colloquialism and denigration fired at teenagers when they put too much product on their hair which makes it look greasy, sticky or OTT. The product is referred to as “gunk”. It’s also a wordplay compound on “junk” and “goo”.
*
So now that I’ve used it as an acronym to cover Great Universal Neural Kinesis, we have to LOL — particularly since some of us believe humans have too much junk in our heads (from tennis scores to family memories to recipes), our brains are nothing but goo-ey matter, and yet that so-called “junk” and “goo” can be kinetically converted into the “gunk” of the mereological variety.
In other words, silo pieces of data can be and is naturally connected and cross-pollinated in our brains. It's our own responsibility and collective intelligence to drive its frequency and velocity (direction and speed) so that we’re utilizing more than this urban legend 10 percent of our brainpower.
Yes, there is indeed an awful lot of junk online --- from spam to accidental misinformation to deliberate mischief.
Hopefully, those of us who can code like me will derive not only another snazzy acronym but actually smart tools which can contribute contextualization filters that lead to a more inclusive and democratic society online.
It's not going to be an easy or overnight fix; I was explaining my solution to a Spanish astrophysicist (30+ years post-graduate experience) and he highlighted that in our axiomatic heritage we can't even agree on the definitions of "time" so it will be a challenge to develop a tool sufficient enough to differentiate the connotations elicited in us by every word, every sentence, every symbol, every paragraph, every audio-visual and every other sensory intake we experience online.......OVER TIME.
Language, that's where it's @. Setting aside the socio-economic politics of the Web for a second, it is the language which defines online culture.
Complain about this comment (Comment number 16)
Comment number 17.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Now I have to go and chip away some more @code.
@NICHOLAS CARR --- welcome to our streams of consciousness. We mean no harm or offense to anyone. We're just being human (contemplating, communicating and cross-pollinating etc.).
Complain about this comment (Comment number 17)
Comment number 18.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Just a quick thought, "Can we use PLASMACITY instead of plasticity?" to denote the brain's adaptability?
Please can the physicists, biologists and etymologists help us on this? Thanks.
Complain about this comment (Comment number 18)
Comment number 19.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:PLAS as a prefix, not a suffix as I wrote.
Also I forgot to say my German teacher was also a pedant and the pedantry of all of my teachers is still greatly appreciated because even when I experimentally play fast and loose with languages, the sciences, the humanities, the arts, the Web etc. I am conscious of differentiating between what is classically established and what is contemporaneously evolving.
I should also proviso my opinions by saying that I believe it's possible for individuals to flex or traverse between being deep contemplation and being info superficial. If our brain's electrical activity was plotted in a topological way, we might see areas of negligible height/depth which would be equivalent to superficial information accumulation and then also areas of high compounds or accumulations of knowledge. There might also be natural valleys where there is no information build-up.
Our neural electrons would be traveling along this landscape and picking up or depositing its matter within this landscape over time. Some of this matter would have markers, some of the matter wouldn't because we subjectively deem it relatively insignificant and not worthy of markers.
Part of my fun+games with "plasticity" shows that there is clearly a dissonance gap between the connotative associations of the word for the biologist, the physicist, the chemist, the etymologist, the Webber, the Generation Googler with the Generation Gutenberg and the "lost in translation" cultural effects.
Interestingly, "plasticity" translated into Chinese is: 可塑性,柔软性,粘性,成形性. Some of these refer to plastic's nature to be a "spear attached to a long staff" (archaic), to become a shape, to "stick" and tenderness.
What will be truly insightful is when our species manages to locate the junctions at which multi-disciplinary and multi-cultural definitions are aligned and complementary.
Complain about this comment (Comment number 19)
Comment number 20.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Oh and when I googled "Nicholas Carr" 140 times last night --- I'm kidding, I've never googled any word more than 3 times --- there was this link to a witty way of looking at whether the Web is changing us or whether we are defining the Web:
*
"We're not defined by our own creation!" --- Stephen Colbert (02:20 to 02:50 in the video).
That's the crux of it. WE are defining the Web, its culture and its effects --- not vice versa. We're defining it in terms of the code we put into tags and protocols whether that's in OWL or AJAX or whatever acronym. We're defining it in the way we design the UI and navigation. We're defining it in terms of content, communication and contextualization exchanges we have on threads/ in IMs/ in metaverses / in Cloud depositories. We're defining it in terms of the intellectuals writing about it. We're defining the Web......EVERYWHERE.
The computers aren't.
This is why the culture of the Web is still very much human and lead by the way human brains work. It's human language, ultimately, which governs what the machines do. We're the ones who thought up binary (1's and 0's), COBOL, ASCII, every OOP imaginable and evolving.
Since we're ultimately responsible for the language, we're responsible for the culture and the future.
WE are changing the Web and its ability to proxy our consciousness rather than the Web changing our contemplation.
Complain about this comment (Comment number 20)
Comment number 21.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Yes and this why I believe that SATIRIZATION also needs to be included in our more holistic modeling of the human brain.
Colbert cut to the crux of it for the funny bone in us and yet behind this is some profound contemplation and wisdom.
Complain about this comment (Comment number 21)
Comment number 22.
At 15th Sep 2009, TaiwanChallenges wrote:I hate to come across as 'thick' but I can't help wondering what the point is of most of the above posts.
We're making a TV program for the general public. I'm as guilty as anyone of going off-topic or talking too much, but I suspect that the themes in question are not going to be discussed in any great detail in a one-hour round-up of the last twenty years.
Google may empower a few individuals to do some fairly obtuse research, but the article was about the majority - the ones who don't write the majority of the stuff but just consume it.
Complain about this comment (Comment number 22)
Comment number 23.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:The Bigger Picture point is that we are changing the way we use Google and other search engines and so is the economics of our online participation.
Yesterday the French announced some proposals by an economic advisory body, including Joseph Stiglitz, to include HAPPINESS and WELL-BEING into the GDP:
To-date what we've been able to contextualize and contemplate online has been determined by statistical surfacing according to click counts. This is because Google originated with Bayesian principles. This is a quantitative approach.
What seems to be emerging is a desire to add some QUALITATIVE or contextual elements. Hence the Semantic Web.
The future will be less about us zipping and leapfrogging from one information source to another based on time shortage and attention deficiencies. It will be more about us having online tools to enable us to coalesce towards content that has subjective meaning to us and hooks into our personal propensity for deep and connected thoughts (or otherwise).
It still all boils down to LANGUAGE --- whether we're talking about the technology of the Web, the way the brain works or how GDP is going to be valued (rather than calculated) moving forward.
Google and other search engines aren't making us stupid or attention poor. They just might not surface the content that makes us concentrate enough. We need different markers for that content.
Complain about this comment (Comment number 23)
Comment number 24.
At 15th Sep 2009, SheffTim wrote:A_PERSON_NOT_A_BOT. You clearly have a very quick, curious mind; a wonderful thing – remember when designing for the Web that many minds are not as quick or as curious as yours; one reason I keep bringing up the notions of an educated digital elite, the digitally excluded and those in-between. Many struggle when using the Web.
PLASMACITY: problems with using a word based on the term ‘plasma’ is that in medicine there already is ‘blood-plasma’ which is nothing to do with brain adaptability and there is also the term ‘plasma’ that is used to describe ironised gasses in physics.
If you’ve seen a plasma-ball with its streaks of ‘lightning’ (sold in shops as a novelty) I can understand why you’d connect it with mental connections, synaptic connections, thought processes etc; however the brain doesn’t rely on ironised gas for this.
The other problem is that ‘neutral plasticity’ or ‘neuroplasticity’ are already widely used by those that study the brain and mind. (The term ‘plasticity’ in relation to the brain was first used in 1948 [by Jerzy Konorski].)
We’re stuck with plasticity I’m afraid, but as with many specialized terms as long as everyone knows what concepts it refers to then they can communicate effectively.
The same applies to the great many names and terms that forms the jargon of technology and computing, and serves to baffle and put-off those that haven’t come across them before.
Creative communication can be enjoyable and fun, particularly the playful creative wordplay side of it, but can result in ambiguity and confuse those that don’t connect with thought processes (or context – I don’t use SMS for example) that preceded it.
Is it language which defines online culture? I’m not convinced. There are generational differences in language use, probably regional ones too as well as those of specific disciplines e.g hardcore coders. There are dangers that as different sub-sets evolve their own terminologies they only serve to isolate and exclude.
To get back on topic, I don’t think that Carr is saying that we can learn nothing from the Web, but that it can reduce our ability to concentrate for sustained reading.
As an avid reader from an early age who still gets through an average of two non-fiction books a week I do think that gaining in-depth knowledge from books requires effort, time and concentration.
Learning from the Web (e.g when Googling) is different; we tend to flit from page to page, to quickly scan text or parts of it, to make judgements very quickly and take in the minimum needed to make a case. I think of this as getting info-bites, for an info-meal I turn to a book. But I realise I’m in a minority.
Complain about this comment (Comment number 24)
Comment number 25.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:@SHEFFTIM --- Thanks, the "plasmacity" sanity-check is most helpful.
The thing about the Web is that there are areas of it which are amorphous, areas which facilitate deep concentration and areas which only require gnat attention. Likewise in everyday existence: sometimes and somewhere we simply don't know and evolve knowhow and discipline as we explore. In other time-spaces we can be profound (maybe late at night and when feeling a little bit wistful on certain social networks). Other times all we want to do is babble about random, disjointed chit-chat (like what the IM channels of the Twitter variety facilitate).
Most people are intelligent. I define "intelligence" as having the ability to learn and adapt to experiences.
When people surf online they soon work out what's of value to them which is not necessarily the same as the way the Google search engines values it in terms of search ranking. Sure, they flip through results pages and scan the content but when they want to they can concentrate in on that content and give it some consideration, contemplation and engagement.
Precisely what we're doing on this Web thread.
Complain about this comment (Comment number 25)
Comment number 26.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Batting aside our online ping-pong over "plasticity" --- which funnily enough highlights that "It's all semantics. You say pOHtato, and I say potAHto" etc. --- we should FOCUS on what would be helpful in Program 4 which is.......
THE PERSPECTIVES FROM THOSE WHO ARE NOT ONLINE OR IMMERSED IN THE WEB AT ALL.
As per EnglishFolkFan and SheffTim's comments earlier.
That's the way I was hoping the thread would develop (suggestions of likely interviewees and questions to ask them) before I woke up and saw Nicholas Carr had joined with his 'plastikos".
Purely anecdotally some of my friends are in their late-sixties and have only recently started their Web adventures. They like the rapidity of the information because they say it stimulates their thinking. The other day I demonstrated an online dictionary to my mother and now she's asking for a Netbook so that she can "click and get translations quicker than if I flip the pages in the dictionary".
Complain about this comment (Comment number 26)
Comment number 27.
At 15th Sep 2009, Dan Biddle wrote:@TaiwanChallenges has a valid point here. While I wouldn't want to truncate a train of thought that led to the link to the Colbert report with Nick Carr, I am also aware that this density of comment posting and the sometimes incredibly intricate and esoteric content may become difficult to consume.
I agree with @A_PERSON_NOT_A_BOT that language has a great significance on the web - it recurs in the debates we're had across the blog - but perhaps we don't need a complete etymological investigation into a word in a blog post's text. 'Plasticity', I believe, as others - including Nick Carr himself - have pointed out, has some understood meaning in the context of the piece. It may rankle with some as a misappropriation, but I think to dissect this much further does start to detract from the arguments posed by the blog itself.
The Digital Revolution project is an exciting one, not only in its open and collaborative aims, but in its very subject matter, so I absolutely understand the way it can carry a comment down into the rabbit hole - you only have to read some of my posts (or indeed, as he admits, @TaiwanChallenges' posts) to see that!
The last thing I would do is discourage participation on the blog - I hope that the weekly round-ups illustrate the joy and fascination you are bringing to myself and this project as a whole - but I would ask that we try to keep the density and relevance of the comments in check.
Right -talking of the weekly round up...
Many thanks,
Dan
Complain about this comment (Comment number 27)
Comment number 28.
At 15th Sep 2009, nicholascarr wrote:One of the great services provided by the unedited blog comment stream is that it reminds us why the profession of editing developed in the first place.
Complain about this comment (Comment number 28)
Comment number 29.
At 15th Sep 2009, paulmorriss wrote:(As an aside the comment above me says:
"At 3:02pm on 15 Sep 2009, nicholascarr wrote:
This comment is awaiting moderation. Explain."
The original author's comment isn't automatically allowed through. Funny.
Anyway. I skimmed the article and then jumped to the bottom because I couldn't be bother to read the comments. I was going to put "this will only happen when children born in the last few years become adults because adults' brains are fixed". Then I went back to the top and I see that the beginning counters what I was going to say anyway. I think the fact that I couldn't be bothered to read the original article carefully shows that I'm wrong and he's right. If you reply to this then I won't read it anyway. No time. Sorry. I chuck my words into the cloud and then get on with something else.
Complain about this comment (Comment number 29)
Comment number 30.
At 15th Sep 2009, EnglishFolkfan wrote:Interesting theme in the comments here on literate and literature on this day when Samuel Johnson is being revered for his dictionary and etymological work.
I tend to the view that the placing of a keypad and screen in the hands of a poorly performing paper based scholar doesn't necessarily limit their online abilities.
The point being that what is typed and sent to others may not be consensually correct but the meaning can well be understood. The use of txt speak is an example of how even correct usage can be creative in space constraints.
The ability to read does not limit the person in the use of a computer. Rather it opens up a bigger world where they will assimilate language in a context meaningful to them. It matters not what typography is used that spurs the mind to make the connection of symbols into words. Universally it would seem the golden arches are known to infants and non English speakers as m and the brand they signify. The scattering of words in comic strip, computer games etc are the kick start to the process of reading for many, and this applies in non English lannguages too. Having watched Infant Class children using top of the range Mac laptops with appropriate software it is the need to get the hardware into peoples hands that is a main priority.
Youth has no fear of technology, just lacks the means to acquire it early enough.
Whilst editing I received this tweet from a well educated and articulate person 'I m enjoi ing badd speling 2day.' It made sense to me because I know their personality. Likewise when reading tweets from a dispraxic friend I can appreciate the inadvertent errors. Myself I find 140 characters is brilliant for curtailing my verbosity & unearthing the preci grammar lessons of my youth!
More on the Twitter theme:
@大象传媒_HaveYourSay
Are you in your teens or early 20s, not studying and unemployed? We would like to talk to you. Pls @ us or email silvia.costeloe@bbc.co.uk
This tweet posted about 3pm Tues 15sep09 is targeted to reach a very specific audience: Young, online and following @大象传媒_HaveYourSay. My guess this will be a very small group if the demographics of twitter users are to be believed plus my own supposition about the people who are likely to follow @大象传媒_HaveYourSay. Will this be made clear when the HYS team use any information gleaned this way and post it on the web or will it just be made out as the (totally incorrect) generic views of youngsters online?
Complain about this comment (Comment number 30)
Comment number 31.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:@NICHOLAS CARR
One of the great advantages of the unedited blog comment stream is the “democratization of ideas” as Professor Eric von Hippel of MIT Sloan School of Management would point out.
One of the great Achilles of editorship is when the editor is insufficiently informed or open enough to explore associated contemplations and invest in the wisdom of crowds. It’s interesting that the Atlantic article seems to have had no thread facilities………
That's A-OK.
I know about editorship: I was Editor and principal on strategy reports, investment proposals and newsletters disseminated across Bloomberg, Thomson Financial, Multex, Reuters and a Tier 1 bank’s Intranet as well as responsible for 15-country coverage on M+A analysis. As Editor, I welcomed open feedback via email, IM and online surveys. As Editor, consensus suggestions from readers were incorporated into reports and newsletters and helped shape the products for relevance to the audience.
In fact, I originated the products in my 20s so, “Thanks for the history info-byte on editorship."
This 大象传媒DigRev space is for our open, democratic, organic BRAINSTORMING for a Web-terrestrial project. It’s not the Doomsday Book or the Magna Carta Libertatum.
“Will the Web will mean the end of the contemplative mind?” Only if ill-informed hierarchical editors dictate the editing of our free streams of consciousness.
If instead we have equivalent access and tools to contextualize and cross-pollinate our ideas and content with others in a holistic way, then……the DNA and ecosystem of the Web will…….FLOURISH.
Complain about this comment (Comment number 31)
Comment number 32.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:Specifically about Program 4……..
It’s worth examining the point raised by EnglishFolkFan above about the MBTI and our PREFERENCES towards certain types of behavior which may explain our online choices of click+concentrate or flip+be distracted.
Some of us are naturally oriented towards skimming pages and being able to grasp all the key concepts readily. Some of us prefer to take our time to read letter by letter, word by word, sentence by sentence, minutiae by minutiae. Some of us have photographic memories: content can flash by and yet be fully captured and lightning-fast detailed and analyzed. Some of us cope better cogitating snippets and others prefer to consume by the volume. We each belong somewhere within this spectrum.
Our concentrations probably vary and adapt according to the situation and from previous experience. If we know we’ve read an article thoroughly before, then we’re more likely to scan it and click through it rather than painstakingly re-read it.
Educational, linguistic and generational abilities probably play a part.
Concentration developed under Western education systems vis-à-vis Oriental ones we’ve covered on some of the other threads (TAIWANCHALLENGES and I). Ditto language barriers and as SHEFFTIM points out: the jargon.
The generational one would be interesting to have empirical analysis on. This is why I included “silver surfers” (55+ years of age) on my list previously.
@大象传媒DigRev team --- maybe send a help request out to a site like Saga to get some interviewee or test case perspectives on the differences in concentration they’re finding online compared with their lives off-line?
?
?
?
?
?
?
?
?
?
Here’s another paradox about how the Web is changing our brains: when writing about teenage surfing there are always expressions of concern about how it will make them stupid / lose focus / become zombies or something similar. Yet when silver surfing is written about it usually mentions how it “sharpens the brain”.
On the empirical side, there is material from online metrics companies (ComScore, Nielsen, compete, etc.) about the average stay per visit on any site. It would be useful if this data could be demographically analyzed, to see whether we could extrapolate how long each age band (18-24, 25-29, 30-34, 35-39 etc.) stays focused on a website’s page.
That would be one way to gain some clearer insights into online concentration. For this we'd need the assistance of some Internet analysts to pull the data, graph it and make sense of it.
Complain about this comment (Comment number 32)
Comment number 33.
At 15th Sep 2009, A_PERSON_NOT_A_BOT wrote:We should also ask these questions:
(1.) Was our species becoming concentration-deficient PRIOR to the Web anyway?
Think about the radio era and when people could turn the frequency knobs and tune in/out of stations. Think about the TV, the zapping of control buttons and the channel/program hopping. Think about the CD and DVD and our ability to fast-forward or rewind.
(2.) The book format of and in itself does not necessarily increase our concentrations. Some people can make it through Tolstoy in one sitting. Others only have the natural attention span to skim-read a novella or a book of prose.
(3.) What's the empirical evidence on electronic and audiovisual books? Which age group's buying them? What evidence is there on the review pages about how long it took people to read those books? Is there any comparison between how much more concentration they apply to read those than paper versions?
I'm younger than the Google founders and have been surfing for the Web for over half of my life to-date. All those years of clicking, electronic game-playing and the immediacy of information streams doesn't mean I can't sit still and read Cloud Atlas, cover to cover, in two sittings.
Complain about this comment (Comment number 33)
Comment number 34.
At 16th Sep 2009, TaiwanChallenges wrote:I'm going to discuss this project with my class of 20 x 15yo Taiwanese kids tomorrow morning. So far, it looks like the only way you're going to get any input from this age-group.
Suggestions for questions, topics, etc are welcome. I leave home at 02:30 GMT, so give me time to process before then. APNAB, if your output is as prodigous as usual, you'll need to stop by midnight. But your perspective would be valuable.
Open questions like "how do you use the internet" are probably a bit meaningless to people who can't remember the world before it changed. Need to be more specific, without pre-defining the terms of the debate to the point that we ask the wrong questions.
I'm going to take a couple of hours and go back through some of the discussions about the other programs. I'm sure that some of the things said recently will trigger new thoughts about power, money, etc. After all, if individuals change and human interactions change then that impacts how we do business and how we organise ourselves.
Where should I post the result?
Complain about this comment (Comment number 34)
Comment number 35.
At 16th Sep 2009, TaiwanChallenges wrote:Where should I post the result? Here will have to do.
Just spent some time looking back at some of what was said earlier. Here are some ideas as they come up:
* I see the format of the weekly round-up has changed, and the round-up itself didn't appear last week. I'd be interested to know what the DigRev team is learning from this experience. As mentioned, the approach taken appears 'top-down' but nobody had a lot to say about better ways of doing things. After all, the production team are people with an agenda, a job to do, attempting to... what? Crowd-source the research process? Better connect with the audience? Whatever, have you learned anything that may be relevant to the business of running social media for profit? There are questions to be answered about motivation, economics, quality and usefulness, serendipity, control, management, self-organising systems, business models and lots lots more.
* Some time ago we talked about the idea that the web is not 'a thing' but is more analogous to an ecosystem or a body. But even so, we're treating it as being seperate from us or the world we live in. Surely the most revolutionary thing about all this is that it's an indispensible integral component of our real lives? Looking back, it seems that many contributors have touched on this topic but it hasn't really been explicitly stated.
Why talk about the economy of the web when you can talk about the web in the economy? One obvious approach is the whole 'Flat Earth' topic as described in the book of the same name. Have you looked at sites like elance.com that enable a global playing-field for anyone in informational industries?
* Another would be this quote from week one: 'I am curious whether if in a few years people can still afford not to have a clear online identity, I guess employers will feel much more uncomfortable with someone who cannot be found in Google than with someone who documents every minute of his holidays in Spain.'
Your online self becomes so much a part of your identity that if you haven’t got one then people think there's something amiss. Is a potential employer going to spend hours trawling through your flickr account, or just check that you do in fact have a real life (as evidenced online) and leave it at that? Aleks also talked early on about this and I saw it come up again later. Why not do a short feature about online CVs and identities for job-hunters and headhunters?
* I decided to hold myself back from posting so much here after a discussion in another forum where I am active – . It's a community discussion board for people living in Taiwan and was originally conceived as a way for people to share advice and experiences, to solve problems. Nowadays the common accusation is that newbies are discouraged from posting – intimidated even – by the existence of a hard-core group of heavy users who spend their days talking about whatever is important to them even if it's not relevant to the original purpose of the site.
I'm one of the people who tends to go off at the deep end, so I felt that it would be a good idea to make a conscious effort to avoid being the guy who monopolised the DigRev conversation and discouraged more timid souls from contributing. It's kind of funny that APNAB appeared at the same time. As interesting and enlightening as her posts are, they do kind of illustrate the point that most of the internet is dominated by a relatively small number of people.
* Whether this is a bad thing is still debatable. Going back to forumosa, someone asked why they had any obligation to create helpful and interesting content for people they had never had any contact with previously. The site (or the whole internet) is simply a blank piece of paper and if you leave people free to write anything they want then there's a fair chance they will help others as they go about their daily banality, but it's not an obligation. In fact, the help and advice given is usually a form of egoising. People help because they want to feel good, or they want to be seen helping, or they want to show off or win approval within the group. (Or because the question inspires them to think out loud and they need an audience.)
It reminded me of an interview I saw with Linus Torvalds where he stated that the reason for sharing Linux was basically so that he could impress other programmers: “Look what I've done.”
It's not really any different from tweeting information about your socks, is it? And was that quote about 40% of Twitter being pointless a reference to 40% of output or 40% of what people listen to?
* We share what we can, at whatever level we happen to be functioning at, and the rest of the world has to sort through the mess to get what they need. The person who shares the most mundane boring aspects of their lives may also be the person who answers your query about php, immigration visas, recipes for apple pie, the effects of the internet, how to write a good CV, etc. on one of the millions of online help forums around the world. The banality, the egoising, the waffle, the mindboggling life-changing information, they're all aspects of the same basic human drive.
I think this is relevant to our economic models, in real life as well as online. Is the web weakening the global money economy, not just the nation state? Will social currency garnered online, or some form of electronic kudos, one day take it's place alongside cash and personal carbon allowances as the must-have resource? File-sharing services sometimes discriminate against people who don't have files to share. Do friends lists, page ranks, or popularity ratings carry any weight in our online worlds yet?
*Surprised this one didn't get more responses. Too early in the game?
/blogs/digitalrevolution/2009/07/the-pirates-dilemma.shtml
It's a fascinating idea, that piracy is a natural healthy (for society) response to monopolistic practises, although I don't think that's a universal truth.
Yesterday I found myself looking through a shop window at a variety of Peter Rabbit (tm) kitch, asking myself why this stuff is still subject to intellectual property law so long after the death of Beatrix Potter. (It's like Kate Bush expecting to still get paid for music she made thirty years ago, despite apparently not having worked since.) What benefit does society gain from the apparently perpetual right to ownership of intellectual property? I have always understood that these laws were created to promote creativity and innovation, by allowing anyone coming up with something new to have exclusive rights to it for a reasonable period after which anyone has the right to use it and create derivative works which again benefit us all. I couldn't help thinking that while Peter Rabbit is competing for shelf space with Spongebob, there isn't room for anything new and original in the marketplace. So the individual is disadvantaged by the corporation using the law as a club to defend a monopoly.
*And going back to the article above, I’ve seen it claimed that 2/3 of the cost of drugs to the NHS is the cost of maintaining the sales organisations that promote the products to doctors. Drug patents may be an important incentive to companies developing new treatments (for conditions which are common in rich countries at least, although we have to leave it to Bill Gates to deal with malaria at his own expense) but they’re certainly not the whole story when discussing the economics of health care.
Perhaps it's time to step back and look again at how we reward innovation? Not just on the web, or in the realm of digital content, but in the light of what we're learning about our real-world selves as we experiment with this new medium.
* Will your prog about the web and nation-states include any info on open government projects? How many people know they can put petitions on the Prime Minister's website?
* Porn only gets a brief mention here and there even though it's credited with playing a major role in creating the demand that pushed a lot of early investment. Here's a silly question, related. I have a friend, ahem, who downloads movies a lot and has observed that very often he gets porn disguised as popular movies. So after waiting patiently for The Incredible he gets some young lady entertaining her three masked friends and their big red rubber thingy. Can anyone explain what this is all about? And why do all photos and movies available p2p have incredible long filenames listing every perversion known to man along with an exhortation to share? What is the psychological drive that makes people put this stuff out there in this manner?
Complain about this comment (Comment number 35)
Comment number 36.
At 19th Sep 2009, oxfordyorick wrote:I would like to add some new considerations about what effect is the web having, or will have, on us: some quite general and one to do with the key notion of Artificial Companions as the kind of web interface we are going to need if everyone, not just typists, are going to deal the Internet seriously and intimately over a lifetime. The idea is that we will each have a personal Companion, for long periods, as our interface--one that talks to us, knows
all about and manages our digital lives on the web for us---where all our information will be, up in the Cloud/web (we dont care where!)----
and far too much of it for us to manage ----with perhaps millions of photos and documents over a whole life. The Companion will change us by being our personal memory, and by being what is left of us when we are gone. It will also be the chief way in which we change by establishing long-term relations with the web-as-an-artificial personality, even if one of our own choosing.
This big international project at the Oxford Internet Institute is at www.companions-project.org, where there are links to the current project movie which gives you the general idea:
There are also early demonstrators there and a project blog that has lots of press cuttings on the development and uptake world-wide of web Companion-like entities.
This personalized web is going to have a profound effect on us psychologically. Sherry Turkle at MIT has written a great deal on what dealing with such entities is going to do to us, much of it pessimistic in tone, because she argues that young are coming to prefer relationships with artificial entities to real ones in some settings. A more positive contributor on this side of the Atlantic is David Levy, who wrote the recent book
that got him on all the US chatshows---but which is really about relationships rather than sex, and where he argues that we will begin to have serious relationships with Companions with a decade. He may well be right, if only because people clearly had relationships of a sort with their Tamagotchis, which could not speak---the barrier is not all that high for relationships! Reeves and Nass showed many years ago (file://localhost/see http/::portal.acm.org:citation.cfm%3Fid=236605 ) that people had emotional relationships of a sort with their PCs, even without knowing it. below (at a Manchester graphics company) is what an extraordinarily plausible web Companion may look like soon:
All this is speculation but one plausible outcome is that people may come to have different kinds of relationships with Companions than they do with other folk: just as they do now with pets, or mistresses had with their Victorian Lady Companions---these were people but semi-servants, and not at all like slaves, yet whose own feelings and wants did not have to be considered in the same way as one’s own or one’s fellows. This is a more optimistic outcome than Turkle’s fear that Companions will dehumanize us.
Other writers here and bloggers have stated forms of this fear: Baroness Greenfield’s fears rest on no real evidence and are the kinds of worries that capture the attention of newspapers. Similarly, the “loss of contemplation” with the web is a fear that rests on a slim observation about multi-tasking. But do we have any reason to think it will have any deleterious effects on real “contemplators” e.g. writers and academics for example? A talented ex-student of mine wrote on Facebook today: “Watching talk TV while ripping CD collection. Had to stop. Losing too many IQ points. Can't use big words good.” But I am not sure this is serious evidence rather than comedy.
Academics now sit through conference presentations while doing mail, back-channeling, twittering and have done for many years; have we any evidence their own “contemplative” work has suffered (e.g. their next written paper?). None of this is to deny that the web is influencing us all the time and not in wholly positive ways: one not mentioned so far in this blog include is the odd result that extensive Facebook exposure seems to be contracting the list of first names used for newborns: it is as if linking to larger intimate groups, and seeing what names they use, is causing a higher degree of mutual name-copying and the proportion of first names occupied by the top ten, names, say is shrinking in the English speaking world. This effect, if transferred to the realm of ideas, could be one we might not like: peer pressure to conform might grow in a way that would not need to be enforced by any Government or law.
Complain about this comment (Comment number 36)
Comment number 37.
At 20th Sep 2009, A_PERSON_NOT_A_BOT wrote:@OxfordYorick --- thanks for the link to Oxford Internet Institute. I mentioned them in another thread and am hoping some lecturers from there will be interviewed for the docu-series.
* /blogs/digitalrevolution/2009/09/revolution-roundup-week-nine.shtml
As for virtual assistants and robot pets, everyone should be aware these are on the horizon or are already present:
? SIRI ---
o It’s a spin-off from DARPA and will appear in iPhones soon ---
o
? EDD (Second Life) ---
?
? AIBO ---
? Shaggy D ---
In case anyone doesn’t know Japanese/Chinese, “AI” means “love” so the Japanese inventors effectively called it a “love me-robot”.
Also watch these movie trailers and get a glimpse of a virtual alternative where humans are the virtual pets for avatars:
? Gamer ---
? Avatar ---
As for getting us away from text and typing, there is of course:
? MS Surface ---
o MS 2019 ---
? iPhone with Autocad ---
? Apple tablet? ---
? Google Earth Holographic ---
? Air-writing ---
I saw an interesting UI based around the Cloud about 16 months ago. The UI is essentially a holographic projection from a button on our clothing. The button is our unique identifier that enables us to beam up to the Cloud and access files. Whatever we access is projected onto whichever surface is closest to us: floor, wall, table, hand etc. Unfortunately, I can’t find that link in my files right now.
HUMAN RELATIONSHIPS >>> VIRTUAL ONES
====================================
Here’s why whilst we do need to analyze the Web and other digital effects (including virtual relationships) on our brains, our concentrations and our behavior in a sensible way, there’s no need to be irrationally wild with worries.
In the real world, humans are intelligent. We can work out over time when the other party is being AUTHENTIC, TWO-WAY CARING and MORALLY ACCEPTABLE to us and by each of our unique standards. We can detect artificiality, falseness, insincerity and faux affections in others and --- unless we are masochists --- we tend to reject these and go in search of authenticity, two-way caring and people with similar moral values.
This is why whilst we may be temporarily “into tech” we have a propensity to revert to and prefer human interactions.
In case it’s not obvious: ALL Web technologies are ARTIFICIAL DIGITAL CONSTRUCTS and not flesh, blood and feelings. Yes, those tools facilitate our ability to communicate. Remember, though, it’s called “Twitter” not “CONVERSATION”. Yes, they provide us with resources and links to read/track/flip pages through; it’s called “Google” instead of “LIFE DISCOVERY”. Yes, they connect us to people all over the world in a virtual phonebook; it’s called “Facebook” not “FLESH+BLOOD”. Yes, they even calculate price comparisons to buy us the cheapest car insurance.
Here’s the thing: there is no digital equivalent or substitute for these ---
? our loved ones and friends hugging us in joy and in despair;
? the sound of their voices laughing at our silly jokes, sharing secrets or commiserating and comforting when Life falls apart;
? the smell of their hair and happy skins or food made with their kind hands;
? the pupil dilations in their eye which reassure us we’re knockouts;
? the taste of kisses.
All of these features, aspects and characteristics are unique to humans and they induce in us releases of specific EXPERIENTIAL CHEMICALS that socialize us (teach us about love, about kindness, about reciprocity, about emotions). These simply can’t and are unlikely to be induced by artificial digital constructs in the same way.
Yes, some people may point out about neural nanotechnology that’s currently being developed to simulate and stimulate chemical releases in our brains which can proxy natural induction. Still, we are CONSCIOUS that that’s artificial whereas our family and friends (flesh+blood+DNA) interactions are not registered as such in our own awareness.
For anyone interested in neural nanotechnology, please watch this series ‘Visions of the Future’ with Dr. Michio Kaku:
*
I thought of the things I posted in this comment because this weekend I was out+about, people watching as well as having dim sum with my family. For the unitiated, dim sum is like the Chinese equivalent of Sunday lunch / Spanish tapas all rolled into one.
*
Whilst eating this popped up in my head: “We can’t eat and survive on social networks or digital streams. We can eat our mother’s cooking and survive with their love, though!”
Later, the answers of why human socialization will always compel us more than virtual relationships appeared everywhere:
? spontaneous laughter
? a guy brushing an eyelash from the cheeks of his girlfriend
? two young ballerinas on a bus discussing the morals and mores of their friends
? couples snuggling closer as twilight fell and the temperature on their skins dropped
? three young kids (a boy and two girls no older than six) teasing each other and playing “slappy hands”
These regular everyday sights induce in us moments of profound contemplation and reflection that don't need to be hours long to be powerful or to affect our own socialization or psyche. It immediately makes what matters in Life REAL to us, even if some of us spend a lot of time in the digital space because of work, research or leisure.
Unless the Web is suddenly going to be able to physically mimic us and our five senses perfectly --- in a flash+blood+DNA way, then we may temporarily allocate some time to them but we’ll prefer to seek out the authentic sources of socialization (the touch, taste, sight, smell and sound of other humans).
Complain about this comment (Comment number 37)
Comment number 38.
At 21st Sep 2009, A_PERSON_NOT_A_BOT wrote:President Obama: "I am concerned that if the direction of the news is all blogosphere, all opinions, with no serious fact-checking, no serious attempts to put stories in context, that what you will end up getting is people shouting at each other across the void but not a lot of mutual understanding."
(Source:
Henry Kissinger is well-known for his emphasis on the need for "context". He's cited in this Google Tech Talk about a shift towards Global Consciousness in tackling climate change issues. Vint Cerf and Larry Brilliant are the keynote panelists.
*
Taking this content as well as the content of all the other links I've posted and the increasing insights I have on the Semantic Web into consideration, there is some emerging notion that the Web isn't making us lose our contemplative minds. More that we may now have or should be building smarter CONTEXTUALIZATION TOOLS.
I also want to quote Bertrand Russell here: "The habit of looking to the future and thinking that the whole meaning of the present lies in what it will bring forth is a pernicious one. There can be no value in the whole unless there is value in the parts." --- The Conquest of Happiness.
*
The question then is, "What authentic value is there to be discerned from the parts of Web skeptics' postulations? If those piecemeal approaches are anecdotal rather than empirical and philosophical, can they amount to a proof of concept whole?"
I mentioned before my concept of GUNK, an acronym for Great Universal Neural Kinesis, which might be achievable by harnessing the Web. In mereology, "gunk" is the philosophical term for any whole whose parts all have further proper parts.
So what happens if multi-tasking instead of making us lose concentration means that actually we're able to SYNERGIZE SOURCES simultaneously until the value of proper parts generates the value of holistic solutions?
Now, THAT's worth applying our contemplation and concentration to rather than irrelevancies.
Besides which, I've been clear about my position on context:
"IF CONTENT IS KING, CONTEXT IS QUEEN AND CONSCIOUSNESS ARE THEIR PROGENY."
The more the Web can facilitate our ability to contextualize, the better we will be at contemplation and consciousness about serious issues (economic equivalence, democratic choice, climate change, educational parity, etc.).
Complain about this comment (Comment number 38)
Comment number 39.
At 21st Sep 2009, GaryGSCC wrote:I was going to say that my contemplative mind is still here, but by the time I got to the comments in the 20's on this blog post I managed to contradict myself by skim reading the rest of them ;-)
I don't come from a psychology/scientific background. I can only offer personal examples of how I use the web and possibly how others do too...
I do sometimes get distracted by things on the net - flitting around, finding I've lost the thread of what I was looking for, but when I am 'distracted' it's often because it's led me down the "that's interesting" path and this interesting path leads me down the "what if" and the "if you can do that, why can't we do this?" path. Isn't the distraction in this case stimulating my contemplative mind? I generally use it as a route to expand my knowledge of a particular area and it isn't a bad thing, although dipping in and out of sites can obviously sometimes mean that you don't pick up the full picture of what someone is trying to say on their little patch of the net.
Actually, if anybody wanted me to focus on a particular website and not go wandering off via tempting links, the easiest way to achieve this is to mark all the outgoing links on a site with references to canhazcheezburger cats or Des O'Connor. I won't be following any of those links! ;-) Sorry Des... I'm not going to apologise to the cats though.
Complain about this comment (Comment number 39)
Comment number 40.
At 23rd Sep 2009, Dan Biddle wrote:@TaiwanChallenges - firstly, apologies for taking so long to reply to your comments. You raise some good points.
Secondly - Molly would like me to tell you that your offer to connect with a Taiwanese school was greatly appreciated, but it came in the middle of the night for us :( so we missed that window.
We missed a Weekly round-up (or rather it was displaced somewhat) because of my holiday and there being no one to dedicate the time to writing it. It's a hearty, time-consuming process to collate and do justice to the blog posts and comments here - there simply wasn't time to take it on. I did attempt a redemptive round-up the following week and we'll be back on the horse this Friday.
'I'd be interested to know what the DigRev team is learning from this experience. As mentioned, the approach taken appears 'top-down' but nobody had a lot to say about better ways of doing things. After all, the production team are people with an agenda, a job to do, attempting to... what? Crowd-source the research process? Better connect with the audience? Whatever, have you learned anything that may be relevant to the business of running social media for profit?'
We're on a learning curve, that's for sure :) I think the top-down point is taken (Dan Gluckman replied to this in a previous post), and hopefully we're learning to make a decent offer to engage people.
It's interesting that APNAB refers to this process as brainstorming. 'This 大象传媒DigRev space is for our open, democratic, organic BRAINSTORMING for a Web-terrestrial project. It’s not the Doomsday Book or the Magna Carta Libertatum.'
That concept is a nice way of considering aspects of the process here on Digital Revolution - @TaiwanChallenges, you often (quite rightly) consider what this process actually offers, wants, delivers etc. Why do people donate (even dedicate) their time to (any) project for which reward isn't financial? This isn't a complete answer for you, but I know that one of my favourite activities at the 大象传媒 is the opportunity to join a brainstorm about a project that I'm not actively involved in: be it a Gardening, Food, WWII project, I relish the opportunity to join a group of intelligent, creative individuals to riff ideas around a 'problem' or concept.
When I accept the invitation I have only a loose idea what the final product will be; I am unlikely to be involved in the actual production of the product; I'm certainly not going to be invited to make the acceptance speech should any awards be forthcoming... I join in to be creative, hear new ideas, be inspired, perhaps even come away with a new idea of my own to develop and brainstorm at a later date.
(And, if I'm being completely honest, the 'being there' and offering ideas may impress people in the room and I may gain some recognition within the organisation - if you will...)
I'm not saying that this is a brainstorm process, but I think there are some interesting comparisons that could be made.
---------------------
Wondering what we're getting from the process - how this is affecting the documentary - the effects so far have ranged from the subtle, a link followed, a train of thought from comments inspiring discussion, to the more obvious (Wikimania filming, a recent call for examples of Amazon reviews leading to major sales (the three wolf T-shirt being a perfect example given)...
An immediate example of differences the Digital Revolution community has made to the programme and process is @Leoncych's recent comment linking to a Howard Rheingold interview; Molly and from this viewing changed the questions she asked Howard in the recently conducted interview (clips and rushes coming soon on site).
Molly (who's crazy busy, so only able to comment through me, her humble conduit) wanted @SheffTim to know that she's taken on board your links and direction re ADHD, but is as yet unsure that it fits into the (already chock-a-block) programme. It's not off the cards, but she can't make promises it will make the script.
--------------------
Likewise APNAB '(1.) Was our species becoming concentration-deficient PRIOR to the Web anyway?' - Clearly very apropos, as I think David Nicholas' blog post considers that very issue. Great minds...!
Moreover, as the process has evolved, and the blogs have been written - they have more and more been written in less of a vacuum, but in respect to (if not direct reply to) the discussions and comments around the blog - the ideas contributed by you. David Nicholas has been reading our content; Maggie Jackson contacted us on account of reading the blog; I think that our guest bloggers are writing their pieces in full knowledge that it is to an intelligent and engaged audience. Which is fantastic for (I hope) everybody here.
--------------------
Also @APNAB, that Obama quote re blogosphere is something of a departure from his early days in office, . Interesting where the administration will go with Social Media and the web, having initially been its champion (and arguably having gained victory by its application).
---------------------
This has almost become a Revolution Round-up Lite and I have answered some, but few comments here. I'm aware that I haven't commented recently. I have been reading your comments and following your links, but other matters (incoming video rushes) have had us all tied up this past week. Hopefully I will be able to do your comments and input more justice on Friday.
Many thanks,
Dan
Complain about this comment (Comment number 40)