Facebook shrinks your brain?

Comments of Baroness Susan Greenfield, a well-known British neuroscientist, during a debate in the House of Lords on the need to control social networking sites like Facebook and Bebo.

From Hansard's Debates: http://www.publications.parliament.uk/pa/ld200809/ldhansrd/text/90212-0010.htm

Baroness Greenfield: My Lords, I should like to congratulate the noble Lord, Lord Harris, on drawing attention to this timely problem. The social networking site Facebook turned five years old last week. Arguably, it marks a milestone in a progressive and highly significant change in our culture as tens to hundreds of millions of individuals worldwide, including the very young, are signing up for friendship through a screen. Other noble Lords may follow the noble Lord, Lord Harris, and speak on specific regulatory measures that may be taken to ensure that children come to no physical harm. We hope that personal safety and privacy is soon to be improved in the light of the recommendations made in the report on personal internet security from the Science and Technology Committe and in the Byron Review Action Plan. However, as a neuroscientist, I think that there are still two more basic and, if you like, brain-based questions that ultimately need to be addressed. First, why are social networking sites growing? Secondly, what features of the young mind, if any, are being threatened by them? Only when we have insights into these two issues can we devise more general safeguards, rooted not so much in regulation as in education, culture and society.

I turn to the first question, surely the most telling of all. What precisely is the appeal of social networking sites? First, there is the simple issue of the constraints of modern life, where unsupervised playing outside or going for walks is now perceived as too dangerous. A child confined to the home every evening may find at the keyboard the kind of freedom of interaction and communication that earlier generations took for granted in the three-dimensional world of the street. But even given a choice, screen life can still be more appealing. As Phillip Hodson, fellow of the British Association for Counselling and Psychotherapy, suggests:

“Building a Facebook profile is one way that individuals can identify themselves, making them feel important and accepted”.

Continuing that train of thought, I recently had a fascinating conversation with a young devotee who proudly claimed to have 900 friends. Clearly, there would be no problem here to satisfying that basic human need to belong, to be part of a group, as well as the ability to experience instant feedback and recognition-at least from someone, somewhere, 24 hours a day.

At the same time this constant reassurance—that you are listened to, recognised, and important—is coupled with a distancing from the stress of face-to-face, real-life conversation. Real-life conversations are, after all, far more perilous than those in the cyber world. They occur in real time, with no opportunity to think up clever or witty responses, and they require a sensitivity to voice tone, body language and perhaps even to pheromones, those sneaky molecules that we release and which others smell subconsciously. Moreover, according to the context and, indeed, the person with whom we are conversing, our own delivery will need to adapt. None of these skills are required when chatting on a social networking site.

Although it might seem an extreme analogy, I often wonder whether real conversation in real time may eventually give way to these sanitised and easier screen dialogues, in much the same way as killing, skinning and butchering an animal to eat has been replaced by the convenience of packages of meat on the supermarket shelf. Perhaps future generations will recoil with similar horror at the messiness, unpredictability and immediate personal involvement of a three-dimensional, real-time interaction. In the words of one user:

“The fact that you can't see or hear other people makes it easier to reveal yourself in a way that you might not be comfortable with. You become less conscious of the individuals involved (including yourself), less inhibited, less embarrassed and less concerned about how you will be evaluated”.

It is hard to see how living this way on a daily basis will not result in brains, or rather minds, different from those of previous generations. We know that the human brain is exquisitely sensitive to the outside world. This so-called “plasticity” has been most famously illustrated by London taxi drivers, who as we know need to remember all the streets of the city, and whose brain scans correspondingly revealed in one study that the part of the brain related to memory is bigger in them than it is in the rest of us.

One of the most exciting concepts in neuroscience is that all experience, every single moment, leaves its mark almost literally on your brain. So you have a unique configuration of brain cell circuits, even if you are a clone—an identical twin. It is this evolving personalisation of the brain that we could view as the mind, and it is this “mind” that could therefore be radically changed by prolonged exposure to a new and unprecedented type of ongoing environment, that of the screen.

So, we come to the second basic question: what might now be in jeopardy? First, I would suggest that it is attention span. If the young brain is exposed from the outset to a world of fast action and reaction, of instant new screen images flashing up with the press of a key, such rapid interchange might accustom the brain to operate over such timescales. Perhaps when in the real world such responses are not immediately forthcoming, we will see such behaviours and call them attention deficit disorder. It might be helpful to investigate whether the near total submersion of our culture in screen technologies over the last decade might in some way be linked to the threefold increase over this period in prescriptions for methylphenidate, the drug prescribed for ADHD.

Related to this change might be a second area of potential difference in the young 21st century mind—a much more marked preference for the here-and-now, where the immediacy of an experience trumps any regard for the consequences. After all, whenever you play a computer game, you can always just play it again; everything you do is reversible. The emphasis is on the thrill of the moment, the buzz of rescuing the princess in the game. No care is given for the princess herself, for the content or for any long-term significance, because there is none. This type of activity, a disregard for consequence, can be compared with the thrill of compulsive gambling or compulsive eating. Interestingly, and as an aside, one study has shown that obese people are more reckless in gambling tasks. In turn, the sheer compulsion of reliable and almost immediate reward is being linked to similar chemical systems in the brain that may also play a part in drug addiction. So we should not underestimate the “pleasure” of interacting with a screen when we puzzle over why it seems so appealing to young people; rather, we should be paying attention to whether such activities may indeed result in a more impulsive and solipsistic attitude.

This brings us to a third possible change—in empathy. One teacher of 30 years’ standing wrote to me that she had witnessed a change over the time she had been teaching in the ability of her pupils to understand others. She pointed out that previously, reading novels had been a good way of learning about how others feel and think, as distinct from oneself. Unlike the game to rescue the princess, where the goal is to feel rewarded, the aim of reading a book is, after all, to find out more about the princess herself.

Perhaps we should therefore not be surprised that those within the spectrum of autism are particularly comfortable in the cyber world. The internet has even been linked to sign language, considered as beneficial for autistic people as sign language proved for the deaf. Of course, we do not know whether the current increase in autism is due more to increased awareness and diagnosis of autism, or whether it can—if there is a true increase—be in any way linked to an increased prevalence among people of spending time in screen relationships. Surely it is a point worth considering.

Finally, I draw your Lordships’ attention to a fourth issuer: identity. It seems strange that in a society recoiling from the introduction of ID cards, we are at the same time enthusiastically embracing the possible erosion of our identity through social networking sites. One 16 year-old intern who worked in my lab last summer summed it up as follows:

“I can see that Facebook makes you think about yourself differently when all your private thoughts and feelings can be posted on the internet for all to see. Are we perhaps losing a sense of where we ourselves finish and the outside world begins?”.

With fast-paced, instant screen reactions, perhaps the next generation will define themselves by the responses of others; hence the baffling current preoccupation with posting an almost moment-by-moment, flood-of-consciousness account—I believe it is called Twitter—of your thoughts and activities, however banal.

In summary, I suggest that social networking sites might tap into the basic brain systems for delivering pleasurable experience. However, these experiences are devoid of cohesive narrative and long-term significance.

As a consequence, the mid-21st century mind might almost be infantilised, characterised by short attention spans, sensationalism, inability to empathise and a shaky sense of identity.

When talking about safeguards, surely we need also to think about safeguarding the mindset of the next generation so that they may realise their potential as fully-fledged adult human beings. Of course we cannot turn back the clock, nor would that be any solution to maximising the individual’s potential in this new century. However, surely the Government could consider investing in some kind of initiative, the goal of which would be the identification of realistic alternatives—be it in the classroom, on the screen, in conjunction with the media, or in society as a whole—for developing a sense of privacy and identity and, above all, a real appreciation of friendship.