I started using Twitter as an experiment, and it was the first and only social network I really participated in. It was great because I actually made new friends that I went on trips with, got the opportunity to follow the thoughts of interesting people, and whenever I was in a quandary, I just had to holler a question and would get plenty of answers and advice in return.
But I was uneasy because I was feeling jaded. I thought it was because of the typical “overdoing it” reason, but there was more to it. It was affecting my ability to think critically/deeply about a subject.
Why am I thinking so much about a social network? As David Allen once said, “Pay attention to what has your attention.” And clearly, Twitter had more of my attention than it should have.
Since my attention span was reducing from books to blogs and then blogs to tweets, I was being converted from “from a thinker to a clicker”.
Getting your Fix
I think of this situation as getting your fix. Think smoking vs. coffee. Both are stimulants. Both are legal. But since smoking actually affects others, people have to go outside to indulge in it. Hence, it is less convenient. Probably that’s why there are more people addicted to coffee. Because it is more convenient. There is a sufficient barrier to smoking. Even though this analogy may not be true, consider reading blog posts vs. reading books. There is a sufficient barrier of attention to the latter, that is why more people prefer reading blog posts. It is more convenient. The same for reading blogs vs. tweets. The latter is more convenient. Then, going down this path, your ability to think becomes restricted to 140 characters. Twitter gives you that instant high that you published or read something, which means you lose persistence which is required for longer reading, hence tend to think a lot less and quick wins prevent you from going after bigger wins.
The problem with the shorter fix is that you will indulge in it more often and it will have lesser stimulation in the long run. Consider the difference between, say, having a 5-day 9-hour work week with 2-day weekends vs. having 6-hour work everyday with no weekend and no holidays. Which one would you prefer? This is how I argue that a book once in a while will give you more stimulation than a hundred tweets. For example, consider the signal-to-noise ratio – only tools like filtrr.com can filter out #ipl talk, etc. whereas a book would give a broad understanding about a particular subject. In the long run, it is more enriching to go deeper into subjects, not to be “restricted” to a buffet of subjects.
As a sort-of substitute for Twitter, I’ve shifted to a del.icio.us network. After all, most of Twitter is sharing links and delicious doesn’t have the downside of frivolous tweets. Also, delicious shows how many people have bookmarked a link giving another indicator whether something is worth reading or not, and even better, they are tagged appropriately so I immediately know the topic to expect for an article, instead of “This is cool <insert link>.”
The Attention Psychology
Let’s think about attention in terms of psychology, which I am trying to understand a little about from The Mouse Trap blog:
U = E x V (where U is utility of act; E is expectancy as to whether one would be able to carry the act and if so whether the act would result in desired outcome; and V is the Value (both subjective and objective) that one has assigned to the outcome. Maximizing Predictability
While selecting an action we maximize reward and minimize punishment, basically we choose the maximal utility function; while choosing which stimuli to attend to we maximize our foreknowledge of the world and minimize surprises, basically we choose the maximal predictability function; we can even write an equivalent mathematical formula: Predictability P = E x R where P is the increase in predictability due to attending to stimulus 1 ; E is probability that stimulus 1 correctly leads to prediction of stimulus 2; and R is the Relevance of stimulus 2(information) to us. Thus the stimulus one would attend, is the one that leads to maximum gain in predictability. Also, similar to the general energy level of organism that would bias as to whether, and how much, the organism acts or not; there is a general arousal level of the organism that biases whether and how much it would attend to stimuli.
As per my understanding, the first part means that because we expect much utility about something, it’s perceived utility is higher, making it’s value higher. And because Twitter gives that dash of randomness that we desire, it’s utility is much higher than it really is.
The second part means that we want to know more about the world in order to have lesser surprises, and hence we tend to read more and more, especially if it is information that we perceive as relevant to us.
Bottom line: I question whether more and more information and more and more immediacy is really necessary/required for us?
Think of all the great things that have been achieved whether it is a motor engine or a music stereo, would it have been created if the to-be-creator was constantly distracted and with low attention span? Where is the time to get inspired if we’re always mentally tired?
Why Can’t We Concentrate?
I will finish up with excerpts from this excellent article on Salon called “Why Can’t We Concentrate?”:
Here’s a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you’re finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan’s famous problem that has no name,” this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr’s much-discussed Is Google Making Us Stupid?” essay for the Atlantic Monthly and diatribes like “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” a book published last year by Mark Bauerlein.
You don’t have to agree that “we” are getting stupider, or that today’s youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic’s headline writers as a salvo against the ubiquitous search engine) reported feeling the change “most strongly” while he was reading. “Immersing myself in a book or a lengthy article used to be easy,” he wrote. “Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text.” For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I’m at the computer, why not check e-mail? Most of the time, I’ll wind up pausing the DVD player before the end of the movie and telling myself I’ll watch the rest tomorrow.
… What this commonplace crisis comes down to is our inability to control our own minds. You may, like Traister, need to buckle down and write, or you may, like Carr, pine for the deeply engaged style of reading we bring to books and New Yorker profiles. You may, like me, realize that your evening will be more enjoyable and more enriching if you commit to the full 110 minutes of “Children of Men” instead of obsessively checking out your friends’ Facebook updates or surveying borderline illiterate reader reviews — or, for that matter, browsing through the “Seinfeld” reruns in your Tivo Suggestions queue. In many cases, the thing we wish we would do is not only more interesting but ultimately more fun than the things we do instead, and yet it seems to require a Herculean effort to make ourselves do it.
What to do? For most people, bailing on the Web or e-mail or cellphones isn’t even feasible, let alone practical or ultimately desirable. (I shudder at the thought of living without my beloved Tivo.) Besides, modern life really isn’t making us stupider: IQ tests have to be regularly updated to make them harder; otherwise the average score would have climbed 3 percent per decade since the early 1930s. (The average score is supposed to remain at a constant 100 points.) And IQ measures problem-solving ability, rather than sheer data retained, which has grown even faster over the same interval. Each of us knows many more people and facts than our counterparts of 100 years ago; it’s just that the importance of those people and facts remains somewhat uncertain. Knowing a little bit about Lindsay Lohan and Simon Cowell (two people I recognize despite having no active interest in either one) can’t really be equated with knowing a bit about Marie Curie or Lord Mountbatten. We have more information, but it isn’t necessarily more valuable information.
Winifred Gallagher’s new book, Rapt: Attention and the Focused Life argues that it’s high time we take more deliberate control of this stuff. “The skillful management of attention,” she writes, “is the sine qua non of the good life and the key to improving virtually every aspect of your experience, from mood to productivity to relationships.” Because we can only attend to a tiny portion of the sensory cacophony around us, the elements we choose to focus on — the very stuff of our reality — is a creation, adeptly edited, providing us with a workable but highly selective version of the world and our own existence. Your very self, “stored in your memory,” is the product of what you pay attention to, since you can’t remember what you never noticed to begin with.
… Gallagher deserves credit for calling our attention to attention itself, specifically to the way it works neurologically. In essence, attention is the faculty by which the mind selects and then zeroes in on the most “salient” aspect of any situation. The problem is that the brain is not a unified whole, but a collection of “systems” that often come into conflict with each other. When that happens, the more primitive, stimulus-driven, unconscious systems (the “reactive” and “behavioral” components of our brains) will usually override the consciously controlled “reflective” mind.
If you’ve read so far, thanks, because you’re one of the few people who still have a healthy attention span. Half of the people who didn’t read probably found it “too boring” (which means it exceeded their attention span) and the other half didn’t read it because there were no bullet points or pictures.
BUT I’m still not saying that you should give up Twitter, you just have got to be careful on how you use it (duh!). All you have to do is just remember when was the last time you read a whole book every time you tweet :)