Pages

Saturday, June 7, 2014

More about why live tweeting conferences is bad

Yesterday, I blogged about why live tweeting from conferences is not worth it. (Link here.)

Such tweets are difficult to comprehend, lack context without a detailed explanation, and might be detrimental to the persons who are tweeting because they aren't paying attention to the lecture.

I have never had such a response to a blog post before. Live tweeters were highly indignant that I should question what they are certain is the greatest marvel in medical education since the invention of PowerPoint.

I said it took a minute to compose and type a tweet, and many claimed they can do it much faster. Others also said that they use their tweets as notes for later reference.

A twitter colleague, Dr. John Mandrola (@drjohnm), unknowingly stepped into the conversation by posting a link to an article [link fixed 6/8/14] in The New Yorker about college teachers proposing to ban laptops in their classrooms.

It referenced a 2003 study from Cornell "wherein half of a class was allowed unfettered access to their computers during a lecture while the other half was asked to keep their laptops closed."

"The experiment showed that, regardless of the kind or duration of the computer use, the disconnected students performed better on a post-lecture quiz. The message of the study aligns pretty well with the evidence that multitasking degrades task performance across the board."

A New York Times piece about handwriting said, "For adults, typing may be a fast and efficient alternative to longhand, but that very efficiency may diminish our ability to process new information."

It cited a study showing "that in both laboratory settings and real-world classrooms, students learn better when they take notes by hand than when they type on a keyboard. Contrary to earlier studies attributing the difference to the distracting effects of computers, the new research suggests that writing by hand allows the student to process a lecture’s contents and reframe it—a process of reflection and manipulation that can lead to better understanding and memory encoding."

The New Yorker article concluded, "Institutions should certainly enable faculty to experiment with new technology, but should also approach all potential classroom intruders with a healthy dose of skepticism, and resist the impulse to always implement the new, trendy thing out of our fear of being left behind." [Emphasis mine.]

15 comments:

Anonymous said...

When I live tweet, I do it as part of a larger social media strategy, usually on behalf of a medical association. I don't do it when I want to learn.

Skeptical Scalpel said...

Anon, thanks for commenting. I think what you said reinforces my point. You readily admit that live tweeting is not conducive to learning.

Diane said...

Would this impact Drs. taking notes on a patient on their computer during an appointment? We have noted many Drs. now face their laptop and type the duration of the appointment vs speaking directly to the patient and uploading the data after the appointment.

Alexey said...

I agree with you on a problem with "live tweeting" a talk while paying enough attention what speaker says and means. You have to be very focus on a talk to capture everything and understand everything. Taking in account usual weak wifi in a room, tweeting could be very distracting.

But, I'd argue that "live tweeting" and "real time tweeting" is not the same thing. I do it a lot on conferences and I call it "live coverage". It all depends on value of your tweets and your incentive. If you tweet to inform your professional community about new exciting things in the field, you don't need to "real time tweet" during the talk. Community will appreciate your tweets if they come later - value of these tweets is important, but not "real time" feed. So, I usually make notes during the talk and I tweet 2-3 tweets between the talks in 2-3 minutes switch between speakers. I tweet in 15 min coffee breaks what was in the talk one hour ago. I tweet in 1h lunch break what I've heard today morning. I tweet from a hotel room after 6 pm what I've learned today. So you don't need to be "real time" to be valuable for community.

Also you can summarize all your notes in one post (on TwitLonger.com or FB or your blog or G+....) instead of tweeting real time - the value of information would be the same.

I wrote about my experience and gave some advices here -
http://stemcellassays.com/2012/11/tweet-conference/

Gwynedd said...

I enjoyed the tweets arising during the International Stroke Conference a few years ago, and the interaction it brought me with both my co-attendees and other people following along from home. I've since often tweeted from meetings (sorry, by the way, that you've probably had to mute me the last few days!)

I would not, however, classify it as a "great marvel of medical education", and I'm definitely aware of the problems with distraction if I use my laptop.

I like Alexey's comments regarding "real time" tweets. I do tend to live tweet during the session, but I generally don't read Twitter simultaneously, except for notifications, because it distracts me from listening. I catch up during breaks.

I do think it's possible to tweet salient points from conference sessions without making them overly terse or unreadable. It's the art of distilling it. I make a real effort to make each tweet self-contained, and not to use acronyms unless the tweet is such that only those who know the acronym would be interested in the tweet content anyway (e.g., people who don't know CCF means carotid-cavernous fistula probably don't care about the esoterics of treatment.)

Anonymous said...

I think your comment about quality of tweets is valid. It would be helpful if there was some shorthand way to reference the speaker and session when commenting on tweets.

Otherwise, tweeting from conferences allows valid opinions and discussions to occur in real time. There is often too many old, egotistical men appearing as key speakers at conferences and the valid scientific arguments are lost in their self-centred rhetoric. Tweeting can democratize conferences, making people who aren't part of the majority feel a little more welcome.

I agree there can be improvements made in the quality of tweets, but so too there can be improvements made in the quality of conference speaker!

Anonymous said...

I am a patient, and follow many rheumatologists on twitter specifically because they live tweet from conferences. Not only are their tweets always of a high quality, conveying complex thoughts/comments in relatively few words/characters, but I can understand quite a lot of them, despite my lack of medical or other formal scientific education. These physicians offer me an opportunity to be made aware of the latest thinking on various rheumatology topics that I, as a patient, would otherwise NEVER have a means of learning about, both because I am neither welcome to these events, but also because I couldn't afford to attend them even if I were welcome. And of course, for a variety of reasons, these are topics that would never be discussed with me during my office visits with my own personal physicians.
As for the rheumatologists who are doing the tweeting, I get the point of the New Yorker article, but when I see doctors discussing and debating among themselves the substances of the messages conveyed in the tweets, I can't help but think that if anything, this engagement in discourse, wherein they are hearing different perspectives on the same topic/issue, is invaluable and only *helping* them be better doctors, not hurting them.
Please do not discourage live tweeting from conferences.

Skeptical Scalpel said...

Great comments all.

Diane, I have no doubt that typing into a computer while taking a history affects the retention of information. I don't know if it has been studied. It should be.

Alexey, I agree that a summary of a few tweets or a blog post might be just as good to disseminate information. But some have commented that a discussion is good. I haven't had many of those experiences myself.

Gwynned, I didn't mute you because the number of your tweets was reasonable. That, by the way, is another huge problem. Some people feel the need to tweet every word starting with "Thank you, I'm happy to be here."

First Anon, I agree the quality of both the tweets and many speakers could be improved.

Second Anon, don't worry. I don't have the power to get people online to do anything (or anywhere else). I'm sure your Twitter feed will still see many live tweets from conferences.


hope said...

I'm part of the whole facebook-twitter-instagram generation and I don't have any of them. Personally, I feel like computers in the classroom, smartphones at conferences and live tweeting are just ways for us to hide and get in touch with our ADD tendencies (let's face it, surgeons are all generally impatient when it comes to sitting and listening). I don't feel left out for not having facebook, because the people I really want to talk to, I do, and I'm not inundated with updates on the lives of people I don't *really* care about everyday. I did have some of these social media tools for awhile, but about 4 years ago I found that they were detracting more than enriching my life. In the medical school classroom, only about 2% of my classmates with open computers were actually doing anything useful. In daily conferences and at national conferences, the number of attendings on their phones is, well, appalling. We're all so disconnected from the present moment. We're constantly trying to plan the future or document the past. Our lives are lived through smartphones...it's pretty sad. Maybe I'll come around at some point, but for now I really feel that MORE information, isn't BETTER information. Think about the amount of data you process daily from social media that actually positively contributes to your lives. It's actually quite little.

Mike McInnis MD said...

It seems to me that the New Yorker article addresses something altogether unlike live-tweeting a conference. That article is talking about students in a classroom who are using laptops to shop, read Facebook, etc. You could easily do that in a medical conference. But live-tweeting is NOT about disengaging from the speaker and playing Candy Crush. It's about paying MORE attention to what is being said so that you can share the "aha" moments with others.

Skeptical Scalpel said...

Hope, I think you are on to something.

Mike, that's not what I got from the New Yorker piece at all. It's not about paying MORE attention. It's about the fact that you don't absorb the information as well when you type.

Here's a quote: "Recent Princeton University and University of California studies took this into account while investigating the differences between note-taking on a laptop and note-taking by hand. While more words were recorded, with more precision, by laptop typists, more ended up being less: regardless of whether a quiz on the material immediately followed the lecture or took place after a week, the pen-and-paper students performed better. The act of typing effectively turns the note-taker into a transcription zombie, while the imperfect recordings of the pencil-pusher reflect and excite a process of integration, creating more textured and effective modes of recall."

And I suppose you're going to tell me that live tweeters are paying strict attention to the subject matter and never look to see if anyone RT'd, replied, or favorited.

hope said...

I definitely learn and process more when I write things down. Am i the only one out there who thinks the idea of "liking" or "favoriting" or "retweeting" anything on facebook, instagram or twitter is so...childish in a way? Like something that a teenager would find gratifying? Why does the whole world need to see what you like or recommend and, further, what does that even mean?! Skeptical Scalpel, I laughed when I read your post about people re-tweeting things without even reading them. A prime example of how useless all this tweeting and liking is. It also sets up an inevitable comparison -- between things you've written or posted, between yourself and others, between yourself and society. How many likes you get for a photo or tweet seems to affect some people's days. It just seems so damn complicated to put that much effort into affirmation that comes externally. I went to three conference this year, and the talks I found interesting I starred in my paper program, and then I emailed some people a few abstracts when I got home for further discussion. I also gave three talks this year, and was just laughing to myself before I went up to the podium each time, because I'd prepared so much beforehand, only to look around and see half the audience on smartphones/ipads/laptops as I was talking. Self defeating!

Skeptical Scalpel said...

Hope, good points as usual.

I wish I could remember where I read it, but someone said social media was like a video game where you rack up points (followers, likes, retweets, etc).

I gave a talk on social media in Sweden last year. When I began, every computer opened up to my blog's site (someone in the back told me). It is unclear how many people just started reading it and tuned me out.

Carolyn Thomas said...

Hallelujah! I've been griping about live-tweeting from conferences from the perspective of a person who has both done that (guilty as charged) AND been onstage as a conference speaker in front of an audience where it seems everybody's live-tweeting me.

As a recovering live-tweeter, I don't need any published studies to tell me that it is simply impossible to both pay attention to a speaker while at the same time frantically composing, editing, and rewriting my tweet to fit - all at the same time.

When I was speaking at a recent Vancouver conference on social media in medicine hosted by the University of British Columbia, organizers actually asked me in advance to email them a list of tweets - with pertinent links - that matched my key slides. The conference team then live-tweeted my 140-character submissions in chronological order throughout my presentation. The advantage of this was the ability to offer Twitter followers a live link to an appropriate resource.

Most live tweets from conferences (the ones that don't contain a useful link to actual information or credible references) are inevitably inane, as you nicely describe here and in your first post on the topic.

But as a speaker, let me tell you how disconcerting and distracting it is to look out from the lectern over a sea of faces - all of them head down working away at their tablets or smart phones during my presentation. Call me old-fashioned, but I miss the good old days when audiences actually maintained some kind of eye contact with the person talking to them from the microphone.

Finally, here's a very useful tip for those who really feel they must live-tweet, courtesy of Joe McCarthy: http://gumption.typepad.com/blog/2013/11/a-modest-proposal-use-replies-and-hashtags-for-live-tweeting-and-tweet-chats.html

"I use the @reply mechanism to reference the event's Twitter handle at the START of each conference tweet - which hides the tweet from anyone who does not follow both me and the event - and then use the designated event hashtag so that anyone who is explicitly following the event hashtag can also see it."

If every conference tweeter followed Joe's advice, my own Twitter feed would not be clogged by an avalanche of time-wasters during conference season (like "Dr. A is about to speak next..." and other breathless prose).

regards,
C.

Skeptical Scalpel said...

Carolyn, thank you for agreeing with me and for the pertinent comments, especially about the inane nature of most live tweets.

The link and Mr. McCarty's solution to the problem are spot on.

Post a Comment

Note: Only a member of this blog may post a comment.