Syllabus Week had been going fine, with no major surprises or glaring errors in judgment. I was almost in the clear, even halfway through my Friday 9 a.m., until – “You all have a Twitter, of course?” my professor chirped. “If not, get an account ASAP.”
My heart sank. I’d almost gone four years Twitter-free in the journalism college, where the motto is pretty much “Tweet or die.” I just didn’t get the appeal. When Twitter first launched, I predicted 60 days in the spotlight, max. It was like the Sarah Palin of the Internet: silly, ill-adapted to challenges, nice to look at but hardly compelling. If we ignore it, it’ll go away.
But it didn’t. Instead, Twitter celebrated its sixth birthday last week. Somehow, it’s become a legitimate tool loved by many, including professional journalists, not just D-list celebrities desperate for fans.
Those little words attached to number symbols are everywhere. People say, “Follow me on Twitter!” like it’s not weird to encourage stalking. In my senior capstone class, uploading a profile photo is an actual homework assignment. #smh
But my profession is dying (if you haven’t heard), so I did what I had to do. I chose a corny username, deigned a really cool background and began documenting this column: The Twitter Chronicles.
On our first assignment, I felt like an alien in a new Twitterverse, surrounded by classmates already fluent in the native shorthand. It didn’t even seem convenient: Downsizing my message into 140 characters took way longer than writing a lengthier, normal sentence would have.
Finally, my first tweet was out there, but it felt incomplete. Why, I wondered, were budding journalists encouraged to sacrifice clarity and thoroughness for brevity? We’re taught to tell the whole story, to value accuracy above all else. Twitter’s mini-messages seemed counterproductive to journalism’s core goals.
People tell me they often find out about major news events via Twitter. At first, I scoffed. How could they trust the accuracy of hastily written Tweets? In the rush to be first, journalists sacrifice double-checking. Facts are flubbed, phrases are misleading and people (Joe Paterno) are prematurely pronounced dead.
Then I explored a bit more, going on a “follow” rampage of anything that interested me. That was a mistake. My page was immediately clogged with the inner musings of athletes and musical artists. I’m interested in their talents, not the color of their snot. #Tellittoyourmother
After adding bona fide news organizations, however, I began to see Twitter’s utility. All the headlines were in one place, and no medium is updated as frequently. It saved me from slogging through piles of news, but also gobbled up my free time and intensified that stressful “there’s so much to read and watch” feeling.
Maybe the idea of Twitter itself didn’t bother me – rather, it was the strain of adding yet another site to my daily checklist. Luckily, I was introduced to iGoogle, which allows you to customize a homepage for your Internet browser. It provides an endless buffet of content: news headlines, photo collections and yes, gadgets connecting to your social media accounts. #Googleforpresident
These aggregator sites are the key to juggling everything the 21st century has to offer and allowing various Internet services to play nice together. Twitter is intended to enrich our knowledge while saving us time, but until I could seamlessly incorporate it into my routine, it did the opposite. Now, instead of wasting time flipping from page to page all day, I simply leave the iGoogle tab open. Twitter seems way less obnoxious and actually kind of cool.
Uh-oh. Am I actually enjoying this thing I’ve stubbornly snubbed for so long? I’ve definitely eaten a few of my words, but not all of them. I still don’t see the point of tweeting – if anyone’s interested in my ramblings, that’s what my columns are for – and it’s still weird not knowing whether your stalkers are robots or real people.
Alissa Gulin is a senior journalism major. She can be reached at gulin@umdbk.com.