Thursday, November 19, 2009

Rich Lowry Comes to Campus

National Review editor Rich Lowry came to campus yesterday, and his talk was a great supplement to Arianna Huffington's speech two weeks ago. Though Lowry and Huffington are on opposite sides of the political spectrum, both had similar sentiments about our current media climate.

Lowry noted that "legacy media, an artifact of institutions, corporations, and technological limitations," is quickly coming to an end. He pointed out many of the flaws of current corporate media, such as the selectivity of soundbytes and the bias of video. While many would argue that video is proof of what is the truth, I particularly liked Lowry's comment disputing this claim: "Video can lie, depending on how it's edited and how much you see." Currently shooting, editing, and producing a 30-minute documentary for my TV Workshop class, I know how true that statement is. I constantly have to consider the implications of soundbytes taken out of the context of full interviews.

Lowry talked about how today's new media is actually returning to the ideals of the earliest journalism. Historically, newspapers were partisan. Somewhere down the road, the myth of objectivity was born. New media prides itself on being unabashedly partisan, and Lowry sees this as one of its assets: "They don't even try to be objective. And from where I sit, that's a profoundly good thing."

Why? Lowry referenced John Stewart Mills's marketplace of ideas and the theory that only through the collision of adverse opinions will truth be obtained. When asked during the question and answer session whether partisan new media will result in "splintering" and people reading only what they already agree with, Lowry admitted this is a possible problem, but then added that alternate views are easily accessible for those who are curious. Our professor, Jeff Cohen, chimed in to say that internet journalism often brings both sides of an issue to the forefront because the subject of criticism is frequently linked to. Links, rather than videos, are becoming the new evidence of truth.

Wednesday, November 18, 2009

Word of the Year: Unfriend

A few weeks ago, I read about how being unfriended on social networking sites can bruise your digital ego. I laughed, but had to admit the complexities of social networking friendships. I personally refuse to friend anyone on Facebook that I haven't actually met in person, but I've still had my share of internal debates over confirming "friendships" with people I have met. Is it okay to accept friend requests from campers I counseled at summer day camps that want to keep in touch? Professors I currently have for classes? High school classmates I know for a fact loathed me, but are curious enough to want to keep tabs on me?

That last example was the first dilemma I faced, way back freshman year when I was a Facebook newbie. My boyfriend at the time convinced me to accept the "friendship," telling me, "You can't not friend someone on Facebook. It's rude." I conceded and gave in.

While I've faced various friending dilemmas since then, it's gotten easier for two reasons. First, Facebook's privacy settings are a lifesaver, allowing me to limit what certain friends can see. Secondly, I've taken down all questionable photos, making sure that everything on my profile would be acceptable for a potential employer to see. When my aunt recently joined Facebook and friended me, I had a mini heart attack -- until I remembered I don't have anything to hide. If it's okay for a potential employer to see, it's okay for my aunt. (Admittedly, I dread the day my mom joins Facebook. I rather like that she can't keep such close tabs on me at the moment.)

Since Facebook friendships are much more tenuous than real friendships, they sometimes come to a rather abrupt end. A roommate of mine recently unfriended someone simply because she kept stealing points from her on Food Friendzy, a Facebook application that awards you coupons to local food places. While Hannah felt perfectly justified in defending her Food Friendzy points, she did feel a twinge of guilt over the stigma of unfriending someone.

Apparently Hannah and I are not alone in our friending dilemmas. The New Oxford American Dictionary has selected "unfriend" as its word of the year.

Tuesday, November 17, 2009

Can You Tweet Libel?

Can you tweet libel? Is Google Earth an invasion of privacy? Who owns the copyright to user-generated content on Facebook?

As social media continues to burgeon, these are some of the issues courts are currently debating. Courtney Love is being sued for libel after slamming a designer on Twitter. So is Amanda Bonnen, who tweeted about a realtor's moldy apartment. A couple took Google to court over Google Earth, claiming the Street View feature was an invasion of their privacy. And Facebook was put in the hot seat after amending its terms of use and sharing user info on partner sites.

Technology is evolving so fast, the laws can't keep up. Everything the Internet is praised for -- providing easy access to information, connecting people who are geographically far away, breaking down barriers between public and private domains -- complicates lawmaking. If two Facebook users sue each other, one in England and the other in Australia, which country's laws have jurisdiction? Can colleges deny degrees based on students' Facebook content? How far does Web anonymity extend?

Clearly, the courts have their work cut out for them as they wrestle to revise existing media laws. And by the time they decide what to do about Facebook and Twitter, other new media are sure to leave them newly stumped.

Sunday, November 15, 2009

Goodbye Objectivity, Hello Links

The idea of transparency as the new objectivity has come up repeatedly in more than one of my journalism classes this semester. In my Issues & the News class, we were comparing newspapers around the world, and our professor mentioned that the idea of an "objective" paper is not the status quo everywhere. In the United Kingdom, for instance, partisan papers are the norm, and people know exactly which papers to read if they want a liberal or conservative slant to their news.

In my Independent Media class, we just read David Weinberger's blog on this subject. Weinberger argues that objectivity, a value highly prized in print media, will become obsolete as more and more of our news shifts to an online format. While newspaper readers may be content to take a journalist's claim at face value, online media consumers are more skeptical -- and active. They want links so they can click away and find out more about the topic in question. Online news is a starting rather than stopping point, inviting further investigation and discussion. In Weinberger's words:

"Transparency prospers in a linked medium, for you can literally see the connections between the final draft’s claims and the ideas that informed it. Paper, on the other hand, sucks at links. You can look up the footnote, but that’s an expensive, time-consuming activity more likely to result in failure than success. So, during the Age of Paper, we got used to the idea that authority comes in the form of a stop sign: You’ve reached a source whose reliability requires no further inquiry."

In an era of citizen journalists where anyone can contribute news content, the idea of journalists as authority figures is rapidly disappearing. Instead, they are now filling the role of conversation facilitators, providing readers with resources to learn more on their own.

Weinberger concluded his blog post by saying:"In short: Objectivity is a trust mechanism you rely on when your medium can’t do links. Now our medium can."

During her visit to Ithaca College two weeks ago, Arianna Huffington further elucidated Weinberger's point by saying, "We are now living in a linked economy." She then went on to talk about what this means, that content put behind closed walls no longer works -- online readers want their news for free and have no tolerance for subscription fees.

If the nightmares of technology-phobic teachers ever become a reality and textbooks adopt a solely digital format, links would take the place of bold vocabulary words that hint to students the concept is important and they should look up the word in the glossary at their earliest convenience. In a digital world, links would signal to students that more information is available with the click of a mouse.

In many ways, this is already happening. On an average day, I conduct upwards of 10 Google searches, mainly to find out more information about current events I've just read about on CNN.com. Even more shocking? One classmate recently told our Issues professor that Wikipedia has been the sole source of her history lessons these past four years of college.

Tuesday, November 10, 2009

The Best - and Worst - Places to Be a Journalist

My sister sent me this article with the warning: "Choose wisely, my traveling sister." Based on the 2009 Press Freedom Index, this Time article summarizes the index's findings of which countries have the most - and least - press freedom. Following Obama's inauguration, the U.S. jumped up the list from 36th to 20th. Topping the list are Denmark, Finland, Norway, Sweden and Ireland. The bottom three slots are occupied by Turkmenistan, North Korea and Eritrea where "the media are so suppressed they are non-existent." Other notable changes in this year's report include Europe losing three spots in the top 20, Israel facing increased censorship and five African countries making it into the top 50. The official Press Freedom Index 2009 can be found here.

And just for fun, while we are on the topic of countries censoring the press, I found this cartoon a few days ago that shows the emerging power of new media.

Friday, November 6, 2009

Smart Journalism

When professor Vadim Isakov guest spoke in our class yesterday about emerging new technologies trends, I couldn't but think of the Disney Channel movie Smart House. When the movie was released ten years ago, the idea of a house run by an internal computer system seemed little more than cool science fiction. Today it seems a very possible reality.

Professor Isakov talked about nine new technology trends: real-time Web, lightblogging, personalization, interactive TV, identity recognition, augmented reality, mobile life, geolocation, and an internet of things. Many of these trends entail tailoring media to your specific needs, be it finding the nearest pizza place by a voice command on your phone or ordering Pam's outfit with a click of the remote as you're watching The Office. Other trends facilitate life on the go: QR codes on cell phones that act as boarding passes and the ability to verbally dictate a blog post via your cell phone. Still others, like Roomba vacuums that run automatically, are robots that perform pesky human chores.

I particularly liked Professor Isakov's response to the question of whether all this new technology eliminates the need for journalists. "If you define a journalist as someone who gathers information, you don't need journalists anymore," he said. However, he immediately added that we continue to need analysis, explanation and fact-checking.

Journalists' jobs are certainly changing, but they remain crucial. They are evolving from information gatherers to information synthesizers and most importantly, analysts. As my high school teachers used to say, "A monkey can copy text from the book. I want you to tell me what it means."

Furthermore, the best journalistic writing has personality to it. It has sass. Humor. Attitude. It's littered with pop culture references. A computer or robot can be programmed to compile facts, but the end result will lack creativity and anecdotal evidence. It will be a fact sheet rather than a story.

Robots will not oust journalists from their jobs, but they may improve certain parts of the journalistic process. Personally, I'd love a gadget that would automatically transcribe all my interviews from my recorder onto my laptop. Anyone want to invent that?

Thursday, November 5, 2009

The Antiquity of TVs

"I hate movie theaters," I often told my high school boyfriends. "The seats are uncomfortable and the stupid armrests make it impossible to cuddle. Can't we just stay home and rent a movie?"

In college, I was quickly introduced to a whole new style of movie watching. Very few of my friends owned TVs, but all of us owned laptops. Hence, our laptops became our makeshift TVs.

Three years later, TV watching on my laptop has become so automatic that I see my laptop as half computer, half TV. It serves both functions, and best of all is portable so I can take it wherever I go.

I couldn't help but laugh as I read David Sarno's article "Wanna share ear buds?" Clearly not a member of the Net Gen, Sarno acknowledges that releasing a movie solely online has economic advantages, but struggles with the idea that people would actually want - prefer, if you dare - to watch a movie on 3-inch by 2-inch iPod screen instead of a gigantic movie theater screen.

"Nothing says romance like sharing ear buds," Sarno facetiously writes. Even Eddy Cue, vice president of iTunes, seemed clueless of the full potential of online-only movie releases. He suggested people may watch movies on their iPods during long-distance traveling or at the gym - in other words, when they simply lack access to traditional, full-size TVs.

But why do we even need TVs anymore? To Baby Boomers that cling desperately to print newspapers and landline telephones, getting rid of TVs probably sounds catastrophic. (This summer, after fighting with cords for half and hour and still failing to successfully hook up our massive TV, I suggested to my mom we watch the rented movie on my laptop. She reeled back in shock and replied, "But the screen's so small!") But to the average Net Gener, TVs are obsolete. They're clunky and singularly functional. Laptops are mobile and multifunctional. There's no contest. And that's not to mention the unnecessary cost of cable versus free online viewing.

Maybe I've been living the life of a skint college student for too long. My viewing habits have, admittedly, largely been dictated by convenience, a shoestring budget, and the limited dimensions of college living space. But you know what? When I graduate in May and get my own grown-up apartment, I see no need to buy a TV. I'll use that money to buy a new iPod.