Posts Tagged ‘NYTimes’


Last night at about 8:45 pm PT I found out that Osama Bid Laden had been killed. Here’s how that went down:  My brother picked up his iPhone, glanced through his Twitter feed and announced the news as we were waiting for the opening credits for an action movie we had come to see to close out our Sunday nights.

It being Twitter, I suspended disbelief, but felt reasonably confident that the news was true, given the groundswell of information being tweeted about it. And then I moved on with my evening.

Is there anything wrong with that?

Yes. And, no.

Yes, because it was truly a momentous event. It was, in many ways, the culmination of 10 years of searching and frustration, made and broken political careers, physical demonstrations of strength and power, alarmed admissions of weakness and ignorance, aggression and intolerance, inner turmoil and acceptance, American tragedy and dark, dark American comedy. And all I did was continue to sit in my seat and watch a very sub-par movie.

No, because I knew that the next few days would unfurl themselves before me in a constant stream of information about his whereabouts for the last ten years, where he was killed, how he was killed, Obama’s thoughts on his death, everyone’s thoughts on his death, analyses of how this will affect the Presidential race, pronouncements of how this will affect Obama’s legacy in office, and general societal responses to the news of his death. And I would be there to read, watch, listen to, and ingest it all.

The thing about fast-breaking news these days is that it breaks, and it continues to break like a wave hitting the continental shelf over and over and over. This phenomenon gives modern news consumers time to digest that information from all chosen angles, from all chosen sources.

All of that, and all I’m really taking away from this news is a) I am utterly relieved to see a contingent of contacts within my sphere who are conflicted about unabashedly cheering someone else’s death-even if that person is arguably the most hated man of the 21st century. This contingent includes my brother, a member of the U.S. Army Reserves, who was twice deployed to Iraq.

I continue to believe that the greatest American patriots in the world are those who continue to question, and- where fitting- condemn, the loss of life as a necessary price of freedom and security, and who query our government about whether the loss of life abroad is a necessary precondition for maintaining American democracy.

In related news, an obituary for Osama Bin Laden in the NYTimes? A poignant statement in the city that lost the most at his hands.

Advertisements

October 15, 2010- http://www.gather.com/viewArticle.action?articleId=281474978605059

Oh goody, as the New York Times reported on October 10th, Twitter has finally come up with a plan to make money. Only, it’s the old new plan, which is to say it’s the same plan as everyone else.

As Twitter’s Evan Williams stepped down, to make room for Dick Costolo who previously headed Twitter’s advertising program as the new CEO, the tech industry remarked on how the shuffle represented Twitter’s increased new commitment to monetization.

As the New York Times reported, “Twitter’s startling growth — it has exploded to 160 million users, from three million, in the last two years — is reminiscent of Google and Facebook in their early days. Those Web sites are now must-buys for advertisers online, and the ad industry is watching Twitter closely to see if it continues to follow that path.”

But there still seems to be no real innovation in the advertising models of hi-tech companies from whom the world expects a great deal of innovation. Why are hi-tech social media and social news aggregation companies having such a hard time innovating with their monetization strategies?

At this point, each new social media platform that comes along seems to jump into the online advertising market that Google forged largely on its own. Now that Google did the heavy lifting on education and we all speak and understand the language of “click-thru rates,” “impressions,” and “search engine optimization,” newcomers like Twitter don’t have to pay or do very much in order to enter this monetization space. Coincidentally, it would seem that they aren’t doing very much at all to evolve it.

As a result, the whole online ad framework is falling flat, and after a few years of evangelizing for social media advertising and the use of new media platforms like Twitter and Hulu, are advertisers really making more money and seeing the benefits of these new media? It’s becoming an embarrassingly redundant question- “yes, we know we are creating funny and entertaining media for our consumers to enjoy, but is it actually increasing sales?”

Interestingly, at this year’s gathering of the Association of National Advertisers, as the New York Times reported, a survey at the beginning of the opening session found that “marketers may still need some schooling on the dos and don’ts of social media. Asked to describe how its use has affected sales, 13 percent replied that they did not use social media at all. (Eleven percent said sales had increased a lot, 34 percent said sales increased ‘some’ and 42 percent said they had seen no change.)”

It would seem that media analysts are continuing to approach social media and search as a given element of any marketing strategy without any hard evidence as to why every company needs to integrate social media into their market strategies. Instead, without the numbers to make the case, analysts and marketeers still discuss the virtues of earned media versus paid media, the value of eyeballs and impressions, and earned equity.

One of this year’s smashing social media success stories has a particular ability to make marketers foam at the mouth. 2010’s Proctor & Gamble “smell like a man” campaign for Old Spice helped increase the brand’s followers on Twitter by 2,700%, to where they “now total almost 120,000.”

Marc Pritchard, global marketing and chief branding officer at Proctor and Gamble had his moment in the sun for what was, undoubtedly, the most high-profile and successful example of how modern brands can use social media to promote their brands. But in the coverage of Pritchard’s talks, there is little to no mention of how the campaign is actually impacting the company’s bottom line. Instead, there is this: “The currency the campaign has earned in social media has pushed it into the popular culture. Mr. Pritchard showed the audience a spoof that was recently introduced by Sesame Workshop in which Grover suggests that his young viewers ‘smell like a monster on Sesame Street.’

But an internet meme does not a year over year increase in sales make. There is no mention of how an increase in followers on Twitter converts itself into a percentage increase in sales. It’s like an equation is missing, or somehow we have all misunderstood how to connect the dots. At the conference Joseph V. Tripodi, chief marketing and commercial officer for Coca Cola was interviewed, and his only contribution to this dilemma was to discuss how social media can sometimes save a company money on promotions through viral videos, “It cost less than $100,000 to produce the video, he added, demonstrating that “you don’t need huge amounts of money to engage with consumers.” However, savings on a marketing budget also do not a sales increase make.

Refreshingly, one of the conference’s keynote speakers, Mark Baynes, vice president and global chief marketing officer at the Kellogg Company, did acknowledge the missing link in the social media to profits equation by proclaiming, “In God we trust; the rest of you bring data.”


October 08, 2010- http://www.gather.com/viewArticle.action?articleId=281474978584382

The New York Times today posted a ReadWriteWeb story about Google’s recently launched contest to encourage young kids to begin learning to code “The Google Open Source Program is announcing a new outreach effort, aimed at 13- to 18-year-old students around the world. Google Code-in will operate in a similar fashion to Google’s Summer of Code, giving students the opportunity to work in open-source projects.” While this is great PR for Google, and an admirable program to boot, it’s also a fascinating example of how today’s largest and most successful companies are assuming a significant role in the training and formation of their future workforce in the U.S.

A couple of years ago a viral video which featured a flash animated presented titled “Did You Know?” made the rounds and introduced us to incredible factoids about the modern world that we live in. One of the information nuggets that stood out among the many others was ““the top 10 in-demand jobs in 2010 didn’t exist in 2004… We are currently preparing students for jobs that don’t yet exist… Using technologies that haven’t been invented… In order to solve problems we don’t even know are problems yet.” It was a startling, yet very believable statement, and one that many people have since cited.

A now-dated 2006 Forbes article addressed this fact and listed jobs that don’t yet exist but should be in high demand within 20 years, jobs that will disappear within 20 years, and jobs that will always exist. For example, some jobs that are expected to disappear are booksellers, car technicians, miners, cashiers, and encyclopedia writers (if they haven’t already). The presented jobs of the future were slightly ominous and depressing in a sort of sci-fi way, such as genetic screening technicians, quarantine enforcers, drowned city specialists (Atlantis, anyone?) robot mechanics and space tour guides. Lastly, those jobs that will always be around? Pretty self explanatory. Prostitution is always high on the list, as are politicians, religious leaders, barbers and artists.

However, if everyone can’t be a hair stylist, how do we prepare the world’s children for an entire generation of jobs we don’t even know about? Among educators, the prevailing sentiment is that the best we can do is to arm tomorrow’s kids with problem solving skills, critical thinking skills, and endless curiosity. However, since most teachers are dealing with a very archaic and traditionally designed curriculum, much of the responsibility of training and forming the world’s new thinkers may continue to fall upon the shoulders of the tech giants like Google, Facebook, Twitter, etc. It is much easier to consider what future skills will be needed when your entire survival as a company depends upon being able to look into a crystal technology ball and anticipate the future needs of an entire world.


September 22, 2010- http://www.gather.com/viewArticle.action?articleId=281474978539098

Hegel famously proclaimed that “history is a dialectic,” that is, a dialogue between people who may hold differing views, but who seek to accomplish a basis of truth by debating together. In other words, history has no discernible truth, but more closely attains the overall goal of “truth” through discussion from all of the voices of history and their personal accounts of what happened.

This quotation of Hegel’s is often cited in the context of discussions about the literary canon, or the “western canon,” as some refer to it. The term “Western canon” is used to denote the selection of books, music, art and general cultural that have most influenced the shape ofWestern civilization and culture over time.

As demonstrated, a simple search on Wikipedia for either of these terms will tell you much about what they are. However, Wikipedia doesn’t explicitly tell us is that it is also holding the record of how the modern canon is determined, and how the truth of history is being determined by the myriad of voices which contribute to it everyday.

A recent Bits blog from the New York Times mentioned the trail of edits that the Internet provides to anyone who is looking for it. James Bridle, founder of BookTwo is particularly interested in what the future of literature holds, but also how that discussion is playing out and how we can track where the discussion has been. In one of his recent entries Bridle points out that although an article on Wikipedia may tell a specific story, the edits show a process of opinion, correction, and the potential biases of each writer. In this respect Wikipedia, and every constantly updated website represents an archive of evolving information over time. What interests Bridle is the offer of two distinct stories: one that is front-facing to the reader and one that reveals the behind-the-scenes editing, writing and creative process.

To illustrate the point, Bridle selected the topic of the Iraq war as an entry in the Wikipedia canon and had all of the history of the entries surrounding the Iraq War published into physical volumes. In his entry, Bridle writes, “This particular book — or rather, set of books — is every edit made to a single Wikipedia article, The Iraq War, during the five years between the article’s inception in December 2004 and November 2009, a total of 12,000 changes and almost 7,000 pages.” Bridle notes that the entire set comes to twelve volumes, which nearly approximates the size of a traditional encyclopedia.

Which brings us to the favorite comparison of Wikipedia and your parents’ Encyclopedia. Is one or the other more reliable? Who gets to decide what is a part of the overall Western canon? Shouldn’t we all be alarmed by a process in which a child may be permitted to contribute to an online encyclopedia which many now claim is an expert source?

In fact, Bridle’s point reminds us of a standard strategy employed to defend the credibility of Wikipedia and its process against its would-be detractors. The strategy is to cite a story central to the process under which the Oxford English Dictionary was compiled in the 19th century. Simon Winchester’s book, The Professor and the Madman: A Tale of Murder, Insanity, and the Making of The Oxford English Dictionary details a Jekyll and Hyde story of the brilliant but clinically insane Dr. W.C. Minor who provided thousands of entries to the editors of the OED while he was committed at the Broadmoor Criminal Lunatic Asylum. In other words, if a mad man may contribute significantly to a tome of the English language which is still very much the authoritative text today, why can a perfectly sane pre-teen not contribute to the modern canon of information about frogs, Major League Baseball, or global warming?Should we be preventing anyone from contributing to the ever-evolving conversation about what is truth and what is history?

As sites such as Twournal –which offers the Narcissistic boon of publishing your very own Tweets through time in print form– begin to proliferate, each of us can possess our very own piece of the modern web canon, whether in print or online. As Twournal describes itself, “Over time tweets can tell a story or remind us of moments. In 20 years we don’t know whether twitter will be around but your Twournal will be. Who knows maybe your great grandkids will dig it up in the attic in the next century.”

That means that each of us now has access to print a credible-looking book of our own (often misspelled) musings and meanderings as representative of history, according to us. Yet in the absence of a forum in which people can engage with our Tweeted observations, there’s no real dialectic. It therefore seems safe to conclude that Hegel would have preferred Wikipedia to Twitter, or to your Twournal.