Question: When is the new media not the new media?
Answer: When it is the new new media.
That’s both good news and bad news: It’s good if you’re media-savvy and eager to push the envelope; it’s bad if you are still are having trouble grasping the basics of e-mail and Microsoft Office.
Either way, we don’t have much time to dawdle if we want to keep up with the changes.
But does doubling the adjective of new media mean we’re just dealing in doubletalk? My friend Chuck who teaches at Boston’s Northeastern University, would say no. Last August, he expressed a frustration to me when he asked, “Why do people insist on calling computers and the Net ‘new’ media? I mean, they’ve been around for a couple decades now.”
Chuck isn’t alone in his thinking. A Fordham professor and author, Paul Levinson, has just written a new book called, New New Media (http://blogcritics.org/books/article/book-review-new-new-media-by/) to try and distinguish between the second layer of media that the Web 2.0 age has thrust upon us. Here’s how he distinguishes between the different media categories:
“One of the defining characteristics of new media – clear since they arose over a decade ago in the mid-1990s – is that people can use, enjoy and benefit from them on the user’s rather than the medium’s timetable, once the content has been posted online
“… New new media give its users the same control of when and where to get text, sound, and audio-visual content as provided by the new media. Indeed, new new media package all the advantages that new media provide over old media. But new new media do more … the true or fully empowered new new media user has the option of producing content and consuming content produced by hundreds of millions of other new new media consumer-producers.”
Many people refer to these new new media simply as the social media that include blogs, Facebook, MySpace, Twitter, and the like. Levinson agrees the new new media are intrinsically social because of the way individuals use them to meet, greet, and communicate with each other.
But he notes, “The social aspect of new new media, though crucial, is not unique enough to new new media to warrant our use of the terms interchangeably … other primary elements of new new media – such as the consumer becoming a producer – can be easily practiced individually, not socially, as in writing a blog post or recording a podcast.”
Among Levinson’s categories of new new media are the following:
Print, Audio, Audio-Visual, Photographic. These elements play a role in all new new media
News. Says Levinson, “News pertains to the purpose, not the media form, of the new new medium … Wikipedia, Digg, and blogging in general would be the leading examples of new new media whose primary purpose is to inform.”
Social Media. Again, while all new media are “social” to some extent, media like Facebook, MySpace, and Twitter are meant primarily to connect people to each other. So they become leading examples of the social media.
General vs. Specific Systems. Specific media systems devote themselves to just one application such as videos (YouTube), information (Wiki), or news headlines and links (Digg). General media systems have more than one application such as podcasting, blogging, vidcasting, and connecting people.
Politics and Entertainment. President Barack Obama became the first presidential candidate to understand fully the power of the new media, and the way his campaign staff used his website, e-mails, Facebook, and Twitter to connect with young voters set the standard for future candidates. Entertainers can and do accomplish the same thing to further their own promotion and careers.
Blogging and Microblogging. Blogs have become an important part of the new new media, and Levinson distinguishes between the short-burst blogs (microblogs) of Twitter and Facebook status updates, as opposed to the longer, more fully developed blogs – like the one you are reading now. He notes that the microblogs are more alluring to individuals engaging in personal chat on the Web than the longer blogs.
There are other categories (such as those designated by the hardware or software bringing us these media such as smart phones vs. laptops), but this sampling gives you an idea of the range of purposes, applications, and uses that the newer media are providing us today. It will be interesting to see how these and other media forms develop in the year 2010.
It was New Year’s Eve 1999, and no one needed to tell me that new media technology was changing my life. I had a good first-hand indication of that by simply looking around my university office and noting what a strange place to spend the year’s best party night.
I was in Memphis at the time and, instead of joining the revelers down on Beale Street, I was nursing my computer through the dawn of Y2K and the cyber-War-of-the-Worlds invaders that could be poised to strike its hard drive.
What a waste of time.
No invaders struck that night; not even the building’s cleaning crew who were probably down on Beale Street enjoying B.B. King.
What did I know, though, on this first night of what was to be the decade when virtual reality changed our daily — and nightly — rituals.
I wasn’t the only one wasting that night away, though. People a lot smarter than me thought something huge and scary would happen this night, and a lot of people were pullilng overtime to nurse their computer systems through the night.
Even the White House had spent some $50 million of our tax dollars to set up a crisis room called the Y2K Center, just to be on the safe side. Our money might better have been spent on a project like the federal Office of Education once rolled out when it spent $220,000 for a “curriculum package” to show college students how to watch television.
Our foresight seldom matches our hindsight, however, and while we misjudged the Y2K threat, we just flat didn’t see other media tech changes coming. How’s this for a partial list of just three changes we’ve witnessed in this first decade alone, and of how these inventions have altered our lives:
Take Me to Your Tweeter. While some of us were getting our toes wet in the virtual world of relationships that chatrooms and online dating sites offered by 1999, we were only like the Pilgrims standing on the easternmost shore of a vast, unexplored continent. Up ahead were the vistas to be opened up by trail guides to be called MySpace (launched in 2003), Facebook (2004), and Twitter (2006). Individually, and together, these three sites have given new meaning to the late Marshall McLuhan’s provocatative observation from the 1970s that human beings go outside to be alone and stay inside to be with others. McLuhan was thinking about the amount of time Americans spend with television, but he realized that the “others” weren’t necessarily real flesh-and-blood people you could reach out and touch. We’ll return to this concept in detail in future posts.
The Sound of Music. The culture of the Internet is not only openness but also immediate satisfaction: There is so much we can get (often for free) almost the minute we recognize our desire for it. I’m thinking music here, and I’m remembering the frustration of losing albums, tapes, and CD’s containing favorite songs of earlier times. I’m also thinking about the challenge of searching for those songs in old record stores and the utter joy of stumbling across them when I least expected it. That frustration — as well as that joy — are a part of an age where music downloads over the Web didn’t exist.
It was before Shawn Fanning and Napster; before Limewire, Kazaa, I-Tunes, and Bearshare, and YouTube. Apple would invent its first I-pod in 2001, other manufacturers would develop myraid other Mp3 players, and today we have our favorite music when and where we want it. For music-lovers like me, that’s a definite improvement. But I do miss those unexpected finds in the corner record bins of those vintage record stores.
It’s a Small, Small World. As 2000 dawned, most of America was still using desktop computers. Laptops like IBM’s Thinkpad were out there, but they were expensive and the learning curve hadn’t permeated the country like it soon would. Before the decade was over, we had not only switched largely from desktops to laptops, but we had also downsized from laptops to hand-held computer/phone/PDAs like BlackBerrys and the iPhone. Now the Net goes with us, and we can pull it out of our pocket virtually anytime we like. Wi-Fi has enabled that, freeing us from the Ethernet cable and that archaic phone cord that I was still looping from room to room in my dial-up days as late as 2004. All this mobility means we can now isolate ourselves from real people anywhere we are. I suppose the ultiamte tech-savy date today has a guy and a girl sitting across a table from each other at the Cheesecake Factory, mute to each other but tweeting another soulmate via their Blackberry or iPhone.
I’m not sure what Marshall McLuhan would make of that. Would this be a virtual date or a real one?
One of my favorite scenes from the holiday classic, A Christmas Story, is when ”the old man” and Mrs. Parker take Ralphie and his younger brother Randy to the big crowded Higbee’s department store in downtown Terre Haute (Cleveland in the real world) to visit a bad Santa and his elf who wind up terrorizing both boys.
Still, there are days when one of my least favorite scenes is actually inserting myself into one of those big holiday shopping crowds. On those days, I stay home and shop online.
But I may be fooling myself in thinking I guard my privacy more by shopping at home rather than going out to the stores. More on that later.
A lot of us are letting our fingers do the walking if the records kept of online shopping activity are accurate and if the smaller mall crowds the past couple years are any indication. Of course the recession helps create a larger diameter of personal space at those malls, too. One hopes, for the sake of the economy, that this is a temporary downturn.
Many still like to shop the old-fashioned way, and it does help a person’s holiday spirit to get out with others who are engaging in the selfless activity of giving to others. All the Christmas trim, music, and even the bad Santas add to that merriment. More of us are even finding it alluring to climb out of bed at 4:30 a.m. the day after Thanksgiving to at least go see what Black Friday is all about down at the discount store.
Still … comScore, a publicly-traded company that measures digital shopping activity reports that online shopping continues rise in popularity this holiday season. A Dec. 8 press release from comSore notes, up to Dec. 13, nearly $20 billion has been spent online, marking a 3-percent increase versus the corresponding days last year.
And the American Research Group has noted that those planning to shop online in 2009 (42%) have surpassed those planning to shop from catalogs (36%) for the first time since these records were kept.
The sales stats for online shopping are still just a fraction of the stats for the retail stores, as retail shoppers spent $41 billion just on the Black Friday weekend alone this season, according to the National Retail Federation. But $20 billion in online sales shows that the virtual world of the Web is affecting the way we spend our money. Add that effect to the myriad of other ways that computers and the Web are changing our lives. Like other changes, this one is a mixed bag. We can shop in a private environment, but ironically we give up privacy in the process.
Before getting to that touchy problem, it is fascinating to see how an online shopping conglomerate like Amazon.com, for example, works. The company that has been the subject of several stories about the long hours that employees must work is also the story of an innovator in e-commerce retailing.
In 2006, writer Julia Layton described Amazon’s growth and process in the online site, How stuff works (http://money.howstuffworks.com/amazon) She wrote, “In 1995, Amazon.com sold its first book, which shipped from Jeff Bezos’ garage in Seattle. In 2006, Amazon.com sells a lot more than books and has sites serving seven countries, with 21 fulfillment centers around the globe totaling more than 9 million square feet of warehouse space.” And if you’ve ever wondered how Amazon and other e-retailers like it just seem to know the kind of stuff you’re interested in, Layton describes that, too:
“The embedded marketing techniques that Amazon employs to personalize your experience are probably the best example of the company’s overall approach to sales: Know your customer very, very well. Customer tracking is an Amazon stronghold. If you let the Web site stick a cookie on your hard drive, you’ll find yourself on the receiving end of all sorts of useful features that make your shopping experience pretty cool, like recommendations based on past purchases and lists of reviews and guides written by users who purchased the products you’re looking at.”
That’s nice, and it gives you the feel a company really does know you. But it also has its downside: further erosion of individual privacy, which will be the subject of future Virtual Unknown posts. It is much easier for a company to track our online shopping habits (thereby developing a “profile” of us which we used to think was private until we self-disclosed it), than for an actual retail store to do the same. Unless you offer up your e-mail address, Zip code, or phone number to the sales clerk, there’s no reason to assume the store will know anything about you at all.
I used to have a professor at the University of MIssouri who wondered if there might come a day when television (PCs didn’t factor in then) would be watching us more than we watch it. That day has arrived, although it is the computer and not the TV screen that is doing the watching. You could make the case that online retailers like Amazon may know us and our consumer desires better than we know ourselves.
So whether we choose to do our shopping online or by visiting the physical stores and mixing it up with the crowds — and that one particular person who may be reaching for the last George Foreman grill that you wanted — we are facing facing tradeoffs either way. It may be nice to do shopping from the privacy of our own homes, but we may be giving up that privacy to the online marketers who are gobbling up information about us even as we make our buys.
The witty CBS newsman Charles Osgood once noted, “The future is not what it used to be.”
Although he wasn’t speaking about computers, he well might have been. For in the world of communication technology, application runs at a slower speed than innovation. We never know what uses the world will make of new inventions.
I confronted my first computer at Northeastern University in Boston back in 1983. It was a large IBM desktop that our department was given to play around with. Trouble is, no one knew how to play. I had heard computers could take you to a realm known as cyberspace, so I sat down, punched the monitor’s ”on button,” and braced myself to be swept away.
But the black screen with a little white cursor just blinked back at me. I had no idea you needed something called “software” to make this box come alive, or that you had to connect it to the phone wire to let it take you anywhere. I was applying television logic to a computer which took that logic a few steps further. What did I know.
So I sat there for more than a half-hour, expecting magic to happen. It never did, I turned it off and gave up. The computer started collecting dust , and we eventually wound up using it as a shelf to stack books on. That was our application.
In 1985 I bought my first computer. It was one those boxy Macs (called by the full Macintosh name then) with the 8-inch b&w monitor, hard drive (all of 128K random access memory), and processor all in the same cube. It had exactly two applications (MacWrite and MacPaint), but you could plug a land line into it if you dared to navigate the infant Internet then without the help of user-friendly browsers.
The price tag was about $2,000 for what many considered just a step up from an electronic typewriter, and which weighed about the same.
Still, this cube was destined to change the face of personal computing forever. Not only did it inspire the many Macs to come, it also inspired Microsoft Windows software. You can still see the early-day Macintosh in Jerry’s apartment if you are a Seinfeld addict as I am.
So here we are, 25 years later and things have changed just a wee bit, no? Today the average selling price of a desktop is under $600. And the 128,000 bytes of storage has morphed into billions on today’s computers and at a fraction of the cost and size. That is especially so when you factor in laptops and the even-smaller notebooks. If you bring “smart phones” into the picture, you can stuff all this into your shirt pocket for a couple hundred bucks.
All of these upgrades, including lightning-fast processors and a mountain range of software, are today connected to a sophisticated Internet accessible through unbelievably user-friendly browsers like Explorer and Netscape. The computer applications have mushroomed geometrically from the early Macwrite and Macpaint to a myriad of uses in both office and home.
The creative ways we’ve found to use computers directly parallels the creative uses that radio was put to, way back in the 20th century. Remember that millenium? Radio was created to be a point-to-point medium linking one ham operator to another or one ship to one port. The idea of using it for broadcasting was unknown because radio pioneers were too busy inventing broadcasting. The same hold true for computers.
Maybe a personal story tells it best: I bought that Mac back in ’85 mainly to create and store words. Fifteen years later I used a computer to find and meet the woman who would become my wife.
The one use all of us make of today’s computers is, of course, surfing the Internet. In its short time of existence, the Web has grown from mainly an online research tool to an essential means of meeting and interacting other people, sharing information about ourselves in the process. It’s the concept known familiarly as “social networking,” and the phenomenon is growing faster than our ability to think up words and phrases to describe it. It has affected all of us in very basic and expansive ways.
In short, the computer and the Web have changed our lives. More of those changes are waiting anxiously in the wings. It is a virtual world we visit when we go to the Web, and it is a world we spend more of our time inhabiting. In so doing, we pull ourselves away from the real flesh-and-blood people – including our wives or husbands, kids and friends – who may be sitting right across the room from us.
Anyone who has done any internet dating knows that we can and often do create an alter ego of ourselves in this virtual world. The persona we create may not be a lie, but it is something of a fantasy, both about the person we’re getting to know as well as about ourselves. We emphasize the positive, confuse their and our ideal and real selves to make us both sound just a touch more intriguing.
The fact that this kind of identity theft occurs at all is interesting and is aided by spending so much time in a virtual world where the real and the unreal appear synonymous. For Baby Boomers this confusion may have begun in the 1950s with Fess Parker’s version of Davy Crockett. But today’s world of CGI (computer generated imagery) movies and the virtual reality of the worlds of Facebook, Twitter, and MySpace have made it harder to tell the real from the unreal.
Even the idea that we come to “know” another person online is something of a fantasy. Why? Because most of the all-important nonverbal cues are missing, even in the age of Skype and webcams because personas can be propped up for awhile on TV, too. The only way real non-verbal communication takes place is the old-fashioned way of face-to-face, real-life human contact.
We know some of the opportunities and dangers that await us in virtual reality, but many remain unknown. Charting these known and unknown waters is what A Virtual Unknown is all about. In the posts to come we will look at some of the impact that the digital world has thrust upon us, some of the changes in technology that are translating into changes in how we live our lives and relate to others, and do some futurizing on what changes lie ahead.
I look forward to this journey of discovery and I would love to hear your thoughts and stories about it, too.