“… it’s cloud illusions I recall; I really don’t know clouds at all.”
The great Joni Mitchell wrote these lyrics for “Both Sides Now” decades ago, and they are truer now than then. Especially when you apply them to computers, which didn’t even exist when Joni put pen to paper.
Perhaps she had a premonition?
I’ve mentioned cloud computing before in this blog, but let’s go over it’s definition again for those of you not under 25 or members of Best Buy’s Geek Squad.
Cloud computing allows users to access their local server resources using a computer, netbook, pad computer, smart phone, or other device anywhere, anytime. In cloud computing, applications are provided and managed by the cloud server and data is also stored remotely in the cloud configuration. Users do not download and install applications on their own device or computer; all processing and storage is maintained by the cloud server. The information is stored online instead of on a device. The on-line services may be offered from a cloud provider or or by a private organization or company.
A familiar face
Enter (who else?) that ubiquitous company known by its signature fruit: Apple.
In case you haven’t heard this ancient news — announced last week — Apple unveiled its iCloud service which will offer remote, wireless updates of music, photos, apps, and other data for iPhones, iPads, iPod Touches and computers, The company refers to this as “PC-free.”
USA Today writer Jefferson Graham noted recently that companies like Google and Amazon have been working on the “cloud” always-on computer application that nests on internet servers. But Apple has taken this a step freer, offering the same service for wireless-device users anywhere.
Leader of the pack
“The iCloud service, which will launch in the fall replaces Apple’s failed $99 yearly MobileMe service, which is no longer accepting customers,” Graham wrote. “Reaction was swift: Apple’s move and its soon-to-open $500 million new data center in North Carolina, puts it in a leadership position, analysts say.”
The service was demonstrated recently at Apple’s Worldwide Developers Conference, that mega-event for digital mavens held annually in San Francisco. It’s the favorite venue for unveiling of new Apple products and services. In the demo, Apple VP Eddy Cue shot a photo on an iPhone. He next opened an iPad and the iPhoto software on a MacBook (convenient these are all Apple products, no?) and the photograph popped up on the screen in a few seconds.
Back it up
If that isn’t enough, Apple also notes that iCloud can be used as a back-up device for all your data.
“If you get a new iPhone, just type in your Apple iD and password, and everything will be downloaded to the new phone,” explained Apple CEO Steve Jobs.
Apple’s website simply notes,”iCloud stores your content and wirelessly pushes it to all your devices. And because it seamlessly integrates with your apps. everything happens automatically.”
Launching in fall
The iCloud service, which comes as part of the iOS 5 mobile operating system from Apple, is due to launch in the fall and will contain 5G of storage space. Apple says it has added over 200 new features to the updated system.
Tech writer Phil Goldenstein has probed the impact that iCloud will have on 3G networks, wondering if it will crush them. His answer: probably not.
“According to CCS Insight analyst John Jackson, Apple must have concluded that users of their products have access to Wi-Fi networks with sufficient regularity that the service will be broadly accessible,” Goldenstein writes. “But what happens if a user doesn’t have access to a Wi-Fi hotspot? Will traffic get routed over the cellular network? Or will the cloud upload just be put on hold until users get in range of a Wi-Fi access point? Apple isn’t saying.”
I think I just saw my cell phone bill increase. What else is new?
My grandfather was a Presbyterian minister who lived in the pastoral town of Aurora, Missouri, with his wife Janey. It’s a quaint little farming community (think Mayberry) in an area of the state now famous for the chaos of Nashville Lite, better known as Branson.
My dad would take us to visit Grandpa and Grandma once or twice a year, as our family boarded a Frisco rail car out of Oklahoma City that stopped in Aurora on its eastern run. If you have never experienced a night journey on a passenger train, you’ve missed something special.
Grandpa was one of the links in the chain of influence that led me into writing. Specifically, it was the sight of him and Grandma Janey sitting in their study at their big, facing mahogany desks and writing letters or sermons that stuck in my memory.
A quiet focus
Occasionally catching the other’s eye over their twin Royal manual typewriters, they were surrounded by rows of antique, glass-doored bookcases. There I was first introduced to James Fenimore Cooper’s books, like The Last of the Mohicans, and I found the whole experience of the books, Grandpa, Grandma, the desks, typewriters, and bookcases to be riveting.
Quiet reflections taking place in a cozy setting overlooking a vegetable garden on a sunny afternoon or moonlit evening.
As I retrace those memories tonight, I realize my wife Anne and I are replicating that pastoral setting of Grandpa’s study, but we’re doing it 21st Century-style in our living room, writing and occasionally catching each other’s glance – over our outstretched legs on which sit digital laptops.
No more Royals, just an HP and a Gateway instead. No more clacking of the cast iron key faces striking the paper and roller, just the nearly inaudible sound of our laptop keys striking … nothing. Not much need for rows of bookcases when you have the vast resources of the Internet at your command and resting upon your lap.
The magic of Grandpa’s study is gone.
The quandary of multitasking
Up until a few minutes ago, Anne and I were doing our laptop work while catching glances and while watching 20/20 on TV. All of us today know this scene as multitasking, and we’re getting pretty adept at it. The ability to do two or three things at once has become vitally important to many sojourners in the world of the virtual unknown.
In fact, I was leafing through an alumni magazine of Syracuse University’s Newhouse School of Journalism today and found an article alerting readers to “25 Newhouse Alums to Watch,” as their careers are apparently soaring. In an interview with one of these 25 best and brightest, the young grad was asked what was the most valuable lesson that she learned in the journalism school.
Her answer was multitasking.
Back to the scene tonight in my living room. I am reminded that there are times with Anne that multitasking is not such a great idea. It only works well when:
- Both of us are doing it at the same time.
- Anne doesn’t want to engage me in conversation.
If she wants to talk and I want to multitask (ie. write and converse at the same time), things get dicey. Anne is big on eye-contact, and I haven’t yet learned how to train one eye on the screen while having the other wander over to her.
It’s at this point where Anne disagrees with me, however.
She believes she is the one who can multitask better than I, finding me too focused on the laptop itself to engage her. It’s a touchy debate, but I can see where we’re both right. Her idea of multitasking is to stop typing for a moment while she engages me in conversation; mine is to do both at the same time. I can see, though, how I’m sometimes less than convincing that I’m paying her as much attention as the computer.
In this, I doubt she and I are much different from other husbands and wives. I wonder how many arguments have erupted over multitasking? Maybe it isn’t so different as when our dads were reading the newspaper at the breakfast table while, on the other side of the large printed page, sat a frustrated wife.
It’s the principle
One might think — since the laptop screen is much smaller, hence the spouse so easy to see — the problem would be solved. Alas, such is not the case. I assume that the friction would even develop if the multitasking involved something as small as a Droid or i-Phone screen.
The principle is the same for the one insisting on eye contact: Without it, kiss the chat goodbye. And while you’re at it, kiss the good-night kiss goodbye.
Real interpersonal conversation remains old-fashioned. For best effect it requires the ability of each person to single-task. That’s not easy to do in this post-modern world.
Too many times we equate single-tasking with inefficiency.
Man in a hurry
If you’re a fan of TV Land’s ubiquitous The Andy Griffith Show, you may recall an episode called Man in a Hurry where a cigar-puffing businessman, barreling his way to Raleigh, is delayed
when his car breaks down in Mayberry on a Sunday and Gomer is asked to fix it quickly.
The multitasking executive is exasperated to think he has to endure a lazy afternoon on Andy’s front porch with Andy, Barney, Opie, and Aunt Bee instead of getting on with his important business in the city.
Exasperated, that is, until he falls under the charm of being rather than doing. He finds, in a 1960s way, that being requires the ease of single-tasking even if that task is simply enjoying a simple moment of simplicity.
The magic moment
When I see that episode, I think of my grandparents in their study where the most important time in the world to them seemed to be the moment they were in. And then I think about how far we’ve come from that; how we think the moment is wasted if we aren’t multitasking.
I know we can’t turn back the clock, but it would be nice just to turn back the laptop occasionally; leave it home on weekends or while we’re on vacation, and spend a little time just enjoying the beauty of the moment engaging family and friends.
I think we can do it if we can convince ourselves that, as smart as a computer is, we can be even smarter. At least as smart as my grandparents were back in Aurora.