Last week I caught the third and final Jeopardy episode where IBM’s supercomputer “Watson” took on the best of the show’s best contestants – Ken Jennings and Brad Rutter – and pummeled them into bits and bytes.
Fun stuff, but also sobering. It left me with the same uneasy feeling I had after first hearing Johnny Cash sing years ago about rail-splitter John Henry racing the steam-driven, spike-driving machine — and losing.
I’ll always pull for the human over the humanoid.
Experts hailed Watson’s decisive win ($77,147 vs. a paltry $24,000 for Jennings and $21,600 for Rutter) as a technological breakthrough in the race between artificial intelligence and the real deal, and I suppose it was.
Even Jennings, the all-time Jeopardy champ with winnings into the millions, expressed his awe of Watson saying, “I for one welcome our new computer overlords.”
A humbling statement for Jennings and, in a larger vein, for the human race. Will we get to the point where computers can out-think us mere mortals? Will we arrive at a day when computers will not need humans to input data? Can they originate their own?
Could Watson invent a Watson? Pradeep Khosla, dean of engineering at Carnegie Mellon University, says no; at least not yet. Dr. Khosla believes it is the human ability to create that separates us from computers, and that we are not in danger of losing that unique ability to a computer any time soon.
But others say the separation between man and machine exists on other levels, too.
More than creativity
In an article they wrote, Seth Borenstein and Jordan Robertson of the Associated Press note, “Experts in the field say it is more than the spark of creation that separates man from his mechanical spawn. It is the pride creators can take, the empathy we can all have with the winners and losers, and that magical mix of adrenaline, fear and ability that kicks in when our backs are against the wall and we are in survival mode.”
Time Magazine did an interesting cover story in its Feb. 21 edition. Called, “2045: The Year Man Becomes Immortal, writer Lev Grossman quoted experts in saying we are only a few decades from that point where computers will become more intelligent than humans.
35 years and counting
Grossman quotes author/inventor/futurist Raymond Kurzweil in particular, and writes this: “According to his calculations, the end of human civilization as we know it is about 35 years away. Computers are getting faster …Also, (they) are getting faster, faster …. There might conceivably come a moment when they are capable of something comparable to human intelligence. Artificial intelligence.”
And when they hit that point, there is no reason to suspect that will not stop getting even faster and continue growing in intelligence, the prediction goes.
Time adds another voice to this prophecy by quoting author Vernor Vinge who says, “Within 30 years, we will have the means to create superhman intelligence. Shortly after, the human era will be ended.”
It is just me, or is it hard to get excited over that?
These guys and others wrap this phenomenon up in a single word, and that word is – fittingly – Singularity. It means the moment when technological change becomes so rapid and profound, it represents a rupture in the fabric of human history.
Exciting or scary?
Again, unless you live your life in a computer lab, it seems hard to get excited about all this. Other words come to mind first. Words like depressed and maybe even fearful.
This dates me and sounds pretty low-tech by today’s standards, but anyone remember HAL from Stanley Kubrick’s 1968 sci-fi classic, 2001? When the astronaut decides to take the arrogant
computer down a peg, HAL asks in his sinister voice, “Just what do you think you’re doing, Dave? I really think I’m entitled to an answer to that question. I don’t think I’ll let you do that, Dave.”
In the metaphysical realm, you also have some obvious differences between man and machine. A lot of differences, starting with the existence of the human soul.
Could a computer somehow generate that? I know some auto enthusiasts who could swear their prized car has a soul, but are we all on the same page in defining what that is? And is it possible for a metallic box held together with screws to develop any kind of entity approaching a soul?
HAL came close to evidencing a moral – amoral would be more accurate – side, but we’re talking movies here and we’re talking a 53-year-old one at that predicting what the future would look like 10 years ago. Oops. Got that one wrong.
As for Watson, the concept of progress is defined in different ways by different people. Both here and in the virtual unknown.
For anyone with a vested interest in the Internet, these are perplexing times.
They are troubling for a lot of reasons, but many of them have to do with the culture of total openness on the World Wide Web. The idea is to allow everyone to put everything out there for everyone to see, letting that information and images do whatever good or damage they may.
But an overarching question is this: How much access do Americans need to information which – if revealed –could cause some serious problems?
Problems on the doorstep
We talking both micro-level problems as well as macro-level problems, and they range from individual humiliation to national security threats.
On one level we have teens committing suicide over unwanted personal disclosures tossed out like birdseed in the social media or via texts. On the other level we have federally classified secrets being leaked at random via a site with that word in its title: WikiLeaks.
This blog has spoken on three occasions about the micro-level problem, so let’s talk a few minutes about that other one.
From the WikiLeaks website, we get this introduction to what it is all about:
“WikiLeaks is a non-profit media organization dedicated to bringing important news and information to the public. We provide an innovative, secure and anonymous way for independent sources around the world to leak information to our journalists.
“We publish material of ethical, political and historical significance while keeping the identity of our sources anonymous, thus providing a universal way for the revealing of suppressed and censored injustices.”
Sounds pretty good, no?
Its founder is Julian Paul Assange, an Australian journalist, turned software developer, turned (according to his own site) internet activist. He created WikiLeaks in 2006 and is editor-in-chief of this whistleblower web site.
Since it began, WikiLeaks has been praised by some, deemed controversial by others, and condemned as traitorous by still others. In its short five years of existence it has published sensitive material about Guantanamo Bay practices and policies, Church of Scientology manuals, and – most recently – classified information about American involvement in the Iraq and Afghanistan wars. Even more recently, it has revealed contents of secret U.S. diplomatic cables, many of which were deemed classified.
Praise and prosecution
On its home page, WikiLeaks quotes Time Magazine as saying, “(WikiLeaks) Could become as important a journalistic tool as the Freedom of Information Act.” Assange himself has been recognized for his efforts by Amnesty International and was runner-up to Time Magazine’s Person of the year in 2010.
But Assange also has some big problems. He has been charged by Swedish authorities with sexual misconduct and is being detained by British police. He is under house arrest at his estate in England, pending possible extradition to Sweden.
A couple weeks ago, he was the focus of a 60 Minutes segment, and he believes he is being targeted by governments and their prosecutors because he allows secret information to be leaked over his site.
Apart from his personal legal problems, however, is the broader issue of WikiLeaks. What it is doing, and whether that is a healthy or unhealthy thing for the world. And that is an issue that could be debated well into the next decade (and may well be so, should the U.S. decide to prosecute Assange under the almost hundred year-old Espionage Act.)
To the credit of WikiLeaks, no one doubts that people living in democracies need access to accurate and timely information if they are to play a meaningful role in the democratic process. That logic goes all the way back to Thomas Jefferson, if not before.
And no one doubts that whistleblowers who uncover dangerous, illegal, or corrupt practices should have protection from retaliation. Remember Dr. Jeffrey Wigand who exposed the practices by the big tobacco companies in the 1990s of making cigarettes more addictive through a secret ammonia-boosting process?
Further, we have seen over the past two weeks how unaware the West was about conditions that led to a people’s revolution in Egypt. One of the main reasons we didn’t know about it was that there were few foreign correspondents there to tell us about what was happening.
Why? Because media owners and managers have decimated the ranks of reporters, especially those covering international stories.
Bottom line: Without those boots on the ground discovering stories like that, how are we to know?
Okay, I’ll go ahead and say the obvious: “If a tree falls in the wilderness and no one is there to hear it, does it make a sound?”
Egypt made no sound for us, because we had no one there to hear it.
Filling a big hole
Wikileaks can also help fill the gap left by investigative reporters who have been cut from newspapers and television stations. We’ve been living in some pretty perilous times without having many of these watchdogs guarding the premises.
Without them, the climate is more open for wrongdoers in business and government to practice corruption. But knowing sites like WikiLeaks can burst their secrecy bubble might make them behave just a tad better.
These are holes that WikiLeaks can and does fill. But does it also create other holes?
National security threat?
It is punching holes in American national security? Should there be at least a few limits to the kinds of classified documents that are published? Shouldn’t we assume that the government has at least some need to guard state secrets, the revelation of which could compromise its peoples’ security?
The culture of the Internet is, again, one of openness. Very few controls exist on content published on the Web, and few leaders of governmental agencies relish the idea of being criticized for trying to establish information controls.
Deregulation is trend
Even the FCC has been in a deregulatory mode since the Reagan years, backing off on regulating television, let alone the Internet. In fact, it has no mandate to control Internet content since the Web doesn’t come to us over the public airwaves.
But a culture of total openness exists within an American society where freedoms are not absolute nor limitless. We have laws regarding invasion of privacy and we have laws regarding libel.
Wrongful death claims coming?
And it may only be a matter of time before wrongful death charges are filed against individuals who leak humiliating information about other individuals who turn around and hang themselves in their bedrooms because of it.
And, on that macro level, what happens if documents do get leaked that do have the power of compromising national security?
Against that reality stands the Internet and WikiLeaks. In a post-9/11 world, it’s not surprising that many people are now thinking some limits should exist on what shows up on the Web. As always, though, the questions are who will regulate that conent, and how do we keep politics out of it?