<$BlogRSDUrl$>

Monday, September 22, 2003

The Internet Reborn 

The Internet, as we know it, is an old network with old standards and protocols. Old in Internet time, i.e. Technology Review is carrying a great story about how researchers backed by big technology firms are dreaming it all up anew -- a faster, more secure and smarter network tentatively called Planet Lab. Planet Lab, if all goes well, will enable you to --

1. Forget about hauling your laptop around. No matter where you go, you’ll be able to instantly recreate your entire private computer workspace, program for program and document for document, on any Internet terminal
2. Escape the disruption caused by Internet worms and viruses—which inflicted an average of $81,000 in repair costs per company per incident in 2002—because the network itself will detect and crush rogue data packets before they get a chance to spread to your office or home
3. Instantly retrieve video and other bandwidth-hogging data, no matter how many other users are competing for the same resources
4. Archive your tax returns, digital photographs, family videos, and all your other data across the Internet itself, securely and indestructibly, for decades, making hard disks and recordable CDs seem as quaint as 78 RPM records.


According to the article, some of these features are in beta stage already. I personally am very intrigued by these ideas and see a lot of business models evolving from these features. For example, I dont think anyone would grudge paying a fee to replicate one's machine across the network, provided one's security and privacy weren't compromised.

OceanStore encrypts files—whether memos or other documents, financial records, or digital photos, music, or video clips—then breaks them into overlapping fragments. The system continually moves the fragments and replicates them on nodes around the planet. The original file can be reconstituted from just a subset of the fragments, so it’s virtually indestructible, even if a number of local nodes fail. PlanetLab nodes currently have enough memory to let a few hundred people store their records on OceanStore, says Kubiatowicz. Eventually, millions of nodes would be required to store everyone’s data. Kubiatowicz’s goal is to produce software capable of managing 100 trillion files, or 10,000 files for each of 10 billion people. To keep track of distributed data, OceanStore assigns the fragments of each particular file their own ID code—a very long number called the Globally Unique Identifier. When a file’s owner wants to retrieve the file, her computer tells a node running OceanStore to search for the nearest copies of fragments with the right ID and reassemble them.

Though a trifle long, I would highly recommend this article to anyone even moderately interested in the future of the Net and our interaction with it.