Network Security Illustrated buy the book at Amazon now!


search site
Table of Contents

Book
Introduction

Managing
Security

Outsourcing
Options

Reserving
Rights

Determining
Identity

Preserving
Privacy

Connecting
Networks

Hardening
Networks

Storing
Information

Hiding
Information

Accessing
Information

Ensuring
Availability

Detecting
Intrusions

Page Tools
print this pagePrint this Page
email this pageEmail this Page
notify me when this section changesWatch for Updates
send us feedbackSend us Feedback
Peer to Peer
Networking
Chapter List
Client-Server Architecture
Internet Services
Remote Access
File Transfer (web bonus)
Streaming Services (web bonus)
Peer-to-Peer Networking
Middleware/Application Services (web bonus)
More Information
Resources (links)
Discussions
FAQs
Errata
Sample Pages
Buy The Book
at amazon.com
Peer-to-peer (P2P) computing is the sharing of computer resources and services by direct exchange between systems. These resources and services include the exchange of information, processing cycles, cache storage, and disk storage for files. Peer-to-peer computing takes advantage of existing desktop computing power and networking connectivity, allowing economical clients to leverage their collective power to benefit the entire enterprise.

The peer-to-peer method of computing has been around for decades. In recent years however, a certain aspect of P2P computing has reached the mainstream. The reason that P2P has come on so strong recently is simple. Currently a bevy of inexpensive computing power, storage, and bandwidth is available. All this cheap technology is also in the hands of the masses. One company in particular deserves credit for popularizing P2P: Napster. However, modern P2P is a very recent implementation of a very old concept. As many as 30 years ago, companies were working on architectures
that would now be labeled P2P.

In a peer-to-peer architecture, computers that have traditionally been used solely as clients communicate directly among themselves and can act as both clients and servers. P2P nodes, as they can be called, will assume whatever role is most efficient for the network at any given time. This is possible because the P2P architecture allows its nodes to make role change decisions on the fly.

One result of this mechanized computing efficiency is that the load on traditional servers can be relieved. Now, instead of doing it all, traditional servers can perform more specialized functions such as email, Web, and domain name services, and do so with more efficiency. At the same time, peer-to-peer computing can reduce the need for IT organizations to grow parts of its infrastructure in order to support certain services, such as backup storage.

Backup is seldom needed in a peer-to-peer environment. Multiple versions of all files can be stored on multiple P2P nodes. If any given node breaks down, there will always be copies of the files located elsewhere. Another name for this polled resource environment is distributed computing. Many people feel that P2P and distributed computing will eventually replace all standard client/server environments. Some even think it’s the next step in computing evolution. A closer look reveals that in real working environments it has tremendous benefits, and huge drawbacks.

What people think: I want my MP3’s.

What we think: P2P networking has a potentially bright future, but the days of the great mp3-free-for-all are coming to an end. Security and legal risks will eventually outweigh the benefits of illegal filesharing.

More Information

The above information is the start of a chapter in "Network Security Illustrated," published by McGraw-Hill and available from amazon.com, as well as your local bookstore. The book goes into much greater depth on this topic. To learn more about the book and what it covers, click here.

Below, you'll find links to online resources that supplement this portion of the book.


Resources

(websites, books, etc.)

Discussions

FAQs

Errata

Sample Pages