Network Security Illustrated buy the book at Amazon now!

search site
Table of Contents














Page Tools
print this pagePrint this Page
email this pageEmail this Page
notify me when this section changesWatch for Updates
send us feedbackSend us Feedback
Chapter List
Client-Server Architecture
Internet Services
Remote Access
File Transfer (web bonus)
Streaming Services (web bonus)
Peer-to-Peer Networking
Middleware/Application Services (web bonus)
More Information
Resources (links)
Sample Pages
Buy The Book
Remote file transfer is a group of protocols and client/server applications that allow users exchange data with remote systems.

Technology Overview

As a concept, remote file transfer is quite easy to understand. The pain of remote file transfer comes in the form of choices. There are many different methods available to successfully transfer files remotely. So which one should be used and why? Which one is the most secure? As usual there is no straight answer to this question. However, by examining a synopsis of the different protocols available the right solution for a given situation can be found.

This is Windows Calling UNIX, Do You Copy?

Windows deals with most things in broad strokes to make them easier to the user. File copying is one great example of this issue. Windows uses drag and drop file copying for just about every situation. Over network, between two local directories, from a CD-ROM to a hard drive etc. Conversely, UNIX has a specific copy command for every situation. Each copy command is unique, and needs to be used appropriately at the right time and in the right place. This can be a tremendous strain for the average user, as it forces extensive knowledge of the inner workings of an operating system. The advantage is that it becomes very difficult to make mistakes: just executing simple file transfer commands requires lots of study. In addition it provides tremendous power, control and flexibility to the advanced user.

Duplicating Files (CP and Drag and Drop)

The average computer user copies tens, or even hundreds of files from one source to another destination on a daily basis. Copying is a function of computers that is taken for granted, and therefore can result in security problems.

All operating systems incorporate the ability to copy files as part of their infrastructure. Different operating systems handle the process of copying differently though. Since networking often involves copying files across multiple platforms knowledge of copying functions can reduce risk. Two platforms commonly networked together are UNIX and Windows.

Windows provides a very pleasant and simple method for copying that most people new to computers mistake for normal. Drag and drop is the phrase that has been coined to describe copying or moving a file in a Windows desktop environment. It refers to the user’s ability to take a file from one location (window) and drag it to another, at which time the file system moves it or duplicates it in the new location.

UNIX shell environments use a more archaic, but more powerful method to achieve the same results as Windows. The default program for copying in UNIX environments is call CP. The following snippet is taken directly from the manual pages of openBSD, a Unix operating system:

“In the first synopsis form, the cp utility copies the contents of the source_file to the target_file. In the second synopsis form, the contents of each named source_file are copied to the destination target_directory. The names of the files themselves are not changed. If cp detects an attempt to copy a file to itself, the copy will fail.”

In short, the above jibber jabber is UNIX man page speak for “this program copies files from where you want to where you want and will not overwrite files unless explicitly told to do so.”

File Transfer Protocol (FTP)

FTP, the file transfer protocol is one of the most well known methods of basic remote file transfer. Historically, FTP has been a standard in file transfer over networks, including the Internet. As TCP/IP protocols go, it is relatively easy to set up, requires little maintenance, and is built with the goal of simplicity in mind. FTP is implemented client/server application, but is very lightweight. There are many different versions of both the client and server version of FTP available for every computing platform. In fact, most server-based operating systems include a version of FTP server by default.

TFTP, or trivial file transfer protocol is a simplified version of FTP. It uses a client/server model as well, but is designed to work in a scaled down fashion. TFTP is generally used by servers to boot diskless workstations, by routers to receive firmware updates, and by X-terminals to transfer small files.

Remote and Secure File Transfer (SCP, RCP, SFTP, RSYNC)

The Windows drag and drop method of file copying does not concern itself with details. It does not care where it transfers files from or where the files are transferred. This can be both advantageous and dangerous at the same time. From one perspective it is nice to be able to use the same copy method whether transferring files over a network or not. It is great to not have to think about the technology, but to just get the copying done as needed. However, as mentioned earlier, there are great security risks to being “oblivious” of a file’s source and destination. The UNIX world treats file transfer in quite the opposite way, with way too little left to guesswork. Unlike Windows, all copying is not handled equally; it is in fact, handled quite distinctly. This results in many scary acronyms and many long nights reading many long books.

The acronyms that pepper the title of this section are not as scary as they sound however, and chances are if they are not familiar it is because UNIX is not your game. Learning the specific functions of these remote file transfer applications is beneficial to security planning in general. The nice part is that they are just enhanced versions of the protocols and applications already covered in this section.

Essentially RCP and SCP are enhancements of CP, where the former is remote file copy and the latter is secure file copy. These programs are part of most UNIX environments by default. They are used in situations where CP is not appropriate, namely in remote file situations.

The RCP utility works just like the CP utility except it copies files between machines, rather than within one machine. RCP also has a built in feature that will encrypt the files before being transferred across machines. This does provide some security.

SCP provides even greater file security. SCP also copies files across hosts, but does so through an encrypted tunnel that is identical to SSH. The encrypted tunnel provides additional security, as a password request authentication process is required for successful transfer. In essence, the remote user wished transfer files between machines needs to log into an SSH prompt exchanging key successfully before the encrypted transfer will begin.

In the same way SCP is CP with SSH integration, SFTP is FTP with SSH integration. Confused? Don’t be, it is rather straightforward. FTP as described earlier is a standard protocol for remote file transfer, which provide little to no natural security. SFTP however provides the same client/server file transfer mechanism, however it wraps the data into an encrypted tunnel identical to that of SSH. So instead of a basic login requirement, a complex key exchange login is required.

FTP and SFTP are even taken a step further by a great utility call RSYNC. RSYNC takes SFTP and includes a feature known as diffs. What are diffs? Diffs are short for differences. Meaning the RSYNC utility, while securely connecting a user to a host checks a log that contains file transfer history. The application then identifies the differences between the source file and the destination file. The process of transfer is shortened because RSYNC only copies the portion of the file that has changed, saving time and valuable resources. FSYNC is the ultimate evolution of FTP.

Security Considerations

FTP: FTP was designed with some inherent security, but it is largely based on permissions, not encryption. When a FTP server is set up the user base of that server determined who has access to what files. If an outside user is requesting guest or “anonymous” access they may be permitted or not based on the security settings. In most Unix environments, FTP servers work off of the user list of the server itself. In windows environments, however, FTP server applications do not always integrate with the permissions base of the operating system. In this circumstance brand new users and groups need to be created explicitly for the purpose of FTP.

Besides permissions, FTP and security have an inverse relationship. Essentially an FTP server is an open invitation for malicious users to gain access to private networks. FTP servers and firewalls don’t get along very well. An FTP server essentially requires the firewall to permit very extensive access from outside the network. FTP proxies can get around this problem.

TFTP: Unlike FTP, TFTP has no built-in security at all. There are no permissions structures and not user base. This can be explained by the fact that it is generally used in local environments. A common use for TFTP would be to upload a new version of firmware to a router. The firmware itself may be downloaded from the Internet and stored on a local workstation. The router runs a version of TFTP server in its operating system by default. A TFTP client is then installed on the workstation, and the IP address o the server is entered.

Once connected, the TFTP server accepts the upload and allows the user to install the new firmware on the router. In this type of localized environment, file theft between two points would be a difficult achievement. That’s why the trivial in trivial FTP is adequate for the job at hand.

Duplication Risks: Duplication of anything private always creates security risks. When you make a copy of your tax returns or some medical records there is more to keep track of and a greater chance that the extra copy could end up in the wrong hands. Duplication of digital files carries all the same risks and more.

Cotton-Eyed Joe: One final risk is a lack of awareness of the source and destination. When file copying occurs in networked Drag and Drop environments such as Windows, mapped drives are often involved. Map drives seem transparent to the end user, meaning they may often be mistaken for local drives. Files are being copied from a remote source to another remote source or from a local source to a remote destination. In these cases, the files are traveling, often unencrypted, over a network. At this time these files are vulnerable to malicious user’s sniffing the network traffic.

Making the Connection

Connecting Networks: The remote file transfer tools all layer over the core networking protocols.

Hardening Networks: Securing remote file transfer is a critical part of a network hardening strategy.

Hiding Information: Securing remote file transfer requires the use of encryption.

Best Practices

FTP servers should never exist on servers in a private part of a network as they require open ports on a firewall and pose a general security threat. Any true secure or private part of a LAN should have as little exposure to the Internet as possible, and FTP is a service that greatly expands exposure. If an attacker is able to exploit the security provided by an ftp server, access can easily be gained to the rest of the network. If there is a need to serve files over the Internet, a VPN is the safest alternative. With a VPN in place, remote users that are downloading or uploading file to the FTP server are doing so in a completely encrypted tunnel. Another safe method to use an ftp server is in a DMZ. However, this area of a network should never contain any files that are private and critical to a business operation.

When copying files staying aware of what you are doing is a good rule of thumb. Be aware of what you are duplicating. Be aware of the source of the original file and the destination it is being sent to, whether remote or not. Do not copy important files blindly over a network, even if it is a private LAN. There are specialized tools available to perform these functions in a more secure fashion.

The applications that provide remote and secure file transfer had security in mind when they where created. These are the application that should be used to take files to and from a network from a remote location. Remote does not necessarily mean out of the confines of the company headquarters. Even within the LAN, these secure applications can provide extra breathing room across subnets. How often and how extensively they should be used should be dictated by a security philosophy. The most secure policies, however, will always provide for SCP, SFTP and RSYNC, as they require both secure login and encryption before files are moved across hosts.

Final Thoughts

Educating the average user is the priority with remote file transfer. Education leads to awareness and awareness leads to better security. Forcing employees to use SCP or SFTP when they want to retrieve a file from the LAN may not make sense in every company. It may not make sense in most companies for that matter. However, the more knowledge users have about how their own desktops work, the more careful they can be when moving data around. This will result in better security, better privacy, and better control of critical information.

More Information

The above information is the start of a chapter in "Network Security Illustrated," published by McGraw-Hill and available from, as well as your local bookstore. The book goes into much greater depth on this topic. To learn more about the book and what it covers, click here.

Below, you'll find links to online resources that supplement this portion of the book.


(websites, books, etc.)




Sample Pages
Learn out about ssd raid 10 data recovery at this site.