About Us

Products & Services
Video Library Services

Our Clients

Technical Information

Our People




For over a quarter of a century computer tape users have known the importance of maintaining their magnetic media. Instrumentation and video engineers have also come to realize the need for tape maintenance as information-packing densities continue to increase.

Intimate contact between the recording or playback head and the recording surface of the tape is critical to good performance. When an instantaneous loss of signal (dropout) occurs, it is usually caused by an imperfect contact between the head and the tape at that point.

Separations of only 7 millionths of an inch between the head and tape will often result in over 50% loss of signal (1 MHz @ 60 ips = 60? inch wavelength, spacing loss = 55 db per wavelength, Wallace formula). Research has shown that most dropouts are only 4 to 10 thousandths of an inch long, but just one in the wrong place can affect the integrity of critical and possibly irretrievable data.

New tapes need cleaning because manufacturing (slitting) debris is left on the surface and this debris is abrasive. The abrasiveness in this case is not from dirt but from the manufacturing process itself. Even dirty used tapes are less abrasive than new tapes; clean used tapes are better yet. Since a dirty tape surface is abrasive, proper cleaning of both new and used tapes is the solution to reduce abrasiveness and dirt, and to give the best life expectancy to expensive heads.

While manufacturing defects can sometimes cause imperfections in the recording surface, contamination is the most common cause of performance problems. Some contaminants are elements of the tape's environment - dust, smoke, fibres, hair, fingerprints, cigarette ash, and alcohol residue. But most are from the tape itself - shedding particles, fractured base film (mylar), and debris from slitting. The majority (up to 80%) of these contaminating particles are loose and can be removed efficiently and quickly to eliminate transient errors. There are two principal sources responsible for the contamination found on magnetic tape.

The primary source of contamination is the computer tape itself. It is a “wear” product and is self destructive by nature, since approximately 80% of all contaminants found on magnetic tape are self-generated.

Origin of Self-Generated Contamination  

  1. Slitter Trash - Magnetic tape is bulk manufactured into “jumbo rolls” that range from 60cm to 180cm in width. These rolls are then slit to the widths required for the various end products to be made, e.g. LTO, DLT, 4mm, 8mm, and 1⁄4” data cartridges, 1/2” reels, video tape, audio tape etc. The slitting process actually fractures the edges of the tape, creating particles of oxide, binder and mylar which then adhere to the edges. The edges come in direct contact with tape guides, capstans, pinch rollers and read/write heads, dislodging the oxide, binder and mylar particles. These dislodged particles then tend to migrate onto the recording surface of the tape, the backing of the tape and the tape drive itself.
  2. Back-to-Front Transfer - when a tape is wound, one backing layer comes in direct contact with the next recording layer. This layer to layer wind generates considerable pressures and results in transfer of contamination to the recording surface. The longer a tape is stored or inactive, the more pronounced this back-to-front transfer phenomenon becomes.
  3. Binding Process – The chemical and mechanical process used by tape manufacturers to adhere oxide and metal particles to mylar. If something goes wrong in this process (and it can!), then oxide or binding agents may bleed from the tape and contaminate the whole system.

The second source of contamination is the sub-micron contaminants found in secondary air in data centres. These contaminants are not only responsible for media read/write errors, but also for ESD (Electric-static Discharge) which causes electronic failures.

Since tapes are statically charged, air-borne sub-micron contaminants tend to seek out and adhere to the surface of the tape. In fact, the static charge generated by magnetic tape contributes greatly to the contamination process.



When a contaminant passes over a head, a head-to-tape separation occurs, reducing the signal strength. The reduction in signal strength is directly related to the height of the contaminant rather than the surface dimension.

When the signal falls below a certain level (threshold level) the tape drive can no longer read the signal, which results in “write skips” and “read failures”.

A particle .00025” (size of a smoke particle) can be enough to eliminate the entire signal and will result in an error. The key element to note is that the higher the packing density, the more sensitive the tape drives become to signal loss caused by contamination.

The effects of debris contamination may not be immediate, and may take some time to develop. Even one piece of debris can become embossed or result in a ‘print through’ to adjacent layers of media by distorting the base film. The tape drive has the ability to error-correct around a debris defect during writing, but after some time this ‘print through’ effect can cause distortion to areas of the tape that were previously unaffected, creating hundreds of errors. With the high track densities in today’s tapes, debris that would likely not cause problems in 18-track tapes, can have an adverse impact on high performance tapes with hundreds of tracks.

Figure 1


The Value of Data

The Gartner Group in the U.S. places the cost to the average data centre for an abnormal job termination at US$1000. If job abends increase by just one per week, it will cost the data centre more than US$50,000 in operating expenses.

The figure below shows a way to project as a function of job reruns for 3 abends/week or 7 abends/week.

Figure 2

The potential for degraded library performance with contaminated media can cost far more than the overall outlay for conditioning your tapes.



Cleaning alone will not guarantee the integrity of tapes; it must be coupled with re-tensioning. Long lengths of thin tape (i.e. 2800m x .0003mm and thinner) are susceptible to damage during shipping and handling. The American Department of Defence devised a specification for conditioning tape prior to handling in order to preserve good physical characteristics. This specification, NSA L14-2, covers several aspects of re-tensioning.

Drive-induced damage often results from inadequate preventative maintenance procedures. The figure on the left (Figure 3) below shows a tape cinch created by a drive that had not been properly maintained. The figure on the right (Figure 4) illustrates how customer data, which should be continuous along the length of the tape, is unrecoverable in the area of the ‘Z’ fold.


Figure 3 Figure 4

Regular tape drive maintenance together with tape conditioning assists in the reduction of drive/tape problems.

Regular re-tensioning of the tape assists in overcoming the problems of ‘Z’ folds appearing, thereby rendering your tape useless for the storage of data.


Critical Questions

When considering tape conditioning, you should ask yourself some critical questions on the balance between media cost and the potential costs of loss of data.

How many errors define end-of-life for a cartridge?
I.T. personnel should be aware that media may have a shorter useful life than tapes that are being cleaned, and may result in poorer performance across the library.

Do we monitor error rates for the tape system to identify drives that require service or media that may need to be retired?
The use of tapes which have not been conditioned may cause system error rates to increase, triggering excessive drive maintenance calls. For many data centres, these excess service calls are now increasingly, becoming chargeable items, even if basic maintenance is included in the service contract.

What is the value of a Disaster Recovery or Business Continuity Plan if you are not able to reliably restore the critical data files?
Is unconditioned tape suitable for these critical data sets? Some data sets must be retained for seven or more years due to legal or contractual reasons. What is the cost to the organization if you are unable to read/access these files? Would you want to store your critical data sets on what may well be unreadable media?

Return to top





Cause and Effect