A Simple Exercise to Measure the Extent of Storage Sprawl

This is an excerpt taken from a comprehensive guide on how to qualify the risks of storage sprawl. You can download it here

Identifying the extent of storage sprawl and performing a thorough cleanse of the files almost always reduces storage requirements considerably. Companies that have been in business for five years or more can likely reduce their footprints by at least 50 percent. The following is a relatively quick storage audit process to help you begin to discover the extent of your storage sprawl.

To start, pick a file server at random. Then generate a simple audit report of all the file names stored on a single server. Going through this phase for an entire server infrastructure can be overwhelming, so begin with just one file server to get the ball rolling. Then classify the files into four categories:

  • Valid business records that belong on the server
  • Valid files that belong on a different server
  • Files that are no longer useful
  • Files that cannot be identified

The filename and the file metadata will usually present solid clues as to which category is appropriate for each file. Files that are no longer useful can often be identified by the lack of associated metadata or the date of last use. In many cases, 80% or more of the files in any given storage system are of no value to anyone in the company, representing a way to reduce storage needs. Instead of 10 files servers, you may only actually need two, offering you the freedom to move both of them to the cloud to lower your compute costs.

After completing the classification phase, take file analysis to the next level:

  • Identify each file type (essential to running the business, eligible for historical archive)
  • Classify the type of data stored in each file (confidential/sensitive, public-facing)
  • Determine which end users/groups have access to each file
  • Document the process for end users to gain access (direct, indirect)

The approach to identify your risk areas by analyzing just a single server gives you a general assessment of your overall storage infrastructure by taking a random extract instead of trying to analyze everything all at once. The extract is most likely representative of your overall situation and will give you an idea of just how big the project will be for analyzing the rest of your storage environment. While the approach may sound simplistic, very few businesses audit their storage systems to this extent.

The next step is to expand this type of analysis to other systems. Cloud FastPath Analytics is able to profile the content on file servers by owner, type, size, location, and much more, so you can begin to plan out the cleanup phase. Get a free trial here