A computer is only as fast as its slowest component, and the disk is by far the slowest part of your computer. It is at least 100,000 times slower than your RAM (Random Access Memory) and over 2 million times slower than your CPU. When it comes to the speed of your PC, focus on disk performance first, before you opt for more memory or a faster CPU.
So what can be done to improve disk performance? Since fragmentation – the scattering or fragmenting of files on your hard drive from continually writing, deleting and resizing them – is a major cause of disk performance degradation, start with disk optimization.
So what causes fragmentation and why is it hurting my hard disk performance?
By the very nature of the way the Windows® operating system stores data on the hard disk – by breaking files into pieces all around the disk in tens, hundreds, even thousands of pieces per file – your PC naturally accumulates fragmentation.
PC speed is determined by three things: the CPU, the memory and the disk drive(s). The CPU operates at speeds of a fraction of a billionth of a second. The memory operates at speeds of one hundred-millionth of a second. (That’s fast).
The disk requires hundredths of a second for each operation. (That’s slow.) The disk is a million times slower than the next fastest component. So, if you’re assessing the reason for slow PC performance, you should first consider the disk.
If CPU speed is cut in half, and memory speed is cut in half, PC speed will still seem downright peppy so long as the disk is running at full speed. But if that disk slows down, even a little bit, you—and the users you support—will notice it. And if it slows down a lot, you really notice it.
Disk defrag is the process of reorganizing all the data on a hard drive so that it is contiguous, rather than scattered in hundreds of thousands of pieces around the drive. When files are disorganized, stored in fragmented pieces, the system cannot retrieve them as quickly.
The Windows operating system naturally creates file fragments every time a file is created, modified, or deleted.
The more places the hard drive head has to go to retrieve a file that is fragmented, the longer it will take to access that file. Accessing fragmented files substantially slows down PC speed and performance.
Every time the computer is used, fragmentation levels increase. Left unhandled, it will inevitably lead to PC slows, lags in performance, slow boot times, and eventually an inability to boot up at all. Waiting a week or even a day to defrag, means the PC is working harder and slower than it should be.
Slow boot up is a productivity drain; never mind the frustration it causes to IT and the users they support. Having applications and files stored contiguously is a key factor in keeping PCs stable and performing at peak efficiency.
The moment a file is fragmented and scattered across the disk drive (file fragmentation), it opens the door to a host of stability and reliability issues.
The most common problems caused by file fragmentation are slow boot times, inability to boot up at all, slow or aborted backups, file corruption and data loss, crashes and system hangs, memory problems, and hard drive failures.
Fragmentation is a major factor in slow boot times. While a system is booting up, there is a rush of data being transferred from the hard disk into the system memory to get the operating system ready for use. This initial burst of data during boot up is where fragmentation first becomes problematic.
Solid state drives (SSD) are generally considered to be faster, more powerful, more efficient and in some respects more reliable than hard drives. The problem is, they start out fast but gradually lose speed and, over time, become subject to failure. Most SSDs experience a dramatic and noticeable deterioration in performance. They also possess a limited number of erase–write cycles, which can result in a short lifespan with large amounts of write I/O.
SSDs require that old data be erased before new data is written over it, rather than just writing over the old information as is done with traditional hard drives. This doubles wear-and-tear and can cause major issues.
HyperFast® technology, included with Condusiv’s Diskeeper®, produces faster performance in solid state drives running on Microsoft operating systems. The HyperFast feature includes TRIM functionality and is enabled only if a solid-state drive is recognized.
Slow Anti-virus Scans
If your organization’s AV (anti-virus) scans are running slower, the culprit is most likely disk fragmentation. Disk fragmentation has a severe impact on the speed and performance of anti-virus and anti-spyware scan speeds.
The basic principle is that the greater the degree of fragmentation, the worse PCs and laptops will perform. This holds true for anti-virus applications just as it would for any user interaction with a fragmented computer. With virus attacks still a serious issue in the workplace, the importance of efficient virus scans is critical.
If your systems are riddled with fragmentation, AV scans will take significantly longer to run, leaving your organization susceptible to new virus attacks as they emerge.
If defrag is not run regularly, PCs build up a significant level of fragmentation. This constant accumulation of fragmentation causes slow AV scans that get progressively slower over time.
IT professionals face the challenge of protecting critical business data. Yet there are substantial gaps in using conventional methodology of backup systems. Traditionally, the only way to recover a file has been from backup, a time-consuming chore that can only restore a file if it existed at the time of the last backup.
If the last snapshot was taken at 9am and the file had not yet been created, you’re out of luck. As you can see, this leaves a major hole in your data protection. But it doesn’t stop there. If a file is deleted or overwritten on a network drive, the Windows recycle bin will not catch it.
Accidental deletions and file overwrites happen all the time and they can cost your company a lot of time and money.
Why leave your organization vulnerable to data loss, when there is an easy solution that is specifically designed to provide true continuous data protection and instant, pain-free file recovery?