Does windows clean it's own registry?

Since many kind of agree that registry cleaners are snake oil, and you should leave registry alone, does Windows 7, 8, 8.1 or 10 (though 10 isn’t officially released yet) clean it’s own registry? Is there any proof of that (like a task, or autorun entry)?

Also is it considered safe to use advanced uninstallers (since they delete registry entries too), like Iobit uninstaller or Revo uninstaller, since built-in uninstallers leave stuff behind, or we should just use built in uninstallers since left over stuff is usually quite small. But perhaps there are side effects to these left overs?

What do you think?

I have no opinion on registry cleaners, I just run CCleaner now and then for the sake of it, I don’t expect any miracles from it.
I don’t know if Windows cleans the registry automatically, I’d assume not, it probably relies on each application to deal with their own registry keys.

Personally I’ve only had issues with advanced uninstallers, for me they’ve often removed more than was related to the program in question, sometimes left the program in question essentially broken in a half uninstalled state where I can no longer uninstall it and hence not install it again, and once it caused another unrelated program to simply not launch anymore… So I stay away from them.

Windows probably does clean it’s own registry entries when they are no longer required, but it does not (and currently cannot) clean entries added by their party programs. That said, 99.99% of the time redundant entries in the registry cause no problems at all. I have never come across a Windows problem that could be proven to be caused by redundant registry entries.

Intelligent uninstallers (IOBit, Revo, etc.) scan for registry entries that reference the program being uninstalled. In my experience these programs only remove entries absolutely known to relate to the uninstalled program, though it’s wise to check what is going to be removed before you actually do so. Good uninstall programs take a restore point before the uninsall so you have a way to reverse the process. I use IOBit Uninstaller and have found it to be reliable.

IMO the problem with general registry cleaners is that they’re not as smart as they claim. Registry cleaner vendors seem to compete to see how many entries can be cleaned and unwise customers judge the effectiveness of a registry cleaner by how many entries it removes (and more is wrongly thought to be better). And when all is said and done it makes not a scrap of difference to Windows performance, despite what some people may claim. Basically the risks of removing a live registry key far outweigh the almost zero benefit of removing genuine redundant ones, so don’t run general registry cleaners at all.

If you really want to improve the performance of your PC then defrag and optimise your hard disk drive, that’s where your biggest performance hits come from!

I don’t know if Windows cleans its own registry, but I suspect that it is unlikely. Having said that, it is generally considered that removing redundant registry entries provides insignificant performance benefits and risks causing problems when a supposedly redundant entry being deleted actually belongs to another program.

Probably the only situation where registry cleanup is necessary is when registry entries are left behind by a poorly written uninstaller or by an uninstall that failed to complete successfully, causing an attempted re-installation to fail.

See http://www.techsupportalert.com/content/do-we-really-need-windows-registry-cleaners.htm for a related article.

Windows doesn’t clean the registry its self. I use Eusing Free Registry Cleaner and Auslogics Registry Cleaner every now and then.

If registry cleaning in most cases doesn’t give any noticeable increase in performance, but, depending on the cleaner, can damage the system, why then is it included in Ccleaner, one of the most recommended and trusted (as far as I know) cleaning tools?

Is it a placebo thing?

Hi Maniak2000,
I think you need to ask Piriform.

My guess is back in the earlier days (XP era), Piriform along with many others thought cleaning the registry might improve performance without any real evidence.
Windows has improved the registry structure, making cleaners even more redundant now than ever.
If CCleaner was to remove its mild registry cleaner now, they would probably be inundated with questions as to why.
Thats just my opinion.

Like I said, it is a question for Piriform IMO.

Kind regards.

Just scroll up a bit for the post by MartiusD for a link to an article about possible performance gains.

I spent most of my working life in large IBM mainframe system software support. When you have 5000 concurrent users online to the mainframe system you’re supporting, and they all want a sub-second response time, performance takes on a whole new meaning.

In ANY computer system the biggest impact on performance comes from the slowest and worst organised device, and on ANY computer system that is the hard disk - ALWAYS. The reason is because on a hard disk you’re moving metal (spinning metal disks and moving metal read/write heads in and out). Access times for hard disks (at best) is measured in milliseconds (10^-3), yet RAM access times and CPU execution times are measured in nanoseconds (10^-9). So RAM and the CPU operate six orders of magnitude faster than your hard disk.

So if your hard disk is not performing optimally it will have a MASSIVE impact on system performance. Sadly, optimising the hard disk isn’t seen as terribly technically clever, nobody will see you as a computer guru because you advise them to optimise their hard disk - even though they will see a marked improvement afterwards.

No, technical gurus seen as are those who mess with technical stuff like the registry. Because most of us don’t understand the registry those who do (or who pretend to) are seen as super-techies. But, if your hard disk is badly optimised to that a regular read operation takes 1000mS instead of 10mS (and that is very easily possible) then you mucking about with the registry trying to save 1ms or 2mS by “optimising” or “cleaning” it is frankly pointless.

There was a famous bank robber in the 1950’s called Willie Lot. He’d rob a bank, they’d catch him and throw him in jail. When he got out he’d rob another bank, they’d catch him again, and on and on it went. Eventually somebody asked Willie why he kept robbing banks. “Because that’s where the money is” Willie replied. So, why should you be focussing on hard disk performance? Because that’s where your performance problems are…

This is a bit off-topic, but how DO you optimize hard drive (and for that matter, ssd)?
The only ways of “optimizing” a HDD I know of is defragging it, periodically run checkdisk on it … and making sure it’s connected to a fastest port.

  1. You could place unmovable files or known files that cause fragmentation or very large files (eg pagefile) to other partitions, physical drives. [guide]
  2. You could optimize files placement-- use zones (eg move boot, prefetch to first zone ; move Recycle Bin, System Volume Information to the last zone).
    etc.

Defragmentation of flash memory, floppy drives and solid-state drives (SSD) is not recommended, as it reduces their lifespan.

As qmarius said you do not defrag SSDs since it reduces their lifespan and doesn’t give much if any performance increase.

The proper way to optimize SSDs is to TRIM them periodically.

Yes, defragmenting SSD won’t give you the same effect as defragmenting HDD, but I found this info while gathering info about SSDs.

If an SSD gets too fragmented you can hit maximum file fragmentation (when the metadata can’t represent any more file fragments) which will result in errors when you try to write/extend a file. Furthermore, more file fragments means more metadata to process while reading/writing a file, which can lead to slower performance.

source http://www.hanselman.com/blog/TheRealAndCompleteStoryDoesWindowsDefragmentYourSSD.aspx

Hard disks suffer from two related problems.

When you write to a file, and more disk space is required to hold those contents, a new area of disk space is allocated. The first “problem” is that this new area is probably not contiguous with the previous area, it’s just in a convenient free area of the disk. The “second” problem is that over time, and as the disk becomes full, the distance between these fragments of your files (in terms of where on the disk surface they are) can become very large.

The reason these are problems is because we have to move the read/write heads over the start of the file (i.e. the first fragment) and then search through the file until we either find the data required or we reach the end of that fragment. (This movement of the read/write heads is called seek, and it’s measured in milliseconds (mS). Most modern disks take about 1mS to seek to the next adjacent track and about 8mS to seek halfway across the disk). If we reach the fragment end the read/write heads have to seek back to the Master File Table (MFT) to find the location of the next fragment, then they have to seek to the start of that fragment. If we reach the end of that fragment before finding the data the heads have to seek back to the MFT again to find the next fragment and so it goes…

So badly fragmented files build up quite large seek times and if the fragments are widely spaced on the disk surface and are a long way from the MFT these seek times can be large. All these seek times (and something called disk latency which is a function of the rotational speed) can add up to response times in the hundreds of milliseconds, sometimes into the seconds (remember the CPU and RAM operate in the nanosecond range).

The quickest way to reduce these seek times is to defragment the disk. This means that all file fragments are physically moved so that all the fragments are located one after another. Thus a data search now requires only one seek from the MFT to the start of each file. Windows contains a defragmenter that does an adequate job of defragmenting files.

The best way to optimise your disk is to locate all of your critical high performance files close to the MFT and on the outer edges of the disk (the sectors on the outer tracks are slightly longer so performance is just a tiny bit better). Ideally you want the MFT in the middle of all your high performance files and they should all be regularly defragmented. Those files you use very rarely should all be moved to the centre tracks of the disk, out of the way so they don’t “pollute” your other files and make their seeks longer than necessary. Windows does not contain a tool that lets you do this. I use the Ultimate Defrag tool from http://www.disktrix.com which contains an excellent defragmenter and tools that allow you to place your files exactly where you want them. Regular use of a tool like this will give you optimal hard disk performance, and you really will be able to see the difference.

As has been mentioned SSDs have no moving metal parts and they do not suffer from seek nor fragmentation problems. Thus an SSD does not need to be defragmented and you do not need to do any file optimisation.

Also, indirectly said, hard disk temperature will rise gradually above normal. It’s an important aspect which is often neglected.

I agree other than the fact it was Willie Sutton that robbed the banks and the source of that quote. An intelligently defragged drive and I/O read times are a major factor in performance. No boost can be gained from simply cleaning the registry. Defragging the registry can produce minimal response improvements. True improvement can only be gained by disabling unneeded and unwanted services and features of the operating system.

A defrag and optimize with Auslogics Disk Defrag is like putting on a new pair of shoes for the operating system. :smiley:

Close enough for government work! 8)