Win7 Library Tool

Windows 7 libraries are a really useful feature of Windows 7, however unfortunately they arrive in a slighly cut-down form out of the box. Microsoft decided against exposing some really useful capabilities to users, like adding network locations, pretty much the first thing I tried to do. You get this message:


Luckily, you can add network locations (and any other un-indexed locations), but it must be done programatically. MS supply a command line utility slutil.exe, candidate for the worst named executable in history. Pretty sure it stands for shell_library_util. Anyway, I decided to write a tool to make it easy to add network locations, and added a few other features as well:

  • Add network (UNC or mapped drive) and any other un-indexed folders to libraries.
  • Backup library configuration, such that a saved set of libraries can be instantly restored at any point (like after a re-install of the OS or for transfer between multiple computers).
  • Create a mirror of all libraries (using symbolic links) in [SystemDrive]:\libraries. This means you can reference all your files using a much shorter path, and also provides another entry-point to your files in many places in the Operating System (e.g. file open/save dialogs).
  • Change a library’s icon.


Hopefully it’s easy enough to use, so I don’t have to explain it :)

You can download it for free below. (Note: This will only run on >= Windows 7.)

Download Installer | Source Code

I must give credit to Josh Smith for his TreeView CodeProject article, upon which this solution is modelled.

The application uses the Microsoft API CodePack to manipulate libraries, which I encourage you to check out if you are writing software to integrate / take advantage of new features in Windows 7.

If you want to learn why and how libraries were introduced in Windows 7, including diving into the .library-ms file format, you can read this MSDN article.

Now featured on Tekzilla!

ResX comments and random access

If you are working with .resx files in .net, you will no doubt be familar with the ResXResourceReader, ResXResourceWriter and possibly ResXResourceSet. These three classes allow reading and writing of resx files, but they have some annoying limitations:

  1. ResXResourceReader does not provide random access to values (no method taking a string name/key – you have to traverse all entries sequentially to find a value).
  2. ResXResourceReader requires use of resx data nodes to access value comments.
  3. ResXResourceSet provides random access (via the GetObject() method) but has no support for comments via resx data nodes.

These classes have been available since .net 1.0 and it is another example of a software development principle I believe in more each day: ‘Your first attempt at designing [software entity] will be sub-optimal.’.

Unfortunately Microsoft has not done much work in this area other than in .net 2.0 to introduce resx data node support for the ResXResourceReader that enable you to work with entry comments.

If you want to work with resx comments as well as have random access to values, here is a class that derives from ResXResourceReader and provides that functionality:

using System.Collections.Generic;
using System.ComponentModel.Design;
using System.Resources;

namespace ResxUtils
	/// <summary>
	/// Extends the ResXResourceReader class with value and comment support.
	/// </summary>
	public class ResXReader : ResXResourceReader
		/// <summary>
		/// Container for a ResX value and comment.
		/// </summary>
		private class ResXValue
			public string Value;
			public string Comment;

		Dictionary<string, ResXValue> _resxEntries;

		/// <summary>
		/// Initializes a new instance of the <see cref="ResXReader"/> class.
		/// </summary>
		/// <param name="resXStream">The resX stream.</param>
		public ResXReader(System.IO.Stream resXStream)
			: base(resXStream)

		/// <summary>
		/// Initializes a new instance of the <see cref="ResXReader"/> class.
		/// </summary>
		/// <param name="filePath">The file path.</param>
		public ResXReader(string filePath)
			: base(filePath)

		/// <summary>
		/// Initializes this instance.
		/// </summary>
		private void Initialize()
			UseResXDataNodes = true; // must turn this on to use access resx comments

			_resxEntries = new Dictionary<string, ResXValue>();

			foreach (DictionaryEntry entry in this)
				var node = (ResXDataNode)entry.Value;
				var value = (string)node.GetValue(null as ITypeResolutionService);
				_resxEntries[node.Name] = new ResXValue { Value = value, Comment = node.Comment };

		/// <summary>
		/// Gets the value.
		/// </summary>
		/// <param name="key">The key.</param>
		/// <returns>The value, or null if there is no entry matching the supplied key.</returns>
		public string GetValue(string key)
			if (_resxEntries.ContainsKey(key))
				return _resxEntries[key].Value;

			return null;

		/// <summary>
		/// Gets the comment.
		/// </summary>
		/// <param name="key">The key.</param>
		/// <returns>The comment, or null if there is no entry matching the supplied key.</returns>
		public string GetComment(string key)
			if (_resxEntries.ContainsKey(key))
				return _resxEntries[key].Comment;

			return null;

Note: If you run Code Analysis / FxCop over this in .net 4.0, it will complain about several things, in particular LinkDemands are skipped, the class is not named correctly. The class naming rule is a side effect of the base ResXResourceReader class implementing the generic interface IEnumerable, (so can just be suppressed) and the security warnings are due to the CAS policy changes in .net 4.0. I’m not sure there is any way to code around deriving from a class with link demands in .net 4.0 without suppressing security warnings (even adding the SecurityCritical attribute fails, and should be unnecessary anyway).

FreeCommander + SuperCopier2

As fun as file systems are, you really don’t want to be spending longer than you have to when it comes to copying / moving / viewing files and navigating folders. I was saved years of grief fumbling about with Windows Explorer thanks to a mate’s advice to switch to TotalCommander back in 2003 (I still remember his look of horror when when I said ‘Total-what?’). When it comes to speed and efficiently, the keyboard is king, and TotalCommander and all the other *commander variants take advantage of that.

Last year I switched to FreeCommander, and am very happy. The main benefits for me are:

  • a more modern / nicer UI
  • ‘type to navigate’
  • plenty of customisation options
  • it’s free :)

Unfortunately there are some serious drawbacks:

  • The file-copy/move dialog is the Windows built-in one (which is down-right horrible)
  • The file-delete dialog is the Windows built-in one, and it’s modal to the application
  • The FTP client has a terrible time-out issue and is generally buggy

I’d like to solve all three of these, but I’m starting with the first.

Luckily, FreeCommander allows you to use an external copy/move program via an INI setting. I tried the free TeraCopy for a while, but when I discovered SuperCopier 2, it was perfect! It really is lightning quick at what it does, it doesn’t get in your way, and has all the options you need (pause, error handling etc.). The only problem is, as of v2.2, the copy interception code was re-written, and it could no longer be fully integrated with FreeCommander. To solve this problem, I have written a small executable in C++ called SC2Integration.exe that uses the API provided by SuperCopier 2 (available on their sourceforge site) to re-unite FreeCommander with this excellent file-copy replacement.

Simply drop SC2Integration.exe into your FreeCommander installation folder (or anywhere else you would like to store it) and add the following two lines under the [Form] section of your FreeCommander.ini file:

FileCopyPrg=%FcSrcPath%\SC2Integration.exe Copy "%ActivSelAsFile%" "%InactivDir%"
FileMovePrg=%FcSrcPath%\SC2Integration.exe Move "%ActivSelAsFile%" "%InactivDir%"

If you didn’t copy SC2Integration.exe into your FreeCommander installation directory, replace %FcSrcPath% with the full correct path (no quotes). On Windows 7 your FreeCommander.ini file is here by default:


Download the SC2Integration utility for free here.

UPDATE: If you get an error ‘Parsing arguments…’, make sure you add double quotes around %ActivSelAsFile% and %InactivDir% in your FreeCommander.ini.

Windows 7 disconnected network drives

Mapped network drives have always been a buggy area in Windows, probably at least in part because they are still linked to a DOS namespace. From MSDN:

On Windows Server 2003 and Windows XP, the WNet functions create and delete network drive letters in the MS-DOS device namespace associated with a logon session…

Whatever the reason, it is commonly reported that mapped network drives appear as ‘disconnected’ in Windows Explorer (or ‘unavailable’ via net use), and that programs that attempt to use these drive mappings will fail until the user physically clicks on the drive letter in Windows Explorer. Only this user-initiated action will restore the connection and allow other programs to successfully read from the drive letter.


This is a BIG problem!

I initially thought that the mapping of the network drives was being done before some required services had started (such as the Workstation and Server services), so I wrote a program to run on startup that attempted to map a drive and logged all the currently running services if it failed. It would keep trying to map a drive until it succeeded. I hoped it would show me which services needed to be running in order to map a drive, after which I would write a program to wait for those services before attempting to map anything. But, much to my amazement, I found that when the initial drive mapping failed and a subsequent attempt succeeded, the set of running services had not changed! So I could only conclude that there was another element in the equation. After thinking about this for a while, and reading this post, which indicates that Microsoft is apparently trying to address this, I decided ’stuff it’, I’ll just write something that I know will work and is simple.

So I’ve written a small executable called MapDrive.exe to ‘work around’ this problem, described on the following dialog:


Now as you may know since Vista, there is the concept of a split user token, and mapped network drives apply to only one token, so if you map drives as a standard user and then run an elevated process, those network drives are not available to the elevated process. This behaviour is documented by Microsoft, however the solution they offer is both unsupported and unsafe. Other solutions have been proposed here. If you don’t wish to use any of these solutions, you can do this:

1. Run MapDrive.exe as a shortcut from your Startup folder. This will map drives for the standard user token.
2. Run MapDrive.exe as a local group policy logon script using gpedit.msc. This will map drives for the administrator token.

Note: After using this program, you may still see the above balloon popup, simply click the spanner icon, and select ‘Hide icon and notifications’ for ‘Windows Explorer’.

Download the utility for free here.

UPDATE: A few people have asked how to use this program. As stated above, there are two ways, depending on whether you need drives mapped for standard users or admins. If you don’t know what you need, you probably just need to do this for standard users. Here are the steps for both:

1. Copy MapDrive.exe to somewhere on your local hard drive.

Standard Users:
2. Right-click MapDrive.exe and choose ‘Create Shortcut’. This will create a file called ‘MapDrive.exe – Shortcut’ next to MapDrive.exe. Now go Start->All Programs, right-click the ‘Startup’ folder and choose ‘Open’. Now copy the shortcut file you made before to this folder. Now right-click the shortcut file and choose ‘Properties’. Add the drive letter and network share as argument to the end of the ‘Target’ field e.g. “C:\Users\joeblogs\Documents\MapDrive.exe s: \\server\share 20″. You are done, next time you reboot, your s: should be mapped successfully for standard users.

Admin Users:
3. Press Win+R, type gpedit.msc, go User Configuration->Windows Settings->Scripts. Double-click Logon, click ‘Add’ and enter script name (no quotes): “C:\Users\joeblogs\Documents\MapDrive.exe” and script parameters (no quotes): “s: \\server\share 20″, ok, ok, done. You are done, next time you reboot, your s: should be mapped successfully for admin users (i.e. elevated programs).

Network CHMs = Pain courtesy of IE

Despite being around since 1997, CHM files still remain a popular format for program help / documentation as well as ebooks. But opening them from a network drive was outlawed in 2005 after Microsoft released this security update. There was a registry hack workaround to enable it again, however this no longer works in Windows 7. The solution accepted on this thread involves two steps:

1. Add this registry key:


2. In Internet Explorer->Tools->Internet Options and add your network drive into the safe/trusted zones.

Just the thought of firing up IE gives me the security-shivers and I really don’t want to be messing about with ‘trusted zones’ – I don’t trust IE to do anything. I’m surprised the US government hasn’t yet advised its citizens against using IE, like Australia, France and Germany have.

Of course the whole reason this has anything to do with IE is because the built in HH.exe that is the default Html Help viewer on Windows uses IE as the web browser engine to display and navigate the HTML files inside the CHM. So we could avoid this whole problem if there was a CHM viewer that doesn’t use IE at all. Now to the motivation for this post:

Enter xCHM – this is an open-source project that predominantly provides support for viewing CHM files on non-windows platforms, however it has been ported to Windows, so you can kiss HH.exe goodbye, and open your CHMs from any network location without having to hack your registry or mess with IE zones!!

Download the Windows port of xCHM (look under xCHM for Win32).

Usenet vs Bittorrent

I switched from Bittorrent to Usenet about 2 years ago, and honestly it is hands-down the best way to get your movies / music / apps etc. Usenet has essentially all the same stuff as can be found on torrent sites (from what I understand, most content appears first on Usenet before it’s uploaded as torrents) but instead of downloading from ‘peers’, you download everything off a central server, so you are not relying on:

a) Popularity of the file

b) Your ‘peers’ having their computers switched on with bittorrent client running

c) Your ‘peers’ having good upload bandwidth

All major usenet providers support SSL so your connection to their servers is secure, and you will download at the maximum speed of your connection, so you will know exactly when you will have your files. Furthermore, you don’t upload anything (on Usenet, you are actually encouraged to leech), so if your ISP counts uploads, that’s a big saving. In fact, that was one of the primary reasons I switched. One night I accidently left my bittorrent client running and by morning my 15Gb quota for the month was blown and I had only downloaded a 700Mb file >:(

So, security, constant download speed, no uploads, and no punching holes in your firewall to allow incoming connections. Sounds like a winner? Read on to get started.

NZB Files

To download from Usenet, you use NZB files (kind of the equivalent of .torrent files). These tiny files contain information on ‘reports’ that make up a set of files (usually a collection of .rar files) that have been posted to a newsgroup. There are lots of sites where you can get NZB files, but I pretty much only use BinSearch, which is fast, free and secure. Some tips for beginners are:

  • After searching, you need to tick/check the items you want and then click the ‘Create NZB’ button.
  • Focus on collections (the ones with green text) and pay attention to the file size.
  • Use the Advanced search to search only for collections containing NFO files and that are between a specific file size range (very useful for narrowing down searches that return lots of results).

An alternative to searching is just browsing (like PB’s Top 100), probably the best site for this is NewzBin however you have to be invited to join. This site categorizes everything on usenet just like you were browsing the shelves of your local Video store. It’s free to browse, however you have to pay to download NZB files (or view NFO files), so I just switch back to BinSearch after i find something interesting.

News Readers

Once you have your NZB file(s) you need a news reader application (equivalent to a bittorrent client) to actually download the files. I used to use the free GrabIt but this crashes almost every time on shutdown on Windows 7 (no data is lost, it’s just friggen annoying), so I’m currently investigating the other free options out there, in particular, SABnzbd, which is an open-source browser-based client with a smick (and very powerful) user interface. There are stax of readers, including many you can pay for (e.g. NewsLeecher).

News Servers

Your news reader app must connect to a news server which hosts essentially a mirror of all newsgroups, for which you must pay. I use and recommend AstraWeb $25 for 180Gb, never expires, SSL, 541 days retention. Giganews has a 14-day free trial, and Binverse has a 30-day free trial with a whopping 329Gb of downloads if you want to try out Usenet before spending any ca$h.

Par Files

Occassionally you may find that one or two of the files in a set of 50 .rar files are missing, and several may be damaged / corrupted. Luckily, collections come with .par files which enable repair or recovery of damaged or missing files without having to download anything further. Simply open the .par2 file with QuickPar and watch it work its magic.


Although it may seem from the above that using Usenet is complicated, it’s really only the fact that there are a few things you need to do at the start (getting the apps, signing up to a news server). Once you have those in place, Usenet is a breeze and provides numerous benefits over torrenting. Go try it!!

Cutting your chances of data loss

Data loss is probably the single most traumatic experience possible for those that store their data electronically. In recent years, I have had the good fortune to be spared of this event, touch wood, but I will be forever mourning the loss of the first computer game I wrote, back in the early 90’s on a Mac SE/30.  It was a hypercard stack, 2.1 Mb in size, so I couldn’t back it up on a single floppy.  Eventually, the 40Mb HDD in that great machine died, and I was crushed.  Years of work, all that creativity, a window on my teenage years.  Gone.

But it was just a game after all.  Recently a friend of mine lost pretty much all the photos of their first born child (3 months worth) due to a botched home server upgrade.  Crushing.

We all know the answer is backup, but many people just don’t get around to it, thinking, ‘yeah I should do that’.  Like many people, I have a server at home (Gentoo linux) that is my firewall, dhcp server etc. and of course file server.  I have a 4.5Tb RAID5 (7 x 750Gb drives) which is about 80% full of data that I’ve amassed over the years.  At the moment there are only two clients – a desktop for everyday use, and a media centre hooked up to the TV.  Even though I do have some level of data redundancy since I am using RAID5, data loss will still occur if more than one HDD fails.

Ok, so finally to the point of this post – to boost my data integrity (and aid in learning WPF), I have written a ‘folder mirror’ utility.


As you can see, the folder mirroring process can be paused and resumed at will, log files are maintained in a directory of your choice, and a system tray icon is shown, which dynamically displays the percent complete.  To configure which folders to mirror, the program reads a file called folderMirror.config, in which you can define as many folders to copy as desired.  The utility is designed to be used as part of a scheduled backup, such as a scheduled task.  I have it running every day, creating a mirror of  the data I care most about on the two client computers.  So now my chances of losing data due to HDD failure have been cut down to managable odds, and I can sleep much easier.

You can download the utility for free here.

Internet over copper sucks

I am subscribed to the fastest available ADSL 2+ (24Mbps) connection here in Sydney, and my connection at three different houses I’ve lived in has always been unreliable. Currently where I’m living, the connection actually goes down whenever it starts raining! :? It also frequently goes down just at random, when it’s not raining. P2P seems to worsen the situation (not surprisingly), but usenet is superior anyway, so that’s not a problem anymore. Granted, I am pretty far from the exchange (~4km), hopefully copper will be replaced with optic fibre in the next decade.

So a while back I wrote a small utility that shows an icon in the system tray reflecting whether you can contact (via ping) a remote host (google, or your ISP’s DNS server, for example).

Online icon: lookalive-up

Offline icon: lookalive-down

Clicking the icon shows your the history of ping times:


It can be run on startup, passing the name or IP of the remote host as an argument. For example:


You can also specify the period (in seconds) between pings (default is 1 second), so to ping every 5 seconds:

"lookalive.exe 5"

If you have multiple hosts you wish to ping, you can run the application multiple times and a spearate icon in the system tray is displayed for each host. To distinguish between them, you can pass a colour:

"lookalive.exe 5 blue"

Additional arguments allow calling of an external application and/or showing a balloon notification whenever the state changes and customisation of the system tray icon.

Download the utility for free as an installer or as a standalone EXE.