That was easier than I thought...

After not nearly as much time spent in front of VisualStudio as I thought I'd need, I now have a tool to test out the big-directory case. I'm running a few tests to see if the output does make sense, but early returns show not too shabby information. I'm still not 100% certain that I have my units right, but at least it'll give me something to compare against, and whether or not large directory listings are subjest to linear, or curved response times.

Some sample output:

Iterations, MkDirOpTime(ms), EnumerationTimeDOS(ms), EnumerationOpTimeDOS(ms)
500, 0.213460893024992, 4.33778808035625, 0.0086755761607125
1000, 0.205388901117948, 8.56917296772817, 0.00856917296772817
1500, 0.206062279938047, 12.6200889185171, 0.00841339261234476
2000, 0.203543292746338, 16.5182862916397, 0.00825914314581986
2500, 0.202069268714127, 20.5898861176478, 0.00823595444705914
3000, 0.201296393468305, 24.786919106575, 0.00826230636885834


That's 3000 directories being enumerated in that bottom line. I'm also not 100% on the time unit being miliseconds, though the same operation converts the arbitrary (?) system units into real units for all of those.

The utility takes two arguments:
  • Number of directories to create.
  • [optional] Stepping factor.
The above output was generated with "bigdir 3000 500" for 3000 subdirectories and do directory enumerations for every 500 directories. It defaults to a step of 1.

Tags: