Not too long ago I discovered that my QFinder indexing service had not been working correctly and since then I have been trying to regenerate the index file. After a couple of weeks of work and several increasing capacity hard drives I have discovered a part of my issue but no resolution. About 300,000 files (about 5 hours) I keep running into what appears to be a corrupted file which interferes with the rest of my indexing process. When the file gets read the index starts growing at 1GB every few seconds until either I manually stop the indexing task or the hard drive runs out of space. Since discovering this specific file I have removed it from the network for repair and just to let my index try to rebuild so I can get an idea of how large it will be when completed, but to regenerate the index again it appears that I have to generate a new index which deletes the previous one and starts over. There has to be a more efficient way to build the index.

What would be ideal is that since I can stop the index generation, the ability to resume would be great rather than a complete rebuild. Since I'm already 5 hours into this process when my problem presents itself, and have another million files to go, this is turning into a painful process of building the index. I'm trying to index my volume which contains all of my Department folder shares, and at this time I do want it to try to capture everything that it can, so I don't really want to put in exclusions. Any advice would be appreciated. Thanks!