I am looking for feedback on file analysis software that has been utilised for activities or projects to reduce redundant, obsolete or trivial information.
Keen to hear about applications such Nuix, HP Control Point, Active Navigation etc.
I too am interested in Andy's question. We routinely use the DuplicateFileFinder tool to identify duplication. This is coupled with a drive analysis tool that shows us when information was last touched. Recently we've started playing with FileFacets to see what it can do for us. We want to use smarter tools to help with this type of content analysis as well as with preparing for document migrations including auto-classification tools.
If you haven't already done so, you may want to review the presentation / recording earlier this year about the "Dark Matter - shared drive file analysis" case study via Lockheed Martin / Nuix:
If you are interested, I can connect you to the presenter.
Another tool we commonly encounter for de-duplication and basic file analysis is TreeSize Pro
We at Venture have a proprietary file analysis tool which carries out de duplication file analysis (with 99.9% accuracy of duplication identification) and several pre migration tasks. We also have another product that carries out auto-classification and meta data generation based on content-level scanning, I could put you in contact with one of our team to further discuss your requirement?
As a solution orientated business, we also keep abreast of other comparable products so would be happy to discuss them from a vendor neutral perspective also.