If your project requires zero exceptions, run for the hills. No deployment of even basic imaging technology is without exceptions. If by chance your deployment is free of exceptions, you have done one of two things, not created an exception handling process and your exception documents are sitting in some ally somewhere all alone, OR you have made an unwise decision to allow false positives into your system. Neither one of these scenario’s is good.
Exceptions do not have to be a bad thing. They are an opportunity to gain better understanding of documents, and improve the system for even greater success. It's how companies choose to deal with those exceptions that often make or break the success. A company that lets the exception process grow organically often finds they are throwing professional service money out of the window for fine-tuning, while other organizations design exception processes that negate any automation the system is providing. Here are the five biggest mistakes companies make when dealing with exceptions.
-
In data capture, exceptions happen in three phases, template matching, field location, and field recognition. Each phase has to be treated different separate from the others. This is the first mistake organizations make, not classifying or recording the result of exceptions. At larger volume each phase requires its own process in order to be most efficient. If the exception is template matching, it does not make sense to just send the whole image for manual entry. Instead an operator should find out why it did not match, and worse case manually rubber band fields only.
-
No exception left behind. It seems like a great idea. For every exception spend some time fine tuning. Really, this is a very bad idea. Fine tuning time should only be spent on those expectations that repeat x% of the total volume. No wonder so many projects never hit ROI goals. It's because organizations are spending time and money on exceptions that repeat once in a million or never again. Not only that, the nature of advanced data capture on semi-structured documents is such that if you add fine-tuning for a specific variation you stand a chance of breaking logic for another variation. The more variations you add the greater the risk. The only way to mitigate this is regression testing, which in itself is very expensive. I understand your ROI can be, not just good, it can be great. I saw the same demo and calculation you did that knocked you over. However, at a point you are just chasing it, and never benefiting from it. In my experience with data capture deployments, the success rate is about 30%. A large number of deployments from the beginning seem to be VERY successful in a period of one month to 3. This is the honeymoon period of data capture production. What is the difference after month three? Exceptions have started to accumulate, dollars are put into play to get the honeymoon back, and no planning was ever done to ensure longevity. Initiate frustration, fear, and disaster. It’s not the technologies fault.
-
Use the tools the vendors give you. So often I see organizations create their own verification processes for data, when all the top vendors have great verification tools. Not only that, they have ways of using the technology in an iterative approach to act as first level verification. Data capture software should rarely be used as a black boxed solution, the verification tools are there for a reason.
-
Don’t neglect QA resources you already own. One of the greatest ways to boost the accuracy of data capture deployments is the use of existing 100% accurate data. If you have a customer database and are recognizing customer data off paper documents, use the supporting data in your QA processes. It’s very common to have some of the information already or at least a firm understanding of its structure. During the setup of the capture system, it’s important to inventory these and use them.
-
Don’t push it! Don’t start off feeding your document scanners every document you can find. I know it’s fun to watch the paper go in one end out the other, and make a pretty icon on the screen, but there is no benefit in feeding it garbage. There is benefit however, in finding the documents that are going to be a clear win, and using their success as a foundation moving when advancing to the more complex or lower quality documents.
Only with an exceptional exception handling process will you have an exceptional data capture system and ROI.
#Exceptions #datacapture #qualityassurance #ScanningandCapture