PRESS RELEASE—FOR IMMEDIATE PUBLICATION

 

Sightech Vision Systems, Inc.

San Jose, CA 95120

408 282 3770

Attn:  Tom Seitzler or Art Gaffin

 

 

San Jose, CA.  Sightech Vision Systems, Inc. announces availability of their new “texture” inspection mode on PC Eyebot. 

 

Until now, texture inspection and pattern analysis have been elusive targets for machine vision processors.  Using patented technology for feature recognition, the Sightech PC Eyebot stores pattern or texture features “learned” during presentation of good products to the camera.  During Inspection or Recognition runs, the features of the object under inspection are compared to the stored feature types from the training (learning) session at a rate of millions per second.  The system indicates unrecognized features as a score from 0 to 99.99.  The user sets the “sensitivity” and flags “pass” or “fail” objects according to his or her own criteria for the product being inspected.

 

An early implementation of the texture inspection mode was to detect occasional stray fibers in a customer’s biomedical filter.  The pattern of the filter fibers was “learned” by the PC Eyebot.  A horizontal seam present in all filters was also learned as acceptable.  When in “Inspect” mode, the PC Eyebot found loose fibers on a few of the filters as they were rotated in front of the 640 x 480 pixel camera.  The filter fibers were about the thickness of a human hair and similar in color to the filter material.  Only the stray fiber’s direction differed from the rest of the filter texture.  

 

Training the PC Eyebot on the good filters’ pattern and texture took only a few minutes.  If pattern or textures change, or if additional filter types need to be inspected automatically, the PC Eyebot can store the existing training, then learn the new patterns and do inspection on additional products. The PC Eyebot also provides output to enunciators and ejectors (PLC’s) for physical response to detected conditions.

 

Sightech.com has short video file examples of this Texture inspection function and several others, as well as briefs on many applications.  They claim to have been successful in several application areas where older technologies have proven inadequate.

They welcome new and tough machine vision applications to test their robust system.