Optical sensors get even smarter
New optical sensors, which use age-old mathematical theories combined with new construction methods, can achieve unparalleled performances. Tom Shelley reports
New optical thickness gauges, which make use of long forgotten advanced mathematical concepts, can be used to monitor the production of optically rough and imperfect coatings and films. Transducers based on closely related optical technologies can also be used to measure refractive index changes in films to less than one part per million, with major applications in biotechnology process control.
The primary outcome of this development is that optical sensors, widely used in many branches of automated manufacturing and process control, are about to become a whole lot smarter. Both sets of developments come from a team led by Dr Bob Jones at Cambridge Consultants.
The optics of the coating thickness gauge constitute a relatively conventional Michelson interferometer. Because light is reflected from the front and back of the film the device produces two sets of interference lobes. If the thickness of the film is less than the coherence length of the light, the sets of lobes overlap. However, if the surfaces of the coating are rough, the lobes become very small, and very difficult to distinguish from noise.
Bayesian breakthrough
Here’s the clever bit. Thomas Bayes, a Presbyterian minister who lived in Tunbridge Wells until 1761, developed a unique way of looking at probability. Cambridge Consultants researcher, Andrew Diston, explained it to Eureka: “It’s quite simple really. You modify your beliefs about something based on the knowledge that you have. For example, a rolled dice should have a one in six chance of coming up six. But if six comes up often, it must be a loaded dice, with a greater than one in six chance of coming up six.”
Simple in essence it may be, but the integrals which describe the basis of the idea cover a page. Despite the peaceful nature of their original inventor, Bayes-derived algorithms are found to be particularly useful in missile tracking, because of the assistance they provide in analysing trajectories – predicting where missiles are likely to go next. This is a matter of crucial interest to those trying to destroy moving targets and to those trying to avoid being hit. Bayesian algorithms also underlie the information mining technologies being pioneered by Autonomy, a company also based in Cambridge. Its technology enables computers to automatically form an understanding of a piece of text, web page, e-mail, voice, documents, people and images. Through automatically tracking individual expertise profiles, and allowing real-time matching of interests and documents, the software can automatically put company members in touch with experts able to solve their problem. Microsoft is known to have a very active interest in using the idea to mine data, and discover data similar to other documents and data samples which users find to be of interest. It is said that the initial version of the Word paper clip was based on Bayesian algorithms, although not the version finally shipped.
In the case of the coating sensor outputs, the algorithms extract best information from available prior knowledge, based on the sensor being scanned along the coating. Precision is about 0.1 microns in coatings of a few microns or tens of microns thickness. One of the first applications is in the measurement of the thicknesses of the various layers in coated angioplasty balloons and stents for a leading US manufacturer. The device is able to highlight defects invisible to the naked eye.
Monolithic ‘lump’
In the case of the second set of interferometric sensors, the innovation is more in the hardware than the software, although the initial application is also biomedically related and the software is still pretty sophisticated. In this case, the illuminating and reference beams are obtained by reflecting the input beam from the front and back of an optical flat. The same flat is used to recombine the beams after interference. The reference and measurement mirrors reside in a single thin gold coating on the far side of a prism. But, what is really clever, is the way the flat and mirror are bonded together into a single monolithic element, making the device unusually rugged and compact.
The device is being developed to image protein binding and other biomolecular interactions, with the crucial molecules attaching themselves to the metallic film. The intensity and phase of the reflected light depends on the refractive index of the medium in contact with that film. Binding between the molecules attached to the surface of the film and protein molecules in a secondary fluid, give rise to tiny variations in refractive index, typically one part in a million or even ten million.
The phase images are extracted using Fourier techniques. Calibration results demonstrate a noise floor of 10-6 refractive index units and a spatial resolution of 50 microns.
A monolithic Fourier Transform spectrometer based on the same technology has also been developed. This may be used for ultra violet (250 to 500nm), visible (400 to 950nm) or near infra-red (1,100 to 1,600nm) spectral ranges though choice of detector array. Light input is via a 1.75mm fibre bundle. Spectral acquisition is 0.1ms and dimensions are 80 x 60 x 30mm. Conventional Fourier Transform spectrometers generally occupy laboratory bench tops; so these instruments are small, and have the potential to be made even smaller. They are potentially low cost, and could be developed for mass sample testing, initially for medical applications, but also equally probably for any kind of chemical analysis, environmental and liquid or gas process monitoring.
The consequences of all this are that interferometric sensors are going to become very much more compact, rugged and cheaper – bringing them down to the costs of simple photoelectric systems used today. They are also going to become much smarter, allowing them to make accurate ‘best guesses’ in circumstances too difficult for present day devices. As regards the wider implications, it is probable that future individuals will be unable even to sneeze, without some system drawing meaningful conclusions.