Time-resolved spectroscopy is taking the world of materials analysis through light to the point where the spectrometer is more sensor than camera and where visible light is no longer the only wavelength under study. Advances in detectors, lasers, their pulse speed, and the timing components and their associated electronics, all mean that researchers can give more consideration to time-resolved spectroscopy.
‘Time-resolved spectrometry comes traditionally from the life sciences area, but we see it also a lot for online or inline process monitoring where you need to have a very fast feedback signal acting more like a sensor,’ says Benno Oderkerk, chief executive officer at Avantes.
The way to get a sensor-like action with spectrometers is to avoid using all the pixels or wavelengths that are on offer, according to Oderkerk. ‘You don’t need all of the pixels or wavelengths of the spectrometer, but you just look at a few specific wavelengths. You need very fast measurement,’ he says. The idea is to process a few pixels at a very high speed and this can be done in different ways. ‘It can be monitored in a very short time like a burst and you store the data in the spectrometer and read it out later on when you have time enough to look at certain phenomena, such as a flash or a spark. It could also be that you monitor in a continuous environment, in a process control-type application.’
For continuous measurements, an advantage is a spectrometer that has no moving parts. Such a spectrometer uses CCDs and photodiode arrays, which can operate at a very high speed. No moving parts means there is nothing that needs to rotate and with a detector array that can be quickly read out, the spectra can be reduced down to 60 microseconds for 128 pixels instead of more pixels that take longer to process.
Because a select amount of pixels is all that is needed for many applications the processing power of computers is not an obstacle. Instead it’s all about transferring the relevant data points fast enough. ‘As long as you can transfer the relevant data points quickly enough to the computer or process it in the spectrometer itself you don’t need to transfer all of the data,’ says Oderkerk.
He explains that that is the philosophy Avantes follows. It has developed small spectrometers with enough processing power to be able to carry out this high speed application onboard the device. ‘You can get more like a smart sensor approach than a spectrometer approach,’ says Oderkerk. But there is a downside to the faster computing. ‘What you see in general as things get faster is more power consumption, and heat dissipation is an issue. The faster things go, the more power they consume. You can feel it with your iPhone; the iPhone is faster, but they are warmer. There is something to do with power disputation and speed.’ However, Avantes’ devices are power limited because they take power from the USB port. ‘We take the power from the USB port and that is only 500 milliamps, so that is a limiting factor,’ says Oderkerk.
With such power limitations pre-processing is important. ‘If you don’t need to have many data points to start with, you should reduce the data as quickly as possible, on the sensor itself or the electronics that run on the spectrometer thus reducting the data that needs to be processed.’ He gave the example of reducing 2,000 pixels to perhaps 250 or 300. Transferring that smaller number of pixels is much faster.
You don’t get much faster than picoseconds and at Edinburgh Instruments they use a technique called time-correlated single photon counting. ‘We use that technique to measure fluorescence. With the lasers and detectors available, this allows us to measure phenomena down to 5ps and that limit is pretty much fixed by the fastest detectors available,’ says Roger Fenske, operations manager at Edinburgh Instruments.
While 5ps is the technological limit, there are now detectors available that are not quite as fast, but are a lot cheaper. They will measure 10-15ps life times and are significantly cheaper. ‘There’s also a movement towards the infrared and new detectors available that allow fast measurements into the infrared, have allowed picosecond measurements up to 1.7μm.’
Fenske explains that ten years ago few such instruments able to detect at up to 1.7μm would be sold per year. Now there are one or two such instruments being sold every month. Fenske is also seeing more and more applications using the infrared. In his experience the areas of interest to scientists have been shifting towards the IR, where maybe 10-15 years ago the emphasis was on the visible spectrum. Now the emphasis is on the IR and near IR and its related material.
‘A lot of it is material research. In the infrared an example would be solar accelerators,’ says Fenske. A solar accelerator is a layer of a solar cell that will capture wavelengths that silicon doesn’t absorb and emit the same light at a wavelength absorbed by silicon. ‘The ability to measure the lifetime [of this process] allows you to see what is happening at the quantum level, because the decay lifetime you measure is equivalent to the probability of the [solar accelerator] material relaxing back to the ground state.’
Other examples Fenske gives are biochemistry and nanoparticles research. Here the lifetime measurement aspect of time-resolved spectroscopy adds another dimension to what can be recorded about the processes at hand. ‘It allows you to see what is happening to the molecule and its surrounding area and this has many applications; to see the interaction between drugs and proteins.’
While nanoparticles may not seem to be related to biochemistry Fenkse explains that nanoparticles act in the same way as a single molecule. ‘A lot of people are measuring all sorts of nanoparticles from gold to copper, organic [molecules] and particle clusters,’ says Fenske.
Faster detection and a wider range of applications is all well and good, but accuracy still has to be maintained. Uwe Ortmann’s, head of marketing at Picoquant, comments: ‘The time resolution opens new accuracy levels. What is key to that is the electronics. We just brought out timing electronics that use the USB 3.0 transfer port, and this gives almost a factor of 10 higher photon throughput count.’ Instead of 10 mega counts per second, Picoquant’s product can deliver close to 150 mega counts per second that could be transferred to the computer, widening the potential applications for spectroscopy.
Parallelisation of measurement, of timing devices is another area that Picoquant is working on. Considered another key to faster throughput is the use of multiple timing devices linked through a router to the laser. But the router can be a bottleneck. ‘To make this really parallel, it is getting really tricky,’ says Ortmann. ‘We are also parallelising the timing modules. For example, right now, the best module is eight timing devices plus one sync for the laser. And this could be multiple of what you want. Parallelisation on the timing electronics is also key to faster throughput.’
Another factor is the speed of the source light. The lasers are getting faster with faster pulsing, and they are cheaper. ‘Now maybe you have multiple diode lasers instead of one laser. It allows you to measure multiple colours at the same time with the same sample or multiple wells with an array of diode lasers. In this way, it is a big advantage to make faster measurements,’ says Ortmann.
Ortmann is not as confident about infrared’s use in time-resolved spectroscopy as others. While he recognises the lasers and timing electronics are there to do the job, he sees few detectors and what detectors there are, are not user friendly for system builders. ‘It is an important spectral range, not for biochemistry though, but for material sciences,’ says Ortmann. He sees UV excitation as a more commercial wavelength and Picoquant has a 266nm laser available, able to do lifetime imaging. ‘It is also partly driven by detector development; for two years there have been very sensitive UV detectors in this area.’
Another development Ortmann sees for time-resolved measurement is that it is becoming easier. ‘Ten years ago people were afraid of the complexity of the measurement and now we have machines that are driven by measurement wizards, so the logic of doing the measurement is built into the machine and it is much simpler to do those measurements. You don’t have to think about operating the device.’ According to Ortmann this gives users the freedom and time to think about how to analyse the data that they are getting and not worry about how to collate it.