Software-driven instrumentation: the new wave - Analytical Chemistry


Software-driven instrumentation: the new wave - Analytical Chemistry...

0 downloads 75 Views 16MB Size

M. L. Salit Department of Chemistry Arizona State University Tempe. Ariz. 85287 M. L. Parsons Los Alamos National Laboratory CHM-1, MIS (2-740 Los Alamos, N.M. 87545

A c e n t advances in computing technology (1-3) and related drops in cost of computing capability have spawned a new generation of analytical instrumentation. This instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The symbiosis of the computer and measurement in these systems has developed into what might be termed software-driven instrumentation. Depicted in Figure 1 is a generic instrument. The goal of this instrument is to make an observation of the phenomenon of interest. This phenomenon may occur under controlled conditions, with feedback through the control and sensor hardware, through the measurement hardware, to the software drivers that structure the ex. periment. Although the modular boundaries of such systems vary, the functionality of the modules is distinct. This instrumentation offers the an0003-270018510357-7 15A$01.5010 @ 1985American Chemical Society

alyst a tool that is capable of makin, measurements otherwise impossible or prohibitively expensive to make. I t is a flexible tool that can be customized to specific tasks through modification of the software instructions that drive the processing, control, and data acquisition hardware. New computing technology has made the shaping of our software-driven tool easier than ever, allowing the nature of the measurement tQ be defined by software. The instrumentation for the measurement can be fixed, whereas a software “skeleton” defines the way in which the instrument components behave together. This permits radical changes in the experiment while minimizing costly changes in the hardware. Another important area in which software-driven measurement systems have found acceptance is in environments in which versatility of computer control, data acquisition, and data treatment are called for, although the measurement system by its nature does not demand it. The features of-

Figure 1. A gen,

instrument fered by such a system greatly enhance the utility of standard analytical instrumentation. This type of application might be a network of instruments integrated into a Laboratory information management system (LIMS). Each instrument need not be a software-driven instrument, but computerization allows the advantages of a cohesive, sophisticated sample-tracking and report-generating system to be effected. Generally the computerization of an instnunent creates an easy-h-use measurement system, eliminating the need for the attention of a skilled analyst and reducing the time demands

ANALYTICAL CHEMISTRY, VOL. 57, NO. 6. MAY 1985

715A

on a technician. Data reduction and data archiving are simplified and more accurate with the calculating power of the computer, liberating more analyst time. Often the sample throughput of a measurement system can be dramatically increased through use of computers in a time-consuming phase of the measurement-be it measurement time, instrument preparation, or data processing. The reliability of measurements obtained from such a system will of course be improved by eliminating the human element in the measurement process. The history of software-driven instrumentation stems from the time that the first minicomputers became available for dedication to instrumentation in a laboratory environment. These computers were expensive, difficult to program, limited in memory and storage, and perhaps most important, limited in their ability to handle real-world input and output (I/O). Design sacrifices were necessary to make the instrumentation conform to the capabilities and limitations of the computer. Despite these limitations, the computer offered a far more efficient method for precise control, data acquisition, and processing than manual or analog electronic methods, and analytical chemists took advantage of these improvements.

There have been refinements in processing capability, memory integration, and I/O capacity of hardware. The level of integration (functionality per component) of computer ICs is such that today’s most complex computers may have a lower part count than their simpler predecessors. This decreases the complexity and cost of the hardware system and enhances its speed and reliability. The new processors not only process more information in less time, but they often support sophisticated architectures or programming structures to enhance capability and programmability. This enhanced speed and programmability not only make it easier to do the things that were done with the earlier technology, but open up applications that were previously impossible. The new memories not only quadruple the amount of information stored on a chip but are smaller and consume less power. Inexpensive, intelligent peripherals allow system input or output in many forms without burdening the processor or other resources. New measurement schemes have been envisioned in many areas as a result of the availability of this new technology. The high level of control over real-world parameters affecting the phenomena to be observed offers precision otherwise unobtainable, and

the data acquisition and timing capabilities offer the analyst unique flexibility in determining how much data to obtain and when to obtain it. The speed of data acquisition hardware is such that extremely fast, transient physical phenomena can be “stroboscopically” stopped in time, allowing for direct observation of a signal of interest while often discriminating against other signals. These abilities have resulted in new techniques in such areas as atomic, molecular, and mass spectroscopy; surface analysis; polymer characterization; and electrochemistry. Today’s data-processing capabilities, in both hardware and software, offer the analyst the opportunity to use sophisticated data reduction on masses of data too large to be treated simply manually. New techniques have grown out of the ability to use pattern recognition techniques on difficult-to-understand data. Useful information can often be recognized and quantified by using chemometric techniques to reduce large, multivariate data sets to their meaningful relationships. This approach is vital when it is the nature of the analytical probe to disorganize, as in a combustion or pyrolysis experiment. Trends in the laboratory are toward automation of sample handling and

Switches LC columns without wrenches. Rheodyne's newest valves make it unnecessary to.disconnect the plumbing when you want to subb r n In,_lOr stitute one LC column for another. You need only turn a pair of tandem valves to select any column from as Switching columns frequently is becoming a generally-accepted practice for several reasons. Often, you must switch to a different column to make a different analysis. But. even when switching columns is not

essential. dedicating a separate column to each analysis eliminates column equilibration delays, reduces interferences, and prolongs column life. Rheodyne's Model 7066 Tandem Column Selector connects any of up to five columns (or a bypass line) between the sample injector and the detector, Off-line columns are sealed at both ends, and no column need ever be exposed to a solvent intended for another. OurTechnical Notes4 tells all about column selection with Model 7066 and other valves. To receive a copy, along with product literature, contact Rheodyne, Inc., PO. Box 966. Cotati, California 94928, U.S.A. Phone (707) 664-9050.

measurement through both fixed (autosamplers, autotitrators) and flexible (robotic) automation systems. Integration of automation systems with instrumentation is most effectively achieved when the experiment is under the control of a computer; the software drivers can be optimized to perform in the required manner for coordination with the automation system. In addition to the measurement advantages, software-driven instrumentation incorporates features brought to analytical instrumentation from computing-ranging from self-diagnostics to expert systems for automatic tuning or optimization to software integration, the ability of programs to share data and offer the user a consistent environment in which to operate on that data. Software integration in the laboratory is a requirement for implementation of a widely accepted, flexible, and efficient LIMS. The electronic laboratory is an achievable concept, with a sample entering the laboratory and being tracked throughout several analyses, the data collated and analyzed, and a report generated without manual data handling. Instrumentation developers are incorporating these sophisticated computing concepts and tools during the design phases of their projects. This

allows for optimization of design, tailoring the computing strengths and limitations to the particular needs of the instrumentation. In many cases the tolerance of the physical measurement apparatus can be significantly relaxed by using feedback techniques in the measurement and control systems. To take full advantage of such features of a computerized instrument the design of the measurement apparatus, the measurement hardware, and the software drivers must be concurrent and integrated. I t is the integration of these three areas of a computerized instrument that allows full exploitation of the capabilities of all subsystems. With tools currently available to the developer, this is a feasible, cost-effective approach with benefits to all phases of the project. Architecture Developments in computing have kept the developers of analytical instrumentation actively upgrading their systems (2,3).For example, an architecture that is having a major impact on software-driven instrumentation is distributed processing, a system in which multiple processors work in cooperation to effect a solution to a problem. This configuration offers increased speed (by sharing tasks), lower cost (several microprocessors are

less costly than a minicomputer), increased modularity of both hardware and software (offering greater reliability and easier modification),and increased standardization in the laboratory. Typically these multiple-processor systems are segregated into host and slave computers, with specific tasks assigned each. Host systems generally encompass the user interface, dataprocessing, and system output functions. These computers range from the popular personal computers (PCs) to highly specialized machines tailored to the management of the instrument. It is the task of the host computer to provide control of the slave system, to provide mass storage for archiving data, and to handle the processing and reporting of analytical results. A host computer need not be dedicated to a particular instrument; it can function as a general-purpose computer-at the disposal of laboratory personnel to improve productivity. Specialized host systems may use array processors for performing Fourier transforms at high speed, floating-point calculation hardware, and image digitizers for the acquisition of video data. Slave computers are dedicated to the instrumentation, providing control and data acquisition capability. These computers either control or directly

A U T O M A T E SPECS, GCs, G E L SCANNERS i for less than $2000

CHROMATOGRAPHY SOFTWARE

-

A u t o or manual peak detection and integration. Store and retrieve scan o n floppy. 16 BIT ADCs - Four independent channels o f 16 b i t ADCs giving 1 part in 65000 resolution t h r o u g h o u t A D C span. (Other mfgrs. use 12 b i t ADCs resolving t o o n l y 1 part i n 4096.) Differential inputs w i t h dedicated integrator f o r each channel for o p t i m u m noise rejection. Full-scale sensitivities of 10 m v t o 1 0 volts available. DIGITAL 1/0 - 2 4 lines o f digital i n p u t / o u t p u t included. DAC CHANNELS - Optional analog o u t p u t w i t h full-scale v o l t ases o f 1 t o 1 5 volts f o r controlling devices o r data output.

HIGH RESOLUTION - VERSATILE COMPLETE - 4 channels o f A D C w i t h digital 110 and chromatography

software

for only

$1 185.

205 Weaver Street / Carrboro, N.C. 27510 (919) 929-5001 for more information

I

IECTROFUGE

C O K P O R A T I O N OF NORTH C A R O L I N A , INC.

CIRCLE 192 ON READER SERVICE CARD

718A

ANALYTICAL

CHEMISTRY, VOL. 57, NO. 6, MAY 198E

B&J Brand" High Purity Petroleum Ether meets the most demanding standards for pesticide residue analysts and gas chromatography. Consistent, reliable purity eliminates interfering background peaks, assures accurate results. For complete information on B&J Brand Petroleum Ether and other high purity solvents for trace analysis contact American Burdick & Jackson, 1953 South Harvey Street, Muskegon, MI USA 49442. Phone: 616 726 3171.

........ American Burdick & Jackson ~;~;~~,Y,odppymCedj;8tion

. .. .....

w;::

01885Ammw HaapialSumYUrmiaQm

CIRCLE 7

ON READER SERVICE CARD

.

.

...

.

. . . . . . ... .

. .

. . .

I CIRCLE 210 ON READER SERVICE CARD '

ANALYTCAL CHEMISTRY. VOL. 57. NO. 6,

MAY 1985

719A

..&we2. A typical distributed-processing environment interface with the measurement electronics. They contain very simple user interfaces (start-stop switches or indicators), if any, which appear to the analyst as black boxes. The level of programmability of the slave system varies from none-the control of the slave is from instructions in read-only memory (ROMl-to complex programs reacting to feedback from external inputs, programmed from a host system with instructions compiled at the slave system. The slave may “buffer” or hold the data it collects in random access memory (RAM), freeing the host from I/O during the experiment, and may preprocess data according to algorithms in ROM or downloaded to program RAM, again offloading a task from the host. A typical distributed-processing measurement system is depicted in Figure 2. Note the slave’s capability for independent action once it has been programmed hy the host. This configuration allows a single host to service multiple slave computers. Slaves can he generic; they can be moved from instrument to instrument or from one technique to another with only a change in the downloaded software. This provides an easy upgrade path for the development of a LIMS in a laboratory, permitting slave measurement systems to he added as needs arise and resources become available.

J

Hardware Different measurement systems have different requirements for processing capability. The range of microprocessors available today offers performance from inexpensive (less than $5) eight-bit chips to highly integrated

16/32-hit microcomputers (several hundred dollars). The nature of the application defines the level of sophistication required hy the processor to meet certain performance criteria. A decision on a given architecture and processor is, therefore, a necessary and important segment of the design phase of the measurement system. Availability of software development capability for a given processor may be as valid a reason for choice of a processor as its performance. Compatihility with installed equipment, cost, availability, and compatibility with peripheral devices are all important factors in the selection of a processor. Often an elegant solution to a measurement problem can be implemented with a simple processor-with lower development overhead than with more complex chips. The purpose of a measurement system is to provide control outputs to a phenomenon and to input information about the phenomenon for further analysis. Clearly the I/O capability of the computer system is critical in a measurement environment. Today’s I/O capabilities are impressive, and it can he presumed that trends to increase the amount of information that can be gathered and the speed a t which it ia transferred will continue. For example, developments in imaging technology have been exciting-the first commercial example in analytical instrumentation is linear photodiode arrays for the multiplexed acquisition of UV-VIS spectra. This technology extends to 512 X 320-pixel imaging devices, which demand high data transfer rates for the large amount of information (160 Kbits) to be passed and processed for each image.

720A

NO.

ANALYTICAL CHEMISTRY, VOL. 57,

Sonware An important current trend in software involves making computer use transparent to the operator. New systems architectures support this trend, incorporating sophisticated user interfaces (pointing devices, pull-down menus, integrated environments) and high-level support for the control and programming of these interfaces. Additional features included in the instrumentation software package may include limited data base capabilities, report generation capability, network support, and integration into standard PC software. This integration will provide a key link between the electronic laboratory and the electronic office. Instrumentation software packages have been affected by these trends; no longer is the software coded in native assembly language and proprietary development languages. This practice results in software that is difficult to maintain and almost impossible to integrate with other packages. New high-level languages are available with good structure and efficiency suitable for the creation of instrument software packages. Time-critical routines are often still coded in native assembly language for efficiency, hut the hulk of the software is written in highlevel languages. This has significantly lowered the overhead involved in the development of software-driven instrumentation in addition to increasing the complexity, capability, and flexibility of the package.

6. MAY 1985

Instruments Examples of software-driven instrumentation include a wavelength-modulated continuum source multielement atomic absorption spectrometry ( U S ) system (4,5),a commercially available Fourier transform infrared (FT-IR) spectrometer (6)that breaks new ground in qualitative and quantitative IR measurements through the use of innovative software, and a pyrolysis gas chromatograph with parallel mass spectral and h e ionization detection for the characterization of polymer systems (7,8). The multielement AAS system uses a high-resolution Echelle spectrometer, a high-pressure xenon arc lamp continuum source for excitation over a large spectral range (200-600 nm), a refractor plate mounted on a computer-controlled galvanometric torque motor for wavelength modulation, and a high-speed analog-to-digital (AD) converter for the synchronous measurement of 16 absorbances (Figure 3). The nature of the measurement scheme (the relationship between data acquisition and wavelength modulation) requires the use of a computer system to implement.

I

Pr:: it cr

and Clean It Up with PrepSep Solid Phase Extraction columns TH

PrepSep column's unique conical shape lets you pour your sample directly into the large lOmL sample reservoir. Permits easy sample addition and large sample volume. Made of polypropylene, with 300mg of selected packing sandwiched between two 20pm pore size polyethylene frits. Fast, easy extraction and elution of pesticides, dyes, parabens, phthalate esters, and many other organic compounds. SDeed UD SamDle cleanup for HPLC, GC, TLC. and UV analysis,.without jeopardizing separations. They replace many preparative techniques. even tedious iqualiquid extractions. Six versatile packings: Five are bonded-phase s.lica: C,. C. C,, cyano. and amino. The sixth is pure wcoated sib ica gel. A.l are 40pm particle size. Packing type IS conveniently marked on upper rim 01 each co umn.

Three convenient ways to achieve fast flow: 1. Connect PrepSep column to a syringe via tubing and use the syringe's plunger to draw eluent through the packing. 2. Insert PrepSep column into a vacuum box to pull the eluent through the packing, into a test tube. 3. Place PrepSep column into a centrifuge tube. Use centrifuging speed and time to control rate and amount of sample flow through the packing. Each PrepSep column is individually sealed for protection against contamination and moisture. They're inexpenw e enough to be truly disposable, eliminating crosscontamination and messy cleandps. For more information, ncluding extraction procedures and analylca. results using PrepSep coiumns, use Reader Service Number. Or write F sner Scientilic. 71 1 Forbes Ave.. Pinsburgh. PA 15219.

&LIED

Oscillating refractor /plate

I

L

giving 1

Xenon arc ,jo;tnl:rn

/

I

Multielement cassette

'PMT array

/l1l111ll l 1 I -

product line.

Figure 3. A multielement atomic absorDtion spectrometer (PMT = photomultiplier tube)

It's all in our lree SDecial Gases &

A five-step waveform is used to drive the torque motor, allowing data acquisition to be performed at five wavelengths across the absorbance profile. These wavelengths correspond to two background intensity measurements, two off-center intensity measurements, and a line center intensity measurement. The capability to measure the intensity at an off-line wavelength permits extension of the d y namic range of an absorbance measurement to higher concentration through the calculation of a secondary, less-sensitive absorbance. The background intensity measurements allow for dynamic background correction-a technique effective in discriminating against the nonspecific ahsorhance commonly observed in the graphite furnace atomizer. To implement this measurement scheme a timed interrupt-driven control architecture is established, with digital-to-analog conversion from a waveform array to control the refractor plate position (wavelength) and high-speed multiplexed AD conversion being performed after motor settling. The resultant array of multielement intensity data (-192 Khytes) is

CIRCLE 3 ON READER SERVICE CARD

' 2 2 1 * ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985

then converted to arrays of absorbance data, one per element, an approximately 3-min task on a PDP 11/23 with a hardware floating-point processor. Such a measurement scheme and data reduction method are far too unwieldy to implement in an analog fashion. The FT-IR system with vectorbased software from Beckman Instmments is one of the new generation of low-cost Michelson-interferometerbased instruments. This system uses a distributed-processing architecture to implement control and data acquisition from the interferometer with a processor devoted to managing the sophisticated user interface (light-pendriven graphics and analysis) and data processing. Specialized hardware permits rapid transforms of the data between the frequency and time domains as well as between the ahsorhance and transmittance domains. The phenomenal growth in FT-IR instrumentation is directly attrihutable to the availability of suitable, low-cost, computer-driven measurement systems. The control requirements for the mirror drive are exacting, as is the data acquisition timing.

color HPIC brochure packed application chromatograms, tech-

I

system consisting of three cartridge lengths, two internal diameters and fifteen packing materials. Choose from Aquapore@wide pore reverse phase and ion exchange columns for the separation of proteins, peptides and macromolecules;Polypore '" resin based columns for the separation of sugarorganic acids and alcohols; or standard pore normal and reverse p h ~ s e columns for a variety of applications. Also included is informition regarding

a