Data Handling

You can sum the intensities of ions, and plot them as a function of time (chromatographic retention time) for a total ion chromatogram (TIC), which looks much like the output of a spectrophotometer such as a UV detector. In the case of MS, one axis represents ion intensity; the other can be time or the digital sample taken at a particular time (i.e., a spectrum). You can display each of the spectra can separately, much like a series of images acquired by modern digital video cameras that are, in essence, a series of high speed still photos.

Simple but very useful techniques are possible: for example, reducing the array of data in a selected ion chromatogram or applying digital filters to reduce noise, as you could by displaying only the most intense peak of each digital sample (a base peak ion chromatogram, or BPI).

Data output, storage and retrieval

Software design has become a separate specialty over the years, not simply a means to set acquisition parameters. Today, operating and data systems permit intricate control of an instrument by its operator.

Significantly, these specialty software packages have evolved:

  • Workflow controls such as open access (also called 'walk-up systems) - a fully trained operator can make complete LC or GC/MS methods available to a large number of non-specialist users giving them access to advanced technology without the requirement for extensive training. A non-specialist may only need to make occasional use of an instrument for determining a compound's identity or purity. The system allows them access without first becoming proficient operators themselves.
  • Data reduction applications - These packages for instance may help identify metabolites or develop biomarkers in complex mixtures from the thousands of unique chemical entities. The applications are often augmented by "expert" systems such as principal component analysis software (PCA), which examines trends not otherwise visible in the extensive output.

The demands of data management are fast outstripping the ability to meet them. High resolution, mass-accurate data can generate a prodigious 1 GB/h. Such enormous quantities of data are generated not only by life science investigators but, increasingly, by those working in industries that depend on high volume processes like characterizing the presence of metabolites and their biotransformations. After 180 days of operation, five mass spectrometers, each producing 24 GB of data per day, will present you with the need to store, retrieve, sort, and otherwise make sense of 21.6 terabytes (TB).

The first question in any data scenario must address what we intend to do with the data we collect. Unlike e-mail, which imparts its message and thereafter serves little purpose, online data increases in value over time as biological, pharmaceutical, and physicochemical measurements continue to amass within a data file. But this increase in value comes with the cost of ensuring the data's accessibility. In view of the increasing size of data files, and the length of time over which they must be accessed, a solution might include some form of hierarchical storage management. Thus, some smaller percentage of the data would be immediately accessible, or "active," while the remainder, in successive stages, are in-process or earmarked for long-term archiving.

See MS - The Practical Art, LCGC (

  • Profiles in Practice Series: The High Speed State of Information and Data Management, Vol. 23 No. 6, June 2005
    • Why this is important: As data output becomes more complex and voluminous, archiving and retrieving and structured storage emerge as critical issues.

  • Hardware and software challenges for the near future: Structure elucidation concepts via hyphenated chromatographic techniques, Vol. 26, No. 2 February 2008
    • Why this is important: The amount of data being developed by modern experiments, which often include MS and orthogonal or hyphenated analytical systems, is discussed.

< PreviousNext >


Contact Waters

Local Offices