Computational processing and error reduction strategies for standardized quantitative data in biological networks

FEBS J. 2005 Dec;272(24):6400-11. doi: 10.1111/j.1742-4658.2005.05037.x.

Abstract

High-quality quantitative data generated under standardized conditions is critical for understanding dynamic cellular processes. We report strategies for error reduction, and algorithms for automated data processing and for establishing the widely used techniques of immunoprecipitation and immunoblotting as highly precise methods for the quantification of protein levels and modifications. To determine the stoichiometry of cellular components and to ensure comparability of experiments, relative signals are converted to absolute values. A major source for errors in blotting techniques are inhomogeneities of the gel and the transfer procedure leading to correlated errors. These correlations are prevented by randomized gel loading, which significantly reduces standard deviations. Further error reduction is achieved by using housekeeping proteins as normalizers or by adding purified proteins in immunoprecipitations as calibrators in combination with criteria-based normalization. Additionally, we developed a computational tool for automated normalization, validation and integration of data derived from multiple immunoblots. In this way, large sets of quantitative data for dynamic pathway modeling can be generated, enabling the identification of systems properties and the prediction of targets for efficient intervention.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Computational Biology / methods
  • Computational Biology / standards*
  • Electronic Data Processing
  • Immunoblotting
  • Immunologic Techniques
  • Immunoprecipitation
  • Proteins / analysis
  • Reference Standards

Substances

  • Proteins