Author

Hongwei Tong

Date of Award

8-1995

Degree Type

Thesis

Degree Name

Doctor of Philosophy (PhD)

Department

Chemical Engineering

Supervisor

Professor Cameron M. Crowe

Abstract

Measurements such as flow rates from a chemical process are inherently inaccurate. They are contaminated by random errors and possibly gross errors such as process disturbances, leaks, departure from steady state, and biased instrumentation. These measurements violate conservation laws and other process constraints. The goal of data reconciliation is to resolve the contradictions between the measurements and their constraints, and to process contaminated data into consistent information. Data reconciliation aims at estimating the true values of measured variables, detecting gross errors, and solving for unmeasured variables.

This thesis presents a modification of a model of bilinear data reconciliation which is capable of handling any measurement covariance structure, followed by a construction of principal component tests which are sharper in detecting and have a substantially greater power in correctly identifying gross errors than the currently used statistical tests in data reconciliation. Sequential Analysis is combined with Principal Component Analysis to provide a procedure for detecting persistent gross errors.

The concept of zero accumulation is used to determine the applicability of the established linear/bilinear data reconciliation model and algorithms. A two stage algorithm is presented to detect zero accumulation in the presence of gross errors.

An interesting finding is that the univariate and the maximum power tests can be quite poor in detecting gross errors and can lead to confounding in their identification.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS