Table Of ContentAutomated Electrocardiography and Arrhythmia Monitoring
Janice M. Jenkins
C OMPUTER PROCESSING of the electro- undergoing further development will be dis-
cardiographic signal (ECG) began over cussed. A historical review of the evolutionary
two decades ago when Dr. Hubert V. Pipberger stages of computerized electrocardiography will
undertook a project for the Veterans Administra- be presented leading to a discussion of the pres-
tion in which he employed a digital computer for ent state of the art and future trends.
the automated detection of ECG waveforms. The Rhythm analysis represents a particularly dif-
original attempt at automation merely delin- ficult aspect of computer interpretation. Because
eated and measured the P, QRS, and T waves, the QRS complex is the most easily detected
detecting the onset and termination of each waveform of the ECG, QRS morphology and RR
wave, and measured intervals between waves. interval measurements constitute the major fea-
Contour analysis of the waveforms followed and, tures for rhythm determination in both the com-
as this technique became more highly developed, puter-assisted diagnostic ECG and computerized
Pipberger and others began to apply decision tree arrhythmia monitoring. Logic exists in most sys-
logic to the results in order to arrive at a specific tems for P-wave information to be incorporated
diagnostic interpretation. At a later stage, second into the rhythm decision, but the frequent failure
generation programs were designed that em- of P-wave detection represents a serious flaw in
ployed statistical methods for diagnosis. Clinical the accuracy of rhythm interpretation. New
implementation of computerized elcctrocardio- techniques for reliable P-wave measurements
graphy occurred in the early 1970s and has such as more optimally located electrodes. par-
continued to develop at a rapid rate. ticularly those which hold promise for improved
Computerized electrocardiography falls into arrhythmia classification, will be presented.
two broad categories: computer-assisted inter- The problem of testing and evaluation of exist-
pretation of the diagnostic ECG and computer ing ECG systems continues to present difficulties
monitoring of cardiac arrhythmias. In the first and will be examined in light of recommenda-
category, pattern recognition techniques are tions that have been advanced. A library of tape
applied to an ECG signal that has been pre- recorded arrhythmias has been collected. diag-
viously acquired and stored in a digital computer nosed, and annotated by experts to serve as an
for examination at length. In arrhythmia moni- instrument for the assessment of rhythm moni-
toring the dynamic ECG signal is analyzed on- toring systems, but, to date, no such data exist for
line such as in coronary intensive care monitor- testing diagnostic systems.
ing, or long-term recordings are processed at
speeds faster than real-time, as in Holter analy- COMPUTER INTERPRETATION OF THE
sis. In both categories, a feature extraction stage DIAGNOSTIC ECG
detects waveforms, determines boundaries, ex-
amines morphology, computes amplitudes and
The heart beat is an electrical process; that is,
duration, and measures interwave intervals. A
currents flowing within the heart are the cause of
contour-analysis stage applies clinical criteria to the beat itself.’ This electrical impulse is ini-
these measurements to arrive at a diagnostic
tiated in the sinoatrial (S-A) node. spreads
classification. and a contextual string is then
through the atria, and coincidentally produces an
examined for rhythm analysis.
atrial contraction. The impulse after traversing
This paper will describe the data acquisition
and signal processing techniques that have been
applied to computer-assisted electrocardiogra- From the Department of Electrical and Computer Engi-
phy since its advent in 1957. Methods for contour neering. the University of Michigan. Ann Arbor.
analysis and interval measurement will be Reprint requests should be addressed to Janice M. Jen-
kins. Department of Electrical and Computer Engineering,
described, and the application of diagnostic crite-
The Universiry oj’Michigan. Ann Arbor. MI 48109
ria to these results will be examined. Rhythm
0 1982 by Prune & Stratton. Inc.
analysis and serial comparison. which are oo33-on2~1,/83/25os-o002 %05.00/0
Progress ,n Cardmmcdar Dsease, Vol. XXV, No. 5 (March/April). 1963 367
368 JANICE M JENKINS
the atria1 chambers reaches the atrioventricular strated a significant improvement of perfor-
(A-V) node, which constitutes a pathway mance over the other. The orthogonal system,
between the electrically insulated upper and given that it contains the same information as the
lower chambers. The A-V node temporarily 12-lead system, offers an advantage for com-
inhibits the impulse after which it is rapidly puter analysis in that it effects data reduction by
transmitted via the His-Purkinje network to all a factor of 4: I .4 Nevertheless. at present. systems
regions of the ventricles. The global depolariza- that employ 12 leads constitute 95%) of all com-
tion of the ventricles causes a synchronous mus- puter-assisted ECGs, while 3-lead systems com-
cular contraction that propels blood into the prise less than 4%,.‘” This is probably due to the
arteries of the body. These macroscopic events familiarity most physicians have with the I2-lead
result from ionic currents operating microscopi- set.
cally at the cellular level.’
The electrocardiogram is a recording or Signal Processing
graphical representation of the electrical activity The standard direct-wiring electrocardiograph
of the heart. The electric current distributions is the primary and most widely used instrument
within the human body that result from the in electrocardiography. The device consists of
spontaneous depolarization of the heart can be multiple electrode connections to an amplifier
detected by sensors located at various positions that provides a signal to drive a strip chart
on or within the body. Electrocardiography had recorder or hot stylus recorder. Electrocardio-
its birth in 1903 when Einthoven, a Dutch physi- graphs are generally one channel or three chan-
ologist, developed a special string galvanometer nel with switching mechanisms for selecting a
for recording minute variations of current or particular lead or lead set. To comply with
electric potential.’ He employed three leads in American Heart Association recommenda-
his electrocardiographic investigations, and re- tions,” an electrocardiograph must have a fre-
quired that the galvanometer be connected such quency response of 0.05 Hz to 100 Hz (3 db
that the string be deflected upwards when the down) in order to insure accurate reproduction of
“base of the heart was negative with respect to clinically used measurements such as wave
the apex.” amplitudes and durations. Data acquisition
There are two fundamental lead systems methods for computer processed ECGs include
employed for conventional electrocardiography the electrocardiographic instrument as a first
as well as for computer processing: ( I ) the stan- stage. The analog signal is detected and ampli-
dard i2-lead configuration consisting of the Ein- fied, generally with a gain factor of 1000. This
thoven limb leads (I, II and Ill), the augmented transforms the electrocardiographic signal,
leads (aVR, aVL and aVF), and the precordial which falls in the I to 10 mV range. into a I- IO V
leads (Vl through V6); and (2) The 3-lead range for further processing.
orthogonal sets such as Frank leads (X, Y, and A variety of storage media have appeared
Z) or corrected sets (McFee or Schmitt).J,’ throughout the years for automated electrocar-
The classical electrocardiogram is plotted diography. Early methods included manually
sequentially with vertical deflections repre- determined measurementsk eypunched into com-
senting the potential difference between elec- puter cards, storage of the entire ECG in analog
trodes, and a horizontal axis that represents time. form on magnetic tape or on magnetic strips
The deflections that are registered on the electro- bonded to computer cards, microfilm storage on
cardiogram are each associated with a particular computer aperture cards, and the most common
electrical event in the heart and the overall method in present use, storage in digital form on
tracing provides a wealth of diagnostic clues to magnetic disks. The availability of disk packs
cardiac structural and functional abnormalities. that contain three million bytes of storage with
During the two decades of computer-assisted access times of nanosecondsh as made instant
electrocardiography, numerous investigators retrieval and serial comparison of multiple ECGs
have studied the accuracy of the 12-lead versus a reasonablet ask.
the 3-lead sets.” ’ In general it was concluded Since it is not feasible to have all electrocar-
that neither the 12-lead nor 3-lead set demon- diograms recorded in a location that is adjacent
AUTOMATED ELECTROCARDIOGRAPHY 369
to the computer facility, the second stage of the ers range from g-bit to l2-bit precision. If the
system generally consists of magnetic tape dynamic range of the input signal is IO mV, an
recording of the data or telephone transmission g-bit A/D converter would have a precision (or
to a central computer. The analog signal is maximum quantization error) of 80 PV with
frequency modulated and, in the case of tele- respect to electrode potential. A I O-bit converter
phone transmission, three channels of data are provides a precision of 20 FV, and a I?--bit
multiplexed onto the standardized carrier fre- converter, a precision of 5 pV.
quencies of 1075, 1935, and 2365 Hz. This While the major information content of the
analog transmission over voice-grade lines can be signal falls within a O-100 Hz bandwidth, high
plagued by noise that badly distorts the original frequency components are sometimes present.
signal. (The signal to noise level is about 40 db.) Sampling rates in all of the current computerized
Even in optimal FM transmission the bandwidth ECG systems range from 100 Hz to 1000 Hz.
of the transmitted signal is limited to 100 Hz due Although a 250-Hz sampling rate may be ade-
to the frequency response of the electrocardio- quate for an electrocardiographic signal that has
graphic recording device. already been bandlimited by the electrocardio-
Digital transmission has frequently been graph or the limitations of telephone transmis-
advanced as an alternative to analog transmis- sion, the Nyquist sampling theorem holds that a
sion. While this provides an improved signal- sampling rate that is at least twice the highest
to-noise ratio (about 52 db). bandwidth limita- frequency content in the signal is necessary for
tions of voice-grade lines have made widespread accurate reproduction. The American Heart
use of digital transmission of electrocardiograms Association recommendations” for digitally
unfeasible at present. At 12-bit conversion preci- sampled data are as follows: “As a minimum
sion and typical 500-Hz sampling rates, each requirement, reconstruction of the original elec-
I-set segment of 3-channel ECG data would trocardiographic waveform with a fidelity com-
constitute 18,000 bits of data. At baud rates of parable to that of a direct writer can be accom-
2400, which are common, a I -set segment would plished with equal interval sampling of 500 per
require over 7.5 set to transmit. At 9600 baud second. digitized with a precision of 10 micro-
this would be reduced to less than 2 set but still volts, referred to electrode potential.” Thus an
cannot be achieved in real time. New techniques I l-bit A/D converter and a 500 Hz sample rate
are emerging that hold promise for rapid devel- should be provided as a minimal configuration
opment in the direction of improved digital trans- for accurate computer processing of the electro-
mission. These include schemes for data com- cardiogram.
pression before transmission and dedicated
transmission lines and transmission links with
broader bandwidth capabilities. The beginning and end of a signihcant wave-
Prior to any computer processing or analysis, form of the electrocardiogram (P. QRS, T) can
the analog signal must be converted to a digital be defined numerically by the rate of voltage
representation by an analog-to-digital converter. change. This rate of voltage change can be
Techniques for direct digital data acquisition are expressed by tirst differences between consecu-
being developed,” but at present the analog tive ECG data points.” If, for instance, input
signal is presented via magnetic tape or tele- data is sampled at 500 points per second (a
phone transmission to the remote computer site. temporal resolution of 2 msec), then the rate of
The electrocardiographic signal (a voltage vary- voltage change can be expressed in terms of
ing as a function of time) is converted at uniform, ditigal conversion units. Suppose an analog-to-
preselected time intervals into discrete numerical digital converter has a I2-bit precision with an
values representing the magnitude of the signal input range of IO mV. In this case the electrocar-
at each sampling point. There are two compo- diogram can span 4096 conversion units with a
nents that determine whether the original signal resultant resolution of 5 pV. Requiring a voltage
can be accurately represented and reconstructed: change that exceeds a preset threshold (specified
the sampling rate and number of quantizing in conversion units) is a method commonly
levels. Typical analog-to-digital (A/D) convert- employed for locating waveforms within the
370 JANICE M JENKINS
*y t+at-Yt
..4 the ECG into normal or various disease or :~r-
SLOPE OF THE CURVE
rhythmia states.” There arc two general
approaches to pattern recognition of the ECG
At
waveforms.” In the first, a cardiologist or group
of cardiologists determines the desired pattern of
amplitudes and durations that represent normal
and abnormal states, and then associates specitic
combinations of these measurements with cer-
tain diagnostic statements. In the second
approach, pattern matching is established by
Fig. 1. Geometrical representation of the first differ- mathematical techniques such as cross correla-
ence. (Reproduced by permission from Charles C. Thomas tion, or by fitting a mathematical expression
Publishers.‘3)
such as a Fourier series to the ECG waveform.
ECG. Obviously the threshold in conversion
Diagnostic Interpretation
units is directly related to sampling rate and
After the initial waveform detection, pattern
A/D precision. The QRS complex is relatively
recognition. and measurement algorithms have
simple to detect by this method, while P and T
been applied, the diagnostic stage is entered.
waves because of their lower amplitude and
There arc two major strategies employed to
slower rate of change are more difficult to recog-
arrive at the diagnostic interpretation: decision
nize. Figure I shows a geometrical representa-
tree logic (a deterministic approach), or maxi-
tion of a first difference, and Fig. 2 demonstrates
mum-likelihood (a statistical approach). In the
the type of measurements that can be obtained
first scheme, measurements fall within or with-
through the application of digital differentiation
out certain ranges, and Boolean combinations of
to the electrocardiographic signal.
each of these results determine whether the
Pattern Recognition criteria for a certain diagnostic state are met. As
All computer programs for diagnosis of elec- an example, a QRS deflection exceeding a cer-
trocardiograms contain two major stages: a wave- tain value in a V lead might elicit a diagnosis o!
form recognition section which, after detection left ventricular hypertrophy. In the second mcth-
of significant waves, measures the amplitudes od, Baycsian statistical techniques are applied,
and duration of complexes and durations of and the outcome, or diagnostic statement. has a
intervals; and a diagnostic section that classifies probability associated with it. The diagnosis with
0
POINT OF STEEPEST POSlTlVE SLOPE
POINT OF STEEPEST NEGATIVE SLOPE
MAXIMUM
MINIMUM
CORNER
ZERO CROSSING
A
Fig. 2. Examples of points that can be detected on the electrocardiographic signal through the application of digital
differentiation. ffleproduced by permission from Charles C. Thomas Publishers.‘3)
AUTOMATED ELECTROCARDIOGRAPHY 371
the highest probability is selected, and this is electrocardiograms in 1959 at the Medical Sys-
not only by the indices of the electro- tems Development Laboratory in Washington,
cardiographic measurements, but by the prior D.C. The intention was to demonstrate the feasi-
probability of the condition existing within the bility of a computer program to extract clinically
population under observation. The results are useful measurements of electrocardiographic
highly dependent upon a priori probabilities, thus parameters. The selection of the ECG for com-
a large population that accurately represents the puter analysis was based on the availability of a
incidence of disease states is required in order for backlog of electrocardiographic data on subjects
a priori probabilities to be determined. The known to be normal or abnormal, thus providing
Bayesian approach is sometimes combined with the capability of statistical analysis of results.
decision tree logic for final classification. The initial system utilized tape recorded data
that was analog-to-digital (A/D) converted at
Historical Review 625 Hz. The data were converted to punched
In 1957 the electrocardiogram was chosen for cards for input to the computer. Thirty-six leads
a pilot study in automatic processing of medical were analyzed and each lead took 64 min of
data because of its widespread use as a diagnostic computer time to process. Parameters selected
aid.’ A system developed by Pipberger at the for measurement were P, Q, R, S, T. and U
Veterans Administration Hospital, Washington, waves and the PQ, ST, QT, and RR intervals.
D.C., was capable of automatic recognition of Three characteristics were determined: ampli-
electrocardiographic waves by digital computer. tude, duration, and slope. The system was later
The original system for sampling and converting rewritten in numerousl anguagesa nd modified to
ECG data into digital form for entry into a run on a variety of computers. The version even-
computer was developed specifically for this pro- tually distributed by the U.S. Public Health
jfxct,“.lh The computer program was capable of Service was known as ECAN (ECg ANalysis
accurate determination of beginning and end of Program) and employed the 12 classical leads.
P waves, QRS complexes, and end of T waves. These two early attempts at computer mea-
The Frank orthogonal lead system was employed surement of electrocardiographic waveforms
for the majority of the ECGs and the Schmitt provided the basis for a second stage in which
SVEC I I I lead system for the remainder. Techni- pattern recognition techniques were applied to
cally poor records were selected for processing the results of the waveform detection; these
from a tape recorded electrocardiogram library pattern recognition techniques were combined
containing 2500 casesb ecausei t was felt that an with the employment of diagnostic criteria in
automatic wave recognition program should be order to arrive at an electrocardiographic inter-
tested with tracings such as might be encoun- pretation. The first successful program to inter-
tered under unfavorable clinical conditions. A pret an ECG diagnostically was an outgrowth of
total of 395 electrocardiograms were analyzed by the early Pipberger work.”
digitizing the signal at 1000 Hz, applying a In 1966 another computer program was
digital filter with a high frequency cutoff of 60 reported that was capable of diagnosing the
Hz, and computing the spatial velocity. It was electrocardiogram.‘9 Decision tree logic applied
found that spatial velocities exceeding 3 PV per to measurements of waveform amplitude and
msec were found only in the significant wave- duration produced reports of right and left ven-
forms: P. QRS, and T. Computation time for the tricular hypertrophy, bundle branch block, intra-
entire wave recognition program averaged I5 set ventricular conduction disturbance, and poste-
per record. The beginning and end of each elec- rior and anterior myocardial infarction. Unlike
trocardiographic wave was identified and dura- the Pipberger and Caceres programs, the ECG
tions of waves and intervals were determined. measurements were done manually by techni-
Failures in measurement were encountered only cians and keypunched onto IBM cards (four
in casesw ith cardiac arrhythmias. The program cards per ECG) for computer input. The 3-yr
represented a major advance in computer tech- project to write a workable program, test its
nology and served as a model for many programs validity, and establish its usefulnessi n a private
which followed. nonuniversity hospital was undertaken to demon-
Caceres” began work on computer analysis of strate that computer-assisted interpretation
372 JANICE M JENKINS
could be valuable especially to the noncardiolo- sis, Klingeman and Pipberger turned to statisti-
gist physician. In a total of 4469 electrocardio- cal classification techniques for the assignment
grams processed by the system, all but I9 were in of electrocardiograms into various diagnostic
agreement with the cardiologist who overread categories.” An initial study in which ECGs
them. were classified into normal and left ventricular
At the same time, a hybrid computer system hyperptrophy (LVH) types was reported in 1967.
for both measurement and interpretation of elec- Four statistical methods were applied to mea-
trocardiograms appeared.” Threshold detectors surements taken from orthogonal electrocardio-
were employed by an analog editor to detect P, graphic record samples. Amplitude measure-
QRS, and T waves. A template representing a ments of various selected points were summed,
typical heart cycle was generated to serve as a and vector differences in three-dimensional
standard with which to compare successive heart space were calculated both with and without
cycles. For each heart cycle, a binary word weight factors. A class-separating transforma-
(match word) was produced, which described tion was tested as well. These 4 statistical tech-
how well the heart cycle matched the template. niques were applied to 3 sample groups contain-
Figure 3 shows a schematized version of an ing 100 ECGs from normal subjects and 100
electrocardiographic passage and the associated from patients with clinically documented left
template. The numbered points in the represen- ventricular hypertrophy (LVH). Best results
tation of the template are the significant points of were obtained with weighted vector differences
the waveform that are recognized and flagged by based on 8 amplitude measurements (84% cor-
the analog circuit. The program was divided into rect classification). The class-separating proce-
two parts: contour and arrhythmia. The contour dure produced an 80% result. while the method
interpretations were “electrical” rather than of summed amplitudes led to 67% separation.
clinical in nature, and rhythm analysis posed Table I shows results of separating normal (N)
problems because of inaccuracy of waveform from left ventricular hypertrophy (LVH) for
measurements, particularly P waves. But the each of the four procedures tested. These early
system advanced computerized electrocardiogra- results demonstrated the practicality of utilizing
phy dramatically and served as a predictor of the computer for the automatic measurement of
things to come. numerous ECG amplitudes and the application
During this early period while numerous sys- of complex statistical procedures for data anaiy-
tems emerged using decision tree logic for analy- sis. Table 2 summarizes the systems that were in
E
Template
Fig. 3. Schematized version of an electrocardiographic passage and the associated template of a normal beat. The
numbered points on the template are the significant points of the waveform which are recognized and flagged by an analog
circuit. (Reproduced by permission from the Annals of the New York Academy of Sciences. 29 )
AUTOMATED ELECTROCARDIOGRAPHY 373
Table 1. Classification of ECG Records Based Table 2. A Summary of the Characteristics of the Early
on Record Samples Systems for Computer Detection end Analysis of
Electrocardiograms
COUeCtly
Classlfled Estimate of Plpberger
Equal Error
1%) Analog data recorded on FM tape
Classlficatlon
Procedure. N LVH (%I Analog-to-digltal conversion at 1000 Hz
15 set ECG. XYZ leads
Experiment 1
IBM 704 computer
(100 N and 100 LVH records)
Spatial velocity to detect waveforms
74 75 74.5
Caceres
2 85 76 80.5
Analog data recorded on FM tape
IJI 93 78 85.5
Analog-to-dlgital conversIon at 625 Hz
4 83 78 82
Data stored on punched cards
Experiment 2
Control Data 160A computer and DEC PDP8
(new samples of 100 N and 100
64 mln to process one lead
LVH records)
Staples
69 59 64
Human measurements of waveforms
2 86 69 77.5
Decision logic for RVH, LVH, BBB. ICD, IRBBB.
3 90 73 81.5
CRBBB, CIB88, PMI, AMI, T-wave changes
4 89 69 79
IBM 1441 computer
Experiment 3
4469 ECGs processed
(new sample of 100 N records,
Wortzman
LVH sample of experiment 2 re-
Analog data recorded on FM tape
tanned)
Sampled only at points of significant changes In ECG
89 59 74
IBM 1401 computer
88 69 78.5
Low pass filtering at 60 Hz
95 73 84
86 69 77.5
Average recogmtion rates derived features and flagged points of exceptional inter-
from experiments 1 to 3
est. Digital filtering was applied to each flagged
1 77.3 64.3 70.8
point using 20 adjacent points before and after
2 86.3 71.3 78.8
the point of interest. The filtered value of the
3 92.7 74.7 83.6
4 87 72 79.5 flagged point was compared to the unfiltered
Evaluation of four stattstical methods applied to measurements amplitude. If it fell within ,025 mV of the
taken from orthogonal electrocardiograms for the purpose of original point it was retained; if not, the original
separating normal subjects from those with left ventricular value was retained. The rationale was that any
hypertrophy.
point that was not greatly affected by filtering
*Procedure 1 = sum of amplitude measurements; procedure
was assumed to be in a region of low frequency.
2 _ vector differences; procedure 3 = welghted vector differ-
ences; procedure 4 = class-separating transformations. The If there was a large discrepancy between the
weight factors for the vector differences were determined on the filtered and unfiltered value, it was assumed that
basis of all records. The matrices for procedure 4 were obtalned the point fell within a region of high frequency,
from the samples of experiment 1. For further details see text.
i.e., QRS, and filtering was not justified.
Adapted and reproduced by permIssIon from Computers and
The waveform was divided into a series of
Bnmedrcal Research. ‘I
overlapping segments which were defined as
use in the late 1960s and some of the characteris- starting or ending whenever the slope-difference
tics of each. changed sign. The high slope parts of the wave-
Two separate groups of investigators, each form were taken to be reliable indicators for
collaborating with International Business Ma- identification of the QRS complex. The fre-
chines (IBM), began serious development of quency distribution for segment slope-difference
computerized electrocardiography in the late for each lead was found and a threshold applied
1960s. Bonner and Schwetman from Advanced to produce a slope constant (8 converter units per
Systems Developmental Division at IBM worked millisecond). 4 segment with a slope-difference
with Pordy at Mt. Sinai Hospital, New York greater than this was considered to represent a
City, on the development of a 12-lead computer QRS complex. Since the highest slope on aber-
system.” 25 The ECG signal was sampled at 400 rant QRS complexes might be markedly lower
Hz by an A/D converter with IO-bit resolution. than predominant QRSs, bizarre beats were
An analog preprocessor employed noise rejection often overlooked. Additional logic to avoid this
374 JANICE M JENKINS
was employed which lowered the threshold to buried in T or QRS complexes. If no dominant
include aberrant beats. This threshold caused interval could be found, it was judged that there
peaked T waves to be detected as well; therefore, was no discernible P train. Each interval llanked
logic was included that detected the proximity of by two adjacent QRS complexes was considered
a new waveform to the preceding wave and to be a unit, and was classified by the left-
permitted rejection of T waves. The widths of hand-side and right-hand-side QRS types (first
QRS complexes were determined by a compli- QRS type (Ql). second QRS type (QZ), etc.]
cated logic, after which all beats were compared. and interval between. Units which contained
For each beat found to match a certain type, only the dominant QRS type were used to deter-
points were added to that type’s rating and the mine the dominant rhythm.
type with the highest rating was considered to Each lead was examined separately for
represent the dominant rhythm. All QRS com- rhythm. Two kinds of basic rhythms were pre-
plexes not identical to the dominant type were sumed possible: type A. which one would expect
remeasured to correct possible measurement to find exhibited in many leads (such as a sinus
errors. This procedure was important so that rhythm); and type B, which could be expected to
arrhythmia analysis would not be in error. appear in only a few leads (such as Wenckebach
Each segment in the interval between a pair of phenomenon or atrial flutter). A sum was accu-
QRS complexes was given a P or T rating mulated for each lead in which a Type A diagno-
depending upon weighting factors applied to sis was seen, with an additional four points added
slope, amplitude, and duration. The segments for a diagnosis derived from the rhythm strip.
with the largest P and T ratings were candidates The rhythm with the largest total, provided it
for P and T waves. If these segments coincided. exceeded six. was selected. If a type B rhythm
the P rating was compared to T rating and the was found in more than two leads. that diagnosis
larger value was the determinant. In all. a total of would supersede the type A diagnosis unless the
288 measurements were extracted from the I2 type A diagnosis accumulated IO or more points.
leads and stored in a measurement matrix for Table 3 shows the rhythms that comprise type A
further analysis. and type B diagnoses. The special statements
The arrhythmia analysis program” examined reflect arrhythmias that are difficult to distin-
the information from the measurement matrix, guish, therefore a composite list of possibilities
i.e., a table of the time of onset and termination was provided for a cardiologist to review.
of each complex and its type. For arrhythmia Results of rhythm analysis from this system
analysis, the information was organized as shown were reasonably successful for commonly seen
in Fig. 4. The widths, positions in time, and types rhythms (sinus rhythm, sinus tachycardia, sinus
of all the waves were known in addition to their bradycardia). but not so promising in other ar-
measurements. As the first step in the determina- rhythmias. The diagnostic errors were seen to
tion of rhythm, a P-train test was performed in arise from the inability of the measurement
which a search was made for a series of regularly program to find P waves superimposed on T
spaced P waves. The representative interval of a waves, or in discriminating against noise that
group of P waves was used as the hypothetical P might mimic a P wave. Initial testing” on 2060
spacing in order to search for possible P waves electrocardiograms that were processed over a
PI P2 QI 11 P2 PI P3 01 11 02 12 PI 01 P3 13 P3 PI Ql
Fig. 4. Passage of an electrocardiogram that has been coded for arrhythmia analysis. Each QRS type is designated 01, CU.
and so on where 01 represents the dominant beat and Cl2 an ectopic beat. P and T waves of different morphologies ara
similarly coded. This example shows atrial fibrillation with a ventricular ectopic beat. (Reproduced by permission from
Computers and Biomedical Research.“)
AUTOMATED ELECTROCARDIOGRAPHY 375
Table 3. Rhythm Statements from the Bonner Program basic rhythm (i.e., ectopic beats), 505% were
missed completely. As a screening device to
Type A rhythms’
Normal sinus rhythm separate normal from abnormal electrocardio-
Sinus tachycardta grams, the system performance was found to be
Sinus bradycardla 9 I % accurate using both contour and rhythm.
Atrial fibrillation wth A-V block
At this same period in the late 1960s. Smith
Atrial fibrillation
and Hyde’h at Mayo Clinic collaborated with
AtrIal filbnllatlon with A-V dlssociatlon
Nodal tachycardia IBM in the development of a computerized ECG
Supraventricular paroxysmal tachycardla system that processed data acquired from three
Nodal rhythm simultaneously recorded orthogonal leads. The
Nodal paroxysmal tachycardia
lead set used was a modified Frank set in which
Nodal paroxysmal tachycardia with I-V block
(I) a neck electrode was used instead of the
Ventricular paroxysmal tachycardia
Ventricular tachycardla Frank head electrode, (2) V4 and Vh were used
Ventrwlar rhythm instead of the C and A electrodes specified by
Normal stnus rhythm with the Frank, and (3) the patient was supine instead of
Wolff-Parkinson-White syndrome
sitting. The leads were recorded for 8 set on FM
Type 6 rhythms
magnetic tape and transmitted by telephone to
Atnal flutter
Atrlal flutter wth 2-l response the central computer (IBM 7040). A/D sam-
AtrIal flutter with 4- 1 response pling at a rate of 350 Hz was performed at
Anal flutter wth Z- 1 to 4- 1 response one-eighth the recording speed and digital hlter-
Anal flutter wth A-V block
ing applied. A time difference function was
Wandering pacemaker between the
derived and a fiducial mark established when the
SA and A-V node
Third-degree or complete A-V block function exceeded a preset threshold for I5 msec.
lsorhythmlc A-V dlssoclation Data from 250 msec following the fiducial mark
Second-degree A-V block uere considered to contain the onset, peak. and
Special statements
offset of R, and were stored for further analysis.
“This tracng could reflect any of the following:
The region after the R wave was searched for a T
ventrwlar paroxysmal tachycardia
supraventricular tachycardia with I-V block wave, and the region from the R back to the
or aberrant ventricular conduction previous T wave was examined for P waves and
amal flbrillatlon with WPW. other indications of atria1 activity.
Deciding between these alternatives IS
Regular rhythm was defined as that having
Important but difficult.
intervals within a 10%’ tolerance of normal, and
A cardiologist should be consulted.”
“Supraventncular tachycardla. the following rhythms were reportedly recog-
The tachycardia could be due to: nized: sinus, ventricular, nodal, artifcal pace-
paroxysmal supraventricular tachycardla maker, and second or third degree block. Irregu-
atria1 flutter with regular 2: 1 A-V conduction
lar rhythms diagnosed by the system were atrial
sws tachycardla.
or ventricular premature complexes, sinus ar-
Consult a cardiologist.”
rhythmia, and atrial fibrillation. The system
‘Two types of rhythms which were recognized by the early
reported an overall agreement with cardiologists
Banner program. See text for details.
Reproduced by pernws~on from Computers and Biomedical in 85?% of the tracings, with a 98% agreement in
Research.‘4 normals, and a 14%’ variation in descriptions of
abnormals.
period of I yr showed a 91% success rate in At this stage there was serious deliberation
contour analysis. Of the contour statements that about the need for objective analysis of the newly
were in error. 48% were due to measurement emerging systems for ECCi analysis. Caceres”
logic, and the remainder due to logic following had proposed a central pooling of all data in
the measurement program. Computer analysis of order to establish a uniform national criteria.
rhythm resulted in an 8% false positive rate ( 124 Pordy” discussed the 12-lead system versus the
of 1731 normals were misclassified as abnor- Frank orthogonal system, and concluded that the
mal), and a 9% false negative rate (26 of 329 l2-lead was probably preferable but suggested
abnormal rhythms were classified normal). In simultaneous utilization of both. Standards were
the case of rhythm abnormalities other than simply nonexistent and evaluation of available
376 JANICE M. JENKINS
systems not possible since there existed no library logic for P-wave determination and arrhythml;i
of ECG records with a suitable variety of cardiac diagnosis.
abnormalities, and no method for testing existing By 1969 computer recognition of complex
systems against diagnoses established by non- waveform patterns such as those seen in electro-
ECG sources. cardiography was considered routine. It was rcc-
Another program for automatic interpretation ognized that the tnajor task facing further clini-
of electrocardiograms utilizing the Frank ortho- cal application lay in the area of establishing
gonal leads” was developed and implemented at uniform diagnostic criteria. The dificult pattern
Latter-Day Saints Hospital, Salt Lake City, recognition problem of arrhythmia diagnosis was
Utah, and was run routinely on all elective as yet unsolved. During this year the Medical
admissions by 1969. Interpretive statements Systems Development Laboratory (MSDL) was
were made concerning only the QRS and ST-T absorbed into the newly formed National Center
waves and consisted of I I QRS categories and for Health Services and Development. The ECG
five ST categories. The diagnoses that were analysis system developed during the previous IO
possible are shown in Table 4. The system sam- yr by MSDL under the auspices of the U.S.
pied X, Y, and Z leads at 200 Hz and used Public Health Services (Caceres Program) was
decision logic to arrive at a classification. Prelim- translated by the National Center into computer
inary results on 287 ECGs produced a 1% false languages with more universal application and
positive rate (2 out of 195) and a 9%’ false released publicly as ECAN Version D in October
negative rate (8 out of 92) in the QRS analysis.
In the ST analysis the false positives ranked 4%’ Table 5. Listing of Certified Programs: Health Care
(7 of 197) and false negatives 16%) (I4 of X9). Technology Division, EKG Data Pool, EKG Analysis
The underdiagnosis tendency was the result of a Programs (Version “D”1. Certified by the Health Care
Technology Division
deliberate emphasis imposed by the desire of the
cardiologists using the program. The criteria
Date
applied for the diagnostic logic was admittedly
CDC 160-A Medtcal Systems De- 10/l/69
simplistic. Additional development of more
velopment Labora-
sophisticated and appropriate criteria was
tory
planned, as was the proposed development of
CDC 8090 MedIcal Systems De- 10/l/69
velopment Labora-
tory
Table 4. Diagnostic Statements Made
CDC 1700 Control Data Corpora- 11121169
by the Pryor Program’
non
ORS diagnosis Unwersal FORTRAN Data Pool Task Force 12/18/69
Normal IV (minus Cl/O)
Left bundle branch block Sigma 51617 FOR- Xerox Data Systems 1 I9170
Filght bundle branch block TRAN IV-H
lntraventrular conductlon defect IBM 360 (Model 40 Beckman Instruments 8/l l/70
Left ventricular hypertrophy and larger1
Right ventricular hypertrophy SIGMA Z/3 FORTRAN Xerox Data Systems 10/l/70
Antenor myocardial infatctlon IV’
lnferlor myocardial infarctnon DEC PDP 8 (pseudo Berkeley Sclenttfvz La- 1113170
Lateral wall myocardlal lnfarctlon assembly lan- boratorles
Left axls dewation guage)’
Right axts dewat&on DEC PDP 9’ MEDAC 4/14/7
ST diagnosis DEC PDP 8 Searle Medldata 511417
Normal DEC PDP 8” Phone-A-Gram Sys- 101517
ln~ury pattern terns
Subendocardial injury pattern IBM 360/50 (DOS) Touro lnfwnary 1 l/l/7
lschemla pattern IBM 360/30 (DOS) Space Age Computer 415172
DigItaIrs effect pattern Systems
*The Pryor program analyzed electrocardiograms from Adaptations of the ECAN Version D program which were
patients admitted electively at the Latter-Day Samts Hospital, Salt certified by July 1972. Certification verified that the modified
Lake City, 1979. versions functioned identically to the original program.
Reproduced by pernws~on from Computers and Bnmedical *Segmented programs.
Research. 2’ Reproduced by permisslon from Academic Press.”
Description:waveform of the ECG, QRS morphology and RR .. 'Two types of rhythms which were recognized by the early. Banner program. See text for details.