History of invasive and interventional cardiology

Associate Editor-In-Chief:

Overview
The history of invasive and interventional cardiology is complex, with multiple groups working independently on similar technologies. While invasive and interventional cardiology is currently closely associated with cardiologists (physicians who treat the diseases of the heart), a great deal of the early research and procedures were performed by radiologists and cardiac surgeons.

Cardiac Catheterization: A brief history
Catheterizations of inner organs have almost been used from 5.000 years

3000 B.C.: Egyptians performed bladder catheterizations using ultra thin metal pipes.

400 B.C.: Catheters fashioned from hollow reeds and pipes are used in cadavers to study the function of cardiac valves.

1711: Hales conducts the first cardiac catheterization of a horse using brass pipes, a glass tube, and the trachea of a goose.

1844: French physiologist Bernard coins the term "cardiac catheterization" and uses catheters to record intra cardiac pressures in animals. Obrastzow and Stracheschenko in 1910 and Herrick in 1912 described the features of a sudden obstruction of a coronary artery as the cause for AMI. In the followed few decades, controversy raged as to whether the clot formed after death and was merely a postmortem finding or a reason of an AMI. This question was remained unsolved until the late 1970s.

1929: First documented human cardiac catheterization is performed by Dr. Werner Forssmann in Eberswald, Germany.

1941: A. Cournand and D. Richards employ the cardiac catheter as a diagnostic tool for the first time, utilizing catheter techniques to measure cardiac output.

1947: Louis Dexter expanded the clinical use of right heart catheterization with studies in patients with congenital heart disease and identified the pulmonary capillary wedge pressure

1958: Mason Sones introduced the diagnostic coronary angiography [he accidentally engaged to the RCA and injected dye during an imaging procedure of aortic valve]; and then he realized that, he had discovered the key to selective imaging of the heart: the diagnostic coronary angiogram. In the same year Fletcher and colleagues first introduced and then reported intravenous administration of thrombolytic drugs in patients with AMI.

1960s: Coronary care units were introduced. Beginning in the 1960’s, treatment of patients with STEMI consisted of bed rest for up to a month. Mortality was reduced with the emergence of Coronary Care Units and treatment of the arrhythmias.

1964: Charles T. Dotter introduced the concept of remodeling the arteries, known as Transluminal Angioplasty,

1967: Rene Favaloro conducts first coronary artery bypass graft surgery with saphenous vein in Cleveland, Ohio.

1967: Dotter and Judkins were the first to propose the concept of reperfusion of the coronary artery by a catheter technique.

1969: Chazov administered the first intra coronary streptokinase in Russia.

1970s: De Wood and colleagues reported that a thrombus was observed in the IRA in nearly 90% of patients undergoing acute coronary artery surgery in the first few hours after the onset of AMI.

1974: Andreas Gruentzig performed first peripheral human balloon angioplasty and presented results of animal studies of coronary angioplasty at AHA meeting in 1976

1977: First human coronary balloon angioplasty performed intra operatively by Andreas Gruentzig and Richard Myler. In the same year Gruentzig performed the first PTCA in a catheterization laboratory without using general anesthetic drugs.

Late 1970s: Chazov et al. were the first group who demonstrated angiographically the clot lysing effect of IC thrombolytic therapy in occluded IRA in patients with AMI. These findings were confirmed by Rentrop and coworkers in Göttingen [94]

1978: First PTCA cases performed in America by Myler and Hanna in San Francisco, and Stertzer in New York.

1979: The concept of catheter-based reperfusion for STEMI was introduced by Rentrop and colleagues

1980s: Guiding catheters and pharmacological reperfusion for AMI were introduced.

1982: Over-the-wire coaxial balloon systems are introduced, along with the development of brachial guiding catheters and steerable guide wires.

1983: Hartzler and colleagues described and introduced the first Percutaneous Transluminal Balloon Angioplasty without the use of fibrinolytic drugs for AMI.

1984: Thrombolysis In Myocardial Infarction [TIMI] Study Group was created by Braunwald and colleagues.

1985: First coronary artery stents [Wall stents] were implanted by Jacques Puel and Ulrich Sigwart in Toulose, These were introduced with the objective of tacking down dissection flaps and providing mechanical support. Stents also helpful to reduce elastic recoil and remodeling associated with restenosis rate.

1986-1993: a large number of new interventional devices are invented and introduced to medical practice [lasers, rotational atherectomy devices [Rotablator], intravascular ultrasound [IVUS] and stents].

1990s: The widespread application of catheter-based revascularization procedures were increased, and reduced in-hospital mortality rates. Worldwide increase of stent use with more understanding of its pathophysiology and improvement on its availability and structure [e.g. more flexible and pre mounted to balloon catheters]. The term of Percutaneous Coronary Intervention was used first to distinguish Stent deployment procedures from Percutaneous Transluminal Coronary Angioplasty [PTCA], and then became a more popular and covered all coronary artery related therapeutic procedures.

Mid nineties: Glycoprotein IIb-IIIa inhibitors have introduced.

Mid to late nineties: Stents become commonplace and eliminate many complications.

Different combination of fibrinolytic drugs, anti platelets and PCIs were used for treatment of AMI. Combination of pre hospital fibrinolytic drug administration, transfer to diagnostic coronary angiography, and then PCI in a tertiary center [a pharmacoinvasive approach, facilitated coronary interventions] became a popular but questionable strategy, whereas coronary angiography and PCI available in acceptable transfer distance.

After the invention of TIMI Frame Count [TFC] and Myocardial Perfusion Grade [MPG] by Gibson CM and colleagues, key advances in the treatment of AMI have arisen from the open artery theory, that timely, to complete and sustained myocardial reperfusion of IRA area (open myocardium and microvasculature) as a major determinant of outcome.

2001: Restoring coronary blood flow by thrombolysis, mechanical reperfusion or any of their combination, with so called different primary PCI strategies with or without using coronary stents is referred to as recanalization of the IRA and became a treatment of choice in AMI. Then the sooner available became the best.

2002: The first drug-eluting stent was introduced in Europe, and then approved by the Food and Drug Administration.

2004: Bio absorbable coronary stents was introduced. Drug eluting stents use has been introduced in Primary PCI era.

2006: There is still a need to find widely available revascularization strategies for combination of an open epicardial IRA, open microvasculature in IRA area, early and complete ST segment resolution with preserved LV function in patients with STEMI.

The birth of invasive cardiology
The history of invasive cardiology begins with the development of cardiac catheterization in 1711, when Stephen Hales placed catheters into the right and left ventricles of a living horse. Variations on the technique were performed over the subsequent century, with formal study of cardiac physiology being performed by Claude Bernard in the 1840s.

Catheterization of humans
The first catheterization of a human was attributed to Werner Forssmann who, in 1929, created an incision in one of his left antecubital veins and inserted a catheter into his venous system. He then guided the catheter by fluoroscopy into his right atrium. Subsequently he walked up a flight of stairs to the radiology department and documented the procedure by having a chest roentgenogram performed. Over the next year, catheters were placed in a similar manner into the right ventricle, and measurements of pressure and cardiac output (using the Fick principle) were performed.

In the early 1940s, André Cournand, in collaboration with Dickinson Richards, performed more systematic measurements of the hemodynamics of the heart. For their work in the discovery of cardiac catheterization and hemodynamic measurements, Cournand, Forssmann, and Richards shared the Nobel Prize in Physiology or Medicine in 1956.

Development of the diagnostic coronary angiogram
In 1958, Charles Dotter began working on methods to visualize the coronary anatomy via sequential radiographic films. He invented a method known as occlusive aortography in an animal model. Occlusive aortography involved the transient occlusion of the aorta and subsequent injection of a small amount of radiographic contrast agent into the aortic root and subsequent serial x-rays to visualize the coronary arteries. This method produced impressive images of the coronary anatomy. Dotter later reported that all the animals used in the procedure survived.

Later that same year, while performing an aortic root aortography, Mason Sones, a pediatric cardiologist at the Cleveland Clinic, noted that the catheter had accidentally entered the patient's right coronary artery. Before the catheter could be removed, 30cc of contrast agent had been injected. While the patient went into ventricular fibrillation, the dangerous arrhythmia was terminated by Dr. Sones promptly performing a precordial thump which restored sinus rhythm. This became the world's first selective coronary arteriogram. Until that time, it was believed that even a small amount of contrast agent within a coronary artery would be fatal.

Until the 1950s, placing a catheter into either the arterial or venous system involved a "cut down" procedure, in which the soft tissues were dissected out of the way until the artery or vein was directly visualized and subsequently punctured by a catheter; this was known as the Sones technique. The percutaneous approach that is widely used today was developed by Sven-Ivar Seldinger in 1953. This method was used initially for the visualization of the peripheral arteries. Percutaneous access of the artery or vein is still commonly known as the Seldinger technique. The use of the Seldinger technique for visualizing the coronary arteries was described by Ricketts and Abrams in 1962 and Judkins in 1967.

By the late 1960s, Melvin Judkins had begun work on creating catheters that were specially shaped to teach the coronary arteries to perform selective coronary angiography. His initial work involved shaping stiff wires and comparing those shapes to radiographs of the ascending aorta to determine if the shape appeared promising. Then he would place the stiff wire inside a flexible catheter and use a heat-fixation method to permanently shape the catheter. In the first use of these catheters in humans, each catheter was specifically shaped to match the size and shape of the aorta of the subject. His work was documented in 1967, and by 1968 the Judkins catheters were manufactured in a limited number of fixed tip shapes. Catheters in these shapes carry his name and are still used to this day for selective coronary angiography.

Dawn of the interventional era
The use of a balloon-tipped catheter for the treatment of atherosclerotic vascular disease was first described by Charles Dotter and Melvin Judkins in 1964, when they used it to treat a case of atherosclerotic disease in the superficial femoral artery of the left leg. Building on their work and his own research involving balloon-tipped catheters, Andreas Gruentzig performed the first success percutaneous transluminal coronary angioplasty (known as PTCA or percutaneous coronary intervention (PCI)) on a human on September 16, 1977 at University Hospital, Zurich. The results of the procedure were presented at the American Heart Association meeting two months later to a stunned audience of cardiologists. In the subsequent three years, Dr. Gruentzig performed coronary angioplasties in 169 patients in Zurich, while teaching the practice of coronary angioplasty to a field of budding interventional cardiologists. It is interesting to note that ten years later, nearly 90 percent of these individuals were still alive. By the mid 1980s, over 300,000 PTCAs were being performed on a yearly basis, equalling the number of bypass surgeries being performed for coronary artery disease.

Soon after Andreas Gruentzig began performing percutaneous interventions on individuals with stable coronary artery disease, multiple groups described the use of catheter-delivered streptokinase for the treatment of acute myocardial infarction (heart attack).

In the early years of coronary angioplasty, there were a number of serious complications. Abrupt vessel closure after balloon angioplasty occurred in approximately 1% of cases, often necessitating emergency bypass surgery. Vessel dissection was a frequent issue as a result of improper sizing of the balloon relative to the arterial diameter. Late restenosis occurred in as many as 30% of individuals who underwent PTCA, often causing recurrence of symptoms necessitating repeat procedures.

Development of the intracoronary stent
From the time of the initial percutaneous balloon angioplasty, it was theorized that devices could be placed inside the arteries as scaffolds to keep them open after a successful balloon angioplasty. This did not become a reality in the cardiac realm until Ulrich Sigwart reported the first implantation of an intracoronary stent in March of 1986. The first stents used were self-expanding Wallstents. The use of intracoronary stents was quickly identified as a method to treat some complications due to PTCA, and their use can decrease the incidence of emergency bypass surgery for acute complications post balloon angioplasty.

It was quickly realized that restenosis rates were significantly lower in individuals who received an intracoronary stent when compared to those who underwent just balloon angioplasty. A damper on the immediate use of intracoronary stents was subacute thrombosis. Subacute thrombosis rates with intracoronary stents proved to be about 3.7 percent, higher than the rates seen after balloon angioplasty. Post-procedure bleeding was also an issue, due to the intense combination of anticoagulation and anti-platelet agents used to prevent stent thrombosis.

Stent technology improved rapidly, and in 1989 the Palmaz-Schatz balloon-expandable intracoronary stent was developed. Initial results with the Palmaz-Schatz stents were excellent when compared to balloon angioplasty, with a significantly lower incidence of abrupt closure and peri-procedure heart attack. Late restenosis rates with Palmaz-Schatz stents were also significantly improved when compared with balloon angioplasty. However, mortality rates were unchanged compared to balloon angioplasty. While the rates of subacute thrombosis and bleeding complications associated with stent placement were high, by 1999 nearly 85% of all PCI procedures included intracoronary stenting.

In recognition of the focused training required by cardiologists to perform percutaneous coronary interventions and the rapid progression in the field of percutaneous coronary interventions, specialized fellowship training in the field of Interventional Cardiology was instituted in 1999.

Changes in post-procedure medications
Through the 1990s and beyond, various incremental improvements were made in balloon and stent technology, as well as newer devices, some of which are still in use today while many more have fallen into disuse. As important as balloon and stent technology had been, it was becoming obvious that the anticoagulation and anti-platelet regimen that individuals received post-intervention was at least as important. Trials in the late 1990s revealed that anticoagulation with warfarin was not required post balloon angioplasty or stent implantation, while intense anti-platelet regimens and changes in procedural technique (most importantly, making sure that the stent was well opposed to the walls of the coronary artery) improved short term and long term outcomes. Many different antiplatelet regimens were evaluated in the 1990s and the turn of the 21st century, with the optimal regimen in an individual patient still being up for debate.

The drug eluting stent era
With the high use of intracoronary stents during PCI procedures, the focus of treatment changed from procedural success to prevention of recurrence of disease in the treated area (in-stent restenosis). By the late 1990s it was generally acknowledged among cardiologists that the incidence of in-stent restenosis was between 15 and 30%, and possibly higher in certain subgroups of individuals. Stent manufacturers experimented with (and continue to experiment with) a number of chemical agents to prevent the neointimal hyperplasia that is the cause of in-stent restenosis.

One of the first products of the new focus on preventing late events (such as in stent restenosis and late thrombosis) was the heparin coated Palmar-Schatz stent. These coated stents were found to have a lower incidence of subacute thrombosis than bare metal stents.

At approximately the same time, Cordis (a division of Johnson & Johnson) was developing the Cypher stent, a stent that would release sirolimus (a chemotherapeutic agent) over time. The first study of these individuals revealed an incredible lack of restenosis (zero percent restenosis) at six months. This led to the approval for the stent to be used in Europe in April of 2002. Further trials with the Cypher stent revealed that restenosis did occur in some individuals with high risk features (such as long areas of stenosis or a history of diabetes mellitus), but that the restenosis rate was significantly lower than with bare metal stents (3.2 percent compared to 35.4 percent). About a year after approval in Europe, the United States FDA approved the use of the Cypher stent as the first drug-eluting stent for use in the general population in the United States.

With the significantly lower restenosis rates of drug eluting stents compared to bare metal stents, the interventional cardiology community began using these stents as soon as they became available. Cordis, the manufacturer of the Cypher drug eluting stent, was not able to keep up with the demand for these stents when they first entered the market. This fueled a rationing of Cypher stents; they were used on difficult anatomy and high risk individuals. At the time there was a fear by the general population that these drug eluting stents would not be used on individuals who could not afford them (as they cost significantly more than the bare metal stents of the era).

Concurrent with the development of the Cypher stent, Boston Scientific started development of the Taxus stent. The Taxus stent was the Express2 metal stent, which was in general use for a number of years, with a copolymer coating of paclitaxel that inhibited cell replication. As with the Cypher stent before it, the first trials of the Taxus stent revealed no evidence of in-stent restenosis at six months after the procedure, while later studies showed some restenosis, at a rate much lower than the bare metal counterpart. Based on these trials, the Taxus stent was approved for use in Europe in 2003. With further study, the FDA approved the use of the Taxus stent in the United States in March of 2004.

By the end of 2004, drug eluting stents were used in nearly 80 percent of all percutaneous coronary interventions.

Trials of heparin coated stents could not match the significant decrease in restenosis rates seen with the Cypher and Taxus stents. With the increased supply in the chemotherapeutic drug eluting stents available, the use of heparin coated stents wained.

Modern controversies in interventional cardiology
The field of interventional cardiology has had a number of controversies since its inception. In part this is because of the dawning of the randomized control trial as the marker of a successful procedure. This is worsened by the rapid changes in the field of interventional cardiology. Procedures would be used soon after they are described in the literature or at conferences, with trial data determining if the procedure improves outcomes lagging behind by years due to the strict protocols and long follow-up of patients necessary to test the procedure. By the time the trials were published, they would be considered out of date, as they did not reflect the current practice in the field. This led to the inception and use of a number of procedures and devices in the interventional realm that have fallen out of practice due to their being found to not improve outcomes after formal trials have been performed.

Roles of bypass surgery and intracoronary stents for coronary artery disease
Another source of controversy in the field of interventional cardiology is the overlapping roles of PCI and coronary artery bypass surgery for individuals with coronary artery disease. This area has been studied in a number of trials since the early 1990s. Unfortunately, due to the rapid changes in technique in both bypass surgery as well as PCI, added to the better understanding of the roll of intense pharmacologic therapy in individuals with coronary artery disease, questions still remain on the best form of therapy in many subgroups of patients. Multiple ongoing studies hope to tease out which individuals do better with PCI and which do better with CABG, but in general each case is individualized to the patient and the relative comfort level of the interventional cardiologist and the cardiothoracic surgeon.

The role of PCI in individuals without symptoms of ischemic heart disease
In the vast majority of cases, percutaneous coronary interventions do not improve mortality when compared to optimal medical therapy in the stable individual. This is, of course, not true in the unstable individual, such as in the setting after a myocardial infarction (heart attack). Even in the stable individuals, however, there are a number of subsets in which there is a mortality benefit that is attributed to PCI.

Subsequently, at the 2007 meeting of the American College of Cardiology (ACC), data from the COURAGE trial was presented, suggested that the combination of PCI and intensive (optimal) medical therapy did not reduce the incidence of death, heart attacks, or stroke compared to intensive medical therapy alone. Critics of the trial state that the trial did not take into account the improvement in symptoms attributed to PCI. Also, the data that was presented was an intention to treat analysis, and that there was a (possibly) significant crossover from the medical therapy arm to the PCI arm of the study. It should also be noted that the optimal medical therapy seen in the COURAGE trial is significantly more aggressive than the current guidelines of the ACC and are not commonly seen in the general cardiology clinic. As with any large clinical trial, the therapies available had changed from when the trial was designed to when the results were presented. In particular, drug eluting stents, while commonly used in practice at the time the results of the trial were presented, were used in less than 5 percent of individuals in the trial.

The safety of drug-eluting stents
When the results of the first trials of drug-eluting stents were published, there was a general feeling in the interventional cardiology community that these devices would be part of the perfect revascularization regimen for coronary artery disease. With the very low restenosis rates of the RAVEL and SIRIUS trials, interventions were performed on more complex blockages in the coronary arteries, under the assumption that the results in real life would mimic the results in the trials. The antiplatelet regimens that were advised for the drug eluting stents were based on the early trials of these stents. Based on these trials, the antiplatelet regimen was a combination of aspirin and clopidogrel for 3 months when Cypher stents were used, and 9 months when Taxus stents were used, followed by aspirin indefinitely.

Soon, case reports started being published regarding late stent thrombosis. At the 2006 annual meeting of the American College of Cardiology, preliminary results of the BASKET-LATE trial were presented, which showed a slight increase in late thrombosis associated with drug eluting stents over bare metal stents. However, this increase was not statistically significant, and further data would have to be collected. Further data published over the following year had conflicting results, and it was unclear whether stent thrombosis was truley higher when compared to bare metal stents. During this time of uncertainty, many cardiologists started extending the dual antiplatelet regimen of aspirin and clopidogrel in these individuals, as some data suggested that it may prevent late thrombosis.

The FDA held an expert panel in December of 2006 to go over the data presented by Cordis and Boston Scientific to determine if drug eluting stents should be considered less safe than bare metal stents. It became evident at the meeting that with all the data published there was varied definitions of late thrombosis and key differences in the types of lesions in different studies, hampering analysis of the data. It was also noted that with the advent of drug eluting stents, interventional cardiologists began performing procedures on more complex lesions, subsequently using the drug eluting stents in "off label" coronary artery lesions, which would otherwise go untreated or for bypass surgery. The FDA advisory board reiterated the ACC guidelines that clopidogrel should be continued for 12 months after drug eluting stent placement in individuals who are at low risk for bleeding.