Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Medical Innovation: Concept to Commercialization
Medical Innovation: Concept to Commercialization
Medical Innovation: Concept to Commercialization
Ebook575 pages6 hours

Medical Innovation: Concept to Commercialization

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Medical Innovation: Concept to Commercialization is a practical, step-by-step approach on how to move a novel concept through development to realize a commercially successful product. Real-world experience cases and knowledgeable contributors provide lessons that cover the practices of diverse organizations and multiple products. This important reference will help improve success and avoid innovation failure for translational researchers, entrepreneurs, medical school educators, biomedical engineering students and faculty, and aspiring physicians.

  • Provides multiple considerations and comprehensive lessons from varied organizations, researchers and products
  • Designed to help address topics that improve success and avoid the high cost of innovation failure
  • Recommends the practical steps needed to move a novel, non-developed concept into a tangible, realistic and commercially successful product
LanguageEnglish
Release dateMay 24, 2018
ISBN9780128149270
Medical Innovation: Concept to Commercialization

Related to Medical Innovation

Related ebooks

Biology For You

View More

Related articles

Reviews for Medical Innovation

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Medical Innovation - Kevin E. Behrns

    States

    Preface

    Bruce Gingles; Kevin E. Behrns; Michael G. Sarr

    This book is a primer for those in many disciplines interested in improving health care through medical technology innovation. The intended readership comprises aspiring entrepreneurs, especially physicians and medical students, as well as leaders in the fields of translational research, medical education, technology transfer, hospital and medical school administration, and health care policymaking. The book was born out of a themed innovation series in the journal SURGERY that presented 24 topical articles between November 2016 and November 2017. This book describes technical, legal, regulatory, business, education, economic, and policy predicates that enable technology innovation in medicine and health care broadly. The material is generally presented in chronological order from invention conception through the various essential gates that transform novel ideas into tangible and useful products in clinical settings. The chapters combine professional information along with horizon scanning and specific lessons. We have made a concerted effort to expand beyond purely technical knowledge and to present perspectives not normally associated with developing a medical innovation. Workforce development, federal and institutional policy, the medical school curriculum, and technology adoption are key among them. With the exception of two recent medical school and residency graduates, the authors are nationally recognized experts, selected for high career achievement and their practical rather than theoretical orientation. Each author has directly nourished innovation and for most, through dozens of cycles.

    We hope this book accelerates the reader’s path from a curious problem solver to a successful entrepreneur, with patients as the ultimate beneficiary.

    Chapter 1

    Introduction to Medical Innovation

    Thomas P. Stossel    BioAegis Therapeutics, Inc., Belmont, MA, United States

    Abstract

    This introductory chapter summarizes three health care innovation eras: (1) nonexistence throughout most of history; (2) major contributions from the mid-19th to mid-20th centuries; and (3) the current system characterized by government-sponsored academic research. It defines innovation as research discoveries with practical benefits and reviews cultural and economic factors that encouraged innovation in the second era but impede it in the present one. It emphasizes the dominant contribution of private industry to innovation and how academic discounting of that fact is impedes innovation. It encourages aspiring innovators to consider innovation's history as they learn the lessons of this book.

    Keywords

    Health care innovation history; Government-academic biomedical complex; Pharmaphobia

    The purpose of this book is to serve as a users' manual for aspiring physician-entrepreneurs and other professionals who want to exert their energy toward improving health care by advancing its technology. As the readers will discover, the health care professional schools in which they are enrolled or from which they have graduated hardly impart most of the know-how required to achieve such a goal. This book attempts to remedy that deficiency, and this introductory chapter provides some historic background concerning health care innovation (defined narrowly as discoveries, inventions, or devices that benefit patients) to explain why this book is necessary.

    Throughout most of recorded history—excepting the construction of instruments for removing teeth, amputating limbs, trephining skulls, and performing rudimentary surgery—little innovation in health care took place. Indeed, the adage falsely ascribed to Hippocrates Do no harm notwithstanding, devices for bleeding and nostrums for purging patients reliably did more harm than good, an outcome masked by spontaneous recoveries and the desperation of the sick.

    Discoveries much celebrated by medical historians concerning the circulation of blood, anatomic structure, observations using microscopes, or the invention of the stethoscope contributed to the diagnosis of disease but had minimal impact on clinical outcomes. One exception, smallpox vaccination, was controversial for a long time despite eventually eradicating the disease.

    Even as innovation in health care germinated in the 19th century, the application of advances we consider truly important today, such as anesthesia, antisepsis, and the germ theory of disease that informed and amplified vaccination against infection, had to overcome resistance by the public as well as by the professionals involved in health care at the time. For example, dexterous surgeons able to complete procedures rapidly to minimize the agonizing pain suffered by their patients scoffed at anesthesia considering it a crutch for their less-skilled counterparts. Also, many religious scholars of the time believed that pain was God's punishment and that anesthesia should not be developed. Considering that their abysmal track record relegated health care professionals—such as they were—to the dregs of social status, the fact that health entrepreneurship centered mainly on quackery is hardly surprising [1].

    These circumstances changed in the latter half of the 19th century, as discoveries of chemistry, microbiology, and physiology—predominantly occurring in Western Europe—transformed medical schools into bastions for health care research and accelerated the appearance of drugs and medical devices that began to supplant improved nutrition, sanitation, and housing as modalities for increasing longevity and improving the quality of life.

    As similar institutions evolved in North America over the ensuing years, their innovators relied heavily on the European precedents. Many physicians and surgeons routinely obtained training in European medical centers, although they often amplified what they learned. When the Mayo brothers, founders of the famed clinic in Minnesota, learned a new surgical procedure in, say, Vienna, they would return the following year having performed many more of the operations at their American headquarters in the interval than had ever been done where the procedure originated. The Doctors Mayo and many other American surgeons invented surgical devices that bear their names to this day.

    The 1919 Flexner Report served to legitimize health care and catalyzed the concentration of American health care education into universities, made science a requirement for legitimacy of health care training, and raised organized allopathic health care to a dominant position. Although only a minority of health care professionals actually engaged in research, this institutionalization of American health care and the establishment of professional societies facilitated innovation by elevating its social status and providing venues for communication about innovations and recognition of its important contributions [2].

    During this era, researchers tended to focus on solving immediate problems in health care. Specific innovations that resulted from this effort have their unique and idiosyncratic histories, but one can generalize that most arose from astutely observing sick patients or experimental work in animals, leading to hypotheses on which to base clinical interventions. Another generalization is that innovations appear iteratively, because the solution to one problem informs approaches to solving others [3].

    One example of the former generalization was the work of George Whipple at the University of Rochester who in the early decades of the 20th century addressed the clinical problem of red blood cell deficiency—anemia. He made dogs anemic by bleeding them and tested the effect of feeding them different diets on their recovery. When he found that raw liver was the most effective treatment, he had patients with the then prevalent and often fatal condition of pernicious anemia ingest large amounts of liver, and he indeed observed clinical improvement. George Minot and William Murphy working at Boston's Peter Bent Brigham Hospital subsequently documented that liver extracts were even more therapeutically effective.

    Although Whipple, Minot, and Murphy received the 1934 Nobel Prize for their accomplishments, their story illustrates how the practical outcome—a treatment for a disease—does not necessarily result from what the researchers believe to be the underlying mechanism. Minot and Murphy credited iron for the therapeutic effect of liver, which is not true. The observed improvement in anemia actually resulted from high concentrations of folic acid in the liver, and subsequent research revealed that pernicious anemia results from a deficiency of vitamin B12. Folic acid can partially overcome this deficit, because folic acid is a component of a biochemical pathway related to vitamin B12, but interestingly and unfortunately, giving folic acid alone actually makes another manifestation of vitamin B12 deficiency—the neurologic complications—worse. The elucidation of vitamin B12 deficiency as the true basis of pernicious anemia was only achieved by the 1950s. Many innovations of this era, such as X-ray imaging or aspirin for fever, came on line with no idea as to how they worked, illustrating the discrepancy between effective treatment and mechanistic understanding.

    An example of the other generalization—how therapies evolve from one problem-solving challenge to another—started with the invention in 1929 of a urinary catheter by the Boston surgeon Frederic Foley. The Foley catheter is a device consisting of a flexible, hollow tube with a very pliable tip that includes an external balloon incorporated into the wall of the tube that can be expanded by injection of air after insertion of the catheter into the bladder via the urethra. Initially devised to impart local pressure to stop bleeding after prostatic surgery, it came into wide use to relieve diverse causes of lower urinary tract obstruction by serving as an outlet for continuous bladder decompression. Building on that principle, Thomas Fogarty developed a similar catheter (the Fogarty catheter) in 1961 that could be inserted into a blood vessel (vein or artery) with a clot upstream or downstream and threaded past the blood clot, its external balloon at its tip then inflated, and then the catheter retracted to remove the clot. The next iteration of this principle was the application of a balloon catheter by Andreas Grüntzig in the 1970s to dilate critically narrowed coronary arteries to restore adequate blood flow to the myocardium supplied by that artery. The proof of principle that such arteries are amenable to safe instrumentation led to the development of intraluminal coronary stents and subsequently the drug-eluting varieties that minimize thrombosis within the stent.

    During this era, basic science research enriched knowledge concerning biochemical pathways and the immune system involved in certain disease processes that ultimately bore on health. Facilitating such linkage was the study of patients with what became recognized as genetically based disorders—that is, experiments of nature. In a few of these cases, it was possible to achieve therapeutic benefits. For example, the understanding that patients who inherited one of many dysfunctional metabolic pathways that converted chemicals in certain foods into toxins enabled dietary measures that avoided this consequence. Appreciation that other patients were unable to synthesize certain types of antibodies against infection led to the therapeutic infusion of antibody transfusions and improved infection control.

    One of the few benefits of armed conflicts and other catastrophes is that the injuries inflicted by war uncover new problems that teach improvements in clinical management and introduce new coping technologies. World War II was no exception and by necessity, served to elicit practical approaches to pain control, imaginative ideas in the treatment of trauma, and both supportive care and new therapies for infections endemic in the war theater. Military physicians working in the field or in government laboratories spearheaded these accomplishments.

    Also during this time, health care innovators were not considered physician-entrepreneurs. In fact, the culture of the health care professional had long declared medical practice as a sacred calling as part of a—largely unsuccessful—effort to prop up its low status. This theologic imperative spilled over into health care research in universities and research institutes that frowned actively on researchers obtaining patents and profiting from their discoveries. But, as therapeutic opportunities accumulated and needed manufacturers to develop and improve innovators' responses to them, bringing drugs and medical devices to the market and to the patient, patents were required in order for development to proceed. Given that infectious disease was the leading challenge for health care in the first half of the 20th century, the introduction of antibiotic therapy not only placed the innovators of new, clinically effective antibiotics into partnerships with industry but also stimulated industry's interest and involvement in health care innovation. Prime examples included the collaboration of Paul Ehrlich with the I.G. Farben dyestuffs company leading to the discovery of Salvarsan, Kurt Domagk's association with Bayer to develop sulfonamides, Howard Flory and Ernst Chain's work with ICI and subsequent efforts at Pfizer that enabled mass production of penicillin, and Selman Waksman's connection with Merck that resulted in the availability of streptomycin.

    The post-World War II period ushered in a huge uptick in federal funding for biomedical research. This largesse eventuated not only in the construction of the world's largest biomedical research facility, the NIH intramural facility in Bethesda, Maryland but also stimulated a marked increase in research grant awards to universities and independent research institutes that enabled these institutions to expand their research activities.

    Without question, the seven decades that have transpired since the onset of this expansion of government subsidy of biomedical research have witnessed the introduction of stunning new innovations in health care that have extended longevity and improved our quality of life. Mortality due to cardiovascular disease, while still the principal lethal disorder, has decreased by 60% from its peak in the 1950s. Cancer deaths have declined, and extreme pain and immobility formerly inflicted by chronic joint syndromes are rare. More and more therapies are emerging to treat the experiments of nature.

    Some analysts, however, including myself, have questioned whether this heavily government-subsidized system (which I have named The Government-Academic-Biomedical Complex—GABC) does not actually inhibit innovation [4,5]. For example, the GABC assumes credit for the emergence in the 1970s of the genetic engineering industry that delivered potent and beneficial biologic drugs, such as erythropoietin, the interferons, and clot-dissolving fibrinolytic agents. But when, after two decades of the GABC existence, relatively little practical innovation was arising from it, Congress enacted legislation designed to encourage patenting of the research discoveries of the GABC grantees and their licensing to startup companies—the Bayh-Dole Act of 1980. Although Bayh-Dole did catalyze innovation by reassuring industry that the government would not appropriate the profits of the licensees, the principal engine behind the biotechnology revolution was insights into genetics, many of which predated the GABC.

    One reason the GABC does not obligatorily promote innovation is economic. The largesse of the GABC initially provided and generated an entitled attitude that afflicted researchers and the administrators of their institutions with the fantasy that the party would last forever. The inflationary nature of high-tech research budgets and competing federal priorities conspired to produce a widening gap between supply of and demand for research funding. The gap has resulted in a progressively aging workforce of the GABC-funded academic researchers forced to compete for survival by conforming to the prejudices and risk aversion of grant review committees—hardly a formula for innovation.

    Other deficiencies of the GABC are cultural. One that shapes the aforementioned deliberative prejudices is that the problem-solving emphasis of the pre-GABC era of innovation represented a deviation from the behavioral norms that have traditionally characterized science. Historians and sociologists have documented that the principal motivation of scientists—in contrast to technologists—has been to impress influential peers with novelty, making discoveries of previously unknown entities, and what has been designated puzzle solving. Puzzle solving includes creating theories that bring natural phenomena into an order that allows for quantitative predictions. The fact that organized science lavishes its awards, such as academic promotion and prizes, on persons accomplishing such achievements is an evidence for the primacy of these motives.

    Moreover, medically trained researchers populated the pre- and early GABC workforce and were therefore, intimately familiar with many of the unmet health care needs. But contemporaneous with the establishment of the GABC, riveting basic science discoveries, such as the biochemical basis of genetics, the elucidation of cellular pathways of signaling, and deep insights into subcellular structures, shifted the research emphasis away from more descriptive physiologic and pathologic topics pursued previously to these reductionist subjects.

    Although schools teach the rudiments of these reductionist basic science subjects to health care professionals in training, the education is insufficient to prepare them for hands-on experimental work. Adequate preparation has therefore, required students to take time off from more practical training or else—more commonly—defaulted to basic science PhD degree programs in which students without health care experience do science coursework followed by several years of mentored research work. This development conspired with the piling on of ever more lengthy requirements for certification in the subspecialty practices to render combining clinical and research training increasingly unattractive. As a result, individuals with basic science but not medical training have come to dominate the biomedical research population.

    Another perverse influence affecting health care innovation is the piling on of burdensome, unjustified regulations affecting the ethics of human and animal experimentation and the financing, conduct, and reporting of research. These rules, however well intended, inflict expense to support the bureaucracies that manage them and cause inordinate time delays. This heavy presence of government oversight due to the dominant federal bankrolling of academic research has been used to justify these inflexible regulations. Performing an experiment as simple as taking a blood sample from a patient and measuring something to test the plausibility of an idea concerning that individual's condition—a frequent procedure in the pre-GABC era—is no longer possible. All too often, by the time an ethics review board sanctions the experiment, the patient is long gone.

    These topical emphases and workforce shifts are not categorically counterproductive and have contributed to progress. But capitalizing on basic science discoveries to produce practical innovation requires skill sets and pursuits—described in this book—not taught or rewarded in academic institutions.

    Of the long list of underappreciated subjects covered in this book, one that I mention here is the economics of innovation, the lynchpin of which is unlocking investment. Most academics do not even comprehend in the most abstract form the difficulty and high failure rate of innovation. Attracting investment is extremely daunting, yet most academic researchers do not appreciate the particulars of how evaluating market potential and strategies to diminish risk are obligatory activities for obtaining such investment. When a few highly successful biotechnology products licensed by universities delivered large royalties, most academic health care institutions created patenting and licensing bureaucracies. Economic illiteracy and lack of business acumen of both the academic researchers and the institutional offices designed originally to help the investigators explains why inadequately funded and staffed technology transfer offices of these institutions rarely succeeded in bringing their bright ideas to practical fruition.

    Moreover, as illustrated by the stories of Whipple, Minot, and Murphy recounted above and others, a deep-seated academic belief that maintains that it is only the elusive, complete comprehension of underlying mechanisms that leads to health care benefits is simply false. The tyranny of this belief, however, is that potentially useful inventions may not advance, because the investigators and the institutional offices of technology transfer lack the time consuming and expensive choreography required to define such mechanisms.

    The GABC hardly possesses a monopoly on behaviors and misaligned incentives counterproductive to health care innovation. The aspiring innovator must work with private industry to achieve innovation, but this investigator also needs to understand the difficulties that beset that sector. The extraordinary risks referred to above have divided it into two camps. One camp consists of the behemoths that have merged to try and maintain a sufficiently large pipeline to eke out some success despite the high failure rate of drug development. The size of these leviathans guarantees that they are Balkanized, poorly responsive to the need for change, and populated with risk-averse employees. They consistently follow competitors' successes rather than break new ground.

    The other camp is the many small- to medium-sized companies developing early technologies. They are more energetic, nimble, and accepting of risk than the battleship industries that these smaller companies hope will acquire them or their technologies, and they probably represent the most exciting homes for the more aspiring health care entrepreneurs. Because entrepreneurship is all about taking risk, the aspirant is likely to live in several such homes before succeeding.

    To conclude this introductory chapter, I have emphasized this history, because the impact of what we do today on future events is speculation. The past reveals what has and has not worked to promote meaningful innovation in health care.

    Based on the lessons of history, I begin my conclusion by summarizing what I am not advocating. I am not recommending that every health care professional engage in innovation. Skilled and compassionate delivery of health care within the confines of today's technologies is an extremely important and noble endeavor—and one all too often not well performed. Researchers with the interests and aptitudes that conform to the traditional status-seeking motives of scientists summarized above should be encouraged to pursue research for research's sake. Economic realities suggest the number of such individuals in academic institutions going forward will decline. Because most research discoveries have little or no impact on innovation, such downsizing may not materially impact health care innovation. That said, individuals committed to careers devoted solely to clinical care or basic science are probably not reading this

    Enjoying the preview?
    Page 1 of 1