Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
“If there were no past, science would be a myth; the human mind a desert. Evil would preponderate over good, and darkness would overspread the face of the moral and scientific world.”
Samuel D. Gross (Louisville Review 1:26–27, 1856)
From earliest recorded history through late in the 19th century, the manner of surgery changed little. During those thousands of years, surgical operations were always frightening, often fatal, and frequently infected. In this prescientific, preanesthetic, and preantiseptic time, procedures were performed only for the direst of necessities and were unlike anything seen today; fully conscious patients were held or tied down to prevent their fleeing the surgeon’s unsparing knife. When the surgeon, or at least those persons who used the sobriquet “surgeon,” performed an operation, it was inevitably for an ailment that could be visualized (i.e., on the skin and just below the surface, on the extremities, or in the mouth).
Through the 14th century in Europe, most surgical therapy was delivered by minimally educated barber-surgeons and other itinerant adherents of the surgical cause. These faithful but obscure followers of the craft of surgery, although ostracized by aristocratic, university-educated physicians who eschewed the notion of working with one’s hands, ensured the ultimate survival of what was then a vocation passed on from father to son. The roving “surgeons” mainly lanced abscesses; fixed simple fractures; dressed wounds; extracted teeth; and, on rare occasions, amputated a digit, limb, or breast. Around the 15th century, the highborn physicians began to show an interest in the art of surgery. As surgical techniques evolved, knife bearers, whether privileged physicians or wandering vagabonds, ligated arteries for readily accessible aneurysms, excised large visible tumors, performed trephinations, devised ingenious methods to reduce incarcerated and strangulated hernias, and created rudimentary colostomies and ileostomies by simply incising the skin over an expanding intraabdominal mass that represented the end stage of an intestinal blockage. The more entrepreneurial scalpel wielders widened the scope of their activities by focusing on the care of anal fistulas, bladder stones, and cataracts. Notwithstanding the growing boldness and ingenuity of “surgeons,” surgical operations on the cavities of the body (i.e., abdomen, cranium, joints, and thorax) were generally unknown and, if attempted, fraught with danger.
Despite the terrifying nature of surgical intervention, operative surgery in the prescientific era was regarded as an important therapy within the whole of Medicine. (In this chapter, “Medicine” signifies the totality of the profession, and “medicine” indicates internal medicine as differentiated from surgery, obstetrics, pediatrics, and other specialties.) This seeming paradox, in view of the limited technical appeal of surgery, is explained by the fact that surgical procedures were performed for disorders observable on the surface of the body: There was an “objective” anatomic diagnosis. The men who performed surgical operations saw what needed to be fixed (e.g., inflamed boils, broken bones, bulging tumors, grievous wounds, necrotic digits and limbs, rotten teeth) and treated the problem in as rational a manner as the times permitted.
For individuals who practiced medicine, care was rendered in a more “subjective” manner involving diseases whose etiologies were neither seen nor understood. It is difficult to treat the symptoms of illnesses such as arthritis, asthma, diabetes, and heart failure when there is no scientific understanding as to what constitutes their pathologic and physiologic underpinnings. It was not until the 19th century and advances in pathologic anatomy and experimental physiology that practitioners of medicine were able to embrace a therapeutic viewpoint more closely, approximating that of surgeons. There was no longer a question of treating signs and symptoms in a blind manner. Similar to surgeons who operated on maladies that could be physically described, physicians now cared for patients using clinical details based on “objective” pathophysiologic findings.
Surgeons never needed a diagnostic and pathologic/physiologic revolution in the style of the physician. Despite the imperfection of their knowledge, prescientific surgeons with their unwavering amputation/extirpation approach to treatment sometimes did cure with technical confidence. Notwithstanding their dexterity, it required the spread of the revolution in Medicine during the 1880s and 1890s and the implementation of aseptic techniques along with other soon-to-come discoveries, including the x-ray, blood transfusion, and frozen section, to allow surgeons to emerge as specialists. It would take several more decades, well into the 20th century, for administrative and organizational events to occur before surgery could be considered a bona fide profession.
The explanation for the slow rise of surgery was the protracted elaboration of four key elements (knowledge of anatomy, control of bleeding, control of pain, and control of infection) that were more critical than technical skills when it came to the performance of a surgical procedure. These prerequisites had to be understood and accepted before a surgical operation could be considered a viable therapeutic option. The first two elements started to be addressed in the 16th century, and, although surgery greatly benefited from the breakthroughs, its reach was not extended beyond the exterior of the body, and pain and infection continued to be issues for the patient and the surgical operation. Over the ensuing 300 years, there was little further improvement until the discovery of anesthesia in the 1840s and recognition of surgical antisepsis during the 1870s and 1880s. The subsequent blossoming of scientific surgery brought about managerial and socioeconomic initiatives (standardized postgraduate surgical education and training programs; experimental surgical research laboratories; specialty journals, textbooks, monographs, and treatises; and professional societies and licensing organizations) that fostered the concept of professionalism. By the 1950s, the result was a unified profession that was practical and scholarly in nature. Some of the details of the rise of modern surgery follow—specifically how the four key elements that allowed a surgical operation to be viewed as a practical therapeutic choice came to be acknowledged.
Although knowledge of anatomy is the primary requirement of surgery, it was not until the mid-1500s and the height of the European Renaissance that the first great contribution to an understanding of the structure of the human body occurred. This came about when Popes Sixtus IV (1414–1484) and Clement VII (1478–1534) reversed the church’s long-standing ban of human dissection and sanctioned the study of anatomy from the cadaver. Andreas Vesalius (1514–1564) ( Fig. 1.1 ) stepped to the forefront of anatomic studies along with his celebrated treatise, De Humani Corporis Fabrica Libri Septem (1543). The Fabrica broke with the past and provided more detailed descriptions of the human body than any of its predecessors. It corrected errors in anatomy that were propagated thousands of years earlier by Greek and Roman authorities, especially Claudius Galen (129–199 ad ), whose misleading and later church-supported views were based on animal rather than human dissection. Just as groundbreaking as his anatomic observations was Vesalius’ blunt assertion that dissection had to be completed hands-on by physicians themselves. This was a direct repudiation of the long-standing tradition that dissection was a loathsome task to be performed only by individuals in the lower class while the patrician physician sat on high reading out loud from a centuries-old anatomic text.
Vesalius was born in Brussels to a family with extensive ties to the court of the Holy Roman Emperors. He received his medical education in France at universities in Montpellier and Paris and for a short time taught anatomy near his home in Louvain. Following several months’ service as a surgeon in the army of Charles V (1500–1558), the 23-year-old Vesalius accepted an appointment as professor of anatomy at the University of Padua in Italy. He remained there until 1544, when he resigned his post to become court physician to Charles V and later to Charles’ son, Philip II (1527–1598). Vesalius was eventually transferred to Madrid, but for various reasons, including supposed trouble with authorities of the Spanish Inquisition, he planned a return to his academic pursuits. However, first, in 1563, Vesalius set sail for a year-long pilgrimage to the Holy Land. On his return voyage, Vesalius’ ship was wrecked, and he and others were stranded on the small Peloponnesian island of Zakynthos. Vesalius died there as a result of exposure, starvation, and the effects of a severe illness, probably typhoid.
The 7 years that Vesalius spent in Padua left an indelible mark on the evolution of Medicine and especially surgery. His well-publicized human dissections drew large crowds, and Vesalius was in constant demand to provide anatomic demonstrations in other Italian cities, all of which culminated in the publication of the Fabrica . Similar to most revolutionary works, the book attracted critics and sympathizers, and the youthful Vesalius was subjected to vitriolic attacks by some of the most renowned anatomists of that era. To his many detractors, the impassioned Vesalius often responded with intemperate counterattacks that did little to further his cause. In one fit of anger, Vesalius burned a trove of his own manuscripts and drawings.
The popularity of Vesalius’ Fabrica rested on its outstanding illustrations. For the first time, detailed drawings of the human body were closely integrated with an accurate written text. Artists, believed to be from the school of Titian (1477–1576) in Venice, produced pictures that were scientifically accurate and creatively beautiful. The woodcuts, with their majestic skeletons and flayed muscled men set against backgrounds of rural and urban landscapes, became the standard for anatomic texts for several centuries.
The work of Vesalius paved the way for wide-ranging research into human anatomy, highlighted by a fuller understanding of the circulation of blood. In 1628, William Harvey (1578–1657) showed that the heart acts as a pump and forces blood along the arteries and back via veins, forming a closed loop. Although not a surgeon, Harvey’s research had enormous implications for the evolution of surgery, particularly its relationship with anatomy and the conduct of surgical operations. As a result, in the 17th century, links between anatomy and surgery intensified as skilled surgeon-anatomists arose.
During the 18th century and first half of the 19th century, surgeon-anatomists made some of their most remarkable observations. Each country had its renowned individuals: In The Netherlands were Govard Bidloo (1649–1713), Bernhard Siegfried Albinus (1697–1770), and Pieter Camper (1722–1789); Albrecht von Haller (1708–1777), August Richter (1742–1812), and Johann Friedrich Meckel (1781–1833) worked in Germany; Antonio Scarpa (1752–1832) worked in Italy; and in France, Pierre-Joseph Desault (1744–1795), Jules Cloquet (1790–1883), and Alfred Armand Louis Marie Velpeau (1795–1867) were the most well-known. Above all, however, were the efforts of numerous British surgeon-anatomists who established a well-deserved tradition of excellence in research and teaching.
William Cowper (1666–1709) was one of the earliest and best known of the English surgeon-anatomists, and his student, William Cheselden (1688–1752), established the first formal course of instruction in surgical anatomy in London in 1711. In 1713, Anatomy of the Human Body by Cheselden was published and became so popular that it went through at least 13 editions. Alexander Monro (primus) (1697–1767) was Cheselden’s mentee and later established a center of surgical-anatomic teaching in Edinburgh, which was eventually led by his son Alexander ( secundus ) (1737–1817) and grandson Alexander (tertius) (1773–1859). In London, John Hunter (1728–1793) ( Fig. 1.2 ), who is considered among the greatest surgeons of all time, gained fame as a comparative anatomist-surgeon, while his brother, William Hunter (1718–1783), was a successful obstetrician who authored the acclaimed atlas, Anatomy of the Human Gravid Uterus (1774). Another brother duo, John Bell (1763–1820) and Charles Bell (1774–1842), worked in Edinburgh and London, where their exquisite anatomic engravings exerted a lasting influence. By the middle of the 19th century, surgical anatomy as a scientific discipline was well established. However, as surgery evolved into a more demanding profession, the anatomic atlases and illustrated surgical textbooks were less likely to be written by the surgeon-anatomist and instead were written by the full-time anatomist.
Although Vesalius brought about a greater understanding of human anatomy, one of his contemporaries, Ambroise Paré (1510–1590) ( Fig. 1.3 ), proposed a method to control hemorrhage during a surgical operation. Similar to Vesalius, Paré is important to the history of surgery because he also represents a severing of the final link between the surgical thoughts and techniques of the ancients and the push toward a more modern era. The two men were acquaintances, both having been summoned to treat Henry II (1519–1559), who sustained what proved to be a fatal lance blow to his head during a jousting match.
Paré was born in France and, at an early age, apprenticed to a series of itinerant barber-surgeons. He completed his indentured education in Paris, where he served as a surgeon’s assistant/wound dresser in the famed Hôtel Dieu. From 1536 until just before his death, Paré worked as an army surgeon (he accompanied French armies on their military expeditions) while also maintaining a civilian practice in Paris. Paré’s reputation was so great that four French kings, Henry II, Francis II (1544–1560), Charles IX (1550–1574), and Henry III (1551–1589) selected him as their surgeon-in-chief. Despite being a barber-surgeon, Paré was eventually made a member of the Paris-based College of St. Côme, a self-important fraternity of university-educated physician/surgeon. On the strength of Paré’s personality and enormity of his clinical triumphs, a rapprochement between the two groups ensued, which set a course for the rise of surgery in France.
In Paré’s time, applications of a cautery or boiling oil or both were the most commonly employed methods to treat a wound and control hemorrhage. Their use reflected belief in a medical adage dating back to the age of Hippocrates: Those diseases that medicines do not cure, iron cures; those that iron cannot cure, fire cures; and those that fire cannot cure are considered incurable. Paré changed such thinking when, on a battlefield near Turin, his supply of boiling oil ran out. Not knowing what to do, Paré blended a concoction of egg yolk, rose oil (a combination of ground-up rose petals and olive oil), and turpentine and treated the remaining injured. Over the next several days, he observed that the wounds of the soldiers dressed with the new mixture were neither as inflamed nor as tender as the wounds treated with hot oil. Paré abandoned the use of boiling oil not long afterward.
Paré sought other approaches to treat wounds and staunch hemorrhage. His decisive answer was the ligature, and its introduction proved a turning point in the evolution of surgery. The early history of ligation of blood vessels is shrouded in uncertainty, and whether it was the Chinese and Egyptians or the Greeks and Romans who first suggested the practice is a matter of historical conjecture. One thing is certain: The technique was long forgotten, and Paré considered his method of ligation during an amputation to be original and nothing short of divine inspiration. He even designed a predecessor to the modern hemostat, a pinching instrument called the bec de corbin, or “crow’s beak,” to control bleeding while the vessel was handled.
As with many groundbreaking ideas, Paré’s suggestions regarding ligatures were not readily accepted. The reasons given for the slow embrace range from a lack of skilled assistants to help expose blood vessels to the large number of instruments needed to achieve hemostasis—in preindustrial times, surgical tools were hand-made and expensive to produce. The result was that ligatures were not commonly used to control bleeding, especially during an amputation, until other devices were available to provide temporary hemostasis. This did not occur until the early 18th century when Jean-Louis Petit (1674–1750) invented the screw compressor tourniquet. Petit’s device placed direct pressure over the main artery of the extremity to be amputated and provided the short-term control of bleeding necessary to allow the accurate placement of ligatures. Throughout the remainder of the 18th and 19th centuries, the use of new types of sutures and tourniquets increased in tandem as surgeons attempted to ligate practically every blood vessel in the body. Nonetheless, despite the abundance of elegant instruments and novel suture materials (ranging from buckskin to horsehair), the satisfactory control of bleeding, especially in delicate surgical operations, remained problematic.
Starting in the 1880s, surgeons began to experiment with electrified devices that could cauterize. These first-generation electrocauteries were ungainly machines, but they did quicken the conduct of a surgical operation. In 1926, Harvey Cushing (1869–1939), professor of surgery at Harvard, experimented with a less cumbersome surgical device that contained two separate electric circuits, one to incise tissue without bleeding and the other simply to coagulate. The apparatus was designed by a physicist, William Bovie (1881–1958), and the two men collaborated to develop interchangeable metal tips, steel points, and wire loops that could be attached to a sterilizable pistol-like grip used to direct the electric current. As the electrical and engineering snags were sorted out, the Bovie electroscalpel became an instrument of trailblazing promise; almost a century later, it remains a fundamental tool in the surgeon’s armamentarium.
In the prescientific era, the inability of surgeons to perform pain-free operations was among the most terrifying dilemmas of Medicine. To avoid the horror of the surgeon’s merciless knife, patients often refused to undergo a needed surgical operation or repeatedly delayed the event. That is why a scalpel wielder was more concerned about the speed with which he could complete a procedure than the effectiveness of the dissection. Narcotic and soporific agents, such as hashish, mandrake, and opium, had been used for thousands of years, but all were for naught. Nothing provided any semblance of freedom from the misery of a surgical operation. This was among the reasons why the systematic surgical exploration of the abdomen, cranium, joints, and thorax had to wait.
As anatomic knowledge and surgical techniques improved, the search for safe methods to render a patient insensitive to pain became more pressing. By the mid-1830s, nitrous oxide had been discovered, and so-called laughing gas frolics were coming into vogue as young people amused themselves with the pleasant side effects of this compound. After several sniffs, individuals lost their sense of equilibrium, carried on without inhibition, and felt little discomfort as they clumsily knocked into nearby objects. Some physicians and dentists realized that the pain-relieving qualities of nitrous oxide might be applicable to surgical operations and tooth extractions.
A decade later, Horace Wells (1815–1848), a dentist from Connecticut, had fully grasped the concept of using nitrous oxide for inhalational anesthesia. In early 1845, he traveled to Boston to share his findings with a dental colleague, William T.G. Morton (1819–1868), in the hopes that Morton’s familiarity with the city’s medical elite would lead to a public demonstration of painless tooth-pulling. Morton introduced Wells to John Collins Warren (1778–1856), professor of surgery at Harvard, who invited the latter to show his discovery before a class of medical students, one of whom volunteered to have his tooth extracted. Wells administered the gas and grasped the tooth. Suddenly, the supposedly anesthetized student screamed in pain. An uproar ensued as catcalls and laughter broke out. A disgraced Wells fled the room followed by several bystanders who hollered at him that the entire spectacle was a “humbug affair.” For Wells, it was too much to bear. He returned to Hartford and sold his house and dental practice.
However, Morton understood the practical potential of Wells’ idea and took up the cause of pain-free surgery. Uncertain about the reliability of nitrous oxide, Morton began to test a compound that one of his medical colleagues, Charles T. Jackson (1805–1880), suggested would work better as an inhalational anesthetic—sulfuric ether. Armed with this advice, Morton studied the properties of the substance while perfecting his inhalational techniques. In fall 1846, Morton was ready to demonstrate the results of his experiments to the world and implored Warren to provide him a public venue. On October 16, with the seats of the operating amphitheater of Massachusetts General Hospital filled to capacity, a tense Morton, having anesthetized a 20-year-old man, turned to Warren and told him that all was ready. The crowd was silent and set their gaze on the surgeon’s every move. Warren grabbed a scalpel, made a 3-inch incision, and excised a small vascular tumor on the patient’s neck. For 25 minutes, the spectators watched in stunned disbelief as the surgeon performed a painless surgical operation.
Whether the men in the room realized that they had just witnessed one of the most important events in Medical history is unknown. An impressed Warren, however, slowly uttered the five most famous words in American surgery: “Gentlemen, this is no humbug.” No one knew what to do or say. Warren turned to his patient and repeatedly asked him whether he felt anything. The answer was a definitive no—no pain, no discomfort, nothing at all. Few medical discoveries have been so readily accepted as inhalational anesthesia. News of the momentous event spread swiftly as a new era in the history of surgery began. Within months, sulfuric ether and another inhalational agent, chloroform, were used in hospitals worldwide.
The acceptance of inhalational anesthesia fostered research on other techniques to achieve pain-free surgery. In 1885, William Halsted (1852–1922) ( Fig. 1.4 ), professor of surgery at the Johns Hopkins Hospital in Baltimore, announced that he had used cocaine and infiltration anesthesia (nerve-blocking) with great success in more than 1000 surgical cases. At the same time, James Corning (1855–1923) of New York carried out the earliest experiments on spinal anesthesia, which were soon expanded on by August Bier (1861–1939) of Germany. By the late 1920s, spinal anesthesia and epidural anesthesia were widely used in the United States and Europe. The next great advance in pain-free surgery occurred in 1934, when the introduction of an intravenous anesthetic agent (sodium thiopental [Sodium Pentothal]) proved tolerable to patients, avoiding the sensitivity of the tracheobronchial tree to anesthetic vapors.
Anesthesia helped make the potential for surgical cures more seductive. Haste was no longer of prime concern. However, no matter how much the discovery of anesthesia contributed to the relief of pain during surgical operations, the evolution of surgery could not proceed until the problem of postoperative infection was resolved. If ways to deaden pain had never been conceived, a surgical procedure could still be performed, although with much difficulty. Such was not the case with infection. Absent antisepsis and asepsis, surgical procedures were more likely to end in death rather than just pain.
In the rise of modern surgery, several individuals and their contributions stand out as paramount. Joseph Lister (1827–1912) ( Fig. 1.5 ), an English surgeon, belongs on this select list for his efforts to control surgical infection through antisepsis. Lister’s research was based on the findings of the French chemist Louis Pasteur (1822–1895), who studied the process of fermentation and showed that it was caused by the growth of living microorganisms. In the mid-1860s, Lister hypothesized that these invisible “germs,” or, as they became known, bacteria, were the cause of wound healing difficulties in surgical patients. He proposed that it was feasible to prevent suppuration by applying an antibacterial solution to a wound and covering the site in a dressing saturated with the same germicidal liquid.
Lister was born into a well-to-do Quaker family from London. In 1848, he received his medical degree from University College. Lister was appointed a fellow of the Royal College of Surgeons 4 years later. He shortly moved to Edinburgh, where he became an assistant to James Syme (1799–1870). Their mentor/mentee relationship was strengthened when Lister married Syme’s daughter Agnes (1835–1896). At the urging of his father-in-law, Lister applied for the position of professor of surgery in Glasgow. The 9 years that he spent there were the most important period in Lister’s career as a surgeon-scientist.
In spring 1865, a colleague told Lister about Pasteur’s research on fermentation and putrefaction. Lister was one of the few surgeons of his day who, because of his familiarity with the microscope (his father designed the achromatic lens and was one of the founders of modern microscopy), had the ability to understand Pasteur’s findings about microorganisms on a first-hand basis. Armed with this knowledge, Lister showed that an injury was already full of bacteria by the time the patient arrived at the hospital.
Lister recognized that the elimination of bacteria by excessive heat could not be applied to a patient. Instead, he turned to chemical antisepsis and, after experimenting with zinc chloride and sulfites, settled on carbolic acid (phenol). By 1866, Lister was instilling pure carbolic acid into wounds and onto dressings and spraying it into the atmosphere around the operative field and table. The following year, he authored a series of papers on his experience in which he explained that pus in a wound (these were the days of “laudable pus,” when it was mistakenly believed the more suppuration the better) was not a normal part of the healing process. Lister went on to make numerous modifications in his technique of dressings, manner of applying them, and choice of antiseptic solutions—carbolic acid was eventually abandoned in favor of other germicidal substances. He did not emphasize hand scrubbing but merely dipped his fingers into a solution of phenol and corrosive sublimate. Lister was incorrectly convinced that scrubbing created crevices in the palms of the hands where bacteria would proliferate.
A second major advance by Lister was the development of sterile absorbable sutures. Lister believed that much of the suppuration found in wounds was created by contaminated ligatures. To prevent the problem, Lister devised an absorbable suture impregnated with phenol. Because it was not a permanent ligature, he was able to cut it short, closing the wound tightly and eliminating the necessity of bringing the ends of the suture out through the incision, a surgical practice that had persisted since the days of Paré.
For many reasons, the acceptance of Lister’s ideas about infection and antisepsis was an uneven and slow process. First, the various procedural changes that Lister made during the evolution of his method created confusion. Second, listerism, as a technical exercise, was complicated and time-consuming. Third, early attempts by other surgeons to use antisepsis were abject failures. Finally, and most importantly, acceptance of listerism depended on an understanding of the germ theory, a hypothesis that many practical-minded scalpel wielders were loathed to recognize.
As a professional group, German-speaking surgeons were the earliest to grasp the importance of bacteriology and Lister’s ideas. In 1875, Richard von Volkmann (1830–1889) and Johann Nussbaum (1829–1890) commented favorably on their treatment of compound fractures with antiseptic methods. In France, Just Lucas-Championière (1843–1913) was not far behind. The following year, Lister traveled to the United States, where he spoke at the International Medical Congress held in Philadelphia and gave additional lectures in Boston and New York. Lister’s presentations were memorable, sometimes lasting more than three hours, but American surgeons remained unconvinced about his message. American surgeons did not begin to embrace the principles of antisepsis until the mid-1880s. The same was also true in Lister’s home country, where he initially encountered strong opposition led by the renowned gynecologist Lawson Tait (1845–1899).
Over the years, Lister’s principles of antisepsis gave way to principles of asepsis, or the complete elimination of bacteria. The concept of asepsis was forcefully advanced by Ernst von Bergmann (1836–1907), professor of surgery in Berlin, who recommended steam sterilization (1886) as the ideal method to eradicate germs. By the mid-1890s, less clumsy antiseptic and aseptic techniques had found their way into most American and European surgical amphitheaters. Any lingering doubts about the validity of Lister’s concepts of wound infection were eliminated on the battlefields of World War I. Aseptic technique was virtually impossible to attain on the battlefield, but the invaluable principle of wound treatment by means of surgical debridement and mechanical irrigation with an antiseptic solution was developed by Alexis Carrel (1873–1944) ( Fig. 1.6 ), the Nobel prize-winning French-American surgeon, and Henry Dakin (1880–1952), an English chemist.
Once antiseptic and aseptic techniques had been accepted as routine elements of surgical practice, it was inevitable that other antibacterial rituals would take hold, in particular, the use of caps, hats, masks, drapes, gowns, and rubber gloves. Until the 1870s, surgeons did not use gloves because the concept of bacteria on the hands was not recognized. In addition, no truly functional glove had ever been designed. This situation changed in 1878, when an employee of the India-Rubber Works in Surrey, England, received British and U.S. patents for the manufacture of a surgical glove that had a “delicacy of touch.” The identity of the first surgeon who required that flexible rubber gloves be consistently worn for every surgical operation is uncertain. Halsted is regarded as the individual who popularized their use, although the idea of rubber gloves was not fully accepted until the 1920s.
In 1897, Jan Mikulicz-Radecki (1850–1905), a Polish-Austrian surgeon, devised a single-layer gauze mask to be worn during a surgical operation. An assistant modified the mask by placing two layers of cotton-muslin onto a large wire frame to keep the gauze away from the surgeon’s lips and nose. This modification was crucial because a German microbiologist showed that bacteria-laden droplets from the mouth and nose enhanced the likelihood of wound infection. Silence in the operating room became a cardinal feature of surgery in the early 20th century. At approximately the same time, when it was also determined that masks provided less protection if an individual was bearded, the days of surgeons sporting bushy beards and droopy mustaches went by the wayside.
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here