Did you swallow a penny as a child? If so, your pediatrician likely used X-ray to confirm its location. Nearly everyone in North America has received an X-ray in their life, maybe to confirm a broken bone, or commonly dentists use X-ray to evaluate your teeth. Even going through airport security can expose you to X-ray imaging. X-ray imaging has been used in medicine since the late 1800s, but constant innovation has rendered it more accurate and much safer than in the past, giving a significantly reduced dose of radiation while providing clearer images and better diagnoses. Today the X-ray segment is by far the biggest segment in the medical imaging field, with $16 billion in revenue at the equipment level in 2019.
X-rays were discovered accidentally, and called “X”-ray because they were an unknown source of radiation. In the late 1800s Wilhelm Röntgen, a physicist and professor in Germany, was experimenting with cathode-ray tubes and noticed that a fluorescent material in the room would glow when a voltage was running through the tube, despite it being wrapped in black paper. This ray was able to pass through most substances; denser materials would show up as shadows when capturing the X-ray image on a fluorescent screen or photographic plate. In one of his early experiments, he was able to show the bones of his wife’s hand by exposing it to X-rays in front of a glass photographic plate; the bones which are denser than soft tissue, can absorb more X-rays and thus less rays are permitted through and exposed on the plates, creating an image of the shadow of bones. The medical applications of X-ray were immediately clear and there was considerable excitement in the scientific and medical community.
From Exotic Discovery to Everyday Medical Procedure
Within months of the discovery of X-rays they were being used for medical diagnosis and scientific research. Doctors used them to visualize broken bones, tumors and they were even used in field medicine to locate bullets during war. In fact, the first published use of X-ray for medical purposes was in 1896, when a doctor used X-ray to find a lead bullet in the wrist of a boy who had accidentally shot himself. What would have been a long and invasive surgical attempt to find and remove the bullet was instead a faster and easier process because the physician was able to locate the bullet first, and then extract it. X-ray was simple and cost effective and led to significant advancements in diagnostics and better patient outcomes.
Initially it was believed that X-rays were harmless, so they were used for all sorts of applications from hair removal to shoe sizing. We are surrounded by things that emit radiation, from the sun to radon gas to naturally occurring uranium in the soil. This type of radiation is referred to as background radiation and most people in North America are exposed to about 2.4 mSv per year. Early X-ray imaging technology on the other hand produced 75 mSv per session! The young boy with a bullet in his wrist in 1896 was likely exposed to more than 12,000 times the amount of radiation than would be used in a clinic today. Radiation of early X-ray work caused burns for technicians and scientists studying X-rays, and for some even led to death – but these were not immediately attributed to X-rays. By 1929 there was mounting evidence that X-rays could cause significant harm and the United States Advisory Committee on X-Ray and Radium Protection was formed. This group would go on to create standards of exposure, the foundation for the radiation exposure limits in place today; reducing radiation exposure while still allowing the use of an important medical tool has been the driver of significant innovation in X-ray imaging.
Early X-ray imaging and advancement
X-ray imaging initially consisted of an X-ray emitter (a high voltage cathode-ray tube) producing X-rays on one side of the subject (or sample), which would pass through and be captured by a detector on the opposite side of the sample. The detector could be a fluorescent screen for live imaging or the image may be developed on film. However, early detectors were quite inefficient and the images produced were very faint, such that physicians would have to prepare their eyes for 10-15 minutes in the dark prior to using the screens. This inconvenient use of time quickly led to innovation in X-ray generation and detection, which simultaneously lowered radiation doses.
Detecting motion procedures, like placing a screw in broken bones, or pacemaker is challenging and requires live X-ray imaging, a kind of X-ray video. An early advancement in X-ray imaging was the development of image intensifiers. Image intensifier technology has improved substantially over the years and is now coupled with digital imaging, but the basis remains similar: X-rays are absorbed by a phosphor (a solid material that luminesces when exposed to radiation, also called a scintillator), photons are emitted, then converted to electrons which can be accelerated. Those electrons then interact with an output phosphor producing a much larger amount of light – this can be up to 5000x brighter than the light emitted from the input phosphor. This light is then captured by a detection device that allows the image to be viewed; reducing the field of view on the input phosphor and mapping it to a larger field of view on the output phosphor also allows for image magnification, however this comes at a cost to image brightness. As computing came online in the 1970s, X-ray detectors were digitized, and optical lenses coupled the emitted light to a TV camera to create a video signal that could be viewed on a TV monitor.
From analog to digital: The modernization of X-ray imaging by adding CCD and CMOS cameras to the Image Intensifiers
In another technological leap, pulsed fluoroscopy was introduced in the 1980s, which allowed for a further decrease in radiation dose with a trade-off in the temporal resolution achieved in image capture through video recording. More recently, traditional analog video recording has been replaced by charged-couple device (CCD) cameras, which are more stable over time and produce less electronic noise. A CCD camera contains a silicon chip with photosensitive sites (sometimes only a single site) where photons can be converted into electrons such that a voltage can be read for each pixel of an image, this information is transported across the chip and read out at a single transistor. The chip produces analog signal but the CCD contains an analog to digital converter that converts analog signal to a digital signal and allows for digital image reconstruction.
CMOS (Complimentary Metal-Oxide Semiconductor) sensors have been rapidly replacing CCDs in imaging. CMOS based cameras are similar in principle to CCD but they are able to convert the charge from a photosensitive pixel into a voltage reading at each pixel site – which makes them much faster at capturing images, but may not have quite as precise image quality as a CCD. While both technologies were invented in the late 1960s and 1970s, CCDs have dominated the market due to easier manufacturability – however, with the proliferation of smart phones (with both embedded cameras and improved manufacturing technology) CMOS sensors have risen in popularity for their faster imaging capabilities and lower power requirements.
Current state and the future of X-ray imaging in medicine
One challenge of the image intensifier is the production of artifacts within the images as well as loss of spatial and contrast resolution over time, as the phosphor ages. This has led to the development of flat-panel detectors. They operate either through an indirect conversion of X-rays to light through an X-ray scintillator and then to a proportional charge using a photodiode, or, direct conversion of X-rays into an electrical charge using semiconductor materials. Direct flat panel detectors have superior image quality, but are slow in taking images and are unstable at room temperature, which causes complications in medical applications and short lifetime. Indirect conversion detectors are most commonly used today, except for in mammography where direct conversion detectors are still prevalent. One major advantage of the flat panel detector is simply the reduction in bulkiness. Flat panel detectors used for c-arm diagnostic machines allow better patient access due to a more agile system with smaller components.
Improvements in detectors has been in part fueled by discovery of novel scintillator materials. Those materials, however, are synthesized using methods that result in relatively small, inflexible detectors. New materials are being investigated now, in particular perovskites, that may lead to detectors that could have high sensitivity, larger image capture area, flexibility and a lower cost point. X-ray sensors have undergone significant improvements too, and new technologies like IGZO (Indium Gallium Zinc Oxide) and CMOS are replacing traditional amorphous silicon sensors, bringing more speed in acquiring images and increased sensitivity, which leads to better images at lower dose.
If you have a broken bone, or kidney stones, today’s X-rays are an extremely safe and effective way to get a diagnosis and find a path to health. But what if you’ve got a torn ligament? Soft, non-calcified tissues don’t show up well on a traditional X-ray image. Contrast agents may be used intravenously or orally (these are used most often in Computed Tomography, or CT scans) but these only accurately show blood vessels or the digestive tract, so they wouldn’t help diagnose your ligament tear. Other imaging techniques are used for soft tissue, like MRI and ultrasound – but work at MIT in the last decade and, more recently, at Tohoku University in Japan has led to techniques for imaging soft tissue using X-ray. The major advantage? Significantly improved image resolution – which means much smaller features of the soft tissue can be visualized. This is an exciting new area for X-ray imaging that could lead to better diagnostics and patient outcomes for soft tissue disease and injury.
It’s incredible how much of X-ray imaging remains true to the original technology of 1895; equally as impressive are the vast and varied ways that X-ray imaging has improved over the last 127 years. With new materials and techniques being developed and discovered, X-ray imaging will continue to be a valued tool for the medical community.