Radiation therapy has been in use as a cancer treatment for more than 100 years, with its earliest roots traced from the discovery of x-rays in 1895 by Wilhelm Röntgen.[6]
The field of radiation therapy began to grow in the early 1900s largely due to the groundbreaking work of Nobel Prize-winning scientist Marie Curie, who discovered the radioactive elements polonium and radium. This began a new era in medical treatment and research.[6] Radium was used in various forms until the mid-1900s when cobalt and caesium units came into use. Medical linear accelerators have been used too as sources of radiation since the late 1940s.
With Godfrey Hounsfield’s invention of computed tomography (CT) in 1971, three-dimensional planning became a possibility and created a shift from 2-D to 3-D radiation delivery; CT-based planning allows physicians to more accurately determine the dose distribution using axial tomographic images of the patient’s anatomy. Orthovoltage and cobalt units have largely been replaced by megavoltage linear accelerators, useful for their penetrating energies and lack of physical radiation source.
The advent of new imaging technologies, including magnetic resonance imaging (MRI) in the 1970s and positron emission tomography (PET) in the 1980s, has moved radiation therapy from 3-D conformal to intensity-modulated radiation therapy (IMRT) and image-guided radiation therapy (IGRT). These advances allowed radiation oncologists to better see and target tumors, which have resulted in better treatment outcomes, more organ preservation and fewer side effects.