The magnitude of the risks from low doses of radiation is one of the central questions in radiological protection. It is particularly relevant when discussing the justification and optimization of diagnostic medical exposures. Medical X-rays can undoubtedly confer substantial benefits in the healthcare of patients, but not without exposing them to effective doses ranging from a few microsieverts to a few tens of millisieverts. Do we have any evidence that these levels of exposure result in significant health risks to patients? The current consensus held by national and international radiological protection organizations is that, for these comparatively low doses, the most appropriate risk model is one in which the risk of radiation-induced cancer and hereditary disease is assumed to increase linearly with increasing radiation dose, with no threshold (the so-called linear no threshold (LNT) model). However, the LNT hypothesis has been challenged both by those who believe that low doses of radiation are more damaging than the hypothesis predicts and by those who believe that they are less harmful, and possibly even beneficial (often referred to as hormesis). This article reviews the evidence for and against both the LNT hypothesis and hormesis, and explains why the general scientific consensus is currently in favour of the LNT model as the most appropriate dose-response relationship for radiation protection purposes at low doses. Finally, the impact of the LNT model on the assessment of the risks from medical X-rays and how this affects the justification and optimization of such exposures is discussed.