Conventional medicine is medicine practiced by medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists). Symptoms and diseases are treated using drugs, radiation, or surgery. Conventional medicine may also be called Western medicine.
Tags: C, Cancer Dictionary