Single Biggest Cancer Dictionary in the World
What is Western medicine?
Pronunciation: /ˈwɛstərn ˈmɛdəsən/
Western medicine
Definition
A system in which medical doctors and other health care professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.