tropical medicine


noun

the branch of medicine dealing with the study and treatment of diseases occurring in the tropics.

Medical definitions for tropical medicine

tropical medicine

n.

The branch of medicine that deals with diseases occurring in tropical countries.