Naturism

From Biology-Online Dictionary
Jump to: navigation, search

naturism

(Science: medicine) The belief or doctrine that attributes everything to nature as a sanative agent.