Anatomism Definition
noun
The application of the principles of anatomy, as in art.
Wiktionary
The doctrine that the anatomical structure explains all the phenomena of the organism or of animal life.
Wiktionary
Related Articles
Find Similar Words
Find similar words to anatomism using the buttons below.