Dominionism Definition
də-mĭnyə-nĭzəm
noun
The theory or doctrine that Christians have a divine mandate to assume positions of power and influence over all aspects of society and government.
American Heritage
The belief that God gave humans the right to exercise control over the natural world.
American Heritage
Related Articles
Find Similar Words
Find similar words to dominionism using the buttons below.