Find the word definition

Wiktionary
dominionism

n. A tendency among some conservative Christians, especially in the USA, to seek influence or control over secular civil government through political action.