American Studies Definition
noun
An interdisciplinary field dealing with the study of the Americas , with a historical emphasis upon the United States , It traditionally incorporating the study of history, literature, and critical theory, but also includes fields as diverse as law, art, the media, film, religious studies, urban studies, women's studies, gender studies, anthropology, sociology, foreign policy and culture of the United States, among others.
Wiktionary
Find Similar Words
Find similar words to American studies using the buttons below.