- Corporate America
-
"Corporate America" is an informal phrase describing the world of corporations within the United States not under government ownership.
References
Categories:- American political terms
- Capitalism
- Vocabulary and usage stubs
Wikimedia Foundation. 2010.