United States of America proper noun
or United States
United States of America
proper noun
or United States
Learner's definition of UNITED STATES OF AMERICA
the United States of America
or the United States
: country in North America美利坚合众国
— American
adjective or noun