Meaning of united states army

Definition of united states army

(noun) the army of the United States of America; the agency that organizes and trains soldiers for land warfare

Other information on united states army

WIKIPEDIA results for united states army
Amazon results for united states army