American Association of University Women American Association of University Women is a national organization that has been around since 1881. It was first founded due to women starting to get college degrees, however they couldn’t get jobs, and AAUW helped facilitate that. The organization’s mission is to promote equity for...
© 2024 — CONNECTCENTER1