By Jo Allen. The role of women’s colleges—far beyond their origins in offering access to college degrees—is to help women flourish. Some women’s colleges have focused on women’s leadership; some on career preparation in STEM and other areas where women have been under-represented; and still others on health care, education and areas where women excel. More...
27 avril 2016
Why women’s colleges are still needed
Commentaires