A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.
0 comments:
Post a Comment