Encouraging Diversity in Data Analytics and What It Means To You feat. Dr. William Southerland

Season 1 | Episode 10
22m | Jan 30, 2024

How you get a mortgage, what type of health insurance you qualify for, or what school your child gets into is based on how data statistics are interpreted. But the people behind those numbers matter just as much as the numbers themselves.

In the past five years, the field of data science skyrocketed in the United States from 1,700 jobs in 2016, to more than 10,000 jobs in 2021. But black scientists, scholars, and researchers make up only 3% of the professionals who interpret data and analytics. 

Today, we sit down with Dr. William Southerland. He is a professor of biochemistry and molecular biology, principal investigator of the HU Research Centers for Minorities Institute's program, as well as the interim director of the new Center of Applied Data Science and Analytics at Howard. 

Dr. Southerland and host Frank Tramble chat about the biases we all carry, where data and analytics are used in everyday life, the new Center for Applied Science and Analytics at Howard, and how we can fix all this by changing the demographics of data scientists. 

From HU2U is a production of Howard University and is produced by University FM.

Episode Quotes:

The importance of the data awareness

[19:14] Data impacts everyone. People from all disciplines and all backgrounds are impacted by data, but not everyone is aware of that. And if you're not aware of it, then you can't take advantage of it…whether you are aware of it or not. You are a consumer of data, and if you're not aware of it, then the data you consume is being constructed for you by somebody else.

Equipping the community with data science exposure

[17:32] Data science is everywhere. It affects everybody. So we want to be proactive in making sure that we equip the Howard undergraduate community and undergraduate communities, HBCUs around the country, to do some data science exposure.

Data and their inherent biases

[05:24] A lot of the bias that winds up in algorithms they're not necessarily there intentionally. It's almost like an inadvertent inclusion. The way I look at it sometimes is that in order to build algorithms, there are two components of information that goes in. One aspect goes in by active inclusion, and that's the technical specifications. And then the other, inclusion, is based on passive.

Show Links:

Guest Profile:

Audio Player Image
From HU2U