Artwork

Conteúdo fornecido por MIT OpenCourseWare. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por MIT OpenCourseWare ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.
Player FM - Aplicativo de podcast
Fique off-line com o app Player FM !

The Human Element in Machine Learning with Prof. Catherine D’Ignazio, Prof. Jacob Andreas & Harini Suresh

16:03
 
Compartilhar
 

Manage episode 318613822 series 2625682
Conteúdo fornecido por MIT OpenCourseWare. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por MIT OpenCourseWare ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

52 episódios

Artwork
iconCompartilhar
 
Manage episode 318613822 series 2625682
Conteúdo fornecido por MIT OpenCourseWare. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por MIT OpenCourseWare ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

52 episódios

Todos os episódios

×
 
Loading …

Bem vindo ao Player FM!

O Player FM procura na web por podcasts de alta qualidade para você curtir agora mesmo. É o melhor app de podcast e funciona no Android, iPhone e web. Inscreva-se para sincronizar as assinaturas entre os dispositivos.

 

Guia rápido de referências