Artwork

Conteúdo fornecido por Soroush Pour. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por Soroush Pour ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.
Player FM - Aplicativo de podcast
Fique off-line com o app Player FM !

Ep 12 - Education & advocacy for AI safety w/ Rob Miles (YouTube host)

1:21:26
 
Compartilhar
 

Manage episode 405391218 series 3428190
Conteúdo fornecido por Soroush Pour. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por Soroush Pour ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 episódios

Artwork
iconCompartilhar
 
Manage episode 405391218 series 3428190
Conteúdo fornecido por Soroush Pour. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por Soroush Pour ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 episódios

Todos os episódios

×
 
Loading …

Bem vindo ao Player FM!

O Player FM procura na web por podcasts de alta qualidade para você curtir agora mesmo. É o melhor app de podcast e funciona no Android, iPhone e web. Inscreva-se para sincronizar as assinaturas entre os dispositivos.

 

Guia rápido de referências