Artwork

Conteúdo fornecido por ITSPmagazine, Sean Martin, and Marco Ciappelli. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por ITSPmagazine, Sean Martin, and Marco Ciappelli ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.
Player FM - Aplicativo de podcast
Fique off-line com o app Player FM !

Hacking Deepfake Image Detection System with White and Black Box Attacks | A SecTor Cybersecurity Conference Toronto 2024 Conversation with Sagar Bhure | On Location Coverage with Sean Martin and Marco Ciappelli

22:40
 
Compartilhar
 

Manage episode 442828901 series 2972571
Conteúdo fornecido por ITSPmagazine, Sean Martin, and Marco Ciappelli. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por ITSPmagazine, Sean Martin, and Marco Ciappelli ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

Guest: Sagar Bhure, Senior Security Researcher, F5 [@F5]

On LinkedIn | https://www.linkedin.com/in/sagarbhure/

At SecTor | https://www.blackhat.com/sector/2024/briefings/schedule/speakers.html#sagar-bhure-45119

____________________________

Hosts:

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

The authenticity of audio and visual media has become an increasingly significant concern. This episode explores this critical issue, featuring insights from Sean Martin, Marco Ciappelli, and guest Sagar Bhure, a security researcher from F5 Networks.

Sean Martin and Marco Ciappelli engage with Bhure to discuss the challenges and potential solutions related to deepfake technology. Bhure reveals intricate details about the creation and detection of deepfake images and videos. He emphasizes the constant battle between creators of deepfakes and those developing detection tools.

The conversation highlights several alarming instances where deepfakes have been used maliciously. Bhure recounts the case in 2020 where a 17-year-old student successfully fooled Twitter’s verification system with an AI-generated image of a non-existent political candidate. Another incident involved a Hong Kong firm losing $20 million due to a deepfake video impersonating the CFO during a Zoom call. These examples underline the serious implications of deepfake technology for misinformation and financial fraud.

One core discussion point centers on the challenge of distinguishing between real and artificial content. Bhure explains that the advancement in AI and hardware capabilities makes it increasingly difficult for the naked eye to differentiate between genuine and fake images. Despite this, he mentions that algorithms focusing on minute details such as skin textures, mouth movements, and audio sync can still identify deepfakes with varying degrees of success.

Marco Ciappelli raises the pertinent issue of how effective detection mechanisms can be integrated into social media platforms like Twitter, Facebook, and Instagram. Bhure suggests a 'secure by design' approach, advocating for pre-upload verification of media content. He suggests that generative AI should be regulated to prevent misuse while recognizing that artificially generated content also has beneficial applications.

The discussion shifts towards audio deepfakes, highlighting the complexity of their detection. According to Bhure, combining visual and audio detection can improve accuracy. He describes a potential method for audio verification, which involves profiling an individual’s voice over an extended period to identify any anomalies in future interactions.

Businesses are not immune to the threat of deepfakes. Bhure notes that corporate sectors, especially media outlets, financial institutions, and any industry relying on digital communication, must stay vigilant. He warns that deepfake technology can be weaponized to bypass security measures, perpetuate misinformation, and carry out sophisticated phishing attacks.

As technology forges ahead, Bhure calls for continuous improvement in detection techniques and the development of robust systems to mitigate risks associated with deepfakes. He points to his upcoming session at Sector in Toronto, where he will delve deeper into 'Hacking Deepfake Image Detection Systems with White and Black Box Attacks,' offering more comprehensive insights into combating this pressing issue.

____________________________

This Episode’s Sponsors

HITRUST: https://itspm.ag/itsphitweb

____________________________

Follow our SecTor Cybersecurity Conference Toronto 2024 coverage: https://www.itspmagazine.com/sector-cybersecurity-conference-2024-cybersecurity-event-coverage-in-toronto-canada

On YouTube: 📺 https://www.youtube.com/playlist?list=PLnYu0psdcllSCvf6o-K0forAXxj2P190S

Be sure to share and subscribe!

____________________________

Resources

Hacking Deepfake Image Detection System with White and Black Box Attacks: https://www.blackhat.com/sector/2024/briefings/schedule/#hacking-deepfake-image-detection-system-with-white-and-black-box-attacks-40909

Learn more about SecTor Cybersecurity Conference Toronto 2024: https://www.blackhat.com/sector/2024/index.html

____________________________

Catch all of our event coverage: https://www.itspmagazine.com/technology-cybersecurity-society-humanity-conference-and-event-coverage

Are you interested in sponsoring our event coverage with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Want to tell your Brand Story as part of our event coverage?

Learn More 👉 https://itspm.ag/evtcovbrf

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: https://www.itspmagazine.com/redefining-cybersecurity-podcast

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

  continue reading

620 episódios

Artwork
iconCompartilhar
 
Manage episode 442828901 series 2972571
Conteúdo fornecido por ITSPmagazine, Sean Martin, and Marco Ciappelli. Todo o conteúdo do podcast, incluindo episódios, gráficos e descrições de podcast, é carregado e fornecido diretamente por ITSPmagazine, Sean Martin, and Marco Ciappelli ou por seu parceiro de plataforma de podcast. Se você acredita que alguém está usando seu trabalho protegido por direitos autorais sem sua permissão, siga o processo descrito aqui https://pt.player.fm/legal.

Guest: Sagar Bhure, Senior Security Researcher, F5 [@F5]

On LinkedIn | https://www.linkedin.com/in/sagarbhure/

At SecTor | https://www.blackhat.com/sector/2024/briefings/schedule/speakers.html#sagar-bhure-45119

____________________________

Hosts:

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

The authenticity of audio and visual media has become an increasingly significant concern. This episode explores this critical issue, featuring insights from Sean Martin, Marco Ciappelli, and guest Sagar Bhure, a security researcher from F5 Networks.

Sean Martin and Marco Ciappelli engage with Bhure to discuss the challenges and potential solutions related to deepfake technology. Bhure reveals intricate details about the creation and detection of deepfake images and videos. He emphasizes the constant battle between creators of deepfakes and those developing detection tools.

The conversation highlights several alarming instances where deepfakes have been used maliciously. Bhure recounts the case in 2020 where a 17-year-old student successfully fooled Twitter’s verification system with an AI-generated image of a non-existent political candidate. Another incident involved a Hong Kong firm losing $20 million due to a deepfake video impersonating the CFO during a Zoom call. These examples underline the serious implications of deepfake technology for misinformation and financial fraud.

One core discussion point centers on the challenge of distinguishing between real and artificial content. Bhure explains that the advancement in AI and hardware capabilities makes it increasingly difficult for the naked eye to differentiate between genuine and fake images. Despite this, he mentions that algorithms focusing on minute details such as skin textures, mouth movements, and audio sync can still identify deepfakes with varying degrees of success.

Marco Ciappelli raises the pertinent issue of how effective detection mechanisms can be integrated into social media platforms like Twitter, Facebook, and Instagram. Bhure suggests a 'secure by design' approach, advocating for pre-upload verification of media content. He suggests that generative AI should be regulated to prevent misuse while recognizing that artificially generated content also has beneficial applications.

The discussion shifts towards audio deepfakes, highlighting the complexity of their detection. According to Bhure, combining visual and audio detection can improve accuracy. He describes a potential method for audio verification, which involves profiling an individual’s voice over an extended period to identify any anomalies in future interactions.

Businesses are not immune to the threat of deepfakes. Bhure notes that corporate sectors, especially media outlets, financial institutions, and any industry relying on digital communication, must stay vigilant. He warns that deepfake technology can be weaponized to bypass security measures, perpetuate misinformation, and carry out sophisticated phishing attacks.

As technology forges ahead, Bhure calls for continuous improvement in detection techniques and the development of robust systems to mitigate risks associated with deepfakes. He points to his upcoming session at Sector in Toronto, where he will delve deeper into 'Hacking Deepfake Image Detection Systems with White and Black Box Attacks,' offering more comprehensive insights into combating this pressing issue.

____________________________

This Episode’s Sponsors

HITRUST: https://itspm.ag/itsphitweb

____________________________

Follow our SecTor Cybersecurity Conference Toronto 2024 coverage: https://www.itspmagazine.com/sector-cybersecurity-conference-2024-cybersecurity-event-coverage-in-toronto-canada

On YouTube: 📺 https://www.youtube.com/playlist?list=PLnYu0psdcllSCvf6o-K0forAXxj2P190S

Be sure to share and subscribe!

____________________________

Resources

Hacking Deepfake Image Detection System with White and Black Box Attacks: https://www.blackhat.com/sector/2024/briefings/schedule/#hacking-deepfake-image-detection-system-with-white-and-black-box-attacks-40909

Learn more about SecTor Cybersecurity Conference Toronto 2024: https://www.blackhat.com/sector/2024/index.html

____________________________

Catch all of our event coverage: https://www.itspmagazine.com/technology-cybersecurity-society-humanity-conference-and-event-coverage

Are you interested in sponsoring our event coverage with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Want to tell your Brand Story as part of our event coverage?

Learn More 👉 https://itspm.ag/evtcovbrf

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: https://www.itspmagazine.com/redefining-cybersecurity-podcast

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

  continue reading

620 episódios

All episodes

×
 
Loading …

Bem vindo ao Player FM!

O Player FM procura na web por podcasts de alta qualidade para você curtir agora mesmo. É o melhor app de podcast e funciona no Android, iPhone e web. Inscreva-se para sincronizar as assinaturas entre os dispositivos.

 

Guia rápido de referências