Beyond the Hype: Science Communication in an AI-Driven World with Alison Kershaw
In this episode, our host Dr Zena Assaad is joined by Alison Kershaw, an award winning science communicator. The conversation explores the challenges of science communication in an age of generative AI. Together they unpack the layered marketing behind artificial intelligence and why its promises resonate so widely. From claims about productivity and economic growth to the sweeping narratives that shape public perception, they explore how AI is being sold to us and why that matters. They also delve into the limitations of generative AI, such as ChatGPT, in science communication and explain why critical thinking is so important.More about our guest:📘LinkedIn: https://www.linkedin.com/in/alisonkershaw/More about our host:📸Instagram: https://www.instagram.com/zena_assaad/📘LinkedIn: https://www.linkedin.com/in/dr-zena-assaad/🔵BlueSky: https://web-cdn.bsky.app/profile/zenaassaad.bsky.social🌐Website: https://www.zenaassaad.com/Watch or listen to more episodes:🍎 https://podcasts.apple.com/au/podcast/responsible-bytes/id1800177424🔊 https://open.spotify.com/show/2x2yNvU6OY1LBLX1TsuAfB Or find us anywhere else you get your podcasts!Follow us online:📸Instagram: https://www.instagram.com/responsiblebytespodcast/📘LinkedIn: https://www.linkedin.com/company/responsible-bytes-podcast/🌐Website: https://www.zenaassaad.com/responsible-bytes-podcast
--------
50:50
--------
50:50
Responsible by design: An overview of the GC REAIM strategic guidance report with Sofia Romansky
In this episode of the Responsible Bytes podcast, Dr. Zena Assaad interviews Sofia Romansky, an analyst at the Hague Center for Strategic Studies (HCSS) and project coordinator for the Global Commission on Responsible AI in the Military Domain (GC REAIM). They discuss the complexities of integrating AI into military practices, the challenges of achieving consensus among diverse stakeholders, and the importance of human-centered design in AI development. The conversation also covers the GC REAIM strategic guidance report, which outlines recommendations for responsible AI use in military contexts, and emphasises the need for multidisciplinary collaboration and reflection in technology engagement. Resources mentioned in this episode:Link to the GC REAIM strategic guidance report: https://hcss.nl/report/gc-reaim-responsible-by-design-strategic-guidance-report/Link to the IEEE SA white paper: https://ieeexplore.ieee.org/document/10707139More about our guest:📘LinkedIn: https://www.linkedin.com/in/sofia-romansky/🌐Website: https://hcss.nl/expert/sofia-romansky/More about our host:📸Instagram: https://www.instagram.com/zena_assaad/📘LinkedIn: https://www.linkedin.com/in/dr-zena-assaad/🔵BlueSky: https://web-cdn.bsky.app/profile/zenaassaad.bsky.social🌐Website: https://www.zenaassaad.com/Watch or listen to more episodes:🍎 https://podcasts.apple.com/au/podcast/responsible-bytes/id1800177424🔊 https://open.spotify.com/show/2x2yNvU6OY1LBLX1TsuAfB Or find us anywhere else you get your podcasts!Follow us online:📸Instagram: https://www.instagram.com/responsiblebytespodcast/📘LinkedIn: https://www.linkedin.com/company/responsible-bytes-podcast/🌐Website: https://www.zenaassaad.com/responsible-bytes-podcast
--------
43:33
--------
43:33
The Evolution of Human Control in Warfare with Lena Trabucco
In this episode of the Responsible Bites podcast, Dr. Zena Assaad speaks with Dr. Lena Trabucco about the implications of AI and autonomous systems in military contexts. They explore the concept of human control in warfare, the differences between AI-enabled and traditional weapons systems, and the evolving role of operators. The conversation also touches on the challenges of academic writing, the impact of parenthood on career ambitions, and the importance of interdisciplinary dialogue. Link to Lena's three part article series on human control: https://digital-commons.usnwc.edu/ils/More about our guest:📘LinkedIn: https://www.linkedin.com/in/lena-trabucco-8075a340/More about our host:📸Instagram: https://www.instagram.com/zena_assaad/📘LinkedIn: https://www.linkedin.com/in/dr-zena-assaad/🔵BlueSky: https://web-cdn.bsky.app/profile/zenaassaad.bsky.social🌐Website: https://www.zenaassaad.com/Watch or listen to more episodes:🍎 https://podcasts.apple.com/au/podcast/responsible-bytes/id1800177424🔊 https://open.spotify.com/show/2x2yNvU6OY1LBLX1TsuAfB Or find us anywhere else you get your podcasts!Follow us online:📸Instagram: https://www.instagram.com/responsiblebytespodcast/📘LinkedIn: https://www.linkedin.com/company/responsible-bytes-podcast/🌐Website: https://www.zenaassaad.com/responsible-bytes-podcast
--------
47:22
--------
47:22
Navigating the future of military AI with Jessica Dorsey
In this episode of the Responsible Bytes podcast, Dr. Zena Assaad speaks with Jessica Dorsey, an Assistant Professor of International Law, about the recent CCW GGE on LAWS meeting held in Geneva. They discuss the complexities of defining lethal autonomous weapon systems, the importance of interdisciplinary collaboration, and the challenges posed by industry narratives and technical expertise in regulatory discussions. The conversation emphasises the need for responsible engagement with technology, particularly in military contexts, and the importance of asking questions across disciplines to foster better understanding and collaboration. More about our guest:📘LinkedIn: https://www.linkedin.com/in/jessicadorsey/🌐Website: https://www.uu.nl/staff/JLDorseyMore about our host:📸Instagram: https://www.instagram.com/zena_assaad/📘LinkedIn: https://www.linkedin.com/in/dr-zena-assaad/🔵BlueSky: https://web-cdn.bsky.app/profile/zenaassaad.bsky.social🌐Website: https://www.zenaassaad.com/Watch or listen to more episodes:🍎 https://podcasts.apple.com/au/podcast/responsible-bytes/id1800177424🔊 https://open.spotify.com/show/2x2yNvU6OY1LBLX1TsuAfB Or find us anywhere else you get your podcasts!Follow us online:📸Instagram: https://www.instagram.com/responsiblebytespodcast/📘LinkedIn: https://www.linkedin.com/company/responsible-bytes-podcast/🌐Website: https://www.zenaassaad.com/responsible-bytes-podcast
--------
51:59
--------
51:59
Dignity in Technology with Lorenn Ruster
In this episode of the Responsible Bytes podcast, Dr. Zena Assaad chats with Lorenn Ruster, a responsible AI researcher, to explore the concept of dignity in technology. They discuss the importance of placing dignity at the center of AI systems, the intersection of dignity and human rights, and how entrepreneurship can honor dignity. Lorenn shares insights on the cultural perspectives of dignity, the challenges of defining it, and the need to move beyond checkbox culture in organisations. The conversation emphasises the importance of reflective practice in fostering responsibility in technology development and usage. More about our guest:📘LinkedIn: https://www.linkedin.com/in/lorennruster/🌐Website: https://lorennruster.com/More about our host:📸Instagram: https://www.instagram.com/zena_assaad/📘LinkedIn: https://www.linkedin.com/in/dr-zena-assaad/🔵BlueSky: https://web-cdn.bsky.app/profile/zenaassaad.bsky.social🌐Website: https://www.zenaassaad.com/Watch or listen to more episodes:🍎 https://podcasts.apple.com/au/podcast/responsible-bytes/id1800177424🔊 https://open.spotify.com/show/2x2yNvU6OY1LBLX1TsuAfB Or find us anywhere else you get your podcasts!Follow us online:📸Instagram: https://www.instagram.com/responsiblebytespodcast/📘LinkedIn: https://www.linkedin.com/company/responsible-bytes-podcast/🌐Website: https://www.zenaassaad.com/responsible-bytes-podcast
Welcome to the Responsible Bytes podcast, where we talk about all things safe and responsible technology. Technology is changing our world and the way we live in it. In this podcast we explore what those changes look like and what they mean for our present and our future. Listen in as I chat to people working in and around the technology sector, unpacking our increasingly complex and evolving digital world. Responsible Bytes is created and hosted by Dr Zena Assaad.