<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet href="/stylesheet.xsl" type="text/xsl"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:podcast="https://podcastindex.org/namespace/1.0">
  <channel>
    <atom:link rel="self" type="application/rss+xml" href="https://feeds.transistor.fm/siacast-preview" title="MP3 Audio"/>
    <atom:link rel="hub" href="https://pubsubhubbub.appspot.com/"/>
    <podcast:podping usesPodping="true"/>
    <title>SIAcast Preview</title>
    <generator>Transistor (https://transistor.fm)</generator>
    <itunes:new-feed-url>https://feeds.transistor.fm/siacast-preview</itunes:new-feed-url>
    <description>SIAcast Preview offers a sampling of discussions shaping research integrity and meta-research today. Listen to open episodes here, and become a subscriber at www.sci-integrity.com for access to the full library.</description>
    <copyright>© 2026 Science Integrity Alliance</copyright>
    <podcast:guid>6618ed99-be61-5cf4-82a6-7f760e668f9a</podcast:guid>
    <podcast:locked>yes</podcast:locked>
    <language>en</language>
    <pubDate>Sat, 02 May 2026 13:00:07 -0400</pubDate>
    <lastBuildDate>Sat, 02 May 2026 13:01:49 -0400</lastBuildDate>
    <link>https://www.sci-integrity.com/</link>
    
    <itunes:category text="Science"/>
    <itunes:category text="Society &amp; Culture"/>
    <itunes:type>episodic</itunes:type>
    <itunes:author>Science Integrity Alliance</itunes:author>
    <itunes:image href="https://img.transistorcdn.com/sOC_lCvla4QBWxYelik1P4270ppf0izcrkQOgiaBUP4/rs:fill:0:0:1/w:1400/h:1400/q:60/mb:500000/aHR0cHM6Ly9pbWct/dXBsb2FkLXByb2R1/Y3Rpb24udHJhbnNp/c3Rvci5mbS9kMjBl/N2ZkZmZmNGFhYmZm/YTRiN2M4OGYyNzE3/YWUzZS5wbmc.jpg"/>
    <itunes:summary>SIAcast Preview offers a sampling of discussions shaping research integrity and meta-research today. Listen to open episodes here, and become a subscriber at www.sci-integrity.com for access to the full library.</itunes:summary>
    <itunes:subtitle>SIAcast Preview offers a sampling of discussions shaping research integrity and meta-research today.</itunes:subtitle>
    <itunes:keywords>research integrity, meta-research, open research, open science, research ethics, reproducibility, scholarly communication, Science Integrity Alliance</itunes:keywords>
    <itunes:owner>
      <itunes:name>Science Integrity Alliance</itunes:name>
    </itunes:owner>
    <itunes:complete>No</itunes:complete>
    <itunes:explicit>No</itunes:explicit>
    <item>
      <title>Research Self Correction and Sleuthing</title>
      <itunes:episode>1</itunes:episode>
      <podcast:episode>1</podcast:episode>
      <itunes:title>Research Self Correction and Sleuthing</itunes:title>
      <itunes:episodeType>full</itunes:episodeType>
      <guid isPermaLink="false">3b77dae7-597e-4e24-98c9-a2365532e798</guid>
      <link>https://www.sci-integrity.com/siacast-preview</link>
      <description>
        <![CDATA[<p><strong>What happens when the system designed to safeguard scientific truth starts to crack?<br></strong> In this episode of <em>SIAcast</em>, Jonny Coates and co-host Yagmur Ozturk dig into a growing crisis in research integrity, from AI-generated “papers” and hallucinated citations to peer-review overload and stealth edits in published work. With real-world examples (including a famously impossible lab rat image that passed peer review), the episode pulls back the curtain on how flawed science can slip through and why it’s happening more often.</p><p>But this isn’t a doom-and-gloom story. Featuring insights from research integrity experts Joanna Diong and René Aquarius, the conversation explores the human, institutional, and systemic forces shaping modern science. From perverse incentives and publication pressure to grassroots sleuthing communities fighting back, this episode asks a critical question: can we still trust science, and if so, what will it take to protect that trust?</p><p><strong>Key Takeaways</strong></p><ul><li><strong>The scale of the problem has changed</strong>: Scientific errors and fraud aren’t new, but AI and paper mills are amplifying them dramatically.</li><li><strong>Peer review is under strain</strong>: Overworked reviewers and overwhelmed editors are struggling to maintain quality control.</li><li><strong>AI is a double-edged sword</strong>: It helps detect fraud (e.g., image duplication tools) but also enables large-scale fabrication.</li><li><strong>Incentive systems are broken</strong>: Metrics like publication count and H-index can encourage questionable research practices.</li><li><strong>“Stealth corrections” undermine trust</strong>: Papers can be altered post-publication without any formal notice to readers.</li><li><strong>Responsibility is diffuse</strong>: No single actor can fix the system; change requires collective effort across the ecosystem.</li></ul><p><strong>Biographies</strong></p><p><br></p><p>Co-host</p><p><br></p><p><strong>Yagmur Ozturk </strong>is a PhD candidate in the ERC Nanobubbles project based at Grenoble Alpes University, France. Within this project, she works on building semi-automatic systems to help with post-publication peer review. She also serves as a maintainer for the Collection of Open Science Integrity Guides (COSIG), working on curating accessible guides for everyone to do forensic peer review. </p><p><br></p><p>Guests</p><p><br></p><p><strong>René Aquarius</strong> is a biomedical scientist working at the Neurosurgery department of the Radboud University Medical Center in Nijmegen, The Netherlands. Besides his regular work, which focuses mainly on systematic reviews, as well as in vivo, in vitro, and clinical research in the field of vascular neurosurgery, he has also been active as a so-called "science sleuth" for the last two years. As a science sleuth, he is focusing mostly on finding image-related issues in scientific articles.</p><p><br></p><p><strong>Dr. Joanna Diong</strong> is a Senior Lecturer at The University of Sydney. Her research program aims to improve the value of health and medical research by using epidemiological methods to find and test ways to improve research quality. She serves as Research Integrity Advisor for the Faculty of Medicine and Health, champions good research practices through the Association for Interdisciplinary Meta-Research and Open Science (AIMOS), and has been awarded for her contributions to peer review and improving research quality. </p><p><br></p><p><strong>Chapter Timestamps</strong></p><ul><li><strong>00:00</strong> – The “impossible rat” and why trust in science is under pressure</li><li><strong>01:44</strong> – AI in research: tool or threat?</li><li><strong>03:04</strong> – AI-generated fraud and hallucinated citations</li><li><strong>05:17</strong> – Peer review under strain &amp; publishing incentives</li><li><strong>07:43</strong> – Why research integrity matters (historical context)</li><li><strong>08:53</strong> – René’s origin story: spotting widespread issues</li><li><strong>11:20</strong> – How institutions handle research integrity</li><li><strong>15:28</strong> – What happens when misconduct is reported</li><li><strong>18:05</strong> – Why researchers cut corners</li><li><strong>21:23</strong> – Introducing COSIG and open integrity guides</li><li><strong>24:24</strong> – What are stealth corrections?</li><li><strong>27:48</strong> – Risks and backlash faced by sleuths</li><li><strong>29:05</strong> – Can we tell fraud from honest mistakes?</li><li><strong>31:28</strong> – How the research community responds</li><li><strong>35:45</strong> – Why integrity tools need to be accessible</li><li><strong>36:32</strong> – Who is responsible for fixing science?</li><li><strong>41:20</strong> – Rethinking peer review and trust systems</li><li><strong>46:24</strong> – What listeners can do to help</li><li><strong>48:39</strong> – Final reflections: the future of trust in science</li></ul><p>Music: Upbeat Corporate by JP Bianchini. Produced by<a href="https://ripplingideas.org"> Rippling Ideas</a> for the<a href="https://www.sci-integrity.com/"> Science Integrity Alliance.</a></p>]]>
      </description>
      <content:encoded>
        <![CDATA[<p><strong>What happens when the system designed to safeguard scientific truth starts to crack?<br></strong> In this episode of <em>SIAcast</em>, Jonny Coates and co-host Yagmur Ozturk dig into a growing crisis in research integrity, from AI-generated “papers” and hallucinated citations to peer-review overload and stealth edits in published work. With real-world examples (including a famously impossible lab rat image that passed peer review), the episode pulls back the curtain on how flawed science can slip through and why it’s happening more often.</p><p>But this isn’t a doom-and-gloom story. Featuring insights from research integrity experts Joanna Diong and René Aquarius, the conversation explores the human, institutional, and systemic forces shaping modern science. From perverse incentives and publication pressure to grassroots sleuthing communities fighting back, this episode asks a critical question: can we still trust science, and if so, what will it take to protect that trust?</p><p><strong>Key Takeaways</strong></p><ul><li><strong>The scale of the problem has changed</strong>: Scientific errors and fraud aren’t new, but AI and paper mills are amplifying them dramatically.</li><li><strong>Peer review is under strain</strong>: Overworked reviewers and overwhelmed editors are struggling to maintain quality control.</li><li><strong>AI is a double-edged sword</strong>: It helps detect fraud (e.g., image duplication tools) but also enables large-scale fabrication.</li><li><strong>Incentive systems are broken</strong>: Metrics like publication count and H-index can encourage questionable research practices.</li><li><strong>“Stealth corrections” undermine trust</strong>: Papers can be altered post-publication without any formal notice to readers.</li><li><strong>Responsibility is diffuse</strong>: No single actor can fix the system; change requires collective effort across the ecosystem.</li></ul><p><strong>Biographies</strong></p><p><br></p><p>Co-host</p><p><br></p><p><strong>Yagmur Ozturk </strong>is a PhD candidate in the ERC Nanobubbles project based at Grenoble Alpes University, France. Within this project, she works on building semi-automatic systems to help with post-publication peer review. She also serves as a maintainer for the Collection of Open Science Integrity Guides (COSIG), working on curating accessible guides for everyone to do forensic peer review. </p><p><br></p><p>Guests</p><p><br></p><p><strong>René Aquarius</strong> is a biomedical scientist working at the Neurosurgery department of the Radboud University Medical Center in Nijmegen, The Netherlands. Besides his regular work, which focuses mainly on systematic reviews, as well as in vivo, in vitro, and clinical research in the field of vascular neurosurgery, he has also been active as a so-called "science sleuth" for the last two years. As a science sleuth, he is focusing mostly on finding image-related issues in scientific articles.</p><p><br></p><p><strong>Dr. Joanna Diong</strong> is a Senior Lecturer at The University of Sydney. Her research program aims to improve the value of health and medical research by using epidemiological methods to find and test ways to improve research quality. She serves as Research Integrity Advisor for the Faculty of Medicine and Health, champions good research practices through the Association for Interdisciplinary Meta-Research and Open Science (AIMOS), and has been awarded for her contributions to peer review and improving research quality. </p><p><br></p><p><strong>Chapter Timestamps</strong></p><ul><li><strong>00:00</strong> – The “impossible rat” and why trust in science is under pressure</li><li><strong>01:44</strong> – AI in research: tool or threat?</li><li><strong>03:04</strong> – AI-generated fraud and hallucinated citations</li><li><strong>05:17</strong> – Peer review under strain &amp; publishing incentives</li><li><strong>07:43</strong> – Why research integrity matters (historical context)</li><li><strong>08:53</strong> – René’s origin story: spotting widespread issues</li><li><strong>11:20</strong> – How institutions handle research integrity</li><li><strong>15:28</strong> – What happens when misconduct is reported</li><li><strong>18:05</strong> – Why researchers cut corners</li><li><strong>21:23</strong> – Introducing COSIG and open integrity guides</li><li><strong>24:24</strong> – What are stealth corrections?</li><li><strong>27:48</strong> – Risks and backlash faced by sleuths</li><li><strong>29:05</strong> – Can we tell fraud from honest mistakes?</li><li><strong>31:28</strong> – How the research community responds</li><li><strong>35:45</strong> – Why integrity tools need to be accessible</li><li><strong>36:32</strong> – Who is responsible for fixing science?</li><li><strong>41:20</strong> – Rethinking peer review and trust systems</li><li><strong>46:24</strong> – What listeners can do to help</li><li><strong>48:39</strong> – Final reflections: the future of trust in science</li></ul><p>Music: Upbeat Corporate by JP Bianchini. Produced by<a href="https://ripplingideas.org"> Rippling Ideas</a> for the<a href="https://www.sci-integrity.com/"> Science Integrity Alliance.</a></p>]]>
      </content:encoded>
      <pubDate>Sun, 26 Apr 2026 14:56:25 -0400</pubDate>
      <author>Science Integrity Alliance</author>
      <enclosure url="https://media.transistor.fm/1e988378/49a678aa.mp3" length="71015691" type="audio/mpeg"/>
      <itunes:author>Science Integrity Alliance</itunes:author>
      <itunes:duration>2958</itunes:duration>
      <itunes:summary>
        <![CDATA[<p><strong>What happens when the system designed to safeguard scientific truth starts to crack?<br></strong> In this episode of <em>SIAcast</em>, Jonny Coates and co-host Yagmur Ozturk dig into a growing crisis in research integrity, from AI-generated “papers” and hallucinated citations to peer-review overload and stealth edits in published work. With real-world examples (including a famously impossible lab rat image that passed peer review), the episode pulls back the curtain on how flawed science can slip through and why it’s happening more often.</p><p>But this isn’t a doom-and-gloom story. Featuring insights from research integrity experts Joanna Diong and René Aquarius, the conversation explores the human, institutional, and systemic forces shaping modern science. From perverse incentives and publication pressure to grassroots sleuthing communities fighting back, this episode asks a critical question: can we still trust science, and if so, what will it take to protect that trust?</p><p><strong>Key Takeaways</strong></p><ul><li><strong>The scale of the problem has changed</strong>: Scientific errors and fraud aren’t new, but AI and paper mills are amplifying them dramatically.</li><li><strong>Peer review is under strain</strong>: Overworked reviewers and overwhelmed editors are struggling to maintain quality control.</li><li><strong>AI is a double-edged sword</strong>: It helps detect fraud (e.g., image duplication tools) but also enables large-scale fabrication.</li><li><strong>Incentive systems are broken</strong>: Metrics like publication count and H-index can encourage questionable research practices.</li><li><strong>“Stealth corrections” undermine trust</strong>: Papers can be altered post-publication without any formal notice to readers.</li><li><strong>Responsibility is diffuse</strong>: No single actor can fix the system; change requires collective effort across the ecosystem.</li></ul><p><strong>Biographies</strong></p><p><br></p><p>Co-host</p><p><br></p><p><strong>Yagmur Ozturk </strong>is a PhD candidate in the ERC Nanobubbles project based at Grenoble Alpes University, France. Within this project, she works on building semi-automatic systems to help with post-publication peer review. She also serves as a maintainer for the Collection of Open Science Integrity Guides (COSIG), working on curating accessible guides for everyone to do forensic peer review. </p><p><br></p><p>Guests</p><p><br></p><p><strong>René Aquarius</strong> is a biomedical scientist working at the Neurosurgery department of the Radboud University Medical Center in Nijmegen, The Netherlands. Besides his regular work, which focuses mainly on systematic reviews, as well as in vivo, in vitro, and clinical research in the field of vascular neurosurgery, he has also been active as a so-called "science sleuth" for the last two years. As a science sleuth, he is focusing mostly on finding image-related issues in scientific articles.</p><p><br></p><p><strong>Dr. Joanna Diong</strong> is a Senior Lecturer at The University of Sydney. Her research program aims to improve the value of health and medical research by using epidemiological methods to find and test ways to improve research quality. She serves as Research Integrity Advisor for the Faculty of Medicine and Health, champions good research practices through the Association for Interdisciplinary Meta-Research and Open Science (AIMOS), and has been awarded for her contributions to peer review and improving research quality. </p><p><br></p><p><strong>Chapter Timestamps</strong></p><ul><li><strong>00:00</strong> – The “impossible rat” and why trust in science is under pressure</li><li><strong>01:44</strong> – AI in research: tool or threat?</li><li><strong>03:04</strong> – AI-generated fraud and hallucinated citations</li><li><strong>05:17</strong> – Peer review under strain &amp; publishing incentives</li><li><strong>07:43</strong> – Why research integrity matters (historical context)</li><li><strong>08:53</strong> – René’s origin story: spotting widespread issues</li><li><strong>11:20</strong> – How institutions handle research integrity</li><li><strong>15:28</strong> – What happens when misconduct is reported</li><li><strong>18:05</strong> – Why researchers cut corners</li><li><strong>21:23</strong> – Introducing COSIG and open integrity guides</li><li><strong>24:24</strong> – What are stealth corrections?</li><li><strong>27:48</strong> – Risks and backlash faced by sleuths</li><li><strong>29:05</strong> – Can we tell fraud from honest mistakes?</li><li><strong>31:28</strong> – How the research community responds</li><li><strong>35:45</strong> – Why integrity tools need to be accessible</li><li><strong>36:32</strong> – Who is responsible for fixing science?</li><li><strong>41:20</strong> – Rethinking peer review and trust systems</li><li><strong>46:24</strong> – What listeners can do to help</li><li><strong>48:39</strong> – Final reflections: the future of trust in science</li></ul><p>Music: Upbeat Corporate by JP Bianchini. Produced by<a href="https://ripplingideas.org"> Rippling Ideas</a> for the<a href="https://www.sci-integrity.com/"> Science Integrity Alliance.</a></p>]]>
      </itunes:summary>
      <itunes:keywords>research integrity, meta-research, open research, open science, research ethics, reproducibility, scholarly communication, Science Integrity Alliance</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <podcast:transcript url="https://share.transistor.fm/s/1e988378/transcript.txt" type="text/plain"/>
    </item>
  </channel>
</rss>
