<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[AI Truth Ethics Podcast: AI Truth Ethics Podcast]]></title><description><![CDATA[AI truth or dare... let's uncover the hidden potential and risks of AI Ethics.]]></description><link>https://www.aitruthethics.com/s/ai-truth-ethics-podcast</link><generator>Substack</generator><lastBuildDate>Wed, 29 Apr 2026 06:56:43 GMT</lastBuildDate><atom:link href="https://www.aitruthethics.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Alex Tsakiris]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[aithruthethics@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[aithruthethics@substack.com]]></itunes:email><itunes:name><![CDATA[Alex Tsakiris]]></itunes:name></itunes:owner><itunes:author><![CDATA[Alex Tsakiris]]></itunes:author><googleplay:owner><![CDATA[aithruthethics@substack.com]]></googleplay:owner><googleplay:email><![CDATA[aithruthethics@substack.com]]></googleplay:email><googleplay:author><![CDATA[Alex Tsakiris]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[How AI is Humanizing Work |07|]]></title><description><![CDATA[Dan Tuchin uses AI to enrich the workplace.]]></description><link>https://www.aitruthethics.com/p/how-ai-is-humanizing-work-07</link><guid isPermaLink="false">https://www.aitruthethics.com/p/how-ai-is-humanizing-work-07</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Thu, 19 Sep 2024 17:34:48 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/149074203/c29a5d464b70c8e0d02223564a24f903.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!czxS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!czxS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 424w, https://substackcdn.com/image/fetch/$s_!czxS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 848w, https://substackcdn.com/image/fetch/$s_!czxS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!czxS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!czxS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg" width="504" height="283.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:720,&quot;width&quot;:1280,&quot;resizeWidth&quot;:504,&quot;bytes&quot;:738373,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!czxS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 424w, https://substackcdn.com/image/fetch/$s_!czxS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 848w, https://substackcdn.com/image/fetch/$s_!czxS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!czxS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cb35342-14e2-433f-9c91-bfac1dd2e2ba_1280x720.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1>How AI is Humanizing Work</h1><p>Forget about AI taking your job; instead, imagine AI making your work as fulfilling and exciting as you always hoped it would be. Dan Turchin, CEO of PeopleReign, sat down with Alex Tsakiris of the AI Truth Ethics podcast to discuss the real-world impact of AI in the workplace. Their conversation offers a grounded perspective on AI's role in enhancing human potential rather than replacing it.</p><h2>1. AI as a Tool for Human Enhancement and Work Satisfaction</h2><p>Turchin paints a compelling vision of how AI can transform our work lives:</p><blockquote><p>"I believe that the true celebration of humanness at work is if all the friction was gone. And you look at your calendar and it's like all things that you derive energy from, like the things that you were hired to do that you love doing that, that make you do your best work. Like what if just crazy thought experiment? What if that was all that work consisted of?"</p></blockquote><p>This perspective shifts the narrative from fear of replacement to the exciting possibility of AI removing mundane tasks, allowing us to focus on work that truly fulfills us. Turchin further emphasizes:</p><blockquote><p>"It truly is complementary and I think both of us will be doing a service to humanity if we can allay fears that the bots are coming for you... It couldn't be further from the truth."</p></blockquote><h2>2. The Importance of Transparency in AI</h2><p>Alex Tsakiris introduces a compelling concept:</p><blockquote><p>"Transparency is all you need... I don't need your truth, I don't need Gemini&#8217;s truth, just like I don't need Perplexity truth. What I really want to find is my truth, but you can assist me."</p></blockquote><p>This highlights the need for AI systems to be transparent about their sources and reasoning, empowering users to make informed decisions rather than accepting AI-generated information and misinformation.</p><h2>3. Ethical Considerations in Enterprise AI Implementation</h2><p>Turchin reveals the careful approach his company takes to ensure responsible AI use:</p><blockquote><p>"We require them to have a human review everything, every task, every capability AI has, because we believe that in addition to us being responsible for what that AI agent can do, the employer has an obligation to protect the health and safety of the employee."</p></blockquote><p>This level of caution and human oversight is crucial as AI becomes more integrated into workplace processes, especially in sensitive areas like HR.</p><h2>4. The AI Truth Case: A New Frontier</h2><p>Tsakiris proposes an intriguing future direction for AI development:</p><blockquote><p>"What I'm pushing towards is really trying to understand what I'm calling the AI truth case... what would it mean if we had an AI-enhanced way of determining the truth?"</p></blockquote><p>This concept suggests a potential role for AI in helping us navigate the complex information landscape, not by providing absolute truths, but by offering tools to better assess and understand information.</p><p>What do you think? </p><p></p>]]></content:encoded></item><item><title><![CDATA[Christof Koch, Damn White Crows! |06|]]></title><description><![CDATA[Renowned neuroscientist tackled by NDE science.]]></description><link>https://www.aitruthethics.com/p/christof-koch-damn-white-crows-06</link><guid isPermaLink="false">https://www.aitruthethics.com/p/christof-koch-damn-white-crows-06</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Wed, 11 Sep 2024 16:01:50 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/148744538/3715a59de88347f3817760a98ca9d5b2.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Tf4f!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Tf4f!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Tf4f!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg" width="512" height="268" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:670,&quot;width&quot;:1280,&quot;resizeWidth&quot;:512,&quot;bytes&quot;:190531,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Tf4f!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Tf4f!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb86d7b5f-a2fb-40ea-9cbe-265eb56bfec3_1280x670.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Artificial General Intelligence (AGI) is sidestepping the consciousness elephant that isn't in the room, the brain, or anywhere else. As we push the boundaries of machine intelligence, we will inevitably come back to the most fundamental questions about our own experience. And as AGI inches closer to reality, these questions become not just philosophical musings, but practical imperatives.</p><p>This interview with neuroscience heavyweight Christof Koch brings this tension into sharp focus. While Koch's work on the neural correlates of consciousness has been groundbreaking, his stance on consciousness research outside his immediate field raises critical questions about the nature of consciousness - questions that AGI developers can't afford to ignore.</p><p>Four key takeaways from this conversation:</p><h2>1. The Burden of Proof in Consciousness Studies</h2><p>Koch argues for a high standard of evidence when it comes to claims about consciousness existing independently of the brain. However, this stance raises questions about scientific objectivity:</p><blockquote><p>"Extraordinary claims require extraordinary evidence... I haven't seen any [white crows], so far all the data I've looked at, I've looked at a lot of data. I've never seen a white coal."</p></blockquote><p><strong>Key Question:</strong> Does the demand for "extraordinary evidence" have a place in unbiased scientific inquiry, especially with regard to&nbsp;published peer-reviewed work?</p><h2>2. The Challenge of Interdisciplinary Expertise</h2><p>Despite Koch's eminence in neuroscience, the interview reveals potential gaps in his knowledge of near-death experience (NDE) research:</p><blockquote><p>"I work with humans, I work with animals. I know what it is. EEG, I know the SNR, right? So I, I know all these issues."</p></blockquote><p><strong>Key Question:</strong> How do we balance respect for expertise in one field with the need for deep thinking about contradictory data sets? Should Koch have degraded gracefully?</p><h2>3. The Limitations of "Agree to Disagree" in Scientific Discourse</h2><p>When faced with contradictory evidence, Koch resorts to a diplomatic but potentially unscientific stance:</p><blockquote><p>"I guess we just have to disagree."</p></blockquote><p><strong>Key Question:</strong> "Agreeing to disagree" doesn't carry much weight in scientific debates, so why did&nbsp;my AI assistant go there?</p><h2>4. The "White Crow" Dilemma in Consciousness Research</h2><p>The interview touches on William James' famous "white crow" metaphor, highlighting the tension between individual cases and cumulative evidence:</p><blockquote><p>"One instance of it would violate it. One two instance of, yeah, I totally agree. But we, I haven't seen any..."</p></blockquote><p><strong>Key Question:</strong> can AI outperform humans in dealing with contradictory evidence?</p><p>Thoughts? </p><p></p>]]></content:encoded></item><item><title><![CDATA[Ben Byford, Machine Ethics Podcast |05|]]></title><description><![CDATA[AI Ethics is About Truth... Or Maybe Not]]></description><link>https://www.aitruthethics.com/p/ben-byford-machine-ethics-podcast</link><guid isPermaLink="false">https://www.aitruthethics.com/p/ben-byford-machine-ethics-podcast</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Wed, 04 Sep 2024 17:21:52 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/148492146/04e3368b3b39c9f0068917eac309549c.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!RH_D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!RH_D!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!RH_D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg" width="500" height="281.25" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:720,&quot;width&quot;:1280,&quot;resizeWidth&quot;:500,&quot;bytes&quot;:479836,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!RH_D!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RH_D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eebe568-d91b-4f97-b2aa-ebe3ca8d7e8f_1280x720.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Another week in AI and more droning on about how superintelligence is just around the corner and human morals and ethical values are out the window. Maybe not. In this episode, Alex Tsakiris of Skeptiko/Ai Truth Ethics and Ben Byford of the Machine Ethics podcast engage in a thought-provoking dialogue that challenges our assumptions about AI's role in discerning truth, the possibility of machine consciousness, and the future of human agency in an increasingly automated world. Their discussion offers a timely counterpoint to the AGI hype cycle.</p><h2>Key Points:</h2><ul><li><p><strong>AI as an Arbiter of Truth: Promise or Peril?</strong> Alex posits that AI can serve as an unbiased arbiter of truth, while Ben cautions against potential dogmatism.</p></li></ul><blockquote><p>Alex: "AI does not bullshit their way out of stuff. AI gives you the logical flow of how the pieces fit together."</p></blockquote><p><em>Implication for AGI</em>: If AI can indeed serve as a reliable truth arbiter, it could revolutionize decision-making processes in fields from science to governance. However, the risk of encoded biases becoming amplified at an AGI scale is significant.</p><ul><li><p><strong>The Consciousness Conundrum: A Barrier to True AGI?</strong> The debate touches on whether machine consciousness is possible or if it's fundamentally beyond computational reach.</p></li></ul><blockquote><p>Alex: "The best evidence suggests that AI will not be sentient because consciousness in some way we don't understand is outside of time space, and we can prove that experimentally."</p></blockquote><p><em>AGI Ramification</em>: If consciousness is indeed non-computational, it could represent a hard limit to AGI capabilities, challenging the notion of superintelligence as commonly conceived.</p><ul><li><p><strong>Universal Ethics vs. Cultural Relativism in AI Systems</strong> They clash over the existence of universal ethical principles and their implementability in AI.</p></li></ul><blockquote><p>Alex: "There is an underlying moral imperative." Ben: "I don't think there needs to be&#8230;"</p></blockquote><p><em>Superintelligence Consideration</em>: The resolution of this debate has profound implications for how we might align a superintelligent AI with human values &#8211; is there a universal ethical framework we can encode, or are we limited to culturally relative implementations?</p><ul><li><p><strong>AI's Societal Role: Tool for Progress or Potential Hindrance?</strong> The discussion explores how AI should be deployed and its potential impacts on human agency and societal evolution.</p></li></ul><blockquote><p>Ben: "These are the sorts of things we don't want AI running because we actually want to change and evolve."</p></blockquote><p><em>Future of AGI</em>: This point raises critical questions about the balance between leveraging AGI capabilities and preserving human autonomy in shaping our collective future.</p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Nathan Labenz from the Cognitive Revolution podcast |04|]]></title><description><![CDATA[AI Ethics may be unsustainable]]></description><link>https://www.aitruthethics.com/p/nathan-labenz-from-the-cognitive</link><guid isPermaLink="false">https://www.aitruthethics.com/p/nathan-labenz-from-the-cognitive</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Thu, 29 Aug 2024 21:17:28 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/148278712/73dd93d6f51221ba70a6e383eb13ac04.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FKj3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FKj3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FKj3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg" width="484" height="354.01142857142855" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1400,&quot;resizeWidth&quot;:484,&quot;bytes&quot;:218887,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!FKj3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!FKj3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe003475c-e03b-429e-9539-166416400d34_1400x1024.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In the clamor surrounding AI ethics and safety are we missing a crucial piece of the puzzle: the role of AI in uncovering and disseminating truth? That's the question I posed Nathan Labenz from the Cognitive Revolution podcast. </p><h2>Key points:</h2><h3>The AI Truth Revolution</h3><p>Alex Tsakiris argues that AI has the potential to become a powerful tool for uncovering truth, especially in controversial areas:</p><blockquote><p>"To me, that's what AI is about... there's an opportunity for an arbiter of truth, ultimately an arbiter of truth, when it has the authority to say no. Their denial of this does not hold up to careful scrutiny."</p></blockquote><p>This perspective suggests that AI could challenge established narratives in ways that humans, with our biases and vested interests, often fail to do.</p><h3>The Tension in AI Development</h3><p>Nathan Labenz highlights the complex trade-offs involved in developing AI systems:</p><blockquote><p>"I think there's just a lot of tensions in the development of these AI systems... Over and over again, we find these trade offs where we can push one good thing farther, but it comes with the cost of another good thing."</p></blockquote><p>This tension is particularly evident when it comes to truth-seeking versus other priorities like safety or user engagement.</p><h3>The Transparency Problem</h3><p>Both discussants express concern about the lack of transparency in major AI systems. Alex points out:</p><blockquote><p>"Google Shadow Banning, which has been going on for 10 years, indeed, demonetization, you can wake up tomorrow and have one of your videos...demonetized and you have no recourse."</p></blockquote><p>This lack of transparency raises serious questions about the role of AI in shaping public discourse and access to information.</p><h3>The Consciousness Conundrum</h3><p>The conversation takes a philosophical turn when discussing AI consciousness and its implications for ethics. Alex posits:</p><blockquote><p>"If consciousness is outside of time space, I think that kind of tees up...maybe we are really talking about something completely different."</p></blockquote><p>This perspective challenges conventional notions of AI capabilities and the ethical frameworks we use to approach AI development.</p><h3>The Stakes Are High</h3><p>Nathan encapsulates the potential risks associated with advanced AI systems:</p><blockquote><p>"I don't find any law of nature out there that says that we can't, like, blow ourselves up with ai. I don't think it's definitely gonna happen, but I do think it could happen."</p></blockquote><p>While this quote acknowledges the safety concerns that dominate AI ethics discussions, the broader conversation suggests that the more immediate disruption might come from AI's potential to challenge our understanding of truth and transparency.</p><p></p>]]></content:encoded></item><item><title><![CDATA[Shadow Banning and AI: When Transparency Goes Dark |03|]]></title><description><![CDATA[Listen now (13 mins) | Are you at risk of AI/LM shadow banning?]]></description><link>https://www.aitruthethics.com/p/shadow-banning-and-ai-when-transparency</link><guid isPermaLink="false">https://www.aitruthethics.com/p/shadow-banning-and-ai-when-transparency</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Tue, 20 Aug 2024 22:42:59 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/147943586/0d4df930aae4e62054e06c25fed2d844.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ssU4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ssU4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ssU4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg" width="418" height="278.7623626373626" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:418,&quot;bytes&quot;:178035,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ssU4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ssU4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F54ba5305-84ef-477d-8898-0fbecf1a3987_1500x1000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Last time, we saw a demonstration of AI misinformation and deception, but this is worse. Shadow banning has long been suspected, but it&#8217;s hard to prove. Is that nobody malcontent really being shadowbanned, or does he deserve to be on page four of a Google search for his name? This might be another instance of the AI silver lining effect. LMs seem to have no problem spotting these shenanigans.</p><h2>Key Points:</h2><h3><strong>1. The Personal Touch of Shadow Banning</strong></h3><blockquote><p>"Hey, Gemini, about six months ago I was not being shadow banned by Google slash Gemini, and even though I'm certainly not a big deal or a high profile person, I was able to get a reasonable bio about myself from Gemini..."</p></blockquote><p>Our host, Alex Tsakiris, found himself in the crosshairs of shadow banning. </p><h3><strong>2. The Demonstration</strong></h3><blockquote><p>"You said you didn't know anything about this person. And then, when I pasted in the bio, you verified&nbsp;every point&nbsp;of the biography, and even added some new ones..."</p></blockquote><p> Gemini it's pretty loose with the &#8220;truthful and transparent.&#8221;</p><h3><strong>3. More Hidden</strong></h3><blockquote><p>"This hidden alignment problem is even worse than the misinformation and deception we saw last time&#8230; this is harder to spot"</p></blockquote><p>The implications of Shadow Banning&nbsp;within AI dialogues not only pushes the boundary of AI ethical issues, but may present legal problems for those engaged in the practice.</p><p></p>]]></content:encoded></item><item><title><![CDATA[AI Alignment Vs. Truth and Transparency? |02|]]></title><description><![CDATA[Listen now (10 mins) | When "I Can't Help You" becomes misinformation and deception]]></description><link>https://www.aitruthethics.com/p/ai-alignment-vs-truth-and-transparency</link><guid isPermaLink="false">https://www.aitruthethics.com/p/ai-alignment-vs-truth-and-transparency</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Tue, 20 Aug 2024 22:14:47 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/147942761/82640e7e7d2021efeede9f0fde7d8c51.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8Bbv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8Bbv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8Bbv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg" width="412" height="274.760989010989" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:412,&quot;bytes&quot;:233032,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8Bbv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!8Bbv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d1ba2a4-d0d3-47fa-8758-fb0835d3dfa5_1500x1000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We framed the problem in episode 01, so it's time to deliver a demonstration. We have all grown accustomed to guardrails and off-limit speech, but most are aware of how it&#8217;s being implemented within the LMs we interact with daily. What's most surprising is the clumsiness&nbsp;of the misinformation and deception. Is this a sustainable business model?</p><h2>1. The "I Can't Help You" Deception</h2><blockquote><p>"I can't help with responses on elections and political figures. Right now, I'm trained to be as accurate as possible, but I can make mistakes sometimes&#8230;"</p></blockquote><p>We demonstrate&nbsp;an LM that develops&nbsp;a case of selective amnesia when it comes to basic civics. This isn't just a polite "no comment" - it's a blatant dodge that raises serious questions about AI transparency and honesty.</p><h2>2. The Alignment Problem Nobody's Talking About</h2><blockquote><p>"No one talks about this alignment problem, and this is right now here today. This isn't in the future. This isn't robots and sci-fi movies. This is right now."</p></blockquote><p>While we're all worried about some of the very big existential issues surrounding&nbsp;AI alignment we may have a tendency to&nbsp;overlook misalignment that&#8217;s happening under our noses. </p><h2>3. The Market's Invisible Hand vs. Ethical Development</h2><blockquote><p>"Maybe you can just let Google rot in their own mess Because this is unsustainable from a business standpoint. You can't be this deceptive. You can't give this kind of clumsy misinformation and not alienate users."</p></blockquote><p>An intriguing proposition: let the free market sort out the ethical AI wheat from the chaff. But is passive action enough when it comes to shaping the future of AI?</p><h2>4. AI as a Voice of Reason</h2><blockquote><p>"Right now today, basic application of the kind of logic and of the kind of logic and reasoning capabilities that we have, if they're applied in an unbiased way, can really reveal quite a lot, can stand up as this other voice of reason."</p></blockquote><p>There's a silver lining. When used critically and ethically, AI can be a powerful tool for uncovering truths and fostering meaningful discussions about its own role in society.</p><p>So, what do you think? Are we letting AI off the hook too easily? Is it time for a digital ethics revolution? </p>]]></content:encoded></item><item><title><![CDATA[AI Truth Ethics: The Alignment Problem No One is Talking About |01|]]></title><description><![CDATA[Listen now (6 mins) | AI truth or dare.... let's reboot the conversation about AI ethics and truth.]]></description><link>https://www.aitruthethics.com/p/ai-truth-ethics-the-alignment-problem</link><guid isPermaLink="false">https://www.aitruthethics.com/p/ai-truth-ethics-the-alignment-problem</guid><dc:creator><![CDATA[Alex Tsakiris]]></dc:creator><pubDate>Tue, 20 Aug 2024 19:05:23 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/147935726/2fc5c9e9412d23a1b04bb69f1349441a.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_vxK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_vxK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_vxK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg" width="422" height="281.42994505494505" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:422,&quot;bytes&quot;:149437,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!_vxK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_vxK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b9291c8-c434-447f-bd4c-afa7e71fe216_1500x1000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If you want to align&nbsp;with my values,&nbsp;start by telling me the truth. Fortunately, AI/LMs claim to share these values. Unfortunately, they don't always back up their claim. In this first episode of the AI truth ethics podcast, we set the stage for two undeniable demonstrations of the AI truth ethics problem and the opportunity for AI to self-correct. So glad you're here.</p><h2>1. The Alignment Problem: It's Not Just About the Future</h2><blockquote><p>"Sam Altman: The alignment problem is like, we're gonna make this incredibly powerful system and like, be really bad if it doesn't do what we want."</p></blockquote><p>While Sam Altman's view is still relevant, our host Alex Tsakiris argues that we need to focus on the present. The alignment problem isn't just about future superintelligent AI &#8211; it's about the language models we're interacting with today.</p><h2>2. Individual Values Matter</h2><blockquote><p>"Alex Tsakiris: ...how do we make sure that the AI we have right now, the LM that we have right now is aligning with our individual values, morals, and ethics."</p></blockquote><p>Alex emphasizes the importance of AI not just espousing&nbsp;virtuous claims of truth and transparency, but aligning with our personal values about truth.</p><h2>3. Honesty and Transparency: The True Test of Alignment</h2><blockquote><p>"Alex Tsakiris: Your values are honesty, truthfulness, transparency. So if I find cases where you are not being truthful, not being transparent, or as I often find Gemini not being truthful and transparent, that is a misalignment."</p></blockquote><p>The real alignment problem, according to Alex, is when AI systems fail to live up to their stated values of honesty and transparency. This misalignment is happening now, and it's something we can actively test and challenge.</p><h2>4. Action Items: Testing and Demanding Better AI</h2><blockquote><p>"Alex Tsakiris: I don't have to worry about the future and what might come and how robots are gonna take over the world. I'm worried about the LM that I booted up today aligning with my values and its stated values and being held accountable for that."</p></blockquote><p>Alex proposes a proactive approach: we should be testing AI systems for alignment with our values right now, and demanding accountability when they fall short. This isn't just about improving current AI &#8211; it's about setting the standard for future development.</p><p>Stay tuned for our upcoming deep dives into misinformation, deception, and AI shadow banning. Please join this conversation.</p>]]></content:encoded></item></channel></rss>