The Foresight Institute Podcast

Foresight Institute

Welcome to the Foresight Institute’s podcast! Since 1986, Foresight has been advancing technologies for the long-term benefit of life and the biosphere. We focus on three areas: molecular machine nanotechnology for building better things, biotechnology for health extension, and computer science and crypto commerce for intelligent cooperation. This podcast is where we share a portion of our public work fitted for listening. We mix longer discussions with shorter bits, new episodes with all-time favorites. There is much more. To view presentations of our technical work and to stay up-to-date on new content, subscribe to Foresight Institute on YouTube and follow us on Twitter.

  • 48 minutes 7 seconds
    Amanda Ngo | Innovating With AI for Wellbeing

    Speaker

    Amanda Ngo is a 2024 Foresight Fellow. Recently, she has built Elicit.org from inception to 100k+ monthly users, leading a team of 5 engineers and designers, presented on forecasting, safe AI systems, and LLM research tools at conferences (EAG, Foresight Institute), ran a 60-person hackathon with FiftyYears using LLMs to improve our wellbeing (event, write up), analyzed Ideal Parent Figure transcripts and built an automated IPF chatbot (demo), and co-organized a 400-person retreat for Interact, a technology for social good fellowship.


    Session Summary

    “Imagine waking up every day in a state of flow, where all the knots and fears are replaced with a deep sense of ease and joy.”


    This week we are dropping another special episode of the Existential Hope podcast, featuring Amanda Ngo, a Foresight Institute Existential Hope fellow specializing in AI innovation for wellbeing. Amanda speaks about her work on leveraging AI to enhance human flourishing, sharing insights on the latest advancements and their potential impacts. Her app: https://www.mysunrise.app/


    Full transcript, list of resources, and art piece: https://www.existentialhope.com/podcasts


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    13 September 2024, 8:00 am
  • 10 minutes 4 seconds
    Jeffrey Ladish | AI Threat Models, Hacking, Deception, and Manipulation

    Jeffrey Ladish

    Before starting Palisade, Jeffrey helped build out the information security program at Anthropic through his security consulting company, Gordian. Jeffrey has also helped dozens of tech companies, philanthropic organizations, and existential-risk-focused projects get started with secure infrastructure.


    Summary

    Ladish discusses the increasing sophistication and proliferation of deepfake technology, which allows AI to mimic human voices and faces, and its potential for widespread deception. He argues that this increasingly capable technology is and will be used to spread fake information, manipulate elections or markets, create deepfake pornography, and generate fake endorsements from actors or organizations.


    About Foresight Institute

    Foresight Institute is a research organization and non-profit that supports the beneficial development of high-impact technologies. Since our founding in 1987 on a vision of guiding powerful technologies, we have continued to evolve into a many-armed organization that focuses on several fields of science and technology that are too ambitious for legacy institutions to support.


    Allison Duettmann

    The President and CEO of Foresight Institute, Allison Duettmann directs the Intelligent Cooperation, Molecular Machines, Biotech & Health Extension, Neurotech, and Space Programs, alongside Fellowships, Prizes, and Tech Trees. She has also been pivotal in co-initiating the Longevity Prize, pioneering initiatives like Existentialhope.com, and contributing to notable works like "Superintelligence: Coordination & Strategy" and "Gaming the Future".


    Get Involved with Foresight:


    Follow Us: Twitter | Facebook | LinkedIn


    Note: Explore every word spoken on this podcast through Fathom.fm, an innovative podcast search engine.


    Hosted on Acast. See acast.com/privacy for more information.

    6 September 2024, 1:15 pm
  • 1 hour 4 minutes
    Kristian Rönn | The Darwinian Trap That Explains Our World

    Speaker

    Kristian Rönn is the CEO and co-founder of Normative. He has a background in mathematics, philosophy, computer science and artificial intelligence. Before he started Normative he worked at the University of Oxford’s Future of Humanity Institute on issues related to global catastrophic risks.


    Session Summary

    When people talk about today’s biggest challenges they tend to frame the conversation around “bad people” doing “bad things.” But is there more to the story?


    In this month’s Hope Drop we speak to Kristian Rönn, an entrepreneur formerly affiliated with the Future of Humanity Institute. Kristian calls these deeply rooted impulses “Darwinian demons.” These forces, a by-product of natural selection, can lead us to act in shortsighted ways that harm others—and even imperil our survival as a species. In our latest episode, Kristian explains how we can escape these evolutionary traps through cooperation and innovative thinking. Kristian's new book, The Darwinian Trap, is being published on September 24th. Be sure to preorder it today!


    Full transcript, list of resources, and art piece: https://www.existentialhope.com/podcasts


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    29 August 2024, 1:12 pm
  • 59 minutes 48 seconds
    Divya Siddarth | Collective Intelligence for Collective Progress

    Speaker 

    Divya is the co-founder of the Collective Intelligence Project, which advances collective intelligence capabilities for the democratic and effective governance of transformative technologies. She serves as Associate Political Economist and Social Technologist at Microsoft’s Office of the CTO. She also holds positions as a research director at Metagov and a researcher in residence at the RadicalXChange Foundation.


    Key Highlights

    In today’s rapidly evolving world, where technology is at the forefront of progress, the need for effective collaboration between humans and artificial intelligence is more crucial than ever. In this engaging and thought-provoking talk, we will explore the concept of collective intelligence and discuss how it can be harnessed to drive collective progress in various domains, including science, technology, and social innovation.


    About Foresight Institute

    Foresight Institute is a research organization and non-profit that supports the beneficial development of high-impact technologies. Since our founding in 1987 on a vision of guiding powerful technologies, we have continued to evolve into a many-armed organization that focuses on several fields of science and technology that are too ambitious for legacy institutions to support.


    Allison Duettmann

    The President and CEO of Foresight Institute, Allison Duettmann directs the Intelligent Cooperation, Molecular Machines, Biotech & Health Extension, Neurotech, and Space Programs, alongside Fellowships, Prizes, and Tech Trees. She has also been pivotal in co-initiating the Longevity Prize, pioneering initiatives like Existentialhope.com, and contributing to notable works like "Superintelligence: Coordination & Strategy" and "Gaming the Future".


    Get Involved with Foresight:


    Follow Us: Twitter | Facebook | LinkedIn


    Note: Explore every word spoken on this podcast through Fathom.fm, an innovative podcast search engine.


    Hosted on Acast. See acast.com/privacy for more information.

    23 August 2024, 12:00 pm
  • 52 minutes 15 seconds
    Siméon Campos | Governing AI for Good

    Speaker

    Siméon Campos is president and founder of SaferAI, an organization working on developing the infrastructure for general-purpose AI auditing and risk management. He worked on large language models for the last two years and is highly committed to making AI safer.


    Session Summary

    “I think safe AGI can both prevent a catastrophe and offer a very promising pathway into a eucatastrophe.”


    This week we are dropping a special episode of the Existential Hope podcast, where we sit down with Siméon Campos, president and founder of Safer AI, and a Foresight Institute fellow in the Existential Hope track. Siméon shares his experience working on AI governance, discusses the current state and future of large language models, and explores crucial measures needed to guide AI for the greater good.


    Full transcript, list of resources, and art piece: https://www.existentialhope.com/podcasts


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    16 August 2024, 8:00 am
  • 49 minutes 56 seconds
    Dean Woodley Ball | Humanity’s Next Leap: Thoughts on the Frontiers of Neural Technology

    Speaker 

    Dean Woodley Ball is a Research Fellow at George Mason University’s Mercatus Center and author of the Substack Hyperdimensional. His work focuses on artificial intelligence, emerging technologies, and the future of governance. Previously, he was Senior Program Manager for the Hoover Institution's State and Local Governance Initiative. 


    Key Highlights

    Based on engagement with the neuroscience and machine learning literatures, this talk will focus on how technologies such as virtual reality, large language models, AI agents, neurostimulation, and neuromonitoring may converge in the coming decade into the first widespread consumer neural technology. The talk will focus on technical feasibility, public policy, and broader societal implications. 

     

    In terms of the challenge, I think the big one for me is probably building the datasets we’ll need for the foundational AI models undergirding all of this.


    About Foresight Institute

    Foresight Institute is a research organization and non-profit that supports the beneficial development of high-impact technologies. Since our founding in 1987 on a vision of guiding powerful technologies, we have continued to evolve into a many-armed organization that focuses on several fields of science and technology that are too ambitious for legacy institutions to support.


    Allison Duettmann

    The President and CEO of Foresight Institute, Allison Duettmann directs the Intelligent Cooperation, Molecular Machines, Biotech & Health Extension, Neurotech, and Space Programs, alongside Fellowships, Prizes, and Tech Trees. She has also been pivotal in co-initiating the Longevity Prize, pioneering initiatives like Existentialhope.com, and contributing to notable works like "Superintelligence: Coordination & Strategy" and "Gaming the Future".


    Get Involved with Foresight:


    Follow Us: Twitter | Facebook | LinkedIn


    Note: Explore every word spoken on this podcast through Fathom.fm, an innovative podcast search engine.



    Hosted on Acast. See acast.com/privacy for more information.

    9 August 2024, 12:00 pm
  • 52 minutes 9 seconds
    Dana Watt | A Neuroscientist's Guide to Starting a Company

    Dr. Watt is an investment associate at Ascension Ventures, an investment firm specializing in healthcare technology. She previously co-founded and served as CSO of Pro-Arc Diagnostics, a personalized medicine company operating in St. Louis.


    Key Highlights

    Watt discusses her career journey and insights into venture capital investing in neuroscience and neurotech companies. She explains her role as a VC, which involves making profitable investments, underwriting risk, and structuring deals. Dana highlights key attributes of venture-backable companies, such as exceptional teams, large addressable markets, defensibility, and differentiation. She also discusses challenges and biases in neuroscience investing, including the complexity of brain science, hardware difficulties, long clinical timelines, and subtle readouts.


    About Foresight Institute

    Foresight Institute is a research organization and non-profit that supports the beneficial development of high-impact technologies. Since our founding in 1987 on a vision of guiding powerful technologies, we have continued to evolve into a many-armed organization that focuses on several fields of science and technology that are too ambitious for legacy institutions to support.


    Allison Duettmann

    The President and CEO of Foresight Institute, Allison Duettmann directs the Intelligent Cooperation, Molecular Machines, Biotech & Health Extension, Neurotech, and Space Programs, alongside Fellowships, Prizes, and Tech Trees. She has also been pivotal in co-initiating the Longevity Prize, pioneering initiatives like Existentialhope.com, and contributing to notable works like "Superintelligence: Coordination & Strategy" and "Gaming the Future".


    Get Involved with Foresight:


    Follow Us: Twitter | Facebook | LinkedIn


    Note: Explore every word spoken on this podcast through Fathom.fm, an innovative podcast search engine.


    Hosted on Acast. See acast.com/privacy for more information.

    2 August 2024, 10:20 am
  • 51 minutes 16 seconds
    Existential Hope Podcast: James Pethokoukis | Conservatism Meets Futurism

    James Pethokoukis is a senior fellow and the DeWitt Wallace Chair at the American Enterprise Institute, where he analyzes US economic policy, writes and edits the AEIdeas blog, and hosts AEI’s Political Economy podcast. He is also a contributor to CNBC and writes the Faster, Please! newsletter on Substack. He is the author of The Conservative Futurist: How to Create the Sci-Fi World We Were Promised (Center Street, 2023). He has also written for many publications, including the AtlanticCommentaryFinancial TimesInvestor’s Business DailyNational ReviewNew York Post, the New York TimesUSA Today, and the Week. 


    Session Summary

    In this episode, James joins us to discuss his book, The Conservative Futurist, and his perspectives on technology and economic growth. James explores his background, the spectrum of 'upwing' (pro-progress) versus 'downwing' (anti-progress), and the role of technology in solving global challenges. He explains his reasoning for being pro-progress and pro-growth as well as highlighting the importance of positive storytelling and education in developing a more advanced and prosperous world.


    Full transcript, list of resources, and art piece: https://www.existentialhope.com/podcasts


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    25 July 2024, 8:05 am
  • 58 minutes 43 seconds
    Existential Hope: The Flourishing Foundation at the Transformative AI Hackathon

    The Flourishing Foundation

    In February 2024, we partnered with the Future of Life Institute on a hackathon to design institutions that can guide and govern the development of AI. The winner of the hackathon was the Flourishing Foundation, who are focused on our relationship with AI and other emerging technologies. They challenge innovators to envision and build life-centered products, services, and systems, specifcially, to enable TAI-enabled consumer technologies to promote human well-being by developing new norms, processes, and community-driven ecosystems.


    At their core, they explore the question of "Can AI make us happier?"


    Connect: https://www.flourishing.foundation/


    Read about the hackathon: https://foresight.org/2024-xhope-hackathon/


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    5 July 2024, 2:56 pm
  • 47 minutes 8 seconds
    Existential Hope Podcast: Roman Yampolskiy | The Case for Narrow AI

    Dr Roman Yampolskiy holds a PhD degree from the Department of Computer Science and Engineering at the University at Buffalo. There he was a recipient of a four year National Science Foundation IGERT (Integrative Graduate Education and Research Traineeship) fellowship. His main areas of interest are behavioral biometrics, digital forensics, pattern recognition, genetic algorithms, neural networks, artificial intelligence and games, and he is an author of over 100 publications including multiple journal articles and books.


    Session Summary

    We discuss everything AI safety with Dr. Roman Yampolskiy. As AI technologies advance at a breakneck pace, the conversation highlights the pressing need to balance innovation with rigorous safety measures. Contrary to many other voices in the safety space, argues for the necessity of maintaining AI as narrow, task-oriented systems: “I'm arguing that it's impossible to indefinitely control superintelligent systems”. Nonetheless, Yampolskiy is optimistic about narrow AI future capabilities, from politics to longevity and health. 


    Full transcript, list of resources, and art piece: https://www.existentialhope.com/podcasts


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    26 June 2024, 8:56 am
  • 54 minutes 29 seconds
    Existential Hope Worldbuilding: 1st place | Cities of Orare

    This episode features an interview with the 1st place winners of our 2045 Worldbuilding challenge!


    Why Worldbuilding?

    We consider worldbuilding an essential tool for creating inspiring visions of the future that can help drive real-world change. Worldbuilding helps us explore crucial 'what if' questions for the future, by constructing detailed scenarios that prompt us to ask: What actionable steps can we take now to realize these desirable outcomes?


    Cities of Orare – our 1st place winners

    Cities of Orare imagines a future where AI-powered prediction markets called Orare amplify collective intelligence, enhancing liberal democracy, economic distribution, and policy-making. Its adoption across Africa and globally has fostered decentralized governance, democratizing decision-making, and spurring significant health and economic advancements.


    Read more about the 2045 world of Cities of Orare: https://www.existentialhope.com/worlds/beyond-collective-intelligence-cities-of-orare

    Access the Worldbuilding Course: https://www.existentialhope.com/existential-hope-worldbuilding


    Existential Hope was created to collect positive and possible scenarios for the future so that we can have more people commit to creating a brighter future, and to begin mapping out the main developments and challenges that need to be navigated to reach it. Existential Hope is a Foresight Institute project.


    Hosted by Allison Duettmann and Beatrice Erkers


    Follow Us: Twitter | Facebook | LinkedIn | Existential Hope Instagram


    Explore every word spoken on this podcast through Fathom.fm.


    Hosted on Acast. See acast.com/privacy for more information.

    21 June 2024, 2:56 pm
  • More Episodes? Get the App
© MoonFM 2024. All rights reserved.