The Edtech Podcast

Sophie Bailey, @soph_bailey

Improving the dialogue between 'ed' & 'tech' through storytelling, for better innovation and impact. #edchat #edtech

  • 25 minutes 50 seconds
    #282 - Risk Assessments for AI Learning Tools, a conversation, Part 2

    In the second episode of a two-part miniseries on risk management, risk mitigation and risk assessment in AI learning tools, Professor Rose Luckin is away in Australia, speaking internationally, so Rowland Wells takes the reins to chat with Dr Rajeshwari Iyer of sAInaptic to hear her perspective on risk as a developer and CEO.

    View our Risk Assesments here: https://www.educateventures.com/risk-assessments

    In the studio:

    • Rowland Wells, Creative Producer, EVR
    • Rajeshwari Iyer, CEO and Cofounder, sAInaptic

    Talking points and questions include:

    • Who are these for?  what's the profile of the person we want to engage with these risk assessments?  They're concise, easy-to-read, no technical jargon.  But it's still an analysis, for people with a research/evidence mindset.  Many people ignore it: we know that even learning tool developers who put research on their tools ON THEIR WEBSITES do not actually have it read by the public.  So how do we get this in front of people?  Do we lead the conversation with budget concerns?  Safeguarding concerns?  Value for money?
    • What's the end goal of this?  Are you trying to raise the sophistication of conservation around evidence and risk?  Many developers who you critique might just think you're trying to make a name pulling apart their tools.  Surely the market will sort itself out?
    • What's the process involved in making judgements about a risk assessment?  If we're trying to demonstrate to the buyers of these tools, the digital leads in schools and colleges, what to look for, what's the first step?  Can this be done quickly?  Many who might benefit from AI tools might not have the time to exhaustively hunt out all the little details of a learning tool and interpret them themselves? 
    • Schools aren't testbeds for intellectual property or tech interventions.  Why is it practitioners' responsibilities to make these kind of evaluations, even with the aid of these kind of assessments?  Why is the tech and AI sector not capable of regulating their own practices?
    • You've all worked with schools and learning and training institutions using AI tools.  Although this episode is about using the tools wisely, effectively and safely, please tell us how you've seen teaching and learning enhanced with the safe and impactful use of AI
    7 November 2024, 10:14 am
  • 36 minutes 54 seconds
    #281 - Risk Assessments for AI Learning Tools, a conversation, Part 1

    In today’s episode, we have the first part of a two-part miniseries on risk management, risk mitigation and risk assessment in AI learning tools.  Professor Rose Luckin is away in Australia, speaking internationally, so Rowland Wells takes the reins to chat with Educate Ventures Research team members about their experience managing risk as teachers and developers.  What does a risk assessment look like and whose responsibility is it to take onboard its insights?  Rose joins our discussion group towards the end of the episode, and in the second instalment of the conversation, Rowland sits down with Dr Rajeshwari Iyer of sAInaptic to hear her perspective on risk and testing features of a tool as a developer and CEO herself.   

    View our Risk Assessments here: https://www.educateventures.com/risk-assessments

    In the studio:

    • Rowland Wells, Creative Producer, EVR
    • Dave Turnbull, Deputy Head of Educator AI Training, EVR
    • Ibrahim Bashir, Technical Projects Manager, EVR
    • Rose Luckin, CEO & Founder, EVR

    Talking points and questions include:

    • Who are these for?  what’s the profile of the person we want to engage with these risk assessments?  They’re concise, easy-to-read, no technical jargon.  But it’s still an analysis, for people with a research/evidence mindset.  Many people ignore it: we know that even learning tool developers who put research on their tools ON THEIR WEBSITES do not actually have it read by the public.  So how do we get this in front of people?  Do we lead the conversation with budget concerns?  Safeguarding concerns?  Value for money?
    • What’s the end goal of this?  Are you trying to raise the sophistication of conservation around evidence and risk?  Many developers who you critique might just think you’re trying to make a name pulling apart their tools.  Surely the market will sort itself out?
    • What’s the process involved in making judgements about a risk assessment?  If we’re trying to demonstrate to the buyers of these tools, the digital leads in schools and colleges, what to look for, what’s the first step?  Can this be done quickly?  Many who might benefit from AI tools might not have the time to exhaustively hunt out all the little details of a learning tool and interpret them themselves? 
    • Schools aren’t testbeds for intellectual property or tech interventions.  Why is it practitioners’ responsibilities to make these kind of evaluations, even with the aid of these kind of assessments?  Why is the tech and AI sector not capable of regulating their own practices?
    • You’ve all worked with schools and learning and training institutions using AI tools.  Although this episode is about using the tools wisely, effectively and safely, please tell us how you’ve seen teaching and learning enhanced with the safe and impactful use of AI
    7 November 2024, 10:06 am
  • 52 minutes 15 seconds
    #280 - What are Student Expectations for AI in Education?

    In today's rapidly evolving educational landscape, Artificial Intelligence is emerging as a transformative force, offering both opportunities and challenges. As AI technologies continue to advance, it's crucial to examine their impact on student expectations, learning experiences, and institutional strategies. One pressing question is: what do students truly want from AI in education? Are they reflecting on the value of their assessments and assignments when AI tools can potentially complete them? This begs the deeper question of what we mean by student success in higher education and the purpose of knowledge in an AI-driven economy.  Professor Rose Luckin is joined by three wonderful guests in the studio to discuss what tools we need to support students and how we explore the potential and the limitations of AI for education.

    Guests:

    • Michael Larsen, CEO & Managing Director, Studiosity
    • Sally Wheeler, Professor, Vice-Chancellor, Birkbeck, University of London
    • Ant Bagshaw, Executive Director, Australian Technology Network of Universities

    Talking points and questions include:

    • Student expectations and perspectives on using AI for assessments/assignments and the role of knowledge in an AI economy
    • The potential of AI to enhance learning through features like instant feedback, error correction, personalized support, learning analytics
    • How AI could facilitate peer support systems and student community, and the research on the value of this
    • The lack of robust digital/AI strategies at many institutions as a barrier to effective AI adoption
    • The evidence-base for AI in education - challenges with research being highly specific/contextual, debating the value of in-house research vs general studies
    • Whether evidence on efficacy truly drives institutions' buying decisions for AI tools or if other factors/institutional challenges are stronger influences
    • How challenges facing the education sector can inhibit capacity for innovative deployments like AI
    • The growing need for proven, supportive AI tools for students despite institutional constraints

     

    10 July 2024, 10:38 am
  • 46 minutes 37 seconds
    #279 - Can We Trust in AI for Education? (AI in Ed Miniseries)

    Coming to the fifth and final episode of our miniseries on AI for education, host Professor Rose Luckin is joined by Timo Hannay, Founder of SchoolDash, and Lord David Puttnam, Independent Producer, Chair of Atticus Education, and former member of the UK parliament's House of Lords.  This episode and our series have been generously sponsored by Nord Anglia Education.

    Today we’re going to look ahead to the near and far future of AI in education, and ask what might be on the horizon that we can’t even predict, and what we can do as humans to proof ourselves against disruptions and innovations that have, like the Covid pandemic and ChatGPT's meteoric rise, rocked our education systems, and demanded we do things differently.

    Guests:

    Talking points and questions include: 

    • Slow Reaction to AI: Despite generative AI's decade-long presence and EdTech's rise, the education sector's response to tools like ChatGPT has been surprisingly delayed. Why?
    • Learning from Our AI Response: Can our current reaction to generative AI serve as a case study for adapting to future tech shifts? It's a test of our educational system's resilience
    • AI's Double-Edged Sword: With ChatGPT's rapid rise, are EdTech companies risking harm by using AI without fully understanding it? Think Facebook's data misuse in the Rohingya massacre
    • Equipping Teachers for AI: Who can educators trust for AI knowledge? We need frameworks to guide them, as AI literacy is now as crucial as internet literacy
    • Digital Natives ≠ AI-Ready: Today's youth grew up online, but does that prepare them for sophisticated, accessible AI? Not necessarily
    11 June 2024, 11:48 am
  • 52 minutes 49 seconds
    #278 - AI as a Tool for Equity of Learning

    Continuing our miniseries on AI in education with the fourth episode centred around a AI's potential for equity of learning, host Professor Rose Luckin is joined by Richard Culatta of ISTE, Professor Sugata Mitra, and Emily Murphy of Nord Anglia Education.  This episode and our series are generously sponsored by Nord Anglia Education.

    In our fourth instalment of this valuable series, we look at AI’s potential to address various challenges and bridge the educational gaps that exist among different groups of students around the world.  AI can analyse vast amounts of data, provide early interventions, and enhance accessibility, and as long as the deployment of the technology is appropriate to the unique context of the school, the learners, the location, and the access to devices, AI can transform education for those who need the most support.

    Guests:

    Talking points and questions include: 

    • What do we mean by equity of learning, and how can we understand context?  Is there a danger that AI will simply be used to reinforce or replace existing conventional methods of assessing learning, despite what it's great potential?
    • What needs to fall into place for AI to be the promise for education we know it could be?
    • What needs to happen to have AI be the magic bullet for equity of learning from a teacher and headteacher perspective?  If the technology is there, and it has the potential it has, how can teachers build on that? 
    • How have different practices and innovations in the classroom been adopted and rejected… is AI going to succeed where other initiatives and technologies have either failed to be adopted, or plateaued and fallen by the wayside?  How is AI different?
    • How do we talk about getting school infrastructure in place to use AI?
    • How we do we convince educationalists, and the budget holders and local governance that AI and other emerging technologies are worth their investment?
    • There is some understandable fear about revolutionary technology disrupting existing practice in the classroom, but are we underestimating our students and teachers?
    7 May 2024, 11:36 am
  • 53 minutes 10 seconds
    #277 - AI from a Global Perspective

    Continuing our miniseries on AI in education with the third episode centred around a global perspective on AI, host Professor Rose Luckin is joined by Andreas Schleicher of the OECD, Dr Elise Ecoff of Nord Anglia Education, and Dan Worth of Tes.  This episode and our series are generously sponsored by Nord Anglia Education.

    In our third instalment of this valuable series, we head out beyond the UK and the English-speaking world to get a global perspective on AI, and ask how educators and developers around the world build and engage with AI, and what users, teachers and learners want from the technology that might tell people back home a thing or two. We examine how international use of AI might change the way we engage with AI, and we also ask why they might be doing things differently.

    Guests:

    • Dr Andreas Schleicher, Director for the Directorate of Education & Skills, OECD
    • Dr Elise Ecoff, Chief Education Officer, Nord Anglia Education
    • Dan Worth, Senior Editor, Tes

    Talking points and questions include: 

    • What are other countries tech and education ecosystems doing to develop and implement AI?
    • International considerations of ethics and regulation
    • Is the first world imposing a way of looking at technology and its innovation on the third world? What assumptions are we making, and are we mindful of the context?
    • Is the first world restricting innovation through specific regulation to change what technology is being built and how, and who might it benefit?
    • Skills and competencies development can be driven by the needs of business - what priorities for AI education exhibited by international models could the UK adopt or consider?
    2 April 2024, 2:16 pm
  • 55 minutes 51 seconds
    #276 - AI, Metacognition, and Neuroscience

    What's in this episode?

    Continuing our new 5-episode miniseries on AI in education with the second episode on AI's relationship to neuroscience and metacognition, host Professor Rose Luckin is joined by Dr Steve Fleming, Professor of Cognitive Neuroscience at UCL, UK, and Jessica Schultz, Academic & Curriculum Director at the San Roberto International School in Monterrey, Mexico.  This episode and our series are generously sponsored by Nord Anglia Education.

    Metacognition, neuroscience and AI aren’t just buzzwords but areas of intense research and innovation that will help learners in ways that until now have been unavailable to the vast majority of people. The technologies and approaches that study in these domains unlocks, however, must not be siloed or made inaccessible to public understanding. Real work must be done to bring these areas together and we are tremendously excited that this podcast will present a great opportunity to showcase what inroads have been made, where, why, and how.

    Guests:

    Talking points and questions include: 

    • Neuroscience and AI are well-respected fields with a massive amount of research underpinning their investigation and practices, but they are also two very shiny buzzwords that the public likely only understands in the abstract (and the words may even be misapplied to things that aren't based in neuroscience or AI). Can you tell our listeners what they are, how they intersect with one another, and what benefits their crossover can provide in the realms of skills and knowledge?

    • Can we use one field, AI, or Neuroscience, to talk about the other, to better 'sell' the idea of the other field of study, and in this way, drastically raise the bar of what is possible to detect, uncover and assess, in education, using these domains?

    • In practical terms, how do we use AI and neuroscience to measure what might be considered 'unmeasurable' in learning? What data is required, what expertise in the team, or in a partner organisation, can be leveraged, who can be responsible for doing this in an educational or training institution?  What data or competencies or human resource do they need access to?

    Sponsorship

    Thank you so much to this series' sponsor: Nord Anglia Education, the world’s leading premium international schools organisation.  They make every moment of your child’s education count.  Their strong academic foundations combine world-class teaching and curricula with cutting-edge technology and facilities, to create learning experiences like no other.  Inside and outside of the classroom, Nord Anglia Education inspires their students to achieve more than they ever thought possible.

    "Along with great academic results, a Nord Anglia education means having the confidence, resilience and creativity to succeed at whatever you choose to do or be in life." - Dr Elise Ecoff, Chief Education Officer, Nord Anglia Education

    6 March 2024, 2:45 pm
  • 55 minutes 16 seconds
    #275 - Preparing Young People for their Future with AI

    What's in this episode?

    Delighted to launch this new 5-episode miniseries on AI in education, sponsored by Nord Anglia Education, host Professor Rose Luckin kicks things off for the Edtech Podcast by examining how we keep education as the centre of gravity for AI. 

    AI has exploded in the public consciousness with innovative large language models writing our correspondence and helping with our essays, and sophisticated images, music, impersonations and video generated on-demand from prompts.  Whilst big companies proclaim what this technology can achieve and how it will affect work, life, play and learning, the consumer and user on the ground and in our schools likely has little idea how it works or why, and it seems like a lot of loud voices are telling us only half the story.  What's the truth behind AI's power?  How do we know it works, and what are we using to measure its successes or failures?  What are our young people getting out of the interaction with this sophisticated, scaled technology, and who can we trust to inject some integrity into the discourse?  We're thrilled to have three guests in the Zoom studio with Rose this week:

    Talking points and questions include: 

    • We often ask of technology in the classroom 'does it work'?  But when it comes to AI, preparing people to work, live, and play with it will be more than just whether or not it does what the developers want it to.  We need to start educating those same people HOW it works, because that will not only protect us as consumers out in the world, as owners of our own data, but help build a more responsible and 'intelligent' society that is learning all of the time, and better able to support those who need it most.  So if we want that 'intelligence infrastructure', how do we build it?
    • What examples of AI in education have we got so far, what areas have been penetrated and has anything radically changed for the better?  Can assessment, grading, wellbeing, personalisation, tutoring, be improved with AI enhancements, and is there the structural will for this to happen in schools?
    • The ‘white noise’ surrounding AI discourse: we know the conversation is being dominated by larger-than-life personalities and championed by global companies who have their own technologies and interests that they're trying to glamourise and market. What pushbacks, what reputable sources of information, layman's explanations, experts and opinions should we be listening to to get the real skinny on AI, especially for education?
    Sponsorship

    Thank you so much to this series' sponsor: Nord Anglia Education, the world’s leading premium international schools organisation.  They make every moment of your child’s education count.  Their strong academic foundations combine world-class teaching and curricula with cutting-edge technology and facilities, to create learning experiences like no other.  Inside and outside of the classroom, Nord Anglia Education inspires their students to achieve more than they ever thought possible.

    "Along with great academic results, a Nord Anglia education means having the confidence, resilience and creativity to succeed at whatever you choose to do or be in life." - Dr Elise Ecoff, Chief Education Officer, Nord Anglia Education

     

    21 February 2024, 9:35 am
  • 57 minutes 59 seconds
    #274 - Managing Your School's Digital Transformation

    Digital Transformation!  Digital Strategy!  Professional Education!  What do they mean, and how do we implement them in a school?  In today's episode we’re very lucky to have on three wonderful guests who operate at the intersection of educational practice and the leveraging of technology for a better learning experience.  They are:  

    • James Symons, CEO, LocknCharge 
    • Katie Novak, Education Strategist, Smart Technologies 
    • Associate Professor Jane Hunter, School of International Studies and Education, University of Technology, Sydney 

    Each of these guests has a long history of working within the education space, from engineering and installing the hardware and catering to the evolving demands of schools, to leveraging the technology as a communal bridge between parents, teachers and students, and finally to researching and understanding the added value such technologies provide for teachers and learners and how they might successfully incorporate their use into daily practice. 

    Talking points and questions include: 

    • The evolving demands of the classroom – what futureproofing and future planning exists in each of your spaces to accommodate new trends and developments?  For those catering to the hardware, does the school or college determine what you make, or are they, and the ways their teachers and learners perform, conditioned by you?  What space is there for reciprocity between the EdTech maker and the EdTech user? 
    • Teacher professional education – how important is this?  Surely a learning tool lives and dies by the amount of training and ‘after-sales support’ is provided to practitioners?  What is the extent of the refusal by a teacher or department to adopt the technology and how is this overcome?  Is it just waving statistics about time-saving and cost-saving in their faces or is there a form of trust that must be engendered? 
    • Digital strategy – this means different things to different stakeholders.  What are the commonalities that should be agreed upon for successful rollout of technology?  Obviously contextual factors are key to each school, but what are the non-negotiables?  And with regard to developments like generative AI and other future trends we can’t even predict yet, what kinds of guardrails need to be in place with teachers, leaders, and the developers of the tech to ensure ongoing supportive relationships with stakeholders.  What foundations should be in place to support digital transformation no matter the bumps in the road ahead? 

    Material discussed in today's episode includes:

    6 December 2023, 10:29 am
  • 52 minutes 13 seconds
    #273 - How to stay in Love with Science

    SCIENCE!  Under discussion today are the ways in which students who were switched off the sciences at school manage to retain their curiosity about the subjects and can even reengage with it later in life.  Professor Rose Luckin is very lucky to have in the online studio this week Dr Andrew Morris, Honorary Associate Professor at UCL, former president of the Education Section of the British Science Association, and author, whose book, Bugs, Drugs, and Three-Pin Plugs: Everyday Science, Simply Explained, is now available wherever books are sold.  

    Dr Morris has an interest in serving learners and the public through scientific and evidence-based outreach.  The discussion in the studio centred around science, technology, research and practice in education.

    Talking points and questions:

    • The ways in which people who were switched off the sciences at school retain their curiosity and can reengage with science at a later point in life
    • Examples of topics and ways of approaching science that have been revealed by Dr Morris’ science discussion groups
    • Research-informed educational practice, and research-informed educational policy
    • Ways in which research can be transformed and mediated for use

    Material discussed in today's episode includes:

    25 October 2023, 11:56 am
  • 56 minutes 54 seconds
    #272 - Is Attention the Currency of Learning?

    Rose hosts Daisy Christodoulou, Director of Education at No More Marking in the EdTech Podcast Zoom studio this week, discussing AI regulation, evidence and effectiveness, and student outcomes in AI assessment, and what we think the future of AI-powered education might look like, and why!

    In late March of this year, Professor Rose Luckin and Daisy Christodoulou spoke at the UK parliament’s Governance of Artificial Intelligence oral evidence session for education, and the discussion that took place was passionate and exciting.  A link to the video of the session is below in the Show Notes if you’d like to watch it yourself, but a lot of ground was covered, yet not as much as they wished! 

    The interest in AI and its governance is very intense at the moment.  The UK government had published a white paper setting out their proposed approach to the governance of AI and the indication from the paper was that rather than give responsibility for AI governance to a single new AI regulator, it intended to empower existing regulators, and that there were several that existed in the education sector already.  Other points raised during the session included the idea of teaching a degree of scepticism in the public’s understanding of AI, meaning that the public should not believe everything that something like ChatGPT, a large language model, returns, for instance, when queried.  Concerns about the speed of AI development were raised, there were questions on safeguarding, ethics, transparency, explainability, access to the technology, autonomy, adaptivity and more. 

    In today’s episode, we’d like to revisit those thoughts on AI regulation, evidence and effectiveness, student outcomes in AI assessment, and what we think the future of AI-powered education might look like and why… 

    Talking points and questions include: 

    • Quality of evidence for improved student outcomes using AI 
    • The value of assessment: how, when, why, and in what form 
    • More discussion around the future of education with AI’s inclusion, and what we can do now 

    Material discussed in today’s episode includes:

    14 September 2023, 3:47 pm
  • More Episodes? Get the App
© MoonFM 2024. All rights reserved.