ReporticaAI

Implementation Guide

February 2026

AI in Professional Education: A Practical Implementation Guide

For Nurse, Midwifery and Social Work Educators

Informed by RCN Congress 2025 discussions on digital literacy, the Ada Lovelace Institute research on AI risks in social work (February 2026), and emerging guidance on structured AI use in professional education.

Guide Contents

  1. Introduction: The State of AI in Professional Education
  2. The Structuring vs Generating Distinction
  3. Why This Matters Now: The Ada Lovelace Findings
  4. Ethical Framework for AI Use in Portfolio Development
  5. Practical Implementation Models
  6. Addressing Common Concerns
  7. Educator Resources and Templates
  8. Further Reading and References

1. Introduction: The State of AI in Professional Education

May 2025: The Royal College of Nursing Congress formally discussed artificial intelligence in nurse education, with RCN Wales advocating for Higher Education Institutions to "equip students with the skills to continually enhance their digital and biotechnological literacy."

February 2026: The Ada Lovelace Institute published findings from an eight-month study across 17 English and Scottish councils, revealing that AI transcription tools used in social work are producing hallucinations, inaccuracies, and "potentially harmful misrepresentations of people's experiences" in official care records.

The consensus emerging across health and social care education is clear:

PositionImplication
AI is already in healthcare and social work practiceStudents need preparation for tech-enabled workplaces
Digital literacy is a professional requirementNot optional, but core competency
Ethical use must be taughtBans don't work; guidance does
Not all AI tools carry equal riskThe type of AI matters as much as its use
Student wellbeing mattersReducing administrative burden supports learning

The question is no longer "should we use AI?" but "how do we use it responsibly and which type of AI is safe to use?"

2. The Structuring vs Generating Distinction

This distinction is central to ethical integration and to understanding why recent research has raised serious concerns about certain AI approaches in social care.

Generating AI (e.g. transcription tools)Structuring AI (e.g. ReporticaAI)
Creates new content from prompts or audioOrganises user-provided content
Risk of hallucinations and fabricated contentNo content generation = no hallucinations
AI "writes" summaries and assessmentsUser provides all professional content
Can misinterpret accents and speech patternsUser pastes their own notes no transcription errors
Opaque unclear how AI reached conclusionsTransparent user's notes + professional structure
Can replace professional thinkingPreserves professional judgement

"Like teaching students to use SBAR for handovers the framework is provided, the clinical content is theirs."

3. Why This Matters Now: The Ada Lovelace Findings

In February 2026, the Ada Lovelace Institute published research based on an eight-month study across 17 councils in England and Scotland, examining AI transcription tools used by social workers. The findings, reported by The Guardian, raise fundamental questions about AI safety in professional practice.

Key findings from the research:

  • An AI transcription tool incorrectly indicated suicidal ideation in a case summary content that was never discussed by the service user

  • AI-generated transcriptions included "gibberish" and references to "fishfingers or flies or trees" when a child was discussing parental conflict

  • Some social workers spent as little as two minutes checking AI-generated transcripts before entering them into official records

  • Reports of disciplinary action for failing to properly check AI outputs and missing obvious errors

  • BASW is calling for regulators to issue clear guidance on how and when AI tools should be used

Separately, LSE research published in August 2025 found that AI summarisation tools used by English councils introduced gender bias into care assessments describing men's needs as "complex" while the same needs in women were described as "independent" and "able to manage."

Additional concerns with transcription-based AI tools:

  • Client consent Recording conversations requires verbal consent from vulnerable service users. Some clients may refuse or feel pressured.

  • Data retention Government transparency records show some tools retain sensitive case data with third-party providers for up to 120 days, shared across multiple sub-processors.

  • Procurement barriers Transcription tools typically require council-level procurement, DPIAs, and council-issued devices limiting access for individual practitioners, students, and smaller organisations.

These findings reinforce why the distinction between generating and structuring AI is not academic it is a safeguarding issue. When AI fabricates content, introduces bias, or retains sensitive data, the consequences fall on vulnerable service users and on the professionals responsible for those records.

Structuring tools that organise the practitioner's own notes without generating, transcribing, or interpreting content eliminate these specific risks entirely. No recording means no consent issues. No generation means no hallucinations or bias. No retention means no data exposure.

4. Ethical Framework for AI Use in Portfolio Development

We recommend this four-pillar framework for students:

Transparency

Disclose AI use where required. Save original notes alongside final output.

Ownership

You are responsible for all final content. Review, edit, and verify.

Privacy

Never enter patient-identifiable or service-user-identifiable information. Use anonymised notes only.

Development

Use AI to learn documentation standards, not bypass them.

Suggested learning outcomes:

  • Demonstrate ability to structure clinical or practice observations appropriately
  • Critically evaluate AI-structured drafts against professional standards
  • Maintain confidentiality while using digital tools
  • Articulate rationale for using/not using AI assistance

5. Practical Implementation Models

Model A: Structured Support (Low Integration)

When:Optional drop-in sessions before portfolio deadlinesHow:Students shown structuring tools as optional resourceSafeguards:Clear guidance on ethical useAssessment:No change to submission requirementsBest for:Early adoption, building familiarity

Model B: Embedded Digital Literacy (Medium Integration)

When:Part of professional skills moduleHow:2-hour workshop: "Structuring Reflections with AI"Safeguards:Students submit original notes + AI output + reflectionAssessment:Quality of reflection assessed, not tool useBest for:Programmes ready to teach responsible AI use

Model C: Curriculum Integration (High Integration)

When:Throughout placement blocksHow:Structuring AI integrated into portfolio guidanceSafeguards:Departmental policy on AI use; educator oversightAssessment:Portfolio criteria updated to include digital literacyBest for:Forward-thinking departments, technology-enhanced learning

6. Addressing Common Concerns

ConcernEducator Response
"Students will stop reflecting deeply"Reflection happens before and after structuring. The tool organises, the student reflects.
"It's unfair to students without tech access"Mobile-friendly, pay-per-use, 1 free use removes barriers.
"How do I know it's their work?"Require original notes submission alongside final output.
"The university has no AI policy"Use this as catalyst to develop one. Start with a pilot.
"It might breach confidentiality"Build into teaching: "Never enter patient-identifiable or service-user-identifiable data."
"What about AI hallucinations?"Structuring tools organise your content they don't generate new content, so hallucination risk is eliminated at source.

7. Educator Resources and Templates

Resource 1: Student Guidance Handout

  • What is structuring AI?
  • When to use it (and when not to)
  • Step-by-step: From notes to portfolio
  • Ethical use checklist

Resource 2: Sample Session Plan (90 minutes)

  • Introduction to AI in healthcare and social care (15 mins)
  • The structuring vs generating distinction (15 mins)
  • Hands-on practice with structuring tools (30 mins)
  • Reflection and discussion (20 mins)
  • Q&A and next steps (10 mins)

Resource 3: Assessment Integration Guide

  • Sample marking criteria for AI-assisted submissions
  • How to verify student input
  • Framework for discussing AI use in tutorials

Resource 4: Policy Template

  • Draft university policy on AI in professional education
  • Adaptable for departmental handbooks
  • Includes student declaration form

All resources available on request email support@reporticaai.co.uk

8. Further Reading and References

  • RCN Congress 2025: AI in Nurse Education (Debate Summary)
  • Ada Lovelace Institute (2026): Research on AI transcription tools in social work across 17 councils
  • Robert Booth (2026) "AI tools make potentially harmful errors in social work records, research says" The Guardian, 11 February 2026
  • Standards framework for Nursing and Midwifery Education Part 1 (NMC, 2018/23)
  • Social Work England: Guidance on the Professional Standards (2020)
  • Sharif Haider et al (2025) "Emerging use of AI in social work education and practice: A rapid evidence assessment of the literature" (Open University)
  • Sarah Rothera and Mairi-Anne Macdonald (2025) "Understanding the emerging use of artificial intelligence (AI) in social work education and practice in England" Research in Practice, National Children's Bureau
  • Jessica Murray (2025) "AI tools used by English councils downplay women's health issues, study finds" The Guardian, 11 August 2025
  • Jo Stephenson and Mithran Samuel (2025) "AI in social work: opportunity or risk?" (Community Care)
  • Romy Duckett (2025) "Why student nurses need critical AI literacy" Nursing Times
  • Department for Education: Generative AI in Education (2025)

About ReporticaAI

ReporticaAI is a UK-based, GDPR-compliant structuring tool for professional documentation. It helps students and practitioners structure placement notes, meeting records, and policy documents into professionally formatted, regulation-aligned output without generating, transcribing, or interpreting content.

  • UK/EU data residency no data stored or retained
  • NMC, SWE, CQC, and Ofsted frameworks embedded
  • 1 free use across all tools no subscription required
  • Governed by the PAIDS professional documentation standard

Free educator access available. Contact: support@reporticaai.co.uk

This guide is provided for educational purposes. Individual institutions retain responsibility for determining appropriate AI use in their context.

ReporticaAI

reporticaai.co.uk

support@reporticaai.co.uk

© 2026 ReporticaAI. All rights reserved.