As educational establishments seemingly wrestle with how, or if, Generative AI (GenAI) should be formally integrated into their curricula, the conversation seems to circle around a familiar tension: education versus training (I’d love to hear from people embedded in the education space for their opinion).
Should STEM degrees remain focused on deep technical foundations, or adapt to include the practical AI skills employers well expect? One promising middle ground is adding humanities courses that sharpen critical thinking, ethics, and communication – capabilities essential for using AI responsibly. The challenge is finding the right balance so educational establishments can preserve their mission to educate while preparing graduates for the realities of an AI-enabled workplace.
What is it?
STEM graduates are entering the workforce with strong technical skills – but many may have had little formalised, structured exposure to the practical, ethical, and contextual use of GenAI tools like large language models (LLMs).
While universities may be debating policies and experiment with curricula, the business world is already adopting AI at increasing speed. As a result it’s reasonable for employers to expect new hires to use GenAI tools effectively, validate outputs, and apply them in real-world workflows.
The result? A skills gap: graduates who can code, model, or analyse – but may not know how to integrate GenAI tools into business processes, communicate GenAI-derived results to stakeholders, or navigate the ethical and compliance boundaries of AI use.
What does this mean from a business perspective?
For employers, this gap carries real risks:
- Productivity loss: New hires take longer to adopt GenAI into their work without structured guidance.
- Inconsistent results: Without shared standards, employees use GenAI with varying levels of quality and safety.
- Ethical and compliance risks: Misuse of GenAI could expose the company to data privacy breaches, IP violations, or biased outputs.
- Missed opportunities: Under prepared employees may overlook ways to apply GenAI for innovation, cost savings, or customer experience improvements.
In short, the onus is on businesses to close the gap, or risk losing competitive advantage.
Advice for STEM Students
If you’re in a STEM program now, you can prepare yourself for an AI-enabled workplace by adding humanities courses that build the human skills AI can’t yet replace – judgement, ethics, and clear communication (I truly wish we had these in my undergrad – and that I had seen the need more clearly for the blend).
Areas to explore include:
- Ethics and Philosophy of Technology: Understand the moral and societal impact of AI tools.
- Linguistics and Semantics: Learn how language works to ultimately design better prompts and interpret AI outputs.
- Cognitive Science or Psychology: Gain insight into how humans think, decide, and error, and how GenAI compares.
- Rhetoric and Communication: Develop the ability to explain AI-derived results in clear, persuasive ways.
- History of Science and Technology: See how past innovations shaped industries and society, and apply those lessons to AI.
Tip: Connect these courses back to your technical work – use assignments and projects to explore GenAI use cases in your field. This makes your learning relevant and portfolio-ready.
What do I do about it?
Forward-thinking businesses will already be taking steps to make GenAI fluency part of their workplace culture. Here’s how:
- Integrate AI into on-boarding: Offer basic prompt design (engineering), validation techniques, and acceptable use and adoption policy training, IP rules and bias detection as part of new employee orientation.
- Create role-specific AI up-skilling: Tailor training to the needs of each department, from GenAI-assisted code review for engineers to GenAI powered content ideation for marketing.
- Appoint AI champions: Pair new hires with GenAI fluent mentors who can model safe and effective tool use.
- Encourage cross-disciplinary exposure: Host internal sessions where different teams share how they use GenAI, and the lessons learned (facilitated via a Community of Practice).
- Standardise validation and documentation: Make it policy to cross-check AI outputs.
Businesses can’t assume AI skills will arrive with new graduates. By making AI literacy part of on-boarding, embedding ethics, and providing role-specific training, companies can turn the skills gap into a competitive advantage, and ensure their workforce is ready for the GenAI-enabled future.
Further Reading
The Converging Paths of Computer Science and the Humanities in the Age of GenAI (Netta Ehrlich and Orit Hazzan)
