AI is now embedded in teaching and learning. As educators, how do we help students benefit from AI without slipping into dependency, surface-level work, or ethical misconduct? I’ve found a helpful way to clarify conversations with students (and with myself) by thinking of AI as a teammate (or teammates) with clearly defined roles: the Tasker, the Draftsmith, and the Facilitator.
Imagine a meeting of a team of people: someone handles logistics, someone sketches ideas, someone pushes discussion. Framing AI this way has made it easier to talk about students remaining responsible for how and to what purposes they use AI. My approach draws on recent scholarship about AI reshaping how teams work (Dell’Acqua et al., 2025).
The AI Teammates
The Tasker
The Tasker deals with repetitive or procedural work such as managing our calendars, formatting citations, or cleaning datasets. These tasks often are tedious but essential and using AI frees time that can be focused on deeper thinking. To use a Tasker, we can set up a Robotic Process Automation (RPA) with AI, use embedded AI tools such as Copilot in MS:Word or Claude in MS:Excel or Google sheets, or create what is currently referred to as an LLM “Agent.” As Farri & Rosani (2025) describe in their HBR guide to generative AI for managers, we can use AI to lighten the load on procedural chores, affording us the time and energy to engage more with deeper thinking and learning.
Because AI output may contain mistakes, students must learn to identify the levels of risks they believe are acceptable for the given task before turning to AI and must validate its output. The level of validation required may depend on the risk level, needed accuracy and quality, and current knowledge of the tool’s metrics.
We approach the Tasker differently than we do the Draftsmith and Facilitator: we can set our RPA or Agent up to run and only involve ourselves in its applications until we’re sure it can work independently and accurately, or we can trigger the Tasker each time we need it. Students must learn to set aside sufficient validation time before designing a Tasker.
The Draftsmith
The Draftsmith can spot passive voice in students’ writing and explain why it’s problematic, convert an outline into a draft of a PowerPoint, and produce study tools such as multiple choice or essay questions, podcasts, and videos from their course notes. My engineering students have prompted their Draftsmiths using a prompt library I created specifically for them to help them convert their design clients’ stated needs into project requirements. Bussgang (2025) has pointed out how small steps like these can dramatically boost productivity without sacrificing learning.
While AI may generate text or ideas, the final product must be the student’s own (Duffy, 2025). I teach my students that, much to their dismay, they must develop their own writers’ voice before using AI and that this process can take years and many writing efforts! Our writer’s voice presents us to our bosses, colleagues, and clients given that much of our communication at work is through emails. College is the best time to invest in this effort because it’s rare to get opportunities later to do so.
From students’ comments, I believe they use AI in the Draftsmith role more frequently than the other two roles. Few realize the Draftsmith increases cognitive load when they use it correctly and decreases their learning when they use it incorrectly!
It increases the load because they must provide the AI with significant context for it to provide anything other than banal, common prose that flattens their own voice (Purohit, 2025) or “creative” prose that is meaningless. AI use decreases their learning when they use it to write first and second drafts because writing sharpens our thinking and embeds learned concepts into our long-term memory.
The Facilitator
Instead of using AI to generate content (the Draftsmith), the Facilitator is a use of AI as a thinking partner (Solis, 2024 and HBSP’s ManageMentor course 2025). A student might ask AI to play devil’s advocate, suggest counterarguments, or pose probing questions when their paper’s argument needs strengthening. AI might help simulate a stakeholder’s perspective with whom the student role plays to prepare for a meeting or consider how their ideas might land with their student group peers. My students have used AI in the Facilitator role to pressure test their design ideas, review their research plans, and prepare to give presentations. When students use the Facilitator, they’re reflecting (Harbridge, 2025) and thereby deepening their learning.
To get the best value from AI as a Facilitator, the student must approach it with the expectation that it will take considerable time. The Facilitator provides meaningful interaction only when the student gives it their full attention, asks probing questions, provides significant context, and is patient. They may need to iterate, restating their prompts or providing more information, and to ask the LLM to be candid and critical. They may need to step away from the LLM, perhaps try a different one or come back on another day, when it goes off in the wrong direction.
We have become too used to scanning an LLM’s output. Just as I would not expect a student who came to me for career advice to start scrolling on their phone while I’m talking, I must teach them to give their full attention to the Facilitator.
What I’m suggesting here is not a list of use cases. Instead, I’m providing students with an approach to using AI where they plan before typing into the chat box, see its best use as a Facilitator that partners with them in learning and development, and recognize that validation and reading output will be time and cognitively intensive.
What This Means for Teaching
I’ve found that when I teach these roles to students before assigning homework, they make better choices regarding when and how to use AI. Here are some tactics I’ve tried that have netted positive results:
- Design assignments that invite different roles. For example, ask students to begin with AI as a Tasker (perhaps organizing sources or cleaning a dataset), then shift to using the Draftsmith to help them find passive voice or other syntax issues in a paper, and finally draw on the Facilitator to find the holes in their arguments before submitting a final paper. Intentionally stating the role each assignment calls for and requiring them to state when they’re using the Tasker, Draftsmith, or Facilitator solidifies their understanding that their approach to AI differs by role.
- Be clear about where AI helps and where it doesn’t. AI can help with structure, suggestions, and ideation, but it cannot replace revision, critical thinking, or the development of one’s own voice. This must be intentionally taught by including questions requiring students to reflect on what worked and where the AI led them astray in any assignments allowing AI use.
- Encourage disclosure. I talk extensively about “Total AI Transparency”, ensuring students that I will note any use of AI in my communications with them and I expect the same from them. Trust has broken down as both students and instructors believe the other is using AI when they’re not and failing to recognize when they are. Owning up to my own use and being clear on why I used AI for that task encourages students to do the same.
Closing Thoughts
We need to steer students away from extremes of either refusing to use AI at all or letting it rob them of learning. By distinguishing the roles of Tasker, Draftsmith, and Facilitator, we clarify our expectations of students. More importantly, we help them become more deliberate in their approach, preparing them to be AI-fluent.
Illysa Izenberg is an Associate Teaching Professor for the Center for Leadership Education in the Whiting School of Engineering at The Johns Hopkins University. She has been teaching management and business ethics to graduate and undergraduate students via Face-to-Face, online, and blended courses since 2006. Izenberg earned her MBA from the Harvard Graduate School of Business and is the winner of both the Alumni and Pond Excellence in Teaching Awards (2016 and 2020 respectively).
References
Bussgang, J. (2025, May 20). 3 small steps to 10x your productivity with AI this week. The Experimentation Machine. https://experimentationmachine.com/p/10x-this-week
Duffy, A. (2025, April 11). Your CEO just said ‘Use AI or else.’ Here’s what to do next. Every. https://every.to/p/your-ceo-just-said-use-ai-or-else-here-s-what-to-do-next
Farri, E., & Rosani, G. (2025). HBR guide to generative AI for managers. Harvard Business Review Press.
Gross, A. (2025). AI Crash Course [Online course]. SectionAI. https://www.sectionschool.com/courses/ai-crash-course-workshop
Harbridge, R. (2025). AI for Team Leaders [Online course]. SectionAI.
Harvard Business Publishing. (2025). Help your team harness generative AI (Lesson 1). In Leading with generative AI [Online course]. Harvard ManageMentor.
Purohit, R. (Host). (2025, June 18). The man inside the minds of the people building AGI [Audio podcast episode]. In AI & I. Every.
Solis, B. (2024, November). Train your brain to work creatively with Gen AI. Harvard Business School Publishing.