Anthropic published a report that focuses on how educators use Claude at university — not just students. What do they do with the tool day to day and what concerns come up? The document was published on August 27, 2025 and combines analysis of conversations with qualitative work with professors. (anthropic.com)
Main findings
The team analyzed roughly 74,000 conversations from university accounts in Claude during May and June, and complemented that data with interviews with 22 faculty members from Northeastern University. These numbers give us an empirical snapshot of how professors use the tool, though there are sampling limits. (anthropic.com)
The three most common uses that emerged from the data were:
- Developing course plans and materials: 57% of identified conversations. (anthropic.com)
- Academic research: 13%.
- Assessing student performance: 7%.
Additionally, a study cited in the report suggests AI tools are saving educators time, with an average reported 5.9 hours per week according to a Gallup survey. (anthropic.com)
How educators use Claude in practice
Have you ever imagined turning an idea into an interactive simulation in an afternoon? Some professors are already doing that. They use the Artifacts
feature to build educational games, lab simulations, quizzes with automatic feedback, and dashboards that help visualize complex data. These are creations that used to require lots of time or technical resources. (anthropic.com)
You also see clear administrative uses: drafting recommendation letters, preparing meeting agendas, generating minutes, and even supporting budget planning. In class, AI shows up as an idea partner, helping explain concepts from different angles or designing materials tailored to various levels.
Augmentation versus automation
A key takeaway from the report is the distinction between using AI as a partner (augmentation) or delegating whole tasks (automation). For creative, high-context tasks like designing lessons or writing research proposals, educators prefer to keep control and use AI as support.
By contrast, routine tasks like financial management or student record keeping show stronger trends toward automation. In particular, interactions related to grading showed a pattern where 48.9% were classified as heavy automation, even though educators consider AI less effective in that area. (anthropic.com)
Does this mean AI will replace the teacher? Not necessarily. What we see is partial delegation for administrative tasks and collaborative use for work that requires professional judgment.
Ethical dilemmas and study limitations
The report is candid about its limits: the sample comes only from accounts with university emails, focuses on higher education, and likely captures educators already inclined toward technology. It doesn’t include K–12 and doesn’t necessarily reflect what happens on other platforms. This limits how much we can generalize the results. (anthropic.com)
Also, automating grading raises ethical and pedagogical questions: who owns the evaluation? How do we ensure accuracy and fairness in feedback generated by models? Several professors in the study expressed skepticism about fully delegating assessment.
What you can do if you’re an educator, administrator, or policymaker
- Try AI on time-consuming tasks that don’t compromise core pedagogical values.
- Stay in the loop: use AI to draft or analyze, but review and contextualize the results.
- Redesign assessments to promote skills AI can’t easily replace, like critical thinking and teamwork.
- Develop clear guidelines on transparency and responsible AI use at your institution.
These recommendations don’t eliminate complexity, but they help make AI adoption practical and ethical.
Looking ahead
The report shows educators in an experimental mode: building tools, rethinking assignments, and negotiating ethical boundaries. This isn’t an instant shift; it’s an adaptation process where technology expands capabilities but also demands responsibility.
If you want to read the full report, Anthropic published it on their site with more details and charts. (anthropic.com)
The relevant question is no longer whether AI will come to the classroom. The question is how the education community integrates it to improve learning without losing quality or ethics.