Learn how to make informed, responsible decisions about using AI tools like ChatGPT and Microsoft Copilot for studying or completing academic work.
AI can be a helpful learning partner when used with intention and integrity. These tips will help you decide when and how to use AI effectively, ethically and in line with U of T’s academic expectations.
How learning works
Key principles to keep in mind about learning when using AI:
- Learning gains don’t always match our perceptions
It’s easy to feel like you’ve learned something, but true understanding usually requires testing yourself or applying what you’ve learned on your own, without any aids. - Effective learning requires time and practice
Shortcuts might be appealing but often lead to more work later on. Investing time and effort up front ensures a deeper understanding and long-term retention. - Learning benefits from desirable difficulty
Experiencing some struggle while studying is crucial for durable, long-term learning. This effort helps build the deep understanding necessary for success at U of T and beyond. - Learning is iterative and built upon reflection
By evaluating what you understand and identifying gaps in your knowledge, you can build deeper comprehension over time – especially with complex ideas.
Using AI can bypass these valuable principles; use it to support, not replace your own deep, active learning.
How to use AI tools to support your learning
1. Be intentional
Before using an AI tool, take a moment to ask what kind of academic support you really need. That could be understanding a concept, organizing your ideas or doing research.
Ask yourself why you’re using an AI tool to make sure it aligns with your learning goals.
Tips
- Skill-building vs. deskilling
Are you developing the skill (e.g., summarizing, analyzing, comparing) that your course requires or are you offloading that to the AI? - Focus on process, not outcome
Use AI to support your learning process (e.g., breaking down a task or generating practice questions), not to produce an outcome (e.g., a completed paper or a finished problem-set). - Practice prompting
The more time you spend refining prompts (the instructions or questions you give an AI tool) in conversation with AI, the more relevant an output is likely to be.
2. Know the limits
AI tools can be helpful, but they have limits. They rely on patterns and predictive language models, not actual thinking, which can produce biased or incorrect information.
Understand how AI works to help you use it more critically and responsibly.
Tips
- Think critically
AI tools can fabricate answers and citations (called “AI hallucinations”). Maintain a healthy skepticism, verify the information provided and draw your own conclusions. - Evaluate outputs
Check AI’s output against more-reliable sources like your course materials, readings and instructor’s guidance. This is especially important when dealing with new or complex topics. - Engage in discussion
Challenge the AI tool by asking it to justify its answers, cite its sources and explain its reasoning. Then, critically engage with the information by considering your own counterarguments and ideas.
3. Act with integrity
Using AI in your coursework comes with responsibilities. You share a responsibility to know what’s allowed in your courses, departments and programs and to ask questions when you’re unsure.
Know the rules around academic integrity and AI to help you avoid academic misconduct.
Tips
- Know what is permitted
Check U of T’s Code of Behaviour on Academic Matters, graduate and undergraduate guidelines and your course syllabi for specific permissions and expectations regarding AI use. - Ask for clarification
Unsure about the rules on AI in your course? Talk to your instructors before you submit any work. Being upfront can help you avoid unintentional misconduct. - Be transparent
Ensure that all work submitted for tests, exams and assignments is your own. Using AI tools to generate your work can be academic misconduct, as it’s an offence to present any idea, expression of an idea, or work of another — including that created by AI — as your own.
4. Protect your privacy
Be careful about what you share with AI tools, especially free or open-source ones. Some tools collect and use your input to train their models.
Protect your privacy and intellectual property by keeping personal, academic or sensitive information—yours or someone else’s—out of your prompts.
Tips
- Share data with caution
Avoid using identifiable or private data in your prompts. Anything you post or upload, e.g., your own research and writing, may remain online indefinitely or be used to train AI models. - Be mindful of intellectual property
Avoid uploading or sharing content that belongs to your instructors without their permission. This includes lecture notes, assignments, journals, and any other copyrighted material. - Use approved tools
At U of T, you can use Microsoft Copilot to create learning and study aids. When you log in with your UTORid, you access a private enterprise license, so your prompts are not used to train the AI model, helping to protect privacy and intellectual property.
5. Check your learning
AI can’t replace the benefits of summarizing, solving problems or analyzing on your own.
Check or test your learning, independent of technology.
Tips
- Embrace desirable difficulty
If using AI made the learning process feel too easy, that might indicate that durable learning didn’t occur. Some struggle with difficult concepts can enhance your long-term retention. - Be the “human in the loop”
Use AI as a tool for exploration, but make sure you’re driving the process. You have the final say in what AI-generated content you choose to use or dismiss. - Practice without AI
After using AI for support, try to complete the academic task or activity on your own to reinforce your learning.
6. Seek support
Struggling with academic work can feel frustrating, but you’re not alone in working through it.
Reach out to your instructors or campus support instead of relying on AI to do the work for you when you’re feeling stuck or overwhelmed.
Tips
- Connect with a Learning Strategist
A Learning Strategist’s insights can help ensure you’re using AI tools ethically, effectively, and in a way that’s supportive of your learning. - Attend an AI and Learning workshop
Want some hands-on practice using AI tools? Our workshops focus on practicing learning strategies using AI. - Consult resources
Like good learning, using AI tools effectively takes time, curiosity, and practice. Stay informed by experimenting with AI tools and engaging with new ideas and perspectives.
Resources on using AI in your academic work
“AI can do your homework. Now what?”
(YouTube, Vox)
- AI education and use of GenAI for students
(University of Toronto) - Best Prompt Engineering Courses
(Coursera) - ChatGPT and Generative AI in the Classroom FAQ
(Vice-Provost, Innovations in Undergraduate Education, University of Toronto) - Citing Artificial Intelligence (AI) Generative Tools (including ChatGPT)
(University of Toronto Libraries)
- “Getting started with AI: Good enough prompting.”
(Ethan Mollick, One Useful Thing) - Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses
(School of Graduate Studies, University of Toronto) - Using ChatGPT or other generative AI tools on a marked assessment
(Academic Integrity, University of Toronto)
Additional resources:
- Using GenAI: Students
(UTSC) - Generative AI Guide for Students
(UTM Teaching and Learning Resource Hub) - Use artificial intelligence intelligently
(U of T Information Security) - Generative AI tools and Copyright Considerations
(U of T Libraries)
The ideas presented on this page were developed by a team of learning strategists at the Centre for Learning Strategy Support, drawing on both research and practice. Some examples and wording were refined or expanded using AI tools like Copilot, then edited and proofed by our team to ensure accuracy and clarity.
References
Abbas, M., Jam, F. A., & Khan, T. I. (2024). Is it harmful or helpful? Examining the causes and consequences of generative AI usage among university students. International Journal of Educational Technology in Higher Education, 21(1), 10. https://doi.org/10.1186/s41239-024-00444-7
Almufareh, M. F., Kausar, S., Humayun, M., & Tehsin, S. (2024). A conceptual model for inclusive technology: advancing disability inclusion through artificial intelligence. Journal of Disability Research, 3(1). https://doi.org/10.57197/JDR-2023-0060
Bond, M., Khosravi, H., De Laat, M., Bergdahl, N., Negrea, V., Oxley, E., Pham, P., Chong, S. W., & Siemens, G. (2024). A meta systematic review of artificial intelligence in higher education: A call for increased ethics, collaboration, and rigour. International Journal of Educational Technology in Higher Education, 21(1), 4.https://doi.org/10.1186/s41239-023-00436-z
Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20(1), 43.https://doi.org/10.1186/s41239-023-00411-8
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences, 116(39), 19251-19257. https://doi.org/10.1073/pnas.1821936116
Eaton, R., & Osborne, A. [University of Bath]. (2024, November 22). Making critical thinking visible in a world of invisible AI [Video]. YouTube. https://www.youtube.com/watch?v=GQODxpJ1lmQ&t=689s
Jacob, M. [Digital Enhanced Education Webinars]. (2024, November 22). Critical thinking at the heart of Gen AI literacy [Video]. YouTube. https://www.youtube.com/watch?v=yr0-RLGZshg&list=PLAbF8wnSF-e-jnk358eRLrwUyDe-Xm9Id
Mollick, E. R., & Mollick, L. (2023). Assigning AI: Seven approaches for students, with prompts. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4475995
Paulson, E. (2025, March 25). Describe and document AI use. Conestoga College Faculty Learning Hub. https://tlconestoga.ca/describe-and-document-ai-use/
Yilmaz, R., & Yilmaz, F. G. K. (2023). The effect of generative artificial intelligence (AI)-based tool use on students’ computational thinking skills, programming self-efficacy and motivation. Computers and Education: Artificial Intelligence, 4, 100147. https://doi.org/10.1016/j.caeai.2023.100147