Skip to content
Home » Blog » AI at University: Smart Help vs Plagiarism and Cheating

AI at University: Smart Help vs Plagiarism and Cheating

Let’s be real, figuring out what you can and can’t do with AI at university is confusing. One lecturer says it’s okay to use for some things. Another threatens disciplinary action if you even think about ChatGPT.

In general, AI can be used to brainstorm ideas, and explain complicated university jargon or assignment instructions, but it starts to cross the line into cheating when you use it to source information or write your assessments.

So where exactly is the line between smart help and plagiarism or cheating? The truth is, it depends on what you’re using AI for, and your university’s policy (which can change between subjects). But to make it easier, I’ve broken it down into three categories: ✅ things that are definitely okay, ⚠️ things in the grey zone, and 🚫 things that will get you flagged for plagiarism or academic misconduct.

You can also watch my video on using AI at University here:

What You Can Use AI For

AI is an amazing thinking tool, the best way I can explain it is to use AI like you’d use a study buddy. Talk to chatbots, bounce ideas off them and ask if they understand something that you don’t (yet) and if they can explain it to you.

Here are things that are 100% okay at most universities (I say ‘most’ because there are always some people out there that think a zero-AI policy is the way to go, but let’s be honest they’re dinosaurs that are just ignoring technology at this point):

1. Brainstorming Ideas

One of the best ways to use AI tools like ChatGPT at uni is for brainstorming. If you’ve been given an open-ended question and don’t know where to start, AI can help you come up with possible angles, key points, or even real-world examples to explore.

Think of it like bouncing ideas off a friend, only this one is available 24/7 and won’t get distracted halfway through. It’s a great way to:

  • narrow down a topic
  • look at different ways to approach a question (or come up with different angles)
  • kickstart your thinking when you’re feeling stuck

Example:

Just remember: AI doesn’t know your specific assignment. You still need to double-check that your approach fits the assessment your lecturer whats you to complete.

2. Understanding Assignment Instructions

Let’s be honest, assignment instructions aren’t always easy to follow. I often found them confusing or not super clear so that I never felt like I knew exactly what I needed to do for an assessment. Uni-speak like “critically analyse,” “evaluate,” or “discuss with reference to scholarly sources” can seriously leave you staring at the page wondering what you’re actually meant to do (at least it did for me).

AI tools like ChatGPT can help translate those confusing instructions into everyday language so you know exactly what’s being asked, and then you can get started writing faster.

It’s especially helpful if:

  • your assignment brief is vague or full of technical jargon
  • you’re not sure what level of detail is expected
  • English isn’t your first language
  • you just want a second opinion before getting started

Example:

This kind of breakdown helps you actually understand what the question is asking, so you’re not guessing and only finding out later when you get your marks back that you didn’t actually understand what your lecturer wanted. The other bonus is that once you get used to what these uni terms mean, you’ll be much quicker at figuring out what your lecturer wants in future assignments.

The more you practise decoding assignment language, the easier it is I promise.

3. Explaining Academic Terms Or Concepts

Journal articles are full of academic (i.e. confusing and wordy) terms that no one ever really explains, but you’re expected to not only understand them, but also to use them in assignments. Whether it’s systematic reviews, epistemology, inductive reasoning, or discourse analysis, academic language can seriously make it hard to get started, let alone finishing an assignment on time and getting a good mark.

This is where AI can really help.

You can copy and paste in a term or a whole paragraph from a reading, journal article or lecture and ask AI to explain it like you’re five, or like you’re a tired uni student who just wants a straight answer.

It’s especially useful for:

  • making sense of tricky readings
  • understanding feedback from lecturers
  • checking you’ve got the right meaning before using a term in your work

Example:

This kind of breakdown helps you understand concepts, making it easier to follow what you’re reading and to use terms correctly in your own work.

4. Study Questions Or Prompts

AI is a real game changer when it comes to studying. It makes active recall study (the best study method – read more on it here) easy, so you’re not just passively reading your notes or highlighting a textbook and wasting your time.

You can use it to generate:

  • generate your own practice questions after a lecture
  • test yourself on key ideas and terms
  • reframe notes into flashcard-style Q&As
  • prepare for exam topics using different question types (short answer, multiple choice, etc.)

It’s like having a study partner who throws you questions to help you check what you understand and what you need to go over again.

Example:

This helps you actively recall what you know and gives you something way more useful than just re-reading notes (I don’t know about you, but I always feel better when my study session is productive rather than trying to stay awake while reading endless notes).

The Grey Areas Of AI (AKA: Use With Caution)

This is where things get murky with AI at university. The generative tools are helpful, but how you use them or how much you use them matters.

1. Asking AI To Find Sources For You

Getting chatbots to find sources of information (e.g., journal articles) sounds like a good idea, but AI is actually completely unreliable for this. It can make up citations and information (a phenomenon called hallucination), or pull from dodgy websites, and yes I’m talking about ChatGPT, Bard Copilot, Gemini etc, it doesn’t matter if they can search the web in real time, they’re all as bad as each other.

In fact, these AI tools have a track record of about 67% accuracy (at best) when it comes to getting sources correct. If you’re asking a generative AI chatbot to give you references or journal articles, you need to be smart about it:

  • Always ask for URLs or DOIs so you can check them yourself.
  • Limit searches to trusted databases like Google Scholar, PubMed, or JSTOR.
  • Never copy and paste references directly into your bibliography without checking every detail.

Some universities or lecturers might be fine with you using AI to suggest places to look, while others see it as outsourcing the research process, which, let’s be honest, it kind of is.

My recommendation is to always check your assignment brief, subject outline, or ask your tutor about your subject and specific assessments, and whether using AI to source information is okay or not. I’ve been researching for years, so for me it’s actually faster to do my own research rather than using AI. I know what I’m looking for and how to find it quickly, which is a better way to spend my time rather than trying to wrestle with ChatGPT’s false information and getting angry with it (yes I know it’s silly to get mad at tech…). But researching takes practice, and that’s something you will never get if you always use AI (just something to think about).

Example:

That’s why you must double-check everything, which is when it can start taking longer than if you did the research yourself.

ScholarGPT is slightly better than some other AIs, and one way you can use it more effectively for this task is to get it to give you a list of authors that are authorities on the topic you are studying/researching/working on. That way you can just look up those authors on Google Scholar and you’ll immediately have access to all of their work and you can get started on your assignment faster.

2. Getting AI To Rewrite Your Work

Using something like Grammarly to check spelling, grammar, or sentence clarity is usually fine. Most universities treat these tools like spellcheckers, a way to polish your work, not replace it. It’s no different to using the Editor in Microsoft Word for spelling and grammar.

But when you go that step further and hit that tempting “Rewrite for me” button, the game changes.

Even if you wrote the original text yourself, if AI is doing the rewriting for you, it’s no longer entirely your work. The line between helpful editing and ghostwriting gets blurry real fast. In terms of university (or academia) this is basically bordering on plagiarism.

This matters because:

  • The rewritten version might no longer reflect your own voice
  • You might not fully understand or be able to explain the writing
  • Some tools (like Grammarly or Quillbot) can completely restructure your paragraph, which can breach assessment policies

Example:

It also creates a bigger issue. If you can’t explain what you wrote or why you said it that way (especially in presentations, exams or future assessments), it becomes clear you didn’t fully understand or write it yourself.

The safest approach is to use tools that highlight suggestions and give you the option to improve your writing, rather than rewriting it for you. Then you can make the changes you want to make, and have it still sound like you.

3. Using AI Tools In Coding Or Technical Assessments

I’ll be the first to admit that I use Chat to help me write code for my website. But that’s for personal use, and helping me in the same way that hiring a person to code it would be (I’m not expected to know how to do everything!).

But while using AI tools like ChatGPT, Codex, or GitHub Copilot feels like the smart and fast way to get your coding assessments done the tasks you get them to do can either be ‘okay’ or cross the line into red flag territory 🚩. If AI is doing the logic, structure, and syntax for you, that can quickly cross the line into academic misconduct. This matters because:

  • You’re submitting work that may not reflect your actual understanding
  • If asked to explain it, you might not be able to
  • It undermines the whole point of the assignment, which is to develop problem-solving skills you’ll need in the real world (and in your real job that’s paying you $$$ for your skills)

Example:

I know it’s annoying, but try not to get AI to write your code, especially if you plan on using code in your career, you’ll want to build those skills while you’ve still got the safety net of uni.

4. Not Checking Assignment Rules

This is one of the easiest ways to get into trouble. Some assignments or classes will clearly state “AI use is not permitted” or something similar like, this one:

Others might say it’s okay for brainstorming but not for writing. And some won’t mention AI at all, leaving you to figure it out. That’s the frustrating part. There’s no consistent rule across universities, or even within the same degree. One subject might encourage AI use, while another will treat it as academic misconduct.

So here’s what you can do:

  • When in doubt, play it safe. If you’re unsure, only use AI for idea generation, clarification, or spell checking, not writing or sourcing.
  • Read your assignment instructions carefully. Look for anything that mentions AI use, originality, or authorship.
  • Check your subject outline or course guide. Some now include an official section on generative AI.
  • Ask your lecturer or tutor directly. It’s better to be clear upfront than assume something’s okay and get flagged later.

Example

Unfortunately the onus is on you to double-check every class and every assessment. It’s annoying, but it’s the only way to stay safe while AI rules are still evolving.

What You Shouldn’t Use AI For

If you use AI for any of the following in university, you’re risking plagiarism, academic penalties (like reduced grades, or even getting kicked out), and worse, missing out on the learning component (which is actually important when it comes to progressing your career after uni):

  • Asking AI to write full paragraphs or entire assessments
  • Feeding in dot-points and asking for “a good version”
  • Copying and pasting AI-generated summaries, explanations, or citations
  • Submitting answers you don’t understand or couldn’t explain
  • Getting it to rewrite your thoughts so heavily they’re no longer yours

Using AI-generated Text Or Code In Your Assignments

This one’s simple: if you copy something written by an AI tool and submit it as your own work, that’s plagiarism.

Even if you change some words or tweak the structure. Trust me, as a marker, we can usually tell. Even if AI-generated writing “sounds academic,” it often lacks depth, context, or relevance (we call it fluff). If the original wording came from ChatGPT, Copilot (or any other AI tool), and you don’t clearly cite it, you’re essentially lying about who wrote it. That’s a textbook definition of plagiarism. It’s the same as sharing content on social media, you always need to give credit to who came up with the content.

BUT here’s the catch: even if you do cite it, many universities still don’t allow AI-written content in assessments. So even being honest about it won’t necessarily save you.

Why it’s a problem:

  • It misrepresents the work as your own original thinking
  • It bypasses the learning process your assignment was designed to develop
  • AI content can be vague, generic, or incorrect, and you’ll still be responsible for it (I added some to this post – see if you can find it… it’s pretty obvious!)
  • Most universities consider using generative-AI for writing any of your assessment as academic misconduct unless explicitly allowed

Example:

Citing AI tools

You can cite generative AI tools using APA or MLA (e.g. “OpenAI, 2023”), but that doesn’t mean your uni allows it. You still have to check your assignment brief or ask your tutor.

For example, in APA (7th edition) your citation would be (OpenAI, 2023) or according to OpenAI (2023), and the reference would be:

Reference list:

OpenAI. (2023, March 14). ChatGPT (Mar 14 version) [Large language model]. https://chat.openai.com/chat

Unless you’ve been told it’s okay to include AI-generated text, the safest option is don’t copy and paste at all. Use AI to think, chat and help, not to write.

✅ Okay⚠️ Grey Area🚫 Not Okay
BrainstormingAsking for sourcesCopy/pasting AI-generated text
Explaining termsHeavy rewriting/paraphrasingGetting AI to write for you
Clarifying tasksCoding help (depends on use)Submitting answers you didn’t write or understand
Study questionsNot checking your unit’s AI rulesUsing fake AI citations

The real cost

Using generative-AI to write or create your assessments isn’t just risking misconduct penalties. You’re also missing the chance to build essential skills like writing, critical thinking, and analysis. Those are the exact skills employers expect when you graduate, and AI won’t always be allowed in the workplace, especially in government or confidential roles because of privacy laws.

The best way to use AI is like a friend or tutor, not a ghostwriter. If you’re not doing the thinking, you’re not doing the learning (and yes, I know that makes me sound like an old fogey, but I’m not wrong).

Here’s some free tools you should be using though.

Leave a Reply