As a student, there are both practical and ethical considerations around using any technology in school, including AI like ChatGPT.
Is my use of AI tools like ChatGPT ethical?
Using AI tools to create text that you copy and paste as your own is unethical and considered plagiarism in an academic setting – and many others as well.
However, using tools as a part of the stages of your own writing process can be ethical such as:
When prompted with a topic, AI can provide ideas which give writers a place to start when staring at a blank page.
It can perform basic tasks that help with organizing your writing like creating an outline or helping you develop a research question.
AI tools can be very helpful with structural work such as reviewing grammar and punctuation.
Is using AI (like ChatGPT) the best use of my time?
The use of AI tools such as ChatGPT will never be the best use of your time if the use is unethical.
Many times, it doesn't produce very good or interesting text because it is not being “creative” or “critically thinking” but predicting the most usable response
It can be wildly inaccurate and biased which can lead to a poor written product. Inaccuracies are called “hallucinations” and can include citations, sources you might ask it to cite, or even specific facts that are wrong.
Universities, and many other institutions, are creating policies for how the use of AI should be disclosed and cited. These are some helpful guidelines from Scribbr.
You should always review your course syllabus for guidelines on using AI in your classes. You may need to disclose your use of AI for any part of your writing or research process.
In general, if you use AI to gather primary sources for evidence, you should absolutely cite your use of a tool like Chat GPT.
In general, if you use AI to do simple things like identify a synonym for a word or a locate a fact, you most likely do not need to cite that use.
Template for Disclosing AI Use
Heinrich Niemann from the Department of Schools and Education of the State of North Rhine-Westphalia, Germany has created a template for how to potentially disclose the use of AI:
In producing this text [or image or programming code, etc.]] X [= the name of the AI-assisted tool] was used. I controlled the AI with the following prompts: 1.______________ 2. __________
There are concerns about who becomes the “author” of information created by artificial intelligence tools.
These are just some questions society is asking about information ownership and AI:
Does crafting a prompt make someone an author?
How does the authorship of the person who created the material that the AI is trained on factor into copyright?
Is AI artwork an infringement on the art used to create a “new” image?
How much assistance from AI would qualify as a co-author on scholarly research?
Check out the links below for some commentary and examples around these questions