Copilot: An AI Tool Meant to Assist Individuals with Writing
What is Copilot?
Microsoft Copilot is an AI-driven writing assistant that can help individuals working on drafting papers, emails, and more. It can create drafts of content, suggest different ways to word something that's been written, suggest and insert images or banners, and create PowerPoint presentations from the content of Word documents.
How can I use Copilot?
Copilot has many uses, including:
- Text Generation: Copilot can be used to compose a short poem, an introduction to a cover letter, or text for an email to a coworker.
- Coding: Copilot can be told what you want your code to do, and which language you want it in, and you'll get formatted code back in return that you can copy and paste elsewhere.
- Image Generation: Copilot is able to generate images as well. Tell it what you want a picture of, and in what style, and after a few moments you'll get a choice of four options—together with follow-up ideas you can use to change the output.
- Assistance with App Use: One basic task that Copilot is able to do is open up programs for you—just type "open" followed by the app name, and your bidding is done. As well as opening the apps in question, Copilot can also tell you how to use them or help with troubleshooting problems. A few suggestions along these lines will pop up every time you open up a program in Windows 11, so it can be a useful way of exploring what a particular software tool can do for you.
Watch the following video to learn more about the uses and applications of Copilot:
Why should I take care when using Copilot?
While generative AI tools like Copilot can help users with such tasks as brainstorming for new ideas, organizing existing information, mapping out scholarly discussions, or summarizing sources, the outputs they provide are not always entirely factual or grounded in the best research strategies. To the contrary, Copilot has been known to hallucinate, or "describe false information created by the AI system to defend its statements." Oftentimes, Copilot will generate outputs without qualifying the accuracy of the information it provides, and it has been known to confidently provide responses to queries that nonetheless consist of partially or fully fabricated citations or facts.
The use of generative AI tools like Copilot in academia has also raised many concerns about academic integrity and the respect of intellectual property. One area of academic integrity affected by generative AI tools like Copiliot is that of plagiarism.
Plagiarism is typically defined as taking someone else's work or ideas and passing them off as one's own. While Copilot is not considered to be an author in the traditional sense of the word, the act of using the outputs of Copilot or any other AI tool in one's own work without giving it proper credit in the form of a citation is still regarded to be plagiarism because the work is still not one's own. While different professors may have different policies on what constitutes acceptable use of generative AI tools, it is nonetheless important for every student to consult the syllabi for all their classes so that it is clear to them as to when it is expected that they cite the outputs of AI.
Where can I find more information about Copilot
For more information on Copilot AI, please refer to the FAQ page for this generative AI tool.
References
Georgetown University Library. (2024, November 21). Artificial intelligence (generative) resources: Ethics in AI. Click this link to view this resource
Last updated December 4, 2024
Back to top