Plagiarism and Use of AI

The College of Idaho maintains that academic honesty and integrity are essential values in the educational process. Earning a Doctor of Medical Science degree requires the development of scholarly writing and critical appraisal skills, among other competencies. Engaging in plagiarism is an example of academic dishonesty and violates the College’s Honor Code.

Plagiarism is the presentation of another's product, words, ideas, or data as one's own work. When a student submits work for credit that includes the product, words, ideas, or data of others, the student must use proper and complete citation and reference in accordance with AMA style guidelines.

Students may not use content sourced from generative artificial intelligence (AI) tools and applications as a substitute for their own academic work nor claim ownership or authorship of any content generated by these tools. However, for purposes of the College’s DMSc program, students may use AI applications, including but not limited to ChatGPT, Claude, and Microsoft Copilot, to augment development of doctorate-level competencies but must do so with appropriate caution.

The following outlines the DMSc program’s policies concerning plagiarism and use of generative AI applications, including circumstances when student AI use is allowable and forbidden, warnings about the limits of AI, and requirements for disclosing AI use.

Allowable AI Use

AI applications may be used to complement personal creativity.  Students may use AI applications in the following circumstances:

  • To brainstorm and generate ideas for assignment topics or components that students develop further through critical assessment and creativity.
  • To edit written material using communication assistance programs like Grammarly.
  • To supplement literature reviews using AI Research Assistants that search across a defined body of academic papers, like Elicit or Consensus.
  • To create images for use within assignments using image-generating AI applications.
  • To create slide shows for use within assignments using slide-generating AI applications like Canva.

Basic tools used to check grammar and spelling and to manage citations are allowed as they are not considered generative AI applications.

When AI applications are used to develop allowable content for DMSc assignments, students must disclose which AI applications were used, and if applicable, include a summary of the associated prompts (see “Disclosing AI Use” below for more information).

It is NOT permitted for students to use AI-generated content as a substitute for personally generated written content.  In other words:

  • Students are not permitted to use AI applications to generate written content to submit for credit for any assignments associated with DMSc degree completion.
  • While students may use applications to edit written material, students may not use applications like Grammarly to paraphrase content created by others.  For example, students may not copy a sentence from a publication and use Grammarly to paraphrase it.  Students must competently paraphrase this type of material on their own.

Using AI-generated text within written assignments constitutes academic dishonesty and shall be considered plagiarism.  The reason for this policy is as follows:

  • Cultivating doctorate-level writing and critical analysis skills requires effort and practice.  Replacing this effort with AI-generated content denies students the opportunity to develop and execute doctorate-level competencies. 
  • Individuals who take credit for the work of others demonstrate a lack of integrity.  Integrity requires giving appropriate credit.  AI-generated text commonly lacks reliable source information, making appropriate attribution of credit difficult and sometimes impossible.
  • To maintain academic integrity, students may only claim credit for personally generated work and provide appropriate attribution for the products, work, ideas, or data created by others in accordance with the American Medical Association (AMA) Manual of Style.

AI Limitations

Generative AI applications have three key limitations that are important for all students to understand:

  • Source Content – Text created by generative AI applications like ChatGPT, Claude, and Copilot is created from indiscriminate assimilation of material contained on the internet. Because the internet contains unregulated and potentially biased material of questionable quality and accuracy, AI-generated text should be suspicious of being inaccurate, low quality, and biased. Students are encouraged to confirm the accuracy of all AI-generated ideas and to critically assess the quality and risk of bias of all AI-generated content.
  • Hallucinations – AI hallucinations describe the tendency of AI applications to generate fake information in response to prompts or inquiries. Examples include providing nonexistent articles when asked to provide references for generated text. Generative AI applications, such as ChatGPT, Claude, and Copilot, have a high risk of hallucinations. Generative AI applications with a low risk of hallucinations include AI Research Assistants like Consensus and Elicit, because these draw from a defined body of academic papers. Regardless of the type of AI application used, students should always confirm the accuracy of AI-generated output.
  • Confidentiality – Students must be diligent in avoiding disclosure of confidential information when interacting with an AI application. In accordance with the Health Insurance Portability and Accountability Act (HIPAA), DMSc students should not enter patient protected health information into any AI applications. It is also important to avoid sharing content from a student’s educational record, in accordance with the Family Educational Rights and Privacy Act (FERPA). Violating HIPAA or FERPA regulations can result in serious legal consequences.

Disclosing AI Use

Students should maintain appropriate transparency and disclose when AI applications were used to create allowable assignment content. Students who use AI to brainstorm assignment ideas or edit written material do not need to disclose AI applications. AI use must be disclosed in the following circumstances:

  • When AI Research Assistants or other AI applications are used to conduct a literature review or find publications for use within a DMSc assignment.
  • When AI applications were used to create images, slide shows, or other allowable content for use within a DMSc assignment.

How AI applications are credited depends on how students deployed them. In general, students should err on the side of over-disclosure. When required, students should include a description of the content that was created (i.e., literature searches, images, slide shows, etc.), the name of the AI application along with the version and its manufacturer. Specific circumstances are as follows:

  • AI applications used to augment literature reviews should be disclosed within the appropriate section of the written assignment. For example, if the assignment includes a methods section, students should include the name of the AI application, version, and manufacturer and an example of the research prompt used. If disclosure requirements for literature reviews are unclear, consult with the associated faculty member for clarification.
  • The name, version, and manufacturer of AI applications used to create images should be credited within the footnote of the image.
  • AI applications used to create slide shows should be credited on the reference page of the slide show. The name of the AI application, version, and manufacturer should be AMA-formatted as a reference and should appear as the last entry on the reference list (see Section 3.14.3 of the AMA Manual of Style for appropriate reference entries for apps).

Student Accountability

When a student submits work for credit that includes the product, words, ideas, or data of others, the source must be acknowledged by the use of complete, accurate, and specific references, such as footnotes. By placing one's name on work submitted for credit, the student certifies the originality of all work not otherwise identified by appropriate acknowledgments. A student will be charged with plagiarism if there is not an acknowledgment of indebtedness. Acknowledgment must be made whenever:

  • One quotes another person's actual words or replicates part of another's product.
  • One uses another person's ideas, opinions, work, data, or theories, even if they are completely paraphrased in one's own words.
  • One borrows facts, statistics, or other illustrative materials, unless the information is common knowledge (already published in at least three other sources without citation).
  • One uses AI application to create allowable assignment content.

While programs exist that detect AI-generated text within student assignments, their accuracy is lacking and the risk of falsely accusing students of using AI-generated material is too great to justify their use. As such, The College of Idaho’s DMSc program will not use AI detection programs on student assignments. Faculty members may still apply professional expertise and critical assessment to the determination of originality on student assignments and may report concerns for AI-generated text within written assignments to DMSc program leadership for consideration.

Whether they use AI applications or not, DMSc students are ultimately held responsible for the accuracy, originality, and appropriate attribution of material contained within every assignment that they submit for credit.

Penalties

Though the burden of proof of student academic dishonesty or misconduct lies with the faculty member, the imposition of penalties is also the responsibility of the faculty member. The consequences of plagiarism vary based on whether the incident is a first, second, or third occurrence. Faculty members have the discretion to require that the student repeat the assignment or exam, or may give a failing grade for the assignment, exam, or courses, or may otherwise deal with the academic dishonesty in a manner they determine to be appropriate within the context of their course.

A first occurrence of plagiarism is generally believed to result from inexperience and/or a lack of familiarity using AMA guidelines or AI-generated material and is perceived as a misuse of sources, and the sanctions for a first offense generally include, but not limited to a grade of zero on the assignment or resubmission of the assignment for a reduced grade. A second occurrence of plagiarism is a more serious academic offense and is not attributed to naiveté, ignorance of guidelines, or a misunderstanding of what constitutes acceptable graduate scholarship at the College, and therefore, the sanction includes, but is not limited to a failing grade in the course. A third occurrence of plagiarism is seen as a student’s chronic inability or refusal to produce acceptable graduate-level scholarship. In such cases, the matter is referred to the Dean of Graduate Studies or their designee for disciplinary proceedings pursuant to the College’s “General Student Conduct Procedures” as articulated in the Student Handbook, and the sanction includes but is not limited to dismissal from the program.