Generative AI at Stetson University

Stetson recognizes that generative artificial intelligence (GenAI) technologies may present risks and rewards for the educational environment. On the one hand, GenAI can enhance teaching and learning and is likely to become an integral technology in many workplaces, making it imperative that employees and students are literate, experienced, and even sophisticated with generative artificial intelligence. On the other hand, GenAI’s potential for interfering with learning by producing and reproducing unfair bias and false information, invading privacy, causing environmental harm, and exploiting human labor calls for the wise use of GenAI, even in light of its rewards. The three sets of guidelines—a set for all employees, a specific set for faculty, and a specific set for students—are meant to provide guidance that balances risks and rewards.

Students Faculty All Employees

student looking at laptop online

TIPS

Before getting started, you should know the following about using Generative AI at Stetson.

  1. Your faculty member has the say over what is allowed in your coursework. Make sure to read your syllabus! Read official guidance for students.
  2. Before you enter any data into GenAI, make sure you are not sharing any personal data, such as health data, grades, etc. Read more data tips.
  3. Thinking of subscribing to a GenAI product that Stetson does not have? Consider reviewing the AI tool usage guidance provided by IT.
  4. Know when to attribute or cite your use of Generative AI. The Writing Center or library may be able to help. Chat with the library or check the Artificial Intelligence Disclosure framework.
  5. Avoid overreliance on Generative AI in your educational experience in order to get the most from your courses at Stetson. When using GenAI, focus on verifying information for accuracy and evaluating the output.

Student Guidelines

These guidelines will help students use generative artificial intelligence tools safely and responsibly while at Stetson. They will assist students in developing AI literacy, which is important for both learning and preparation for a future career. As GenAI technology evolves, Stetson will update these guidelines.

  1. Generative artificial intelligence, or GenAI, is a type of artificial intelligence that uses deep learning to generate new and unique content (including text, images, sound, and video) in response to a prompt based on its training data. GenAI includes both stand-alone AI tools (e.g., OpenAI’s ChatGPT and Claude’s Sonnet) and artificial intelligence that is embedded in technology tools that you may be using already (e.g., Copilot within Microsoft Products and Gemini within Google products).
  2. It is possible to use GenAI to assist with completing various course assignments such as papers, reports, essays, discussion posts, and presentations. It might also be able to assist in creating resumes, cover letters, and materials for student organizations. Its ability to generate human-like communication is extensive.
  3. GenAI is rapidly evolving and advancing. Importantly, it offers both advantages and disadvantages for your education. You will also encounter GenAI in your career, personal life, and public interactions.

While at Stetson, students should learn as much as they can about the responsible and effective use of GenAI. When using GenAI as part of the educational experience, students should adhere to the following guidelines.

  1. To uphold academic integrity, follow the instructor's directions for GenAI use.
    • Syllabus statements, assignment instructions, or both, establish what is permitted or prohibited in the course and will be relevant to accusations of academic misconduct in situations involving GenAI misuse. For example, one course or assignment may prohibit copying material directly from a GenAI tool for course assignments. Another course or assignment may require that students cite AI-generated text when it is used. In another course, an assignment’s instructions might permit using GenAI for research activities or revision.
    • The scope of permissible GenAI use may vary from course to course, so always follow the specific instructions provided for each class and seek clarification from the instructor if you do not fully understand the permissible scope of use on any assignment. Students should not assume that what GenAI use is permitted or encouraged in one course applies to other courses. Always look for a statement in the syllabus or assignment instructions—or ask the instructor for guidance!
    • Honor System  Academic Honor Code (College of Law)
  2. Learn about GenAI while at Stetson. GenAI has both advantages and limitations; thus, you should approach this technology thoughtfully and use it responsibly. Students might learn about GenAI in coursework, and Stetson provides additional resources to help students learn about how to use GenAI sensibly, safely, and responsibly. Visit the GenAI website for more.
  3. Choose GenAI tools wisely. Not all GenAI tools are the same. Some require subscriptions, while others are free. Some are provided by Stetson while others are not. Each tool has specific terms and conditions for use. Before using any GenAI tool, understand how it works and how it handles the information you put into and generate with it. To help with thoughtful choices about using GenAI tools, Stetson offers information about the features of the GenAI tools it provides and some other commonly used GenAI tools.
  4. Do not share sensitive, private, or confidential information with GenAI tools. In some instances, GenAI can store your information, use your information for future training, or both. To keep your GenAI use safe, avoid sharing personal details, such as campus ID numbers, social security numbers, and health information, with GenAI tools. Additionally, it’s best to use University-provided AI tools that adhere to proper data protection protocols or opt out of allowing AI tools to use your data for training purposes.
    • Everyone in the University community, including students, must also protect University information from unauthorized disclosure and follow the University's Acceptable Use Policy and Generative AI Use and Privacy Policy when sharing University data with AI tools. Before sharing University information with a GenAI tool, you should know how AI tools handle and store the data they provide.
    • Tool Usage & Data Classification Guidelines  Data Classification Policy
  5.  Avoid over-relying or depending too much on GenAI tools in your University studies. Overreliance on GenAI means using it to create content without (1) understanding GenAI’s limitations and (2) understanding the generated content yourself. Examples of overreliance include:
    • Using AI to summarize reading assignments instead of reading them yourself.
    • Relying only on AI-generated ideas without exploring other sources of inspiration and information.
    • Submitting AI-generated answers without critically evaluating them.
    • Overreliance can hinder learning and stifle your critical and creative thinking and potentially interfere with learning foundational knowledge and skills. So, don’t let GenAI do the work you should do yourself. You can also review more information on how to avoid overreliance.
  6. Be aware that GenAI can produce inaccurate or biased content and take steps to address this. GenAI tools are trained on data that may include biased or false information. As a result, they may produce biased, misleading, harassing, or discriminatory information. They are also notorious for producing inaccurate content, including false information, faulty reasoning, and “deepfake” images. Your job is to verify the accuracy and appropriateness of the content generated by GenAI. You must also follow Stetson’s Acceptable Use Policy when using GenAI. You can also review more information on how to evaluate GenAI output for false, biased, and harmful information.
  7. Avoid violating others’ intellectual property rights when using GenAI tools. For example, some publishers and faculty may not allow their materials to be used with AI tools, and using their content could violate copyright. Many AI tools are trained on copyrighted information, raising concerns about whether this training infringes on copyright. Those legal issues are still unsettled. At a minimum, students must understand the terms of use for AI tools, know publisher limitations on using their intellectual property, and avoid copyright violations. You can also review more information about copyright protections
  8. When using GenAI for research, faculty, staff, and students should do so responsibly. Researchers should:
    • Disclose plans for GenAI use for research in the Institutional Review Board Protocol Description form for methods that are relevant for the IRB protocol (e.g., stimulus generation, transcription, experimentation, etc.).
    • Avoid uploading information on research subjects without IRB approval. For example, some GenAI tools pose a risk that qualitative research data (e.g., interview data) could become public and potentially allow research subjects to be identified. As part of your IRB submission, you will be required to confirm that any use of AI for transcription does not pose any risks to confidentiality of data, the release of sensitive information, or any violation of participants’ privacy.
    • Be transparent in reporting GenAI use and acknowledge that use where appropriate.
    • Verify the accuracy of AI-generated content and exercise caution when incorporating GenAI output into research.
    • Follow the policies of publication outlets and funding agencies when publishing research or applying for research grants.
    • Follow University IRB policies.
    • Avoid uploading unpublished research or other confidential information into GenAI tools that do not protect that research from disclosure, store the research data, or use that research for training the tool.

Faculty Guidelines

These guidelines provide direction to faculty on the safe, sensible, and responsible use of artificial intelligence generative tools, or GenAI. In addition, they provide a framework for the University’s intentional efforts to provide resources for faculty GenAI literacy development and support. Stetson acknowledges that society is in the early period of GenAI development and that these guidelines will likely need regular review and revision as technology changes.

Faculty should also review the Employee Guidelines for GenAI Use that apply to all employees.

  1. GenAI will never replace outstanding teaching, service, and scholarship. Faculty, however, can effectively use GenAI to support and enhance teaching, learning, research, service, and administrative activities. Students can use GenAI to help them produce an array of University course assignments and co-curricular and personal materials. In the educational context, GenAI’s capacity to create human-like communication is extensive.
  2. GenAI is a new and rapidly evolving technology. Its use presents both benefits and risks to the educational experience. In addition, GenAI is a technology that students will find in the professional world they enter upon graduation.

Faculty using GenAI as part of the educational experience should adhere to the following guidelines.

  1. Because of its potential to aid teaching, learning, research, service, and administration, and because students will likely need to use GenAI in the workplace, faculty are encouraged to proactively, safely, sensibly, and responsibly use, experiment with, and learn about GenAI. Among other things, faculty may find GenAI tools helpful in assisting with
    • Preparing for class;
    • Creating classroom demonstrations and learning activities that can inspire engagement, critical thinking, and reflection;
    • Creating learning, evaluation, and feedback materials;
    • Creating assessment materials, including rubrics;
    • Assisting with drafting letters, emails, reports, and other administrative and service materials; and
    • Assisting with scholarly and creative activities like research, writing, and editing.
  2. Faculty should be well-informed about responsible GenAI use. Stetson provides faculty resources for learning how to use GenAI sensibly, safely, and responsibly on its GenAI website. The Brown Center for Faculty Excellence and Innovation also supports faculty use of GenAI.
  3. Faculty should include a GenAI policy in every course syllabus or, alternatively, every assignment. Faculty are responsible for deciding how students may use GenAI to complete coursework in their courses. Faculty may sometimes want to prohibit the use of any GenAI tools. In other instances, faculty may wish to permit students unlimited use of GenAI. In most instances, faculty will want to choose a “middle road,” allowing or prohibiting the use of GenAI depending on the goals of each assignment and the degree to which GenAI aids in or interferes with learning. Students should receive clear directions about how they may use GenAI in this coursework. This clarity fosters trust between faculty and students and empowers students to engage with and learn about the strengths and weaknesses of GenAI. Accordingly, faculty should have a GenAI policy in their syllabus that:
    • Clearly defines “generative artificial intelligence” for course purposes;
    • Describes what GenAI use is permitted and prohibited;
    • Explains the reasons behind the policy; and
    • Announces if GenAI detection tools will be used to police academic dishonesty. (See #8 below for guidance that strongly discourages using GenAI detectors.)
    • In the absence of a GenAI policy in a syllabus, the Stetson Honor Pledge, the Law School Honor Code, or Conduct Code will apply as appropriate. For more on how to craft a GenAI syllabus or assignment policy, see the faculty information on the Stetson GenAI website.
  4. When requiring students to use a specific GenAI tool in a course, faculty should follow University policy and ensure students have adequate instruction. Faculty may require students to purchase and use GenAI tools as part of the learning experience if such use benefits students in accomplishing the course's learning outcomes. When selecting AI tools, faculty should prefer GenAI tools with a strong commitment to data privacy, avoidance of bias, harm mitigation, and inaccuracy mitigation. Faculty should also consider equity and accessibility when choosing required GenAI tools. If possible, faculty should choose University-owned GenAI tools or GenAI tools that have been reviewed and approved by the University for specific uses. Stetson provides information about Stetson-owned GenAI tools on its GenAI website. If another GenAI tool is desired, it is strongly recommended that faculty contact Information Technology to review the product and ensure it meets Stetson’s standards (including data privacy standards) for GenAI tools. Purchasing review guidelines for faculty can be found on the Stetson GenAI Resources website. 
    • If a GenAI tool is suggested (but not required) for students, faculty should ensure that the tool meets the same standards as if they were to require the tool and should also ensure that the tool is reasonably accessible to all students.
    • As with other course materials, faculty remain responsible for choosing GenAI tools as course materials and ensuring that these tools meet minimum quality and data protection standards.
  5. When allowing or requiring the use of GenAI tools in their courses, faculty should actively instruct students on the safe, sensible, and responsible use of these tools. Faculty should guide students on:
    • Protecting personal, sensitive, and University information when using AI tools.
    • Verifying the accuracy and reliability of AI-generated outputs.
    • Recognizing and addressing biases in AI-generated outputs.
    • Ethically using GenAI tools, including avoiding academic dishonesty and respecting intellectual property rights.
    • Responsibly using AI tools, understanding their limitations and appropriate contexts within the course and field of study.
    • Understanding the impact of AI on learning outcomes and the importance of developing independent critical thinking skills and avoiding overreliance on GenAI.
  6. Faculty should learn about and guide students on the safe, sensible, and responsible use of GenAI within the context of their academic discipline and associated work environments. Faculty prepare students for success in the workplace. Students now face a future that includes GenAI in their workplaces, and they will be expected to use GenAI tools effectively and ethically. Faculty should educate themselves on the use of GenAI in their discipline and incorporate appropriate instruction for students on this use in their classrooms. associated work environments. Faculty prepare students for success in the workplace. Students now face a future that includes GenAI in their workplaces, and they will be expected to use GenAI tools effectively and ethically. Faculty should educate themselves on the use of GenAI in their discipline and incorporate appropriate instruction for students on this use in their classrooms.
  7. Faculty should model transparency in their GenAI use. Whenever faculty use GenAI to produce course materials, they should be transparent about how GenAI is used. This may include oral or written disclosure, such as informing students in class or including a notice on materials. The disclosure should include how the material uses GenAI-produced content. Faculty should be transparent if they are relying on the output of a GenAI tool. For more on appropriate disclosures, see the Stetson GenAI website.
  8. Faculty should generally avoid using GenAI detection tools on student assignments. GenAI detection tools are far from perfect in detecting the presence of GenAI in student assignments. Currently, AI detectors do not provide conclusive proof of academic misconduct; they only provide statements of probability regarding the presence of AI-generated content. AI detectors are known to have problematic rates of false positives and false negatives. The false positives are particularly worrisome; misidentifying the presence of GenAI can put students in the position of defending against accusations that are not supported by actual evidence but instead by insinuation only. In addition, putting student information into detection tools that are not University-provided may present a data privacy risk, including the risk of disclosing FERPA-protected information. Due to these shortcomings, Stetson does not currently provide GenAI detection tools and strongly discourages their use.
    • Faculty who, in any event, choose to use an external AI detection tool remain responsible for complying with the University’s Acceptable Use Policy and Generative AI Use and Privacy Policy. (See item #9 below for more guidance on protecting data privacy.)
    • If a faculty member determines that it is important to gather data from an AI detection tool, they should speak with their Chair, Dean, or Associate Dean before using an AI detection tool on student assignments. Faculty should treat any detection tool's results appropriately—as a source of information, not as conclusive proof of GenAI misuse.
  9. Faculty must guard against the disclosure of confidential and private information to GenAI tools. GenAI tools can store information, and that information may be used to train the tool in the future. Thus, faculty should use caution when sharing personal, confidential, or sensitive information with GenAI tools. In addition, faculty must protect University information from unauthorized disclosure and follow the university’s Acceptable Use Policy and Generative AI Use and Privacy Policy when prompting GenAI with University information. Examples of this kind of information include FERPA and HIPAA-protected information and other sensitive information protected by University policy. All student work (including grades, graded work, and ungraded assignments) is FERPA-protected content and should not be used as input for GenAI tools without student written consent or unless the GenAI tool is Stetson-provided and approved explicitly for FERPA-protected inputs. Review the information on the Stetson GenAI website for more guidance on sharing information with provided and approved GenAI tools. In addition to following the University’s data protection policies, faculty should minimize the risk of inadvertent use or disclosure of other sensitive, personal, or private information by either (1) using University-provided or approved GenAI tools with appropriate data protection levels or (2) using tools that allow the user to opt out of allowing their data to train future iterations of a GenAI tool.
    • Faculty are responsible for knowing how GenAI tools use and store the data faculty provide.
  10. Faculty should avoid violating others’ intellectual property rights when using GenAI tools. GenAI tools can implicate others’ intellectual property rights, such as copyright ownership. Some publishers may not permit their materials to be used with some or all GenAI tools, and a faculty member using their protected content in a GenAI prompt could violate their copyright. For example, a book publisher may expressly prohibit readers from sharing an electronic version of the book with a GenAI tool. Further, students may hold intellectual property rights in the content they produce in the educational context. Finally, many GenAI tools have been trained on information that has copyright protection, and it is still unsettled whether that training violates the copyright owners’ rights. Faculty must know the terms and conditions under which they use GenAI tools, know the limitations a publisher places on using their intellectual property with GenAI, and avoid violating others’ intellectual property rights. Note that open educational resources (OER) have terms and conditions for their use, and faculty should not assume that they are copyright-free.
  11. When using GenAI for research, faculty, staff, and students should do so responsibly. Researchers should:
    • Disclose plans for GenAI use for research in the Institutional Review Board Protocol Description form for methods that are relevant for the IRB protocol (e.g., stimulus generation, transcription, experimentation, etc.).
    • Avoid uploading information on research subjects without IRB approval. For example, some GenAI tools pose a risk that qualitative research data (e.g., interview data) could become public and potentially allow research subjects to be identified. As part of your IRB submission, you will be required to confirm that any use of AI for transcription does not pose any risks to confidentiality of data, the release of sensitive information, or any violation of participants’ privacy.
    • Be transparent in reporting GenAI use and acknowledge that use where appropriate.
    • Verify the accuracy of AI-generated content and exercise caution when incorporating GenAI output into research.
    • Follow the policies of publication outlets and funding agencies when publishing research or applying for research grants.
    • Follow University IRB policies.
    • Avoid uploading unpublished research or other confidential information into GenAI tools that do not protect that research from disclosure, store the research data, or use that research for training the tool.

Consider these topics before reviewing a student’s suspected use of Generative AI in a course.

  • Do you have a Generative AI statement in your syllabus?
  • GenAI can be difficult to detect. What specific clues do you have?
  • Have you reviewed the Honor System Council’s procedures?

Contact the Honor System Council

Before acquiring an AI tool for University use, review and complete the checklist below.  Please consult with Jose Bernier, Senior AVP for Information Technology and Chief Information Officer ([email protected]).

  1. Data Security and Privacy
    • Does the tool comply with university data privacy standards (FERPA, HIPAA, etc.)?
    • Has the AI/GenAI tool been reviewed by Stetson’s IT office?
    • Is any confidential or sensitive data being processed?  If yes, how is it protected?
    • Are the vendor’s privacy practices aligned with university policy?
  2. Vendor Risk Assessment
    • Has the vendor been vetted for security and compliance risks?
    • Is the vendor transparent about handling data, including storage, processing, and
      sharing?
    • Is the contract clear on data ownership and access control?
    • Does the vendor provide plans for ongoing support, updates, and security
      patches?
  3. Ethical and Legal Considerations
    • Does the AI tool respect copyright and intellectual property laws?
    • Does the tool have safeguards against bias and harmful outputs?
    • Is there a clear policy for the responsible use of AI, including guidelines for accuracy and avoiding “hallucinations” (i.e., incorrect or fabricated content)?
  4.  Academic Integrity and Usage Guidelines
    • Are there guidelines for responsible use by students, faculty, and staff as needed?
    • Does the tool align with Stetson’s academic integrity policies?
  5. Cost and Licensing
    • Is the AI tool’s cost structure clear (e.g., upfront cost, subscription, usage fees)?
    • Are there any licensing restrictions or vendor lock-in risks?
  6.  Interoperability and Compatibility
    • Is the AI technology compatible with existing university systems and software?
    • Does the tool support open standards, ensuring it can be integrated with other technologies?
  7. Ongoing Evaluation and Support
    • Is there a plan to continuously assess the AI tool’s performance and ethical usage?
    • Will the vendor provide training for university staff and users?
    • Is there a clear process for reporting issues or misuse?
  8. Regulatory Compliance
    • Does the AI tool meet federal, state, and local regulations?
    • Has the institution verified that the AI tool complies with relevant industry standards?

Employee Guidelines

These guidelines direct all employees on the safe, sensible, and responsible use of artificial intelligence generative tools. In addition, they provide a framework for the University’s intentional efforts to provide resources for employee GenAI literacy development and support. Stetson acknowledges that society is in the early stages of GenAI development and that these guidelines will likely need regular review and revision as the technology changes. For employees serving in teaching or faculty roles, please see the Stetson University Faculty Guidelines for GenAI Use for additional guidance pertaining to teaching, service, and scholarship.

  1. Generative Artificial Intelligence, or GenAI, is a type of artificial intelligence that uses deep learning to produce new and unique content (including text, images, sound, and video) in response to a prompt based on its training data. GenAI includes both stand-alone tools (e.g., OpenAI’s ChatGPT and Claude’s Sonnet) and artificial intelligence that is embedded in already existing tools (e.g., Copilot within Microsoft Products and Gemini within Google products).
  2. GenAI is a new and rapidly evolving technology. It can produce reports, emails, news releases, videos, podcasts, and other materials. In the workplace, GenAI’s capacity to generate human-like communication is extensive.

Employees using GenAI in the workplace should adhere to the following guidelines.

  1. Before implementing GenAI tools for University business, employees must be trained to safely, sensibly, and responsibly use these tools. Department heads are responsible for ensuring staff receive adequate training.
  2.  Employees should use only those GenAI tools that are University-provided or otherwise approved for use in their department. If unsure whether a tool is University-provided or approved, employees should contact their department supervisor or see the information about University-approved GenAI tools on the Stetson GenAI Resources website. Faculty members should also refer to the Faculty Guidelines for additional information on using GenAI tools in their faculty roles.
  3. Employees must guard against disclosure of confidential information to GenAI tools. GenAI tools may store information and use it for training the tool in the future. Thus, employees should use caution when prompting GenAI tools with personal, confidential, or sensitive information. In addition, employees must protect University information from unauthorized disclosure and follow the University’s Data Classification Policy and Generative AI Information Protection Policy when prompting GenAI with University information. Examples include FERPA and HIPAA-protected information and other sensitive information protected by University policy. Faculty should also refer to the Faculty Guidelines for more information related to using GenAI tools in their faculty roles.
  4. Employees should avoid overreliance on GenAI tools. Overreliance on GenAI means depending too heavily on GenAI to produce content, particularly when one does not fully understand GenAI’s limitations. It means failing to engage with and oversee the work a GenAI tool produces or substituting the tool’s judgment for human judgment. Employees should always apply their own critical analysis and contextual knowledge to validate and refine AI contributions before use. In the workplace, human judgment should remain paramount, and AI tools should be used to aid, not replace, human expertise.
  5. Employees should adhere to Stetson’s Acceptable Use Policy when using GenAI, check for and address biased outputs, avoid using harmful outputs, and verify the factual accuracy of GenAI outputs. GenAI tools are trained on data obtained from a wide variety of sources, which may contain biased information. Because of this, biases in the data may appear in GenAI’s outputs and perpetuate unfair cultural biases. GenAI can also produce misleading, harassing, and discriminatory content. GenAI is also well-known for its capacity to generate false or inaccurate content like incorrect information, “deepfake” images, or faulty reasoning. Employees should be aware of these shortcomings and moderate their use accordingly. Faculty should also refer to the Faculty Guidelines for more information related to mitigating the harms of GenAI in their faculty roles.
  6. Employees should avoid violating others’ intellectual property rights when using GenAI tools. GenAI tools can implicate others' intellectual property rights, such as copyright ownership. Some data providers may not allow their content or materials to be used with some or all GenAI tools. Further, faculty and students may hold intellectual property rights in the content they produce in the educational context. An employee inputting this protected content into a GenAI tool could violate their copyright. Many GenAI tools have been trained on information that is protected by copyright, and it is still unsettled whether that training violates copyright owners’ rights. Employees should exercise caution when using generative-AI-produced materials in University publications. Employees should consult with their department supervisors to learn the terms and conditions under which they may use GenAI tools for University business, know the limitations a publisher places on using their intellectual property with GenAI, and avoid violating others’ intellectual property rights. Faculty should also refer to the Faculty Guidelines for more information related to intellectual property rights and FERPA.
  7. When using GenAI for research, faculty, staff, and students should do so responsibly. Researchers should:
    • Disclose plans for GenAI use for research in the Institutional Review Board Protocol Description form for methods that are relevant for the IRB protocol (e.g., stimulus generation, transcription, experimentation, etc.).
    • Avoid uploading information on research subjects without IRB approval. For example, some GenAI tools pose a risk that qualitative research data (e.g., interview data) could become public and potentially allow research subjects to be identified. As part of your IRB submission, you will be required to confirm that any use of AI for transcription does not pose any risks to confidentiality of data, the release of sensitive information, or any violation of participants’ privacy.
    • Be transparent in reporting GenAI use and acknowledge that use where appropriate.
    • Verify the accuracy of AI-generated content and exercise caution when incorporating GenAI output into research.
    • Follow the policies of publication outlets and funding agencies when publishing research or applying for research grants.
    • Follow University IRB policies.
    • Avoid uploading unpublished research or other confidential information into GenAI tools that do not protect that research from disclosure, store the research data, or use that research for training the tool.
  8. Employees should learn as much as possible about responsible GenAI use. Stetson offers resources for learning how to use GenAI sensibly, safely, and responsibly.

Before acquiring an AI tool for University use, review and complete the checklist below.  Please consult with Jose Bernier, Senior AVP for Information Technology and Chief Information Officer ([email protected]).

  1. Data Security and Privacy
    • Does the tool comply with university data privacy standards (FERPA, HIPAA, etc.)?
    • Has the AI/GenAI tool been reviewed by Stetson’s IT office?
    • Is any confidential or sensitive data being processed?  If yes, how is it protected?
    • Are the vendor’s privacy practices aligned with university policy?
  2. Vendor Risk Assessment
    • Has the vendor been vetted for security and compliance risks?
    • Is the vendor transparent about handling data, including storage, processing, and
      sharing?
    • Is the contract clear on data ownership and access control?
    • Does the vendor provide plans for ongoing support, updates, and security
      patches?
  3. Ethical and Legal Considerations
    • Does the AI tool respect copyright and intellectual property laws?
    • Does the tool have safeguards against bias and harmful outputs?
    • Is there a clear policy for the responsible use of AI, including guidelines for accuracy and avoiding “hallucinations” (i.e., incorrect or fabricated content)?
  4.  Academic Integrity and Usage Guidelines
    • Are there guidelines for responsible use by students, faculty, and staff as needed?
    • Does the tool align with Stetson’s academic integrity policies?
  5. Cost and Licensing
    • Is the AI tool’s cost structure clear (e.g., upfront cost, subscription, usage fees)?
    • Are there any licensing restrictions or vendor lock-in risks?
  6.  Interoperability and Compatibility
    • Is the AI technology compatible with existing university systems and software?
    • Does the tool support open standards, ensuring it can be integrated with other technologies?
  7. Ongoing Evaluation and Support
    • Is there a plan to continuously assess the AI tool’s performance and ethical usage?
    • Will the vendor provide training for university staff and users?
    • Is there a clear process for reporting issues or misuse?
  8. Regulatory Compliance
    • Does the AI tool meet federal, state, and local regulations?
    • Has the institution verified that the AI tool complies with relevant industry standards?

Additional Resources

Additional resources for use by students to engage with Generative AI can be found at:

AI Tool Usage & Data Classification Guidelines

Stetson University classifies institutional data into three levels: Public (Level 1), Private (Level 2), and Restricted (Level 3). Use of AI tools must comply with these classifications. This chart describes the level of institutional data that may be shared with AI tools.

AI Service

University Provided?

Public (Level 1)

Private (Level 2)

Restricted (Level 3)

Microsoft Copilot (M365)

Yes

Allowed

Allowed

Allowed

OpenAI ChatGPT

No

Allowed

Not Allowed

Not Allowed

Google Gemini / Bard

No

Allowed

Not Allowed

Not Allowed

Anthropic Claude

No

Allowed

Not Allowed

Not Allowed

Copilot Studio (Power Platform)

Yes

Allowed

Allowed

Allowed

GitHub Copilot

No

Allowed

Caution*

Not Allowed

Other Third-Party AI Tools (e.g., Jasper, Perplexity)

No

Allowed

Not Allowed

Not Allowed

Zoom integration

Yes

Allowed 

Allowed

Allowed

Third Party tools purchased for departmental level use

Yes

Allowed (if the IT purchasing checklist was followed).

Not Allowed

Not Allowed

*If the application will be used with level 2 data, an Enterprise license should be purchased and configured instead of an individual one. This ensures that security policies can be implemented across the entire university.

student looking at laptop inside

DATA TIPS FOR USING GEN AI

Do's

  • Use Microsoft Copilot for working with confidential or sensitive data.
  • Confirm data classification before using any AI tool.
  • Consult IT if unsure whether an AI tool is approved.

Don'ts

  • Do not input student records or financial data into public AI tools like ChatGPT.
  • Avoid using AI tools that are not approved or integrated into Stetson's secure systems.
  • Don't assume AI-generated output is accurate or policy-compliant.
  • Never paste code or content from restricted systems into external AI tools.