“The difficulty lies not so much in developing new ideas as in escaping from old ones.” John Maynard Keynes
The College of Professional Studies’ goal is to ethically and responsibly embed Generative AI and Large Language Models (LLMs) in our work: New Product Development, Curriculum Development, Teaching, Assessments, and Workplace Tasks. Whether you are embarking on your journey to implement AI into teaching, curriculum development, assessments, the development of CPS administrative and learning products, or simply in work-related tasks, this document will support you with guiding principles and standards (note that University policies supersede this document).
This document was developed by Joan Giblin, Uwe Hohgrawe, Ilka Kostka, Yvonne Leung, Prashant Mittal, Allison Ruda, Balazs Szelenyi, John Wilder, and Shachi Winton. Please feel free to contact these colleagues while you are exploring your use of AI in the areas outlined here; they will be able to provide guidance and/or put you in the right direction for a successful implementation of AI at the College of Professional Studies.
Our Purpose
We are teaching and working in an era of significant disruption due to rapid developments in AI. With this disruption, however, there are tremendous opportunities for enhancing teaching, research, collaboration, and work, allowing the College of Professional Studies (CPS) to serve as innovative thought leaders for the University. The guidance in this document aims to support faculty, administrators, and staff as they integrate AI into educational endeavors in the College. This living document serves as a reference and recommendation (i.e., guideline) and is not set forth as a policy or rules to be implemented.
Who Should Use These Guidelines?
As Ethan Mollick aptly noted, “The only bad way to react to AI is to pretend it doesn’t change anything.” Whether you are a power user of AI, early experimenter, curious beginner, or anywhere in between, you are taking important steps to understand the impact of AI on teaching, learning, and work. This living document offers guidance and support to ensure that CPS colleagues with a range of AI experience have the knowledge they need on their journey into AI integration. Each use case and question are built around creating an AI supported environment that promotes efficiency, innovation, and accountability. You are encouraged to refer to these guidelines as you teach and work with AI and regularly share insights, questions, and concerns with colleagues.
Playground or Sandbox Approach. Embracing a playground or sandbox approach fosters a culture of innovation where experimentation with new technologies is encouraged and valued. Creating an environment that welcomes trying out new AI applications through pilot programs and projects of small scale allows us to test viability and impact without significant risk.
Safety First/Human in the Loop. This principle emphasizes the ethical and responsible use of AI to ensure that AI systems are designed to mitigate biases, protect privacy, and promote transparency. We all maintain oversight and accountability by keeping humans involved in critical decision-making processes such as admissions, grading, and credentialing. Regular monitoring of AI outputs safeguards against errors and upholds quality standards.
Smart Resource Allocation. Adopting a rational and practical stance on the economics of AI use is crucial. Conducting thorough cost and benefit analyses for CPS at large and for AI projects allows us to make informed decisions about investing in AI technologies that align with institutional goals. Resources should be allocated to AI solutions that offer clear benefits, considering factors like scalability and integration with existing systems to maximize utility. CPS’ commitment to ongoing professional training also enhances our ability to effectively use and manage AI tools, facilitating smoother transitions to AI enhanced workflows and ensuring effective applications of AI to professional endeavors at CPS.
Continuous Improvement and Measurement. Quality assessment and performance metrics are vital to ensuring the long-term success of all AI implementations. This involves establishing clear benchmarks and KPIs (Key Performance Indicators) to measure the performance and effectiveness of AI systems. By continuously monitoring metrics such as accuracy, efficiency, and user satisfaction, we can assess whether AI applications meet desired outcomes and adhere to institutional standards. Ongoing evaluation also helps to identify areas for improvement and optimize AI performance, ensuring that we not only meet immediate needs but also anticipate future demands. Quality assessment ensures that AI tools provide consistent value, both operationally and academically, in the long term.
Learning About and Engaging with AI
Experimenting with AI to complete one simple task is an excellent way to begin exploring its affordances. As always, all AI users must look carefully at the output and ensure accuracy and ethical and responsible use. From there, they can adapt the output and tailor it to their needs.
Northeastern offers opportunities for professional AI learning about the development of new products, curriculum development, teaching, assessments, and workplace use. There is also a listserv in which faculty across the University can share insights and ideas for teaching. Lastly, the Center for Advancing Teaching and Learning through Research (CATLR) periodically offers programming related to AI. A list of resources can also be found at the CATLR website.
Potential inequities may arise if individuals are left to manage AI tools and data practices without adequate resources, training, and support. CPS will, where possible, offer structured professional development and training to help faculty, staff, and students learn how to effectively use AI tools and navigate data privacy concerns. These endeavors can create equitable opportunities for those who are already skilled at using AI and those who can benefit from additional training and support.
One of the best ways to keep up to date is to use AI and experiment with new features. When we “play” with AI, we can see firsthand what is possible and how tools work, thus increasing our familiarity and confidence in using AI. Additionally, the process of establishing AI use conventions and guidance will be forged through collective decision making and accountability across the CPS community. Faculty, staff and student voices are all critically important to this work, and policies and guidelines will be regularly updated to keep pace with technological advancements and regulatory changes. Below are other suggestions for engaging in continual professional learning:
Peruse resources designed specifically for instructors (e.g., EDUCAUSE articles and reports)
Earn micro credentials focused on AI in teaching and learning
The AI Tools Directory provides a useful collection of AI tools that are organized by name (in alphabetical order), category (e.g., coding, education, music, social media), and price (i.e., free, “freemium”, and paid).
The University published a position statement and a set of guidelines for faculty in August 2023. This document includes decision-making guidance, practical considerations for leveraging AI, and syllabus expectations. As far as we know, this document has not been updated since.
Similar to University guidance for faculty, there is a document for students titled “An Insider’s Guide to Learning with AI.” This document is available at CATLR’s site about AI in higher education.
AI for Curriculum and Product Development
Use AI tools to analyze industry trends, job market demands, and student performance data. AI can suggest new courses or updates to existing ones to keep programs relevant. However, always review the AI’s suggestions with faculty and CPS administration to ensure alignment with institutional goals and expertise. Most importantly, always approach AI suggestions with caution and verify their accuracy before relying on them.
Begin with your idea – however big or small. Always discuss your idea with your area dean, and how you plan to test new AI tools or approaches in conjunction with a specific context and users. Gather feedback from faculty and staff to refine the tools and approaches and measure impact, based on users’ experiences. Discuss how best to scale the solutions that demonstrate the greatest impact.
/
AI for Teaching
CPS encourages the integration and use of AI, and all technologies, into teaching and learning. Below are considerations for students and faculty.
General Considerations – Students
While AI can provide quick answers and feedback, it is essential to engage with tutoring systems that encourage problem solving and critical thinking. AI tools can be programmed to ask open-ended questions, provide scenarios that require deeper analysis, or give hints instead of direct answers. Human tutors can supplement this process by reviewing AI driven activities to ensure students are not just memorizing solutions but developing strong analytical and reasoning skills.
Each discipline has essential skills that students must learn, even if AI can perform those tasks. Identify these tasks and skills and explain to students why it is important for them to know. Highlight the learning opportunities they will miss if they rely solely on AI.
It is also helpful to identify processes in your discipline that AI can enhance. Faculty should guide students in using AI to enhance their work while avoiding its pitfalls, and they are encouraged to make students experts in leveraging AI effectively in their field. When they are empowered to be co-creators of knowledge with their instructors, there are endless possibilities for teaching and learning with AI.
Yes, one essential skill across disciplines is learning how to ask generative AI good questions. Your curriculum should include “prompt engineering” or how to engage with AI meaningfully to achieve the learning outcomes of an assignment. Often, the first response from AI is not the best. Teach students to refine their questions and have a conversation with AI to improve the answers.
In each field there are different risks. When using AI models, the text input can be used to train the model, potentially leaking information into future responses. Make sure students are aware of this. Teach them how your field handles private data and intellectual property rights. (NU Privacy Principles).
Some AI models also ensure information is not used for training. If your discipline has important privacy and security concerns teach them which AI models keep data private and which do not. Faculty and students are also encouraged to use Copilot, which ensures that their data is not used to train its AI model. See Appendix II for more information about Copilot at Northeastern for faculty, staff, and students.
As an instructor, the choice is yours, but consider the following: How can you support and guide students in their AI use in your course? How can you ensure that students are prepared to safely and ethically use AI in your class? What are the learning goals of the assignment, and will using AI undermine those goals?
Students often rely on specific and explicit faculty guidance about AI use and want to know what appropriate AI use entails. Including AI in course assignment descriptions and inclass activities, as well as examining AI-generated output with students, can foster transparency and build trust between faculty and students. One way is to generate output and critically analyze it with students either in class or in out-of-class assignments, discussing questions such as:
Is AI-generated content accurate? Why or why not?
What misinformation and/or biases may be present in this output?
How could this output be misleading or incorrect?
What other credible sources would support this output?
Is there any important information missing from the output? Why or why not?
How does this output align with what I already know about this topic?
When designing your class, identify tasks that cannot be done by AI. Focus on what is important for a human to know and what AI cannot do. If certain topics can be easily handled by AI, spend less class time on those and more on complex problems that AI cannot solve. It is essential to verify AI output for accuracy and relevance before utilizing it in decision-making processes or integrating it into educational practices. This approach allows for a broader “big picture” view of the course content. Incorporating AI can help reduce the emphasis on minute details, allowing more focus on the big picture or more meaningful content of the course.
General Considerations – Faculty
Faculty can leverage AI to create customized content for their courses, be creative with project-based learning, use interactive case studies, role playing simulation, personalize learning for students, and build students’ critical AI literacy skills (i.e., the ability to use, understand, and evaluate AI platforms). They can also leverage AI to save time preparing for class, providing feedback on student work, and creating teaching materials.
While there are opportunities to innovate teaching and learning, faculty must also be aware of the risks involved in using AI. These risks include data privacy violations, algorithmic bias, misinformation and disinformation in AI output, copyright and intellectual property issues, inequities with access to AI, hallucinations, and human labor costs. There are also major ethical implications to consider, as well as the impact of AI on the environment. Faculty need to be aware of all these risks and make every effort to mitigate them. Engaging in open and on-going conversations with colleagues and with students can deepen our collective understanding of these risks, especially as AI continues to rapidly develop.
Faculty and staff can use AI in meetings to record and take notes in meetings with students with all participants’ consent. One program used in the GSE is Fathom, which generates summaries, action items, and next steps that can be shared with students. Another platform is NotebookLM, which allows users to upload various document types, including Google Docs, PDFs, and website URLs (see Appendix II). Once uploaded, the system analyzes these sources and provides: Once uploaded, the system analyzes these sources and provides:
Engaging “podcast style” audio summaries from sources
A podcast style overview of a course or student service based on existing resources that are uploaded into it using Fathom
Identification of key topics
Suggested questions to explore the content further
Analyses and reference images embedded in documents
Another example is Perplexity, which searches the internet in real time to gather information and generate a first draft. Users enter a topic and select their target audience (e.g., anyone, beginners, or experts). The Pages function will transform a researched topic into a well structured and beautifully formatted article.
Customization: Users can edit and customize every element of the Pages, including text, photos, sub-headlines, and tone.
Source traceability: The tool provides easy access to view and manage sources for each section, ensuring information accuracy.
Interactive Q&A: Readers can ask follow-up questions directly on the Pages, enhancing the personalized learning experience.
Assigning tasks for students that leverage their higher-order thinking skills (e.g., analysis, synthesis, evaluation) can shift the focus from simple task completion to deep engagement in learning. Assignments could also include an AI component in which students need to use AI then reflect on output.
AI can be used for a range of tasks customized to course content, program learning outcomes, and course objectives. For instance, some CPS programs are using chatbots to provide students with additional feedback on key assignments (see Appendix II). Other examples include generating realistic project scenarios (e.g., risk factors and stakeholder challenges) for students to analyze and develop management plans. Another example is dynamic question generation. AI can generate personalized questions based on learning objectives, making it harder for students to share answers. Below is an example:
Prompt the LLM to regenerate the questions to create more personalization according to student interests and answer
Create a TA chatbot to interact with students using a platform such as https://poe.com
Prompt the LLM to ask students a few specific questions regarding their interests related to the course materials
Prompt the LLM to generate questions based on a topic of interest and student’s answers
Prompt the LLM to evaluate student’s answer to the generated question
Suggestions for Assessment
There are several ways in which faculty can assess student learning, many of which they likely already implement. Faculty are encouraged to:
Redesign assessments to emphasize critical thinking, focusing on analysis, synthesis, and evaluation instead of tasks requiring simple recall or content creation.
Use skill-based approaches that mirror real-world professional scenarios and encourage critical thinking.
Incorporate reflective assessments by requiring students to document their learning process with multiple drafts, research notes, and reflections.
Utilize real-time assessment including oral examinations and presentations to assess understanding and limit overreliance on AI.
Encourage collaboration and peer assessment, making it harder for students to rely solely on AI.
Use a variety of assessment methods to evaluate student learning instead of relying on a few major assignments
Ask students to explain and evaluate their AI usage, which promotes transparency.
Faculty can design assessments that require students to demonstrate higher-order thinking skills such as problem-solving, synthesis, and evaluation. They can also develop rubrics that evaluate students’ process and reasoning, not just final outputs.
Host fun events (e.g., a hackathon) to encourage participants to explore the intricacies of problems and their solutions in a playful manner.
Assessments that focus on developing critical thinking skills and promoting originality are less susceptible to AI-generated responses. Instructors should also create assignments that require students to draw from personal experiences, specific class discussions, and/or or specific course materials (e.g., readings, lectures).
Another strategy is to incorporate process-based assignments. Instructors can implement a submit-revise-resubmit cycle with feedback at each stage and contextualize questions by crafting prompts that require applying knowledge to specific scenarios or case studies. This makes it harder for AI to generate appropriate responses without deep subject understanding.
Suggest using this (evolving) Bloom’s Taxonomy table as a reference for evaluating and considering changes to align course activities (or, where possible, learning outcomes) that highlight characteristic human skills and/or integrate generative AI tools to supplement to the learning process.
Project-based learning (PBL) can be integrated into assessments by creating tasks that require students to apply theoretical knowledge to solve practical real-world problems.
Design assessments around real-world scenarios: These scenarios should reflect the complexities of professional practice, requiring students to engage in tasks that mirror the challenges they will face in their careers. This also encourages the development of problem-solving and critical thinking skills.
Require professional presentation of findings: Encouraging students to present their final projects in a professional format prepares them for the expectations of their future careers. This could involve written, oral, or multimedia presentations, depending on the context.
Encourage collaboration: PBL assessments can foster teamwork and communication by requiring students to work together to solve complex problems. This approach mirrors real-world settings, where teamwork and negotiation are essential skills.
Focus on problem-solving and decision-making: Assessments should prompt students to analyze problems, make informed decisions, and justify their reasoning. This process helps students develop higher-order thinking skills, moving beyond simple content recall or replication.
Emphasize the iterative process of feedback: PBL should include opportunities for students to revise and improve their work based on feedback. This iterative process helps them develop resilience, adaptability, and a deeper understanding of the material as they refine their approaches.
Use technology to simulate real-world environments: Integrating technology, such as simulations or virtual environments, into assessments can enhance the PBL experience by replicating the complexities of professional work environments, helping students develop practical skills.
Assess both the process and the final product: PBL emphasizes not just the final outcome but also the learning process. By evaluating how students approach problems, handle challenges, and adapt their strategies, faculty can gain deeper insights into their critical thinking and problem-solving skills and provide support as needed.
Promote creativity and innovation: PBL assessments should allow for flexibility, encouraging students to approach problems in creative and innovative ways. This flexibility fosters independent thinking and helps students develop unique solutions to complex issues.
Faculty can design reflective-based assessments to focus on evaluating the learning process instead of the final product. By requiring students to submit drafts, research notes, and reflections, faculty gain insights into how students arrive at their conclusions, which promotes a deeper understanding of their work. These reflections should encourage students to both examine how they used AI during the learning process and allow them to analyze the effectiveness of AI in learning. For instance, students could reflect on how AI helped them learn, what worked well using AI and what did not, and how using AI either helped or harmed their learning.
Oral exams and presentations require students to verbally explain concepts, defend their work, or engage in discussions in real time, all of which help faculty assess their understanding and ability to articulate knowledge without relying on AI tools or other sources of support.
Project Management: Use customized case studies, reflective journals, or live project simulations.
Analytics: Provide unique datasets, conduct coding assessments with in-person components, and require project defense presentations.
Education: Assign curriculum development projects, teaching practicums, lesson demonstrations, and reflections on real-world educational challenges (See Appendix 2).
Digital Media: Have students create original works and document the creative process through portfolios. Incorporate peer critiques.
Science and Regulatory Affairs: Assign regulatory strategy development tasks, simulate regulatory meetings, and require analytical essays on regulatory changes.
Automated Grading with Human Oversight
Use AI for efficient grading and assessment while maintaining quality and ensuring equity.
Faculty can leverage tools that are currently available at Northeastern to assist with certain types of assessment tasks, such as developing detailed grading criteria including rubric, or creating formative feedback opportunities that help learners improve their work prior to submitting it. Appendix 2 includes an example of Harmonize, which provides rubric tools and a discussion coach to support engagement. Investigate tools that assist with developing objective assessments, such as multiple-choice exams with automated, scripted feedback. For subjective assessments such as open-ended questions or essays, use AI to offer initial evaluations but provide transparent final grading and personalized feedback. In particular, and as one example, an open-source LLM hosting platform called Ollama offers completely offline and private solution for faculty to upload student works for automated grading without bleaching the data privacy issue. For more information, please visit https://community.hetzner.com/tutorials/ai-chatbot-with-ollama-and-openwebui
AI grading tools should be carefully calibrated to avoid biases. Regularly audit the AI’s grading patterns by comparing them to human evaluations. Ensure that AI tools are trained on diverse data sets to mitigate bias and have instructors review a sample of AI-graded assessments to ensure accuracy and fairness, especially for subjective assignments like essays or projects.
AI grading is most effective for objective tasks like quizzes, multiple-choice tests, or assignments that have clear, structured answers. It can also be used for initial evaluations of essays, coding exercises, or even discussion posts, where AI can check for basic elements such as grammar, structure, or keyword usage. However, for complex tasks requiring deeper analysis or creativity, human oversight remains essential for providing nuanced feedback.
While AI can speed up the grading process by handling the bulk of assessments, it is important to ensure that students still receive personalized feedback. For example, instructors can use AI to handle homework and repetitive grading tasks, freeing up time to focus on crafting meaningful and detailed feedback for more subjective or high-stakes assignments. Faculty can also review AI feedback to ensure it aligns with the learning objectives and personalize it where necessary. With prompt engineering, faculty will be able to submit a rubric to a LLM Chatbot and ask for specific feedback on student assignments based on the rubric. Faculty will then leverage the LLM’s output as a structure to provide more personalized suggestions to improve the assignment. For more information please visit: https://hongkongtesol.com/blog/how-use-ai-generate-studentfeedback
Academic Integrity
Please visit the Academic Integrity resources pages on the CPS SharePoint site for ideas and guidance related to educating learners about academic integrity and AI, and to learn about tools and strategies for investigating and responding to potential lapses.
Transparency is key. You should encourage students to use AI tools for brainstorming or as learning aids but require them to clearly document how and why they used AI. This can be done by tracking AI usage during the research and creation process. For instance, some instructors ask that students submit the AI prompts they used and/or submit the output they received from AI. It is also important to discuss ethical considerations around AI use with students, so they are fully aware of AI’s opportunities and risks.
Here are resources on how to discuss ethical considerations with students.
When teaching about AI-powered tools, discuss issues related to copying, plagiarism, and citing sources. For example, you might recommend that students cite AI if it generates ideas instead of the student coming up with ideas on their own. Faculty expectations for AI use should be clear to students for each specific assignment.
Creating transparency and trust among faculty and students is a critical first step in minimizing inappropriate AI use. For example, specifying when and how students may use AI in their work, as well as critically analyzing AI output in class can help students see that AI is merely one tool that can support their learning. Nevertheless, they may still rely too heavily on AI for varying reasons (e.g., poor time management, lack of confidence in their skills or content knowledge). Detection programs are believed to be problematic, so they are not suggested for addressing student AI use. The following CPS Guidelines for Generative AI Teaching and Learning Expectations, part of which is listed below, offer additional guidance for reducing heavy use of AI in assignments and responding to students:
Do not accuse students of cheating or of using AI
Do not rely on AI detectors: bear in mind that they are inaccurate and may falsely flag the writing of multilingual students and neurodivergent students
Use the rubric to note missing expectations, citations, or demonstration of learning.
If evidence suggests a student may have used GenAI, you may note: “Your work demonstrates patterns of error that are consistent with text produced by GenAI.”
The style guides that students use now have information about citing AI in written work. Below are guidelines that can be shared with students to encourage transparent use of AI.
Guidelines from Chicago Manual
Guidelines from the APA (American Psychological Association)
Guidelines from MLA (Modern Language Association)
AI for Research, Experimentation, and Continuous Feedback Loops
Use pilot programs to test AI tools before full integration into designed learning experiences. Before implementing new pilot initiatives, faculty should consult with area deans and engage with the digital learning specialist supporting their program area. The Applied Research team hosts a monthly Research Collaboration Series to highlight faculty research projects – no matter what size or stage – and facilitates workshops symposiums, and other events on a range of topics related to research and innovation. Explore your ideas with the College’s Center for the Future of Higher Education and Talent Strategy(CFHETS), which studies trends and pioneering next generation learning model.
There are AI-powered tools that focus specifically on research (see Appendix 2). One example is ResearchRabbit, designed to streamline the academic research process. ResearchRabbit allows users to create collections of papers, similar to playlists in Spotify. The AI learns from user preferences to provide personalized recommendations, making it easier to discover relevant research. One of ResearchRabbit’s standout features is its interactive visualization capabilities. Users can generate citation graph visualizations that help them understand the relationships among papers and authors, providing a comprehensive view of their field of study. The visualization tools can reveal research teams or related publications that might not be discovered through traditional search methods. The platform offers recommendations in two main modes:
People: Recommending publications by specific authors or suggested authors in the field
Papers: Suggesting similar work, earlier work, or later work related to the user’s interests.
Encourage students, staff and faculty to provide structured feedback. Use surveys or focus groups to assess how well the AI tools function. Make improvements based on this feedback before scaling up the tool’s use in full programs.
Design pilot programs to mirror the diversity and scale of the actual environment where the AI tool will be used. Involve a range of end users, use contexts, faculty and staff in pilots to ensure the tool performs well under various conditions. Simulate real-world challenges such as differing learning paces and approaches, end user needs, and technical issues to better gauge how the AI tool will perform when fully implemented.
Establish clear success criteria before launching the pilot, such as improvements in efficiency, student/staff satisfaction, and outcomes. Collect both quantitative data (e.g., assessment completion times or grading accuracy) and qualitative feedback from end users. If the AI tool meets the predefined goals and improves educational processes without significant drawbacks, it can be considered a successful pilot ready for broader adoption.
AI for (Daily) Work and Collaboration
Faculty and staff collaboration on AI initiatives has taken many forms at CPS. Many faculty conduct research and disseminate their findings in presentations and publications, offer workshops for other instructors, apply for research funding, co-present at conferences, and publish about AI. An easy way to collaborate, however, is to simply exchange ideas and resources. For instance, small working groups formed among colleagues who are interested in AI can provide a supportive and collaborative environment in which to discuss discipline-specific issues regarding AI.
It is important to understand and consult the University Decision Support’s high-level guidance document: How to Safeguard Data. If you have questions about specific use cases or technologies, the CPS Decision Support & AQA team will field your question and guide you to appropriate resources.
It is important to ensure that all meeting participants feel comfortable with AI Notetakers, or other types of AI Assistants, in online meetings. Create frameworks for establishing acceptable norms or conventions around AI use, particularly concerning sensitive data. If meetings are recorded, attendees should be made aware ahead of time. They should also have the option to participate in whichever way is comfortable for them. See Appendix 2 to learn how other universities are incorporating guidance around AI-based “netiquette.”
The University is exploring a variety of platforms to support AI innovation in ways that encourage innovation while also aligning to broader business strategies and infrastructure requirements. There are important infrastructure considerations to consider in conjunction with these decisions, so talking with your supervisor about your idea is an essential first step.
The process of establishing AI use conventions and guidance will be forged through collective decision-making and accountability across the CPS community. Faculty, staff and student voices are all critically important to this work, and policies and guidelines are regularly updated to keep pace with technological advancements and regulatory changes. Consider this a “living” guidance resource and return to it – and offer your feedback on it – regularly.
Appendix I: AI Training and Infrastructure at CPS
This section provides strategic framework considerations for building a fully integrated and sustainable AI workplace, ensuring that our practices align with the College’s goals and mission and support the ongoing development of faculty, staff, and students.
Tailored AI training programs: These will focus on interpreting AI-generated insights for curriculum development, understanding AI tools, and integrating AI into administrative processes. Faculty can learn how to effectively use AI for grading, curriculum updates, and even personalized learning systems.
Continuous learning opportunities: These include workshops, webinars, and peer mentoring programs, to keep faculty and staff updated on the latest AI advancements. Offering refresher courses or advanced training on new AI tools will ensure that everyone stays current with evolving technologies. Additionally, creating a support network where staff can share challenges and solutions will foster collaborative learning.
The CPS initiative aims to cultivate a supportive environment for innovation by leveraging AI to enhance faculty, staff, and student success. Key strategies include fostering experimentation through pilot programs and promoting a culture of creativity and responsible risk-taking. It also emphasizes systematic evaluation of AI applications across various domains like program development, teaching, and operational efficiency, using KPIs to gauge effectiveness and scalability. Collaborative efforts with internal and external stakeholders ensure AI solutions address real-world educational challenges. Additionally, CPS plans to enhance infrastructure and student support services to integrate AI tools effectively, thereby supporting personalized learning and continuous improvement within the institution.
While there are many AI tools on the market, the free version of Microsoft Copilot is IT Services’ recommended AI chat tool. To ensure university data stays private, always sign in to Microsoft Copilot with your Northeastern credentials. This ensures university data remains within our Microsoft environment and is not used to train the AI model. Members of the university community should refrain from using thirdparty AI tools such as meeting recorders, note-taking tools, and writing proofreaders. Third-party AI tools can use this data to train their model and, potentially, sell the data provided to it to advertisers and others.
Harmonize is a CPS-licensed tool available to all faculty and students. Harmonize’s Rubric Coach can take instructions that you developed for your discussions and build a set of focus areas, or an entire rubric, that syncs back to your course and is available in the Canvas SpeedGrader. You can associate existing Canvas rubrics with Harmonize discussions assignments as well. Learn more
Scite.ai is an AI-powered research platform that analyses and provides citation context for scientific papers, helping researchers evaluate the credibility and impact of scholarly articles. Learn more
Cycle 1 Fieldwork report chatbot provides Advanced Research Methods students with additional feedback on their first assignment (a fieldwork report). This required working with developers to tune the bot specifically to EdD course rubrics, expectations, and templates. The bot was trained with EdD faculty feedback.
AI notetaker for meetings with students. Instructors are using the free Fathom Notetaker tool to record and take notes for meetings with dissertation students. It provides excellent summaries, action items, and next steps that instructors can forward to students.
Course introduction audio/podcast. Dr. Joan Giblin uses the freely-available Google Notebook to create a podcast course introduction. To generate the podcast, Dr. Giblin uploaded the EDU7310 R3.5 syllabus, her personal introduction to the course email, the Cycle 1 fieldwork report template, and a document on how and why the program is introducing AI in this course.