News


Making IT happen

Partner Post: Anthology's Blog on the Importance of Empowering Instructors for Responsible AI Adoption

By Jim Chalex and Nicolaas Matthijs

With the recent widespread availability of generative AI technology, many institutional leaders are faced with a fundamental strategic question: how can the efficiencies of AI be leveraged without jeopardizing academic integrity?

We are actively discussing this challenge with clients worldwide and have released our Trustworthy AI Approach to provide full transparency on Anthology’s approach to AI, including how it informs product development. A central pillar of this is what we call “humans in control”, meaning our global education community can choose whether to adopt AI-powered capabilities within our solutions—and the timeline for adoption—based on their institution’s policies and preferences.

It is our belief that a responsible and innovative approach to AI within the classroom starts with instructors and instructional designers, and our recent announcement of the AI Design Assistant—an initiative from Anthology in partnership with Microsoft to make Blackboard® Learn the first LMS in market with generative AI capabilities—represents a crucial step in this direction. By supporting faculty and informing them of AI’s possibilities—not just its risks—institutions can promote authentic assessment, engaging learning experiences, and academic integrity, while also providing efficiency for teachers and improved outcomes for students as a result.

Tackling the Assessment Challenge

Let us cut to the chase: a core concern around AI relates to plagiarism and academic integrity. With programs like ChatGPT able to produce essays and other material in seconds, and many exams and other tasks now completed remotely, both instructors and institutional leaders are concerned about the damage that AI could do to fair assessment practices. And not without reason, as many approaches to student evaluation that have been honed over decades must now be reviewed in the world of generative AI.

Embracing authentic assessment processes is more important than ever. With AI able to distill existing information with such efficacy, assessment needs to focus on critical thinking, personal perspectives, and self-reflection rather than the accrual of knowledge. Activities might also look to explore subject areas where these tools do not have as much historical data to work with, such as current and local events, personal experiences, and future predictions. As Jacob Spradlin, director of online instructional development and support at Sam Houston State University shared at our recent Anthology Together ’23 conference, “ChatGPT has sparked a whole new level of awareness and proactivity from our faculty with regards to authentic assessment. Knowing that students can easily generate responses to standard questions and tasks has placed greater emphasis on looking for original ways to test different subject areas. It might even be that AI ends up being a net-positive for assessment in higher ed.”

The role of learning technology, by extension, is to make it easier for instructors to adopt these practices. The AI Design Assistant, which will be available to all Learn institutions in September, will allow instructors to automatically generate formative test questions and grading rubrics based on course content, making Learn the first LMS in market to extend this functionality. By eliminating the grind of starting an assessment from scratch, this frees up more time for the instructor to review the task through an authentic assessment lens, creating an efficient process which maintains academic integrity.

Automatic generation within these tasks is not meant to replace the user or the process as a whole, but rather to provide a starting point. Maintaining autonomy for instructors and instructional designers is fundamental to our development approach for Learn, and the AI Design Assistant reflects this: it provides helpful tools to inspire the process of test and rubric creation without requiring faculty to cede any control over the final product. Again, with Anthology it is always the human–in this case, an instructor or instructional designer–that is in the driver’s seat, with full visibility and ownership of all high stakes processes throughout the learning experience.

You may well be questioning where anti-plagiarism software and AI detection fits in with all this. It is our opinion that reliable detection is not a viable approach and provides a false sense of security at best, a view borne out of not only our extensive testing with clients but also broader research–including concerns that AI detection disadvantages students with disabilities and those who are learning outside their native language. Inclusion and accessibility are, and always will be, central pillars for all Learn product development, and we believe that the most responsible approach is to focus on empowering instructors and authentic assessment as the first priority, then employ anti-plagiarism tools as a last line of defense. We will be publishing a whitepaper that brings together the results from our own testing and broader research in the next few weeks.

The Benefits of AI Extend Beyond Assessment

The AI Design Assistant has additional benefits for instructors outside of assessment. As the name implies, a core part of this is giving instructors inspiration for how to structure their course in alignment with the course syllabus and learning outcomes. This is beneficial for instructional designers also, as it allows them to focus their time at a higher, strategic level, rather than on time-consuming production tasks. To see this market-leading functionality in action, check out the short video below where VP of Product Management for Blackboard Learn, Nicolaas Matthijs, uses AI to generate modules for a course on the Evolution of Country Music.

The excitement from clients was palpable when this was announced at Anthology Together ’23:

  • "Anthology is the first EdTech giant to focus on strategically supporting faculty and instructors." -Frederick T. Wehrle, Associate Dean for Academic Affairs, University of California, Berkeley
  • “I usually walk away from these product roadmap sessions with four or five great things–this time there is 25! I think the biggest impact, though, will come from the AI Design Assistant.” – Mary Kleps, Director of Academic Computing, Fairfield University
  • “The new Blackboard Learn feature that is most intriguing is the AI Design tool. I am really interested in exploring what that has to offer.” – Roy Cloate, Learn Manager, The Independent Institute of Education, South Africa

And this is just the beginning. The advantages of our integrated partnership with Microsoft, whose commitment to innovation in AI is well established, are that it allows us to stay at the forefront of this technology, while collaborating with our institutional partners to ensure that it is used responsibly in higher education.

Enhancing Support Systems for Instructors

A study of the instructors at Harvard found that 47% felt AI would negatively impact education, compared with just 21% who expected it to be beneficial. AI is going to demand changes of instructors, and all major change will inevitably be met with a degree of trepidation.

This, too, has been a major theme in our recent conversations with clients. "My experience has been that faculty come to us usually from one of three perspectives,” said Suzanne Tapp, assistant vice provost of faculty success and executive director of the Teaching, Learning, and Professional Development Center at Texas Tech University, in a recent Anthology webinar. “A ‘fight it’ perspective where they're really concerned, with good reason, about what happens to academic dishonesty with the entry of easily accessed AI tools. Or maybe they come to us with the opposite ‘use it’ perspective and they're ready to jump in. Or maybe they're in the open-minded middle, where they're watching to see what happens. [...] We have this new paradigm that we all have to learn about [AI] and figure out what to do with it, and where we enter into the conversation."

A major source of instructors’ concerns is that they do not feel they are receiving the necessary guidance and direction to apply AI within their courses. As recently reported in Inside Higher Ed, a survey of instructors across the United States found that only 14% had been provided guidelines for use of AI in the classroom – and, consequently, only 18% had set guidelines for their students. Furthermore, the conversation around AI has so often been negative and defensive—i.e., “how to protect your courses against plagiarism in the ChatGPT era!”—that it is little wonder that few instructors are feeling positive about it.

We need to change the tune on generative AI. (Sorry, we are fresh off a week in Nashville – music puns have become mandatory around camp Anthology!). There is clearly a role for strong policy to lead responsible adoption, something we are committed to in the development of our technologies. But there is also a need to empower instructors, by demonstrating to them that AI can provide inspiration and make the process of designing engaging courses easier, including for online, hybrid, and other emerging modalities. Instructors who embrace change and innovation provide richer learning experiences to their students, and we are dedicated to working with institutions to ensure that generative AI becomes the latest example of technology improving, rather than inhibiting, meaningful education that changes lives.

Subscribe

Subscribe to our newsletter for the latest industry news, events and promotions