Professor Doyle Reflects on AI
- Kyle Tabor
- Sep 25
- 2 min read
Kyle Tabor | Contributing Author
For Part 1 of this series, I spoke with Dr. Doyle, Associate Professor and Department Chair of Art History, to discuss how the rise of Artificial Intelligence has impacted their field. Dr. Doyle’s specialty is the art of medieval Europe, which they have focused on since undergrad: a double major in Art History and Medieval and Renaissance Studies. Straight away, I was told that the impact is not fully clear, as AI tools are not yet able to directly correspond with the kinds of research in this field. The teaching angle, however, is more pronounced, and they are developing a lesson plan looking at AI users and learning outcomes.
“Art History,” they say, “analyzes images and words, and generative AI has in many ways impacted the way we engage with these mediums, especially online … many others in my discipline are disappointed by the volume of AI writing their students are submitting.”
Cheating with AI is an increasing issue across all classrooms, especially for paper and essay submissions. They assure me that paper-writing is not for its own sake. “Through writing papers, you develop critical thinking skills necessary for your discipline, and life going forward. And so the ease at which an AI tool can churn out a seemingly polished art history paper is a serious challenge to learning.”
Dr. Doyle stressed the concept of information literacy, which is the process of teaching students how to assess and contextualize the information that they find, especially online. Usefulness, appropriateness, and authority should all be evaluated for each source. “We need to empower students to approach this technology critically to their best advantage, and support learning and development rather than replace it.”
They recently worked on a group-study with 3 other ECSU professors: Dr. Ferruci, Dr. Garcia, and Dr. Hwang. This occurred under an NEH “Spotlight on the Humanities” grant led by Dean Emily Todd as a Principal Investigator, and probed into the relationship between generative AI and the Humanities, since much of the focus and public attention on AI centers around STEM and the further development of AI itself. The group went in with different initial attitudes, and emerged with different takeaways. Dr. Doyle remarked, “when you use something without understanding its purpose or approach, you can’t use that thing very well, or adapt to it.”
Then we went big-picture. I asked them about the future of AI and its place, if any, in the Humanities as a whole. Were they optimistic? “The notion that technology allows us to be more human is somewhat undercut by the companies developing and marketing these tools. They’re arguing that what AI is best suited to do is replace the humanist effort. And that is a huge concern, for us as an academic community and as a society.”
A final nugget of wisdom was dispensed at the end of our interview, as they addressed students considering Art History as a field. “One of the key skills is looking closely, and looking critically about what we see.” As we enter an age increasingly filled with digital misinformation and disinformation, let’s keep our eyes open.
The work necessarily continues. This article is the first part in a series.






Comments