Digital Citizenship: Exploring the Potential of AI

Digital Citizenship: Exploring the Potential of AI

ETHOS
Digital Citizenship: Exploring the Potential of AI

Drawing on the same skills and mindset teachers have helped students develop since Lower School, Ravens explore how to navigate both the benefits and the risks of using AI tools.

Karen Lewis Taylor
Nov 18 2024

As the Information Age continues to dramatically redefine the ways in which we learn, work and engage, Ravenscroft faculty are committed to ensuring that students in all three divisions build the knowledge and skills they need to thrive and lead in the online world.

In this three-part series about digital citizenship, we explore how our youngest Ravens master the basics of computer and tablet use, how middle-grade students develop a sense of responsibility alongside their burgeoning digital voices and how both teachers and students explore the opportunities and challenges presented by artificial intelligence as this latest technological advancement enters the mainstream.

By the time they’re in Upper School, students have access to Innovation, Design and Engineering courses that include college-level computer science courses and an advanced seminar in artificial intelligence and machine learning. Ravenscroft continues guiding students in the responsible use of technology even as they begin to create new programs and tools themselves.

In fact, learning about AI provides many of the same growth opportunities teachers have helped students explore since Lower School. Successfully navigating both the benefits and the risks of using AI tools requires a sound strategy for implementation that aligns with best practices.

Above, Rehan Khan ’25 and Zach Peverall ’25 present their AI prototype, Absolutely Momentous Origin for a New Generation of Understanding and Success (AMO) — which will support math education by creating homework and review exercises — as part of their work in Advanced Seminar in AI & Machine Learning.

Jump To

Lead From Here facilitator Chris Harper and Lower School Spanish teachers Carmen Hernandez and Margaret Irish ’13 work on prototypes for an interactive lesson using AI as part of recent training by the school’s Innovation Task Force.

“I really see the benefits of AI”

Developing a thoughtful approach to the use of AI begins with the Innovation Task Force, whose members are working to identify the skills students need to use AI effectively and the strategies teachers need to incorporate AI tools into course curricula.

Comprising Chief Information Officer Louis Tullo, Director of Educational Technology Mitch Carraway and Director of Library Services Angela Finn plus two teachers from each division, the task force is developing guidelines for the academic use of AI, providing teacher training and assessing the ongoing impact of AI on learning outcomes.

“Our meetings are centered around questions including ‘How can various learning activities be reimagined to help students develop the skills they need?’ and ‘How do AI tools address the skills we’ve identified?’” Tullo explained.

Chief Information Officer Louis Tullo’s presentation on using AI tools to gamify lessons included an opportunity for faculty to practice using Flint to adapt a lesson.


“We also want to be sure we’re encouraging students to use critical thinking in evaluating the quality and reliability of information generated by AI,” Finn said. “In addition, we’re focused on maintaining a balance between AI-powered tools and face-to-face instruction — prioritizing human interaction.”

In support of this work, the task force is offering teacher training on AI. On the Sept. 20 workday, for example, Tullo walked faculty through a presentation on how to use AI to help build gamified activities. Teachers had time to consider ways to augment or adapt their lessons and discuss ideas with peers before using the tools to design a prototype for a lesson they had brought to the training.

Upper School math teacher Karen Tinnesz enters information into the AI tool Flint to create a gamified lesson for her algebra classes.


“I found the guidance about [being specific in] prompt writing to be extremely helpful,” Middle and Upper School Health and P.E. teacher Frank Regalbuto said. “Whether it be building onto some of the stuff I’m already doing or finding new ideas, I really see the benefits of AI as an additional resource to my teaching.”

 

Xander Lamond ’27 presents his AI-generated prom attire, part of a unit on shopping, to his Honors Mandarin II class.

“It is here to stay”

Other teachers who are new to AI agreed, saying the session opened their eyes to the many ways they can use these tools in curriculum development.

“I did not have any experience using AI prior to the training,” Lower School General Music teacher Katie O’Neill said. “I found it helpful to know that I could double-check tasks for music, such as chord progressions for a song, before teaching the song to a class. I see myself finding new activities with the gaming AI database that will help with creating various other ways of learning music in the classroom.”

Yan Zhou’s AI-generated images of her Honors Mandarin II students — including Sais Smith ’27, shown here — as children exploring art in the Fine Arts Center inspired the class to use the tools themselves in subsequent projects.


Innovation Task Force member Lali Fisher, who teaches Science 6 and Science Olympiad, said, “I have begun using Flint for homework to reinforce the learning going on in the classroom. I am really excited to get to know AI better and utilize the tools to enrich student learning.”

But these AI tools aren’t just for teachers. When Middle and Upper School Mandarin teacher Yan Zhou used ChatGPT to create fun images of her students viewing artwork in the gallery in Pugh Lobby last year, the results “sparked their interest in incorporating more AI-generated images into their learning process,” she said. In a subsequent unit on shopping, students used AI tools to create their dream prom attire, which they then shared in an oral presentation.

Upper School English teacher Karen Cruz used the training to adapt a lesson to include ChatGPT, allowing the seniors in her 20th-Century American Drama elective to see both the benefits and the limitations of the technology.

To create prom fashions as part of their shopping unit in Honors Mandarin II, Jaeden Jordan ’27 and Azera LaGuerre ’27 first brainstormed descriptive terms to guide the AI, with these images as the result.


“The objective was to write a new scene for a canonical play we had read as a class,” she explained. “I had the students brainstorm on paper what they envisioned for their scene, as I wanted the seed of the writing to be their own. Next, they created two sets of input for ChatGPT. I wanted them to see what the different terms and information they input would yield.” Students then adapted the results to fulfill their vision for the scene. As a final step, they wrote a reflection that included noting where AI had helped — and hindered — their work.

“ChatGPT did not understand everything about the play,” Cruz noted. “In one output, the AI did not understand an emotional factor of the drama that a human reader would have easily understood.”

Innovation Task Force member Angela Moser, an Upper School English teacher who is pursuing a doctorate in educational technology, said she uses AI “all the time,” for everything from rote tasks to brainstorming ideas for lessons. She said she applauds her colleagues’ work to understand both the power of AI and its downsides.

Students “don’t understand that AI is often wrong and cannot do a lot of the tasks we ask of it, at least not in the way humans can,” she said. “I think it’s important for all teachers to be trained on AI, as it is here to stay, and we need to be able to recognize it, use it and teach students about it.”

 

Advanced Seminar in AI & Machine Learning students Jonas Lisson ’25 and Sundesh Donthi ’25 collaborate on work for their prototype.

“It’s deeply connected to human values”

As the Upper School computer science teacher and robotics co-coach, Innovation Task Force member Mariam Elias is on the front lines of preparing students to use these emerging technologies. Her work with Ravens in the Advanced Seminar in AI & Machine Learning highlights the responsibilities educators have to prepare students to use AI tools successfully.

“Currently, I’m focusing heavily on the intersection of AI and ethics, which is one of the most important discussions we need to have as we continue to develop and implement AI technologies in real-world applications,” she said. “I’m teaching students not just how to build AI models but how to advocate for ethical principles that address bias in AI, fairness, privacy concerns and the broader societal impacts of AI systems.”

To ground their work in this mindset, Elias assigned a group project that tasked students with prototyping an AI product that would benefit schools. They were required to build an ethical matrix and a framework that considers fairness and privacy, ensuring that the AI product operates transparently and equitably.

Students in Mariam Elias’s Advanced Seminar in AI & Machine Learning course created logos for their AI prototypes (left to right): Matthew Madewell ’26 and Cole Welborn ’26’s Activity FinderAI; Jonas Lisson ’25 and Sundesh Donthi ’25’s ScheduAIder; Tyler Artinger ’25 and Eleanor Mowat ’25’s InnovaEd; Zach Peverall ’25 and Rehan Khan ’25’s Absolutely Momentous Origin (AMO).


Proposals ranged from a scheduling tool that considers a range of factors — including graduation requirements, students’ preferences and students’ friends — in creating course schedules to one that matches students with extracurricular and community activities that nurture their interests and give them a well-rounded profile for college applications.

Students said the unit informed their understanding of AI in important ways.

“The biggest thing I learned about AI from this project is that seemingly simple and harmless products, like an AI that makes schedules, have many ethical issues that can affect many people,” Jonas Lisson ’25, who created the ScheduAIder with Sundesh Donthi ’25, said. “During the research process, I was shocked by the multitude of ethical issues that needed to be avoided. It was crucial for us to consider how every decision we made could affect the user.”

“As an informed user, I will research how AI systems generate their outputs to avoid relying on ‘black box’ models — models that hide their inputs, operations or decision-making processes,” Matthew Madewell ’26, who worked with Cole Welborn ’26 on the Activity FinderAI prototype, said. “As a potential creator, I will prioritize transparency in my models by making their logic visible and using a well-rounded and comprehensive dataset to ensure the most accurate and beneficial outcomes for users.”

Advanced Seminar in AI & Machine Learning student Eleanor Mowat ’25 presents the InnovaEd prototype, an AI tool developed with classmate Tyler Artinger ’25 that will use data from prior assignments to predict how much time students need to do future work and generate a schedule for completing tasks; as part of the assignment, they spelled out metrics for ensuring fairness as well as privacy and data protection.


“My goal is to ensure students understand that AI isn’t just a technical subject — it’s deeply connected to human values and social justice,” Elias concluded. “We are equipping students not only with the skills to build AI technologies but also with the understanding of how to use them ethically and responsibly.”

Be sure to read our entire three-part series on digital citizenship! Learn how Ravens develop the skills and mindset for technology use in part one, Learning Foundational Technology Skills, and prepare for the challenges and opportunities of the online world in part two, Preparing Students for the Online World.