Library sessions teach students to ‘leverage’ AI

By Zyheim Bell

Starting in October, the Moore Library will host workshop sessions open to the Rider community in hopes of empowering students across all majors to learn how to “leverage” artifical intelligence, according to a Sept. 25 studentwide email from the university library.

The workshops will be offered in four in-person sessions within the library. Two sessions will be held on Oct. 8 while the remaining two will take place on Oct. 9 and 21. 

Students have the opportunity to attend sessions that teach them how to utilize Google Notebook LM, use AI models that can help planning and development of group work and explore career options.

Dean of the Library Sharon Whitfield said, “We’re kind of looking at AI more as what it’s supposed to be, as an assistant that works with you.”

Whitfield is an avid user of AI employing the software in uses where she believes creativity is not required. “They tend to be very automated things in my life. For instance, if I want to just make sure that an email is coming across effectively,” she said.

The Oct. 8 sessions will both cover Google Notebook and learn how AI can be implemented for creating studying tips and remain within compliance of the university’s AI policies. The Oct. 9 session “From Doodles to Done: AI Tools for Smarter Group Work” intends to teach students how to apply AI models like Google Gemini and Napkin to take “doodles” and prompts to create images. 

The Career Development and Success department will cover the Oct. 21 workshop on how students can use AI like Google Career Dreamer for career path ideas and skill building, the email stated. Students will be able to utilize the Google workspace AI features like Career Dreamer and Notebook for free through Rider.

The university library has begun hosting sessions to equip students to properly utilize artifical intelligence. (Gail Demeraski/ The Rider News)

With the workshops, the library team wants students to understand how human input and collaboration with AI is needed for the best results — a “centaur-based” relationship according to Whitfield, where the human and technology have to work hand in hand with one another.

Whitfield is hoping that the sessions provide students with the necessary skills to utilize AI, and that starts with the prompting and user input. 

According to her, AI writes at a seventh-grade level and when students do not take the time to properly format the prompts or use models to completely write for them, they run the risk of their work having a high level of “banality.” Whitfield wants students to understand and see how AI produces these unoriginal responses to build the understanding that programs will not produce writing that is “creative” or capable of being a “great argument.”

Whitfield combats this herself in two ways, using AI only to proofread her original work and using promptings that focus on the audience she wants to adapt her writing for.

“I will kind of take my prompt and say from a certain lens, what do you think about this?” she said.

Trevor Janusas, a junior communications major, believes that it is important that the school has opted to host programs teaching AI through the library.

 “There’s so much rampant use of AI within every part of people’s school lives … Why not learn how to make ourselves better from it instead of just completely cheating?” he said.

Janusas uses generative AI systems like ChatGPT for creating study guides and helping to complete assignments. However, he also notices that the convenience of AI has been a “slight detriment” to him.

“It might be taking me back as a learner because I’m not doing as much of the thinking as I used to do in high school,” Janusas said.

The library workshops are working toward correcting this and bringing students to that “next-level” by using AI to enhance written notes, provide new lenses of thinking and remaining critical of AI, according to Whitfield.

Cheating is not the only issue that can be raised when it comes to the use of AI; Whitfield wants students to understand the ethics behind the software.

Whitfield said, “There’s a bias to AI, and that typically happens with all technologies. So we want them to also think about that.”

AI is also fed by the content that is put into it. Whitfield noted that each time someone submits work to AI programs, they lose their intellectual property as the writing is used to feed and improve AI databases.

Whitfield hoped that more students would begin to enroll in the workshops, and urged students to take control over the shaping and use of AI. She said, “Make sure that you continue to prompt … to ask more questions to use this tool more effectively.”

Related Articles

Back to top button