...
Technology

Academic approach to AI maturing as technology evolves


Higher education moving beyond initial fears of artificial intelligence (AI) to focus on practical, technology-specific opportunities was a recurring theme at the Digital Universities US conference, which concluded Wednesday in St. Louis.

The conference, co-hosted this week by Inside higher education It is Higher education times in collaboration with Washington University in St. Louis, brought together hundreds of university administrators and employees from educational technology companies to explore the possibilities and challenges of digital transformation in higher education.

“I have been in digital transformation for over 20 years; The first lesson is that this doesn’t happen overnight,” said Lev Gonick, chief information officer at Arizona State University, who kicked off the event’s second day by describing a science-focused virtual reality lab and a partnership with OpenAI. .

Gonick said that although ASU’s digital transformation has taken decades, there is no time to waste when it comes to AI. ASU needs to go from “online to AI” in about three to four years, he said.

Artificial intelligence

Unsurprisingly, generative AI was a topic of discussion at many event sessions. At a packed workshop on “Why universities are slow to adopt technology,” participants considered AI as a key technological trend in higher education that will continue to emerge over the next five years.

When it comes to AI speeding up academic processes, “a lot of the new work for us is figuring out what we want to evaluate in the process rather than just at the end,” said Douglas Harrison, associate dean and clinical professor at New York University. . “The ending has been so reliable, for so long, as a measure of learning. But now we have to evaluate in the middle – what we have been saying for decades, but now our hands are forced.”

In another session, Robbie Melton, interim provost and vice president for academic affairs at Tennessee State University, warned about the dangers of bias in AI results. She described how AI-generated images of underrepresented groups can provide negative portrayals, even in subtle ways, with images that tend to be sad or serious. Creating positive, happy AI-generated images may require several prompts, she said.

“There is a digital divide, and there will be an even greater digital divide if underrepresented groups don’t have a seat at the table,” said Melton, who is also vice president of technology innovations for SMART’s Global Innovative Technologies Division.

Badri Adhikari, associate professor of computer science at the University of Missouri in St. Louis, emphasized the importance of human checks on AI, including providing “context” to AI models to reduce bias when they are trained on inevitably biased data. . Adhikari also emphasized that AI is not yet reliable enough to go without human verification in any consequential application.

“There is somewhere between solving the problem of prejudice and avoiding Terminatorand I continue to work in this role,” he said.

The University of Florida is working hard to train faculty members to help students use AI ethically and practically, but is holding off on using AI to construct assessments due to similar concerns, said David Reed, associate dean for strategic initiatives. He said even a promising short-term predictive analytics program was halted while his team explored its possible implications.

Neil Richards, Koch Distinguished Professor of Law at Washington University in St. Louis, joked that he was invited as the “resident contrarian” on a panel discussing the ethical and legal implications of AI. Richards opposed the idea that technology regulation and innovation are in conflict, arguing that technology and law have long been intertwined and that any strong technology finds ways to adapt to strong ethical and legal barriers.

ASU’s Gonick said a key way to quickly adopt AI is to have a few employees focus exclusively on implementing the technology, whether it’s a team of two or 20 people.

“They wake up in the morning and go to bed at night thinking only about AI acceleration at ASU,” Gonick said of his AI acceleration team. “If you add that to someone’s existing schedule, you’re game, but it’s hard to imagine you’re dedicating the necessary resources.”

Equity and Inclusion

AI was not the only theme of the event, where the official theme was “digital first: access, equity, innovation”.

“One of the really great things about online spaces is that they give us the opportunity to really think about creating learning experiences with diversity in mind,” said Tiffany Townsend, vice president of organizational culture and chief diversity officer at Purdue. Global.

“What we’re doing with technology is really thinking, from the beginning, ‘How do our students present themselves? How do they show up and learn in different ways? And how are we incorporating that?’ And the way we are structuring our courses from the beginning,” she said.

When it comes to defining access and equity in online spaces, boiling ideas down to single definitions can be limiting, said Racheal Brooks, director of quality assurance implementation solutions at Quality Matters, a nonprofit focused on online and blended learning. .

“Rather than ensuring that no student faces challenges, we need to rely on the experience of students to help clarify how we can continue to expand this definition,” she said. “Keep in mind that we will grow and learn – keeping this in mind can help us expand whatever definition we choose.”

Accounting for COVID Losses

University leaders also addressed the importance of addressing the learning and emotional loss that has occurred during and after the COVID-19 pandemic.

In a session with leaders of minority-serving institutions, Maurice Tyler, vice president for information technology and chief information officer at Bowie State University, a historically black institution in Maryland, said current students are behind in terms of development. in building social relationships compared to students before the pandemic.

“We can clearly see the cliff, but we’re not sure how to handle it,” Tyler said. “How do you fast-forward someone’s brain six years into the future without overwhelming them with too much social interaction that we’re not really ready for?”

Part of Bowie State’s response has been to increase the frequency of outreach from its student support team, increasing checkpoints for potential interventions from the fifth to the second week of the semester.

“Reducing these deadlines has helped because it helps us get ahead of these problems rather than playing catch-up,” she said.

Wendy DuCassé, director of field education and assistant clinical professor at Saint Louis University, noted in another session the impact of the pandemic on student mental health – how some students have thrived in online learning environments while it has exacerbated existing mental health challenges for others. Young people’s tendency to access a near-constant stream of news about global events on social media can create experiences of “vicarious trauma,” she said.

Tameka Herrion, senior director of programs at the Scholarship Foundation of St. Louis, where 85% of funded students are eligible for Pell Grants, told educators, “The only thing we can do to help our students the most is pay for health care.” mobile mental. apps – like Headspace or Calm – so they can access them anywhere.”

Sara Custer, Colleen Flaherty, and David Ho contributed to this article.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.