Two George Mason faculty appointed to Virginia AI task force

In This Story

People Mentioned in This Story
Body

Amarda Shehu, a professor of computer science and associate sean in the College of Engineering and Computing, and Padhu Seshaiyer, a professor of mathematical sciences and the former associate dean for academic affairs for the College of Science, were named to the State Council of Higher Education for Virginia (SCHEV) EO30 task force, prompted by Executive Order (EO) 30 by Virginia Governor Glenn Youngkin.

Both Shehu and Seshaiyer acknowledge that this is a timely and important initiative by SCHEV, and they are excited that George Mason University is an integral part of this process, supporting SCHEV, and by extension, institutes of education across the commonwealth.

A headshot of a woman
Shehu's research is on AI and machine learning. Photo by the Office of University Branding

The EO states that it “recognizes the dual nature—both the opportunities and risks—of this developing technology in education. K-12 schools and postsecondary institutions must embrace innovation, experimentation, and new educational opportunities for students as well as ensure appropriate guardrails and necessary constraints exist to safeguard individual data privacy and mitigate discriminatory outcomes.”

SCHEV leaders chose Shehu and Seshaiyer for the task force because of their longstanding work and leadership in artificial intelligence (AI), STEM, and education; both admit there is a lot to consider around AI, and they are glad that Virginia is putting forth this effort.

Shehu said a broad conception of AI is necessary. “We need to think a little bit more holistically regarding the different opportunities and potential risks with regards to integrating AI in education. What kind of guidelines should we have for universities and for community colleges, not just for educators, but also for all the different organizations that they work with across Virginia.”

Seshaiyer explained that the task force has broad representation. “While we talk about promoting innovation and economic growth, there's also this whole aspect of ethical boundaries and considerations in AI development and deployment," he said. "It requires government, industry, and academia coming together to understand AI research, workforce development, and responsible AI.”

Shehu said the task force’s work and function are evolving. “A lot of the meetings I imagine will be thought experiments and the members feeding off one another based on our expertise. Some of us may be more positioned toward exciting applications, while some will want to caution, ‘Hey, hold on a second—you can't really do acquisition like this because you have to go through so many regulatory processes.’”

Shehu said they had been asked to consider and build on examples and guidelines from states, countries, and regions to be forward-thinking in their approach. One approach would be for the task force to categorize AI applications that are definitely a no, some that are yes, and some that still exist in a gray area.

A headshot of a man
Seshaiyer has encouraged students to experiment with AI in assignments. Photo provided

“For instance, AI assessing your emotions and your reactions and anything like that used for surveillance—that's a ‘no’ unless in situations of national security,” she said. “Consider personalized learning settings that are popping up where you might see clear value for an AI agent that sits and monitors you to read facial features so it can figure out if I'm confused or if I'm understanding. But for the task force, that’s a no, because that is surveillance. Please keep in mind we are thinking about children and students, a highly vulnerable population.”

Seshaiyer said, “For higher ed we have K-12 on one side, and workforce on the other side, so we are theoretically coming up with all sorts of academic ideas about AI and its implications. We test it on a nice cookbook problem, while the workforce side is looking at it from a business perspective, and K-12 is of course thinking of the learning implications. This is an example of having the tension in understanding AI implications by different stakeholders between impact, thoughtfulness, and speed.”

Those dealing with children have multiple considerations, he added. “So, the question for the principal or the Chief Information Technology Officer of the school system is, ‘How do I navigate this space? Which software is feasible, appropriate and at the right price point? Is it safe to use and are there any backdoors in terms of data privacy?’ There are these really practical questions for which they don't have answers because nothing is formally written, but they also don't quite know how to evaluate AI and its implications even if there are answers of how to actually evaluate it.”

The recommendations and guidance provided will impact rank-and-file faculty members, but also must be understood by deans, presidents, and provosts in higher ed, the faculty members said.

Seshaiyer said, “This could have impact across the curriculum. Should I change the content or the way I teach a class? Do I need to go back because maybe I was doing it all wrong or I thought I was doing it right? So, if I were to put my money into the recommendations, it would be training, training, training,”

The task force members are currently working on creating a draft of emergent themes on opportunities, issues, and executive order responsibilities. They have not yet been told what their timeframe is or the expected length of service.