OpenAI has released a teacher's guide to ChatGPT, but some educators remain skeptical.

For example, OpenAI would envision teachers using its AI to draft lesson plans and interactive tutorials for students. But other educators are wary of the technology — and its potential to go awry.
OpenAI has released a teacher's guide to ChatGPT, but some educators remain skeptical.

For example, OpenAI would envision teachers using its AI to draft lesson plans and interactive tutorials for students. But other educators are wary of the technology — and its potential to go awry.

Just now, OpenAI released a new free online course designed to help K-12 teachers learn to bring the company's AI chatbot platform, ChatGPT, into their classrooms. The one-hour, nine-module program was created in collaboration with nonprofit Common Sense Media, with which OpenAI has an active partnership, covering the basics of AI and its pedagogical applications.

OpenAI reports that it has already used the course in "dozens" of schools, including the Agua Fria School District in Arizona, the San Bernardino School District in California, and the charter school system Challenger Schools. According to the company's internal research, 98% of participants reported that the program provided new ideas or strategies they could apply to work.

"Schools across the country are tackling new opportunities and challenges as AI reshapes education," Robbie Torney, senior director of AI programs at Common Sense Media, said in a statement. "With this course, we're being proactive: supporting and educating teachers at the heart of it all and building for this future."

But not everyone is so convinced. Some educators believe it's not helpful — and could be downright misleading.

Lance Warwick, a sports lecturer at the University of Illinois Urbana-Champaign, is concerned resources like OpenAI’s will normalize AI use among educators unaware of the tech’s ethical implications. While OpenAI’s course covers some of ChatGPT’s limitations, like that it can’t fairly grade students’ work, Warwick found the modules on privacy and safety to be “very limited” — and contradictory.

"In the example prompts [OpenAI gives], one tells you to incorporate grades and feedback from past assignments, while another tells you to create a prompt for an activity to teach the Mexican Revolution," Warwick noted. "In the next module on safety, it tells you to never input student data, and then talks about the bias inherent in generative AI and the issues with accuracy. I'm not sure those are compatible with the use cases."

Sin á Tres Souhaits, an artist and instructor at The University of Arizona, calls the AI tools helpful for guides and other adjunct course material but said he is worried about the lack of direct openness about how OpenAI might seek to exert control over content teachers create through its services.

"If educators are making courses and coursework based on a program that grants the company the power to recreate and sell that data, that would destabilize a lot," Tres Souhaits told TechCrunch. "It's unclear to me how OpenAI will use, package, or sell whatever is generated by their models."lo

In its ToS, OpenAI proclaims that it "does not sell user data," and that users of its services, including ChatGPT, "own the outputs they generate 'to the extent permitted by applicable law.'" Unless and until more assurances are provided, however, Tres Souhaits cannot be satisfied that OpenAI will quietly alter its policies to drop the problematic provisions.

"For me, AI is like crypto," said Tres Souhaits. "It's new, so it offers a lot of possibility — but it's also so deregulated that I wonder how much I would trust any guarantee."

Late last year, UNESCO called on governments to consider overseeing AI usage in education - including regulatory boundaries around who is allowed to use AI tools, and guardrails around data protection and user privacy. But on that front, and on AI policy more broadly, progress hasn't been much made since then.

Tres Souhaits also complains that the OpenAI program, to which OpenAI touts as a roadmap to "AI, generative AI, and ChatGPT," doesn't mention any other tools besides OpenAI. "It feels like this reinforces the idea that OpenAI is the AI company," he said. "It's a smart idea for OpenAI as a business.". But we already have a problem with these tech-opolies — companies that have an outsize influence because, as the tech was developed, they put themselves at the center of innovation and made themselves synonymous with the thing itself.

Josh Prieur, a classroom teacher-turned-product director at educational games company Prodigy Education, had a more upbeat take on OpenAI’s educator outreach. Prieur argues that there are “clear upsides” for teachers if school systems adopt AI in a “thoughtful” and “responsible” way, and he believes that OpenAI’s program is transparent about the risks.

There are still concerns from teachers over using AI to plagiarize content and dehumanize the learning experience, and also risks around being overly reliant on AI," said Preiur. "However, education is often key to overcoming fears around the adoption of new technology in schools, while also ensuring the right safeguards are in place to ensure students are protected and teachers remain in full control.

OpenAI is aggressively going after the education market, which it sees as a key area of growth.

In September, OpenAI hired former Coursera chief revenue officer Leah Belsky as its first GM of education, and chargefd her bringing OpenAI’s products to more schools. And in the spring, the company launched ChatGPT Edu, a version of ChatGPT built for universities.

The AI in education market could be worth $88.2 billion in the next decade according to Allied Market Research. Growth, however, is off to a sluggish start, in large part thanks to skeptical pedagogues.

In a recent survey by the Pew Research Center, this year, a quarter of public K-12 teachers said that using AI tools in education does more harm than good. A separate poll by the Rand Corporation and the Center on Reinventing Public Education found that just 18% of K-12 educators are using AI in their classrooms.

Educational leaders have been similarly reluctant to try AI themselves, or introduce the technology to the educators they oversee. According to educational consulting firm EAB, few district superintendents view addressing AI as a "very urgent" need this year — especially in light of other issues that parents and elected officials are more apt to fuss about, like understaffing and chronic absenteeism.

Mixed research about the educational impact of AI has not yet done much to win the case for those unconvinced. University of Pennsylvania researchers found that Turkish high school students who had used ChatGPT did worse on a math test than their peers who did not use it. Another study concluded that German students who used ChatGPT were able to locate materials more easily but tended to synthesize those materials less skillfully than their peer group not using ChatGPT.

As OpenAI writes in its guide, ChatGPT isn't a replacement for interaction with students. Some instructors and schools may never be swayed that it is a replacement for any part of the teaching process.

Blog
|
2024-11-20 19:57:10