It’s time to establish boundaries for AI use
Students must understand how to use artificial intelligence within education
January 20, 2023
As the Midway sees it…
Everyone’s seen the dystopian movies — the ones with robot takeovers, machines gone rogue and the human subjugation that follows. They succeed in stirring great fear, or at least caution, in viewers. One thing is for certain: movement toward that bleak future is getting increasingly difficult to avoid.
Since their creation, Google and library search engines have provided a powerful resource for students to guide and enhance their work. They’ll continue to do so, especially as the programs behind such sources learn and improve to “humanize.” The proliferation of AI is somewhat inevitable — according to Global Newswire, the global artificial intelligence market size is projected to more than triple to almost $1.4 trillion in 2029.
The questions are whether those in academia will rise to this evolving challenge to human ingenuity, and how they’ll balance that fine line between convenience and contamination. As society looks for faster and more efficient means for progress, there comes the need for a choice between speed or integrity — and soon.
As artificial intelligence continues to develop, evolve and popularize, its existence must be addressed, and distinct boundaries must be established in academia to prevent confusion and crossover into academic dishonesty.
Deliberating the use of AI in an academic context is essential to even begin to define any boundaries, especially as they will differ by department or course.
According to the Student and Family Handbook: “Each teacher may have course-specific expectations regarding academic integrity. Students are responsible for knowing and understanding their teachers’ expectations about shared work, use of outside help, assessment and assignment protocols, source citation, and plagiarism.”
If a student is responsible for understanding the expected use of AI writing software, the boundaries should be made consistent or clearly established across all courses. Teachers should bear the responsibility, at least initially, of clearly defining how AI can be used in their classrooms, especially when a breach of academic honesty, according to the student handbook, can be as dire as a failing grade.
The handbook defines plagiarism as “words, ideas, opinions, compositions, or images deriving from or belonging to another person or source.” By this definition, using AI writing software to generate an entire paper is inherently plagiarism, but there are still a multitude of circumstances where boundaries are unclear. For example, without restrictions on AI use, a student may use AI writing software to simplify a complicated concept when a teacher would prefer the student to utilize a more conventionally accepted source or their own material. Whether the student should be held liable for “academic dishonesty” could become ambiguous because not only did the teacher not clearly establish expectations for the course, but no universal “course-specific expectations regarding academic dishonesty” exist. Teachers must establish these boundaries.
Outside academic settings, it is up to the student or individual to dictate their own boundaries and to control reliance on AI. Long-term use of AI could cause codependency on the software, disrupt the learning process and in a nonacademic setting, diminish human authenticity.
Another concern of AI is misinformation and disinformation — users should regard AI-generated text with the same level of caution they would anywhere else to prioritize overall trustworthiness and credibility.
AI writing software may eventually perfect its capacity to incorporate human idiosyncrasies and nuances into its outputs, and lesser quality may no longer be an adequate justification against using AI. We must start to turn our focus to what it means to preserve the integrity of learning and growth. But more immediately, teachers need to clearly establish boundaries in the classroom instead of ignoring or dismissing an inevitably developing tool.
This reflects the opinion of the U-High Midway editorial board.
Mark Moseley • Jul 26, 2023 at 12:04 pm
Now tell us what we can do about kids who connect AI to 3D printers.