top of page
Search

Does Artificial Intelligence Belong In Classrooms?

  • Fellow Editors
  • 42 minutes ago
  • 7 min read
ree

It's A Conversation That Won't Wait


The topic of AI or Artificial Intelligence has been a controversial one for the last decade. It is no different in schools, where many are questioning if AI should be used in classrooms or not. There are many people on both sides of the issue.


When I stopped teaching in 2013, each student in our school had a laptop computer. We used technology quite a bit in record keeping and in our instruction. But, artificial intelligence had not really come on the scene. There were sites where students could create citations for research papers using technology and some sites that helped with grammar. Other than that, nothing major.


Educators were able to use some sites to detect plagiarism in student work.


On Wednesday, August 20, the Talbot County, Maryland, public schools tackled the issue of AI through a presentation by Robin Werner, the district's Director of Teaching and Learning. Talbot County is a small system on Maryland's Eastern Shore that has approximately 4500 students.


Werner's position on AI seemed similar to many other people in the education business. She stated, “It should be enhancing our instruction, but human oversight is imperative. We are not using it to replace any positions or staff members, but it is really a guiding tool that can help teachers.”


That statement is verbatim of statements in many districts regarding the use of AI.


The move to address AI usage in schools began with a Presidential Executive Order signed on April 23, 2025:



The Executive Order established the White House Task Force on Artificial Intelligence Education (Task Force) and focused on improving education through AI by providing resources through public-private partnerships.


The U.S. Department of Education has issued a policy and official guidance to state and local education agencies on Artificial Intelligence:



“Artificial intelligence has the potential to revolutionize education and support improved outcomes for learners,” said U.S. Secretary of Education Linda McMahon. “It drives personalized learning, sharpens critical thinking, and prepares students with problem-solving skills that are vital for tomorrow’s challenges. Today’s guidance also emphasizes the importance of parent and teacher engagement in guiding the ethical use of AI and using it as a tool to support individualized learning and advancement. By teaching about AI and foundational computer science while integrating AI technology responsibly, we can strengthen our schools and lay the foundation for a stronger, more competitive economy.”


It's a notable statement, but it, like most statements by school officials across the state, doesn't address many of the concerns that parents have about AI usage in schools.

One of the main concerns is the protection of student privacy as AI may be able to collect data and information on students who use it. On August 1, the parent of an Escambia Schools, Florida, student, Rafael Lopez, filed formal FERPA (Family Educational Rights and Privacy Act) and Title VI civil rights violation complaints against the school district with the Florida Department of Education Office of Inspector General.



“The basis of these complaints is that ECPS parents were not properly informed or asked for consent before third-party AI education platforms (including Waterford.org, Age of Learning, and others) began collecting and tracking children’s data. This includes personally identifiable information, behavioral and usage data, learning profile data and potentially biometric/indirect data,” said Lopez, whose son attends Jim Allen Elementary School.


In July, Lopez received a parent's welcome packet from his child's school. The packet included information from an AI company, the Waterford Institute(Early Learning Software for PreK-2 - Waterford)

Lopez filed a complaint the Department of Education.


“This flyer encouraged the use of the digital learning platform for early education but failed to disclose any of the data collection practices associated with it. At no point was I provided with a FERPA-compliant parental consent form, nor was I informed about any data-sharing policies or offered the opportunity to opt out prior to enrollment," Lopez wrote.


The concern over privacy is bolstered by the fact that many of these AI platforms can track students who use them throughout the day.



Parents are also concerned regarding the potential influence of AI on their children. They feel that school districts are not doing enough to inform them about which Artificial Intelligence platforms are being used with their children and how any information gathered on their child is being stored.


Many are divided:



And, of course there is the issue of students using AI to cheat in school, bully others and harass teachers and students via fake videos, audio files and pictures created by AI. School systems are currently scrambling to create policies that will address these issues, but it is hard for them to keep up with the ever-changing AI landscape. For example, Wichita Public Schools has taken a proactive stance by:


  • Creating a dedicated AI specialist role to guide implementation.

  • Training teachers to use AI as a learning companion, not a shortcut.

  • Ensuring tools comply with FERPA and COPPA, avoiding student data collection.

  • Using AI in creative ways, like character interviews in literature classes to spark discussion and critical thinking.


Their approach emphasizes responsible use and prepares students for a future where AI is ubiquitous.


Still, even in Wichita, questions remain on how districts will deal with student misuse.

Districts will have to expand their definitions of bullying and harassment and possibly include off campus behavior. They will have to provide supervision and reporting mechanisms and create appropriate consequences which fit into the student code of conduct in the school system.


Policies will have to be consistently updated and changed. As many of found out with the increasing use of technology in the classroom, its changing nature can cause issues many of us have not imagined. Think about the advent of smart phones in the last twenty years and their impact on what goes on in our classrooms and hallways as students text, video, and use phones for purposes schools never dreamed of. Schools now have detailed cell phone policies, and yet they have to change them periodically.



Of course, teachers are concerned with students using AI to cheat, or, even more damaging, as a crutch to which will stunt thinking critically and developing their brains in problem solving, communication, etc. A recent study finds that 20% of students admit their AI use clearly constitutes cheating, while another 25% acknowledge operating in ethical gray areas. Only 20% report avoiding AI for schoolwork entirely.



Seventy percent of teachers also admit that they don't feel fully trained to address AI usage in their classrooms, thus compounding the problem. Hard to make students use AI responsibly if one doesn't know how to use it themselves.


There's also a cultural, human component to the problem of AI in the schools. Many of us acknowledge that technology such as smart phones, social media, etc. have changed how we interact with other people. Many claim it has made people angrier, ruder and more dismissive of each other.


Will AI exacerbate these problems? Only if we let it.


AI is a tool and we need to use it as a tool. It is not a friend, a companion, or the font of all truth. It is not a religion or belief system. Like any other technology tool, it will make some jobs easier, make some more challenging and eradicate others. It is also a reality. It can be an intrusion into our privacy and we have to have guiderails for its use.


In Talbot County, Maryland, the schools will be wrestling with how best to implement AI. They have initiated a study group which visited schools to talk with teachers about using AI. Werner said that the county would wait to set policy on AI until the State creates one.

Board President Emily Jackson wasn't comfortable with that approach. “If we don’t have solid policy to lean back on … I find it gets stickier and into that gray area,” she said. She asked that the study committee bring back some policy ideas or guidelines for the board to review.

Werner stated that the county will be using a rubric for assessing AI usage in the classroom. A rubric is a scoring guide that will define how AI is currently being used on certain assignments in Talbot classrooms. The rubric goes from 0 -4, "0" means not used at all, "4" means can be fully used. The district got the idea from North Carolina.


She said she hoped that teachers/administrators would use potential student misuse of AI as a teachable moment without worrying about discipline of student offenders.


Talbot, like every other county in the country, is trying to figure out Artificial Intelligence implementation so that it fulfills its promise and helps our students learn.


It's a fine balance between doing what is needed and acting quickly to prevent problems because AI won't wait. It's already here in our classrooms and our lives.



Here is an article about Artificial Intelligence by Dr. Aaron Poynton: (Dr. Aaron Poynton is president of the Harford County Board of Education in Maryland and chairman of the American Society for AI.)


Other Links:




Jan Greenhawk, Author

August 22, 2025


Jan Greenhawk is a former teacher and school administrator for over thirty years. She has two grown children and lives with her husband in Maryland. She also spent over twenty-five years coaching/judging gymnastics and coaching women’s softball.


This article was originally featured on the Easton Gazette.  

 

 Please consider joining the Delmarva Parent Teacher Coalition and follow us on FaceBook to stay informed of what's really happening with education in our schools. Copyright

 
 
 
bottom of page