P. Hockicko
Artificial intelligence (AI) offers many possibilities today. When solving many problems, students turn to artificial intelligence, ask a question, and read the answer. They consider the given answer to the problem, the result of solving the task, to be correct. But do students think about whether the answer is correct? From my experience as a physics teacher, I can confirm that AI solutions to physics problems in physics teaching are fast (instantaneous) but not always correct.
The following article analyses the answers of two artificial intelligences to a question asked by AI students in a physics class: “What is the average power of a cat that jumps 1 meter high? (We assume that the cat has a mass of 4 kg.)”
Different artificial intelligences provided us with other answers, suggesting that not all AIs responses were accurate. Subsequently, a video analysis of a jumping cat was performed with the students, which confirmed the correctness of one AI's solution and refuted the correctness of the other AI's solution.
Video analysis in Tracker is an interactive method that, according to Bloom's taxonomy, allows tasks to be analysed at the "analysis" level and then, using mathematical analysis, to arrive at the desired physical parameters. During physics class, students worked with the Tracker program and determined the required physical parameters as well as the answer to the given question.
This article points out that not everything AI tells us is correct. Especially when solving physics and math problems, it is necessary to check and analyse the solution. In case of doubt, it is possible to modify the task for AI and define the problem more precisely or in more detail. However, a final check is always necessary. Ultimately, it is up to the student to decide whether to accept the AI solution or come up with their own alternative solution.
Keywords: AI, Tracker, STEM education, physics teaching.