Pages

Friday, November 01, 2024

3 Critical Problems Gen AI Poses for Learning

 

3 Critical Problems Gen AI Poses for
Learning

The Limits of AI “Educators” in Higher Education
August 6, 2024

Since the widely acclaimed release of ChatGPT 4, generative AI has been touted by many as the savior of education. Case in point: British education expert Sir Anthony Seldon has predicted that by 2027, AI will replace human teachers on a global scale.

Unfortunately, more than 40 years of academic research exploring human cognition suggests that generative AI could also harm learning at all levels, from online tutoring to employee training, for three reasons.

Problem one: Empathy

Intellectual heavyweights from Bill Gates to Sal Kahn have argued that the personalized tutoring enabled by ChatGPT and other generative AI tools based on large language models will close achievement gaps across education. However, individualized instruction is not the most important driver of learning. After analyzing data from thousands of studies, educational researcher John Hattie recently reported that a strongly empathetic learner-teacher relationship imparts two and half times greater impact on learning than personalization.

“Using AI to help learners avoid the tedious process of memorizing facts is the best way to ensure higher-order thinking skills will never emerge.”

The hormone oxytocin is the foundation of empathy. When two individuals connect and release oxytocin simultaneously, their brain activity begins to synchronize—a process known as “neuronal coupling” that leads them to not only learn from one another but to quite literally think alike. Given that algorithms have neither a brain nor oxytocin, it is biologically impossible for humans and AI to develop an empathetic relationship: the transpersonal nature of empathy precludes its emergence in the digital realm.

This is one major reason why students operating in purely digital environments perform worse and are significantly less likely to graduate than comparable students engaged in face-to-face instruction. Without empathy, students become passive receivers of information with little impetus to push through the requisite struggles inherent in the learning process.

Even among highly skilled human educators, failure to cultivate an empathetic relationship inevitably hinders learning. And this only serves as a further warning against AI, as it reveals that neither knowledge nor pedagogy (presumably the forte of digital tutors) are sufficient for effective teaching.

Problem two: Knowledge

University College London Professor Rose Luckin recently argued that, since ChatGPT can access and organize all the world’s knowledge, learners need no longer waste time learning “facts.” Instead, they can focus on higher-order thinking skills like creative and critical thinking.

Unfortunately, much of what we term “creative” and “critical” thinking occurs via subconscious processes that rely on internalized knowledge. When we consciously think about a problem, humans can only actively consider a very finite amount of information due to the cognitive limits of working memory.

However, once we stop consciously thinking about a problem, we enter into an incubation period whereby our brains subconsciously sort through our memory stores by seeking out relevant ideas. It’s during this sorting process (known as reconsolidation) that novel connections are made and better thinking emerges.

“Even among highly skilled human educators, failure to cultivate an empathetic relationship inevitably hinders learning.”

Here’s the problem: Subconscious reconsolidation only works with information that is stored within a person’s long-term memory, which means it cannot leverage information that is externally accessed or stored. This explains why experts almost always demonstrate stronger problem-solving skills than novices within their field of expertise, but rarely outside of it. This also explains why semantic dementia (whereby patients lose long-term memories but maintain cognitive faculties) impairs creativity nearly twice as much as frontotemporal dementia (whereby patients lose cognitive faculties but maintain long-term memory stores).

Simply put, using AI to help learners avoid the tedious process of memorizing facts is the best way to ensure higher-order thinking skills will never emerge.

But, you may be asking, what about learners who use AI to merely assist with fact memorization? Well, consider that textbooks have historically been written by experts—people with enough deep knowledge to aptly vet and organize information into a meaningfully structured curricula. Large language models (at least in their current form) have neither oversight nor vetting. This means learners who use AI are very likely to encounter wrong, oddly sequenced, or irrelevant information which—if memorized—might very well derail their path to mastery.

Of course, AI models will improve and information will surely increase in accuracy. Unfortunately, this won’t address the issue of vetting. Just as with Wikipedia today, users will only ever be able to work up to their current level of knowledge: Anything beyond that must be taken on faith. When learning relies on faith, it’s imperative that faith is placed where the likelihood of success is highest; this is why having the assurance that an expert has evaluated and organized key information remains invaluable.

Problem three: Multitasking

It has long been known that multitasking harms accuracyspeedmemory formation, and even enjoyment. In fact, I have no qualms calling this the single worst thing human beings can do for learning.

pre-COVID survey revealed that students across the United States spent nearly 200 hours annually using digital devices for learning purposes. However, they spent 10 times as long—more than 2,000 hours—using these same devices to rapidly jump between divergent media content. Other studies have shown that, when people use a computer for self-guided learning, they typically last fewer than six minutes before engaging with digital distractions and, when using a laptop in the classroom, students typically spend 38 minutes of every hour off-task. In other words, the digital devices learners use to access and engage with ChatGPT have become veritable multitasking machines.

It’s not that computers can’t be used for learning; it’s that they so often aren’t used for learning that whenever we attempt to shoehorn this function in, we place a very large (and unnecessary) obstacle between the learner and the desired outcome—one many struggle to overcome.

What does work?

There is one area of learning where generative AI may prove beneficial: cognitive offloading. This is a process whereby people employ an external tool to manage “grunt work” that would otherwise sap cognitive energy.

However, as noted above, when novices try to offload memorization and organization, learning is impaired, the emergence of higher-order thinking skills is stifled, and without deep-knowledge and skill, they’re unable to adequately vet outputs.

“When we regularly offload certain tasks, our related skills and mental faculties can atrophy, making external support a requirement in the future.”

Experienced learners or experts can benefit from cognitive offloading. Imagine a mathematician using a calculator to avoid arithmetic, an event planner using a digital calendar to organize a busy conference schedule, or a lawyer using a digital index to alphabetize case files. In each of these scenarios, the individual has the requisite knowledge and skill to ensure the output meaningfully matches the desired outcome.

But there is still the risk of digital reliance. When we regularly offload certain tasks, our related skills and mental faculties can atrophy, making external support a requirement in the future. For instance, I’ve used digital programs to run statistical analyses for over a decade. Although I have the relevant knowledge to vet the output, I can no longer remember the specific equations each statistical test employs. Accordingly, unless I return to my textbooks, I’m now reliant upon these programs.

Consider the costs

Whenever we employ digital tools to amplify, hasten, or circumvent aspects of a particular process, something is inevitably lost along the way. Or, in the words of Thomas Sowell, “There are no solutions, only trade-offs.”

Sometimes this trade-off is worthwhile—such as discarding complex equations to run statistical analyses in seconds rather than hours. However, when we use AI to supplement education, that thing which is lost is the very essence of the endeavor itself: learning.
Whenever the primary reason for using a tool is negated by its own adoption, we are well justified in questioning its continued use.

If the primary reason for using a tool is negated by its own adoption, we are well justified in questioning its continued use.


Monday, October 21, 2024

Tool for Creating the APA Version 7 Reference list

 Key in the following into command prompt: 

-o"%appdata%\Microsoft\Bibliography\Style\APASeventhEdition.xsl"

 

(Inclusive of - symbol)

 

To access command prompt, key in command prompt on the start menu for Windows.

Information from:  https://raw.githubusercontent.com/briankavanaugh/APA-7th-Edition/main/APASeventhEdition.xsl

Monday, October 07, 2024

An Inspiring Quote for Engineering Students

 “Precision and focus are the twin pillars of engineering excellence.

 In every problem lies an opportunity to innovate, and

it is through meticulous attention to detail 

and unwavering concentration 

that we transform challenges into groundbreaking solutions."

Copilot

Monday, April 15, 2024

Friday, April 05, 2024

UCS1001 CVE Tri2 2023-24 S16 Talent Titans



Team QueueSync





Team Rain4Use


                                    




                                                                      Team SGSecure

                                     



Guardian Eyes






Team Phase Changers










 

Monday, January 15, 2024

The Uses of gAI for University Students, per poe.com

I can provide you with a description of the various ways that AI can be used by university students.  Here are some examples:

  1. Research Assistance: Students can use AI to assist in their research by automating tasks such as literature reviews, data analysis, and information gathering.

  2. Writing Support: AI tools can help students improve their writing skills by offering grammar and style suggestions, providing feedback on essays and papers, and assisting with citation formatting.

  3. Study Aid: AI-powered educational platforms can offer personalized study plans, adaptive learning materials, and interactive quizzes to help students enhance their understanding of various subjects.

  4. Language Learning: AI language models can be used to practice and improve language skills through conversation simulations, vocabulary building exercises, and grammar correction.

  5. Virtual Tutoring: AI-powered tutoring systems can provide one-on-one assistance to students, offering explanations, answering questions, and guiding them through difficult concepts.

  6. Time Management: AI-based apps and tools can help students manage their schedules, set goals, prioritize tasks, and track productivity to make the most of their time.

  7. Virtual Laboratories: AI simulations and virtual reality can provide students with immersive laboratory experiences, allowing them to conduct experiments and explore scientific concepts in a digital environment.

  8. Plagiarism Detection: AI plagiarism detection tools can help students ensure the originality of their work by comparing it to a vast database of sources and identifying any potential instances of plagiarism.

  9. Personalized Education: AI algorithms can analyze student data and provide personalized learning paths, recommendations for additional resources, and adaptive assessments tailored to individual needs.

  10. Career Guidance: AI career platforms can assist students in exploring career options, providing insights into job market trends, suggesting relevant skills to develop, and offering guidance on career paths.


 

          from GPT for Teachers (a facebook group)

Friday, December 08, 2023

The Paul-Elder Framework for Critical Thinking


 

Key Weblinks for Blackstone's UCS1001 CVE S16 for Tri2 2023-24

                                                                       

Symbaloo Page (This requires you to sign up for a free Symbaloo account.)


https://www.symbaloo.com/mix/es1102ontheweb

CVE S16 Zoom: 

Topic: Blackstone 1 (S16 CVE)
Time: This is a recurring meeting Meet anytime

Join Zoom Meeting

Meeting ID: 940 0893 3569
Passcode: 239538




Key Weblinks for Blackstone's UCS1001 MDME S20 & S21 for Tri2 2023-24

 




Symbaloo Page (This requires you to sign up for a free Symbaloo account.)



Google Docs Folder

https://drive.google.com/drive/folders/1uVlB4zbNUTfn5fElO2mzyDITwjRMXkZS


 MDME S20 Zoom: 

Topic: Blackstone 2 (S20) MDME
Time: This is a recurring meeting Meet anytime

Join Zoom Meeting

Meeting ID: 962 0783 5746
Passcode: 695425

MDME S21 Zoom:

Topic: Blackstone 3 (S21) MDME
Time: This is a recurring meeting Meet anytime

Join Zoom Meeting

Meeting ID: 999 3027 3306
Passcode: 139919








Friday, November 24, 2023

UCS1001 MEC Tri12023-24 S21 Team Photos




The Think Tank Titans




The Pancake Masters




Team Smart Garbager




Team 4ME (minus 1)





Cart Genius Crew (minus 1)

 

UCS1001 MEC Tri1 2023-24 S17 Team Photos

Team Plant Bonne



Team SmartWatch




Team TDU




Team Rescue



Team Fridge