College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • Mtrad@lemm.ee
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    Wouldn’t it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

    There could still be classes / lectures that cover the more classical methods, but I remember being told “you won’t have a calculator in your pocket”.

    My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can’t solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      That’s just what we tell kids so they’ll learn to do basic math on their own. Otherwise you’ll end up with people who can’t even do 13+24 without having to use a calculator.

      • Overzeetop@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        people who can’t even do 13+24 without having to use a calculator

        More importantly, you end up with people who don’t recognize that 13+24=87 is incorrect. Math->calculator is not about knowing the math, per se, but knowing enough to recognize when it’s wrong.

        I don’t envy professors/teachers who are hacing to figure out novel ways of determining the level of mastery of a class of 30, 40, or 100 students in the era of online assistance. Because, really, we still need people who can turn out top level, accurate, well researched documentation. If we lose them, who will we train the next gen LLM on? ;-)

      • Arthur_Leywin@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        When will people need to do basic algebra in their head? The difficulty between 13+24 and 169+ 742 rises dramatically. Yeah it makes your life convenient if you can add simple numbers, but is it necessary when everyone has a calculator?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Like someone said. It’s not just about knowing what something is, but having the ability to recognize what something isn’t.

          The ability to look at a result and be skeptical if it doesn’t look reasonable.

          169+742. Just by looking I can tell it has to be pretty close to 900 because 160+740 is 900. That gives me a good estimate to go by. So when I arrive at 911. I can look at it and say. Yeah. That’s probably correct, it looks reasonable.

        • Mtrad@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That sounds like ot could be a focused lesson. Why try to skirt around what the desired goal is?

          That also could be placed into detecting if something is wrong with AI too. Teach people things to just help spot these errors.

          In my experience, it’s so much more effective to learn how to find the answers and spot the issues than to memorize how to do everything. There’s too much now to know it all yourself.

    • Revv@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      Training how to use “AI” (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use “AI”, you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

      Without some means of forcing students to engage cognitively, there’s little point in education. Pen and paper seems like a pretty cheap way to get that done.

      I’m all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.

    • settxy@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

      Your point on checking for accuracy is on point. AI doesn’t always puke out good information, and ensuring students don’t just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys… https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

    • Snekeyes@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It’s like the calculator in the 80s and 90s. Teacher would constantly tell us “no jobs just gonna let you use a calulator, they’re paying you to work”…

      I graduated, and really thought companies were gonna make me do stuff by hand, cause calulators made it easy. Lol.