• 0 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: July 16th, 2023

help-circle
  • If they’re really just after data at all costs, they could easily spin up an instance that has no apparent link to threads and federate secretly. I agree with other arguments about not federating with them but idk, all these data privacy arguments against federating with threads are so dumb. If they want it, they’ll get it because getting it is so absurdly easy.




  • Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class.

    That’s if the LLM is right. If you don’t know the material, you have no idea if what it’s spitting out is correct or not. That’s especially dangerous once you get to undergrad level when learning about more specialized subjects. Also, how can reading a paper be more informative than doing research and reading relevant sources? The paper is just the summary of the research.

    and get a level of engagement equal to a private tutor for every student.

    Eh. Even assuming it’s always 100% correct, there’s so much more value to talking to a knowledgeable human being about the subject. There’s so much more nuance to in person conversations than speaking with an AI.

    Look, again, I do think that LLMs can be great resources and should be taken advantage of. Where we disagree is that I think the point of the assignment is to gain the skills to do research, analysis, and generally think critically about the material. You seem to think that the goal is to hand something in.


  • It can definitely be a good tool for studying or for organizing your thoughts but it’s also easily abused. School is there to teach you how to take in and analyze information and chat AIs can basically do that for you (whether or not their analysis is correct is another story). I’ve heard a lot of people compare it to the advent of the calculator but I think that’s wrong. A calculator spits out an objective truth and will always say the same thing. Chat GPT can take your input and add analysis and context in a way that circumvents the point of the assignment which is to figure out what you personally learned.








  • Neve8028@lemm.eetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    Because some people are good at what they do. In online communities you often have people talking out of their ass. It’s interesting hearing what experts in certain topics have to say. Also not to mention following friends and family. Not sure what’s hard to understand there.