• Phanatik@kbin.social
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    11 months ago

    There’s a difference between using ChatGPT to help you write a paper and having ChatGPT write the paper for you. One invokes plagiarism which schools/universities are strongly against.

    The problem is being able to differentiate between a paper that’s been written by a human (which may or may not be written with ChatGPT’s assistance) and a paper entirely written by ChatGPT and presented as a student’s own work.

    I want to strongly stress that in the latter situation, it is plagiarism. The argument doesn’t even involve the plagiarism that ChatGPT does. The definition of plagiarism is simple, ChatGPT wrote a paper, you the student did not and you are presenting ChatGPT’s paper as your own, ergo plagiarism.

    • LukeMedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I’m entirely with you on this, but schools are notorious for just banning/hating tools instead of teaching how to properly use the appropriate tools at your disposal. This is just going to be another tool to add to that pile.

      When you never teach students how to properly use tools, the most frequent use you see will be improper or cheating. This just makes schools feel more justified in banning tools.

      I personally think it would be a lot more beneficial not to ban tools, but to actually use tools as a part of education. Tools in many different forms are going to be present throughout life, and while it’s important to understand the core material without the need for assistive tools, it’s also important to know how to use assistive tools.

      But I don’t have any evidence for this, it’s just an opinion formed by what I’ve seen, experienced, and heard corroborated by others.

      • olmec@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Correct me if I am wrong with current teaching methods, but I feel like the way you outlined things is how school is taught. Calculators were “banned” until about 6th grade, because we were learning the rules of math. Sure, we could give calculators to 3rd graders, but they will learn that 2 + 2 = 4 because the calculator said so, and not because they worked it out. Calculators were allowed once you get into geometry and algebra, where the actual calculation is merely a mechanism for the logical thinking you are learning. Finding the answer to 5/7 is so trivially important to finding that that value for X is what makes Y = 0.

        I am not close to the education sector, but I imagine LLM are going to be used similarly, we just don’t have the best way laid out yet. I can totally see a scenario, where in 2030, students have to write and edit their own papers until they reach grade 6 or so. Then, rather than writing a paper which tests all your language arts skills, you will proof-read 3 different papers written by LMM, with a hyper focus on one skill set. One week, it may be active vs passive voice, or using gerunds correctly. Just like with math and the calculator, you will move beyond learning the mechanics of reading and writing, and focus on composing thoughts in a clear manner. This doesn’t seem like a reach, we just don’t have curriculum ready to take advantage of it yet.