[Tutor] louis renton

Oscar Benjamin oscar.j.benjamin at gmail.com
Sun Jan 22 20:35:31 EST 2023


On Mon, 23 Jan 2023 at 01:03, Mats Wichmann <mats at wichmann.us> wrote:
>
> On 1/22/23 17:50, Oscar Benjamin wrote:
> > On Mon, 23 Jan 2023 at 00:35, Mats Wichmann <mats at wichmann.us> wrote:
>
> > I think StackOverflow's reasons were in part very particular to their
> > situation. From what I understand some people had promoted the idea
> > that anyone could use ChatGPT to quickly gain "reputation". Then a lot
> > of people blindly posted outputs from ChatGPT that were not helpful so
> > SO banned it. I've seen something similar here:
> > https://github.com/sympy/sympy/issues/24524#issuecomment-1383545797
> > That usage of ChatGPT is bad but isolated (I haven't seen any other
> > occurrence) so not massively problematic in the way that I think it
> > was for SO.
>
> I think the situation is a little related to what was claimed earlier in
> this thread (about "googling for stuff"), which is the only reason I
> brought it up:  the answers may well be right, even of high quality.
> It's even possible that that's the case in a large majority of cases.
> But if you don't have the background to make that determination, how can
> you trust it?  There have been lots of examples as people play around
> with the AI of the system being essentially "faked out" and getting
> something pretty dramatically wrong.

To be clear in the context of the GitHub issue I linked above the
answer was pure gibberish. I don't know what input was given to the AI
but the output was worthless or completely irrelevant (I honestly
don't understand what it is referring to). I don't understand the
thought process of the person submitting it who later admitted their
own lack of knowledge.

The danger (and this was sort of SO's concern) is that actual human
communication could become overwhelmed by these things.

The flipside is that ChatGPT can be useful for various things. I've
tested it and it can work okay and in some ways be more useful than a
basic web search but when it's wrong it will confidently assert the
wrong answer with no caveats or citations. That seems like a very
dangerous thing to put in the hands of people who don't know what
they're doing.

--
Oscar


More information about the Tutor mailing list