And the quality of the AI output sucks. I was recently looking for information about positive convention for yaw, pitch, and roll in aircraft. I was looking at az and yaw and got reasonable results from the AI, but when I looked at pitch and el all of the results were about elevator pitches. Even when I spelled out elevation it insisted on elevator pitches. I scroll past the AI results as a matter of principle, but I usually look at them so I have something specific to complain about when people ask why I am so virulently anti-AI.
The other day I tried to have it help me with a programming task on a personal project. I am an experienced programmer, but I only “get by” in Python (typically just by looking up the documentation for the standard library). I thought, “OK. This is it. I will ask Llama 3.3 and GPT4 for help.”
That shit literally set me back a weekend. It gave me such bad approaches and answers, that I could tell were bad (aforementioned experience in programming, degree in comp sci, etc) that I got confused about writing Python. Had I just done what I usually do, which is to look up the documentation and use my brain, I would have gotten my weekend task done a whole weekend sooner.
It scares me to think what people are doing to themselves by relying on this, especially if they’re novices.
Google automatically gives me ai search results that are piss poor.
And these results are taken at face value by a shocking number of people. I’ve gotten into niche academic arguments where someone just copy and pasted the AIs completed hallucinated response as “evidence.”
I experimented with using AI to generate basic quizzes for students on concepts like atomic theory or conservation of energy, but maybe 2/20 questions it came up with were any form of accurate/useful. Even when it’s not making shit up entirely, the information is so shallow as to be useless.
And the quality of the AI output sucks. I was recently looking for information about positive convention for yaw, pitch, and roll in aircraft. I was looking at az and yaw and got reasonable results from the AI, but when I looked at pitch and el all of the results were about elevator pitches. Even when I spelled out elevation it insisted on elevator pitches. I scroll past the AI results as a matter of principle, but I usually look at them so I have something specific to complain about when people ask why I am so virulently anti-AI.
The other day I tried to have it help me with a programming task on a personal project. I am an experienced programmer, but I only “get by” in Python (typically just by looking up the documentation for the standard library). I thought, “OK. This is it. I will ask Llama 3.3 and GPT4 for help.”
That shit literally set me back a weekend. It gave me such bad approaches and answers, that I could tell were bad (aforementioned experience in programming, degree in comp sci, etc) that I got confused about writing Python. Had I just done what I usually do, which is to look up the documentation and use my brain, I would have gotten my weekend task done a whole weekend sooner.
It scares me to think what people are doing to themselves by relying on this, especially if they’re novices.
AI is useful for basic, mundane tasks and that’s about it. Trying to force it to be some sort of Uber search engine is such a bad idea.
Yea that’s a bad example of what to use ai for at least right now. You’re going to get bad results with that question.
It’s good for things, if you pay.
I don’t want to ask ai. Google automatically gives me ai search results that are piss poor. Those useless results still use energy to generate.
And these results are taken at face value by a shocking number of people. I’ve gotten into niche academic arguments where someone just copy and pasted the AIs completed hallucinated response as “evidence.”
I experimented with using AI to generate basic quizzes for students on concepts like atomic theory or conservation of energy, but maybe 2/20 questions it came up with were any form of accurate/useful. Even when it’s not making shit up entirely, the information is so shallow as to be useless.