• شاهد على إبادة@lemm.ee
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    2 days ago

    When I used to do image and signal processing for embedded systems in C and C++, AI was useless. Now that I do backend web development in Python and Ruby, AI is better than me. It really depends on the problem area and how many sample code and answers are out there for it to steal from.

    • spooky2092@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I do a ton of Powershell scripting, and AI is either a half competent programmer, or someone let grandpa respond with the syntax from nineteen dickety two

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      And as soon as you enter corporate stuff, LLMs are useless again, because most things are integrated into existing ecosystems which LLMs don’t know and/or libraries are only used for closed source code.

    • gwilikers@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      Really? I had an app that would autogenarate time sheets for work in Google Sheets. I decided to minimise API calls by doing a single call to Google Drive then parse the HTML and reupload. Not a big Python project but ChatGPT hit a wall pretty fast on that one. Though, tbf the documentation was suprisingly opaque so I suppose that goes back to your point.

      That project also produced my finest pile of spaghetti code since I had to account for stretched cells in the HTML parsing. I still have a piece of paper with my innumerate math scribbles. The paper makes sense to me. The code does not.