Tech guy invents the concept of giving instructions
With clear requirements and outcome expected
Why did no one think of this before
Who does that? What if they do everything right and it doesn’t work and then it turns out it’s my fault?
I wrote a shell script like this (it admin , notna dev) for private use.
The prompt took me like 5 hours of rewriting the instructions.
Don’t even know yet if it works (lol)OP just chatting with themselves so they can screenshot it?
That is some telegram group and both messages shows from left with profile icons(which got cropped). The screenshot person sent the last message which shows double ticks
Could be a group chat but we all know they’re a twat
Neural network: for when saying LLM doesn’t sound smart enough
Calling GPT a neural network is pretty generous. It’s more like a markov chain
it legitimately is a neutral network, I’m not sure what you’re trying to say here. https://en.wikipedia.org/wiki/Generative_pre-trained_transformer
You’re right, my bad.