pleroma.debian.social

pleroma.debian.social

LOL, my job is safe. This is exactly why LLMs are so dangerous: The don't think, and don't know what they are suggesting. This can lead to extreme danger, as indicated by the short circuit here. #electronics
A ChatGPT prompt where the LLM admits it made a mistake by shorting 9 V over an inductor to GND.
replies
2
announces
1
likes
0

@werdahias they re a rubberduck which talks back,


But a rubberduck is not an engineer

@grillchen might not even be half bad for explaining code, but it can *never* know what a given part of a circuit does. And if it doesn't it will make shit up.

@werdahias same with low level code tbh.

@werdahias@pleroma.debian.social you're right, but ... you know