LOL, my job is safe. This is exactly why LLMs are so dangerous: The don't think, and don't know what they are suggesting. This can lead to extreme danger, as indicated by the short circuit here. #electronics
@grillchen might not even be half bad for explaining code, but it can *never* know what a given part of a circuit does. And if it doesn't it will make shit up.
- replies
- 1
- announces
- 0
- likes
- 0
@werdahias same with low level code tbh.
@werdahias@pleroma.debian.social you're right, but ... you know