pleroma.debian.social

pleroma.debian.social

Public reply to [this] unlisted toot in reply to [this] public toot in [this] thread:

@mirabilos @amin @sotolf @thedoctor

> I have this vague feeling that I don’t want any new software, at this point.
>
> Well, almost, but defaulting to scepticism.

"Ooo, look at this new shiny that I wrote in ${meme_language} that has built-in git integration, makes coffee for you, and only requires 24GiB RAM to run!!!!!!!!!!!!!!!!"

I'm convinced that Computer Science is now the least disciplined, least principled out of all of the sciences.

I think it should be demoted from the Sciences and placed under Business and Finance where it belongs. 😭

F*** my ${discipline}.

Oh, and of course, now the Humanities are trying to cozy up under STEM (at least at my Alma Mater), so it's not like that's any kind of bastion of sanity anymore.

So, let's see...

The hard sciences are now ${mathematically_plausible_but_completely_unprovable_paper_with_very_alluring_title}-writing and grant-attracting factories, Computer Science is now an arm of the Battalion-of-M*st*rb*t*ng-Chimpanzees Speculative Finance sector, the Humanities are trying to find a place within STEM/STEAM to find a reason to continue existing, and of course, Theology is trying to become a justification arm of fascism, kinda like the Russian Orthodox Church.

Have I missed anything? πŸ€¦β€β™‚οΈ

@rl_dane @mirabilos @amin @thedoctor

There is a difference between computer science, and softeware engineering, computer science is theory and driving the field foreward think things like Haskell, Idris and the like, crazy math, bit twiddling and so on.

Then you have Software engineering which is building stuff using the pieces and theories made by the computer science wizards, or at least the most practical parts, and creating stuff with them.

I think it's parts of this second type that are degrading, or more they fall for the promises and siren song of the capitalist class to use the fasch tools to do stuff.

@sotolf @mirabilos @amin @thedoctor

But even the theory arm is highly entrenched in finding new tools and techniques to empower terawatt-guzzling data center type activities. :'(

@rl_dane @mirabilos @amin @thedoctor

Are you sure? I haven't seen that, what would they gain from that anyway?

@thedoctor @sotolf @rl_dane @mirabilos @amin CS theory with its O-notation, which is all about "be more efficient for large n, even if the constants make it prohibitive for reasonably sized problems"?

The entire field of IT exploded when it was industrialised, with the promise to industrialise everything else, and that flows back to CS because even researchers gotta eat sometimes and IT<->CS is the natural pairing.

Add finance bros with more money than sense who want that "industrialise everything else" promise to come to fruition and here we are.

@sotolf @mirabilos @amin @thedoctor

It always ends up to be a money game, even in the deepest parts of academia, at this point.

Research exists to get grants first, to further the field second.

@patrick @thedoctor @sotolf @mirabilos @amin

I mourn the death of personal computing.

@thedoctor @sotolf @rl_dane @mirabilos @amin I don't: the personal computing space today is probably larger than the personal computing space in the 90s, and it was sustainable back then ("proof by history", I guess?). It's a _much_ smaller slice of the pie, but only because the pie grew so much.

As for parts availability for PC, it won't be at the wavefront anymore (chasing those ever-newest-ever-smallest-ever-faster designs), but there are tons of fabs out there cranking out 45nm, 90nm, ... parts for embedded purposes and the like: That used to be "high performance computing" cutting edge stuff with the ability to build seemingly infinite capacity in storage and computing power. The finance and AI bros probably don't even realize that this stuff still exists, and the fabs won't all retool because the costs are prohibitive (and also, some things just _need_ to be done at that scale, even for the 2nm AI bro gear, so there will always be a market for the larger processes).

So personal computing's best bet is probably to move away from the consumer electronics that it paired with for the last decades, and align with permacomputing and embedded.

@amin @patrick @sotolf @thedoctor @rl_dane that will also give us 4:3 screens and sensible resolutions back!

@mirabilos @amin @patrick @sotolf @thedoctor

You can get reasonably-priced and new 1024x768 and 1280x1024 LCD screens today (with modern connectors), but I don't know how good they are.

@rl_dane @mirabilos @patrick @sotolf @thedoctor

How did I get into this thread?

@sotolf @rl_dane @patrick @amin @thedoctor modern connectors means VGA for me, and maybe DVI ;)

@mirabilos @sotolf @rl_dane @patrick @amin @thedoctor
DVI (single Link) is a subset of HDMI BlobCatClown
edit: As in, passive adapter

@patrick @amin @thedoctor @kabel42 @rl_dane @sotolf HdMI is consumer-tech crap and needs to die in computers

@mirabilos @sotolf @rl_dane @patrick @amin @thedoctor neocat_think does that mean you can technically have DVI over USB-C?

@rl_dane @sotolf @thedoctor @patrick @amin yes, that, except I assumed he knew and I could save the image file overhead

@mirabilos @sotolf @patrick @amin @thedoctor

Try HDMI. And maybe DisplayPort.

@mirabilos @sotolf @patrick @amin @thedoctor

But it's a plausible successor to DVI.
HDMI is just not meant for computers.

@rl_dane @sotolf @patrick @amin @thedoctor that. is. precisely. the. problem.

@sotolf @rl_dane @amin @thedoctor I’m neither, I’m just a programmer *shrug* I consider myself 60% craftsman/manu-facturer, 30% artist, and only 10% academic.

replies
0
announces
0
likes
1

@mirabilos @sotolf @amin @thedoctor

I'm 5% programmer, 5% sysadmin, 10% crusty ever-so-over-it infosec veteran, 20% shell script monkey, and 60% snark. XD

@rl_dane @mirabilos @amin @thedoctor I'm just a janitor trying my best to repair stuff I see that is broken, and then sometimes making new stuff to fix an issue someone has, that then starts breaking down :p

@sotolf @mirabilos @amin @thedoctor

Janitor is such a nice analogy.

In my last years in I.T., I felt like the last janitor mopping up the floors while Rome fell. XD