One thing I wish folks knew better about "Linux" that the annoying evangelists never seem to care to mention.
One of the most important differences from other platforms if *how you get your software*.
You don't download it from the author/publisher who might be (these days, is) bundling malware.
You don't get it from a walled garden with commercial incentives to let publishers hurt you.
You don't have to fumble around Google trying to find if the site offering it is reputable.
You get it from a party, usually made up of dedicated volunteers, who believe in the platform and who are vetting all the software they build and package for you. Usually the same one you got your base system from.
@dalias This is true as a first approximation, but one of the guiding principles of free software is user choice. Having a distro you can trust is a good thing, but being able to, if you have the knowledge and inclination, get the software from the source you want, is essential. It's how power is in the hands of users, not in the hands of any gatekeeping cabal.
quite a few times wifey needed a package that wasn't packaged, i wrote an ebuild in about a minute or so, and asked her to sync my repo and install it
i could setup something similar for other distros too, but arch's makepkg being completely detached from the package manager makes that painful to distribute outside the aur, and i don't really want to maintain an infra of binary packages for those small one-off things
The threat of Mozilla is so much different when you're running Windows with Mozilla's auto-updater installed, or using snaps from Mozilla on Ubuntu (a distro that abdicated its role), or whatever on MacOS, etc.
Versus running a real Linux distro where the same people you trusted to put together a base system that works in your interests are also the ones building and shipping the Firefox source and able to omit anything exceedingly harmful before it gets to you.
but my question is more, how do i distribute that? a gentoo ebuild repository is basically a directory in the filesystem, often synced with git or rsync
i don't need specific hosting software nor any special setup other than a git host, i write a bash-based ebuild file, git push, wifey syncs and emerges it
afaik most other distros only really do distribution of binary packages, which i would then need to setup a build server, sign, etc, etc
(note, i don't want to send wifey packages as .deb or .rpm that i build locally, i want for it to be just her, doing an usual package manager install, not having to do anything special for dependencies or what-not, and also allow me to easily update the ebuild for a new version if needed)
@dalias when Apple’s App Store came out, I was like “that’s a repository! When we did repositories users complained that they couldn’t just grab random .exes from shady webpages, but now Apple does a repository and it’s a great innovation???”
@JamesWidman @maco @dalias and they don't care if what is uploaded to the repository is user-hostile, as long as they aren't financially responsible for it
even *moar* innovation! /s
Long time ago, a friend introduced me to Ubuntu. I didn’t know you could boot from a USB, I didn’t know many things and my friend’s initial guidance was paramount.
The introduction is best done person to person.
Edit: I know Ubuntu of past and Ubuntu of present are not the same. But it was baby’s first linux and easy to install again as a casual user, without my friend’s help.
Due to the criticisms of Ubuntu, I figured I’d try something not-Ubuntu and tried to install Debian over the summer. Got stuck. Had to abandon it due to time reasons.
My personal circle is too busy these days and I probably would appreciate joining a volunteer party. Or here’s to hoping I can make it work over Christmas.
A deb/rpm repository isn't much more than that. You dump packages in a directory and run a single command to extract the metadata into an index file. 'createrepo' or its C reimplementation for rpm, 'dpkg-scanpackages' for deb. That's all that's *required*. You then export said directory over http or mount it and you can install these packages with all the dependency tracking.
@dalias @draeath @ska @SRAZKVT
I mean yes if you're going to be serious about building a binary repository then higher-level tools like reprepro to track packages and their versions in different suites so that you get auto cleanup of old versions and easy metadata signatures are definitely useful, but they're absolutely not required.
@SRAZKVT @dalias @draeath @ska
- replies
- 1
- announces
- 0
- likes
- 1
@draeath @ska @navi @wouter @dalias @SRAZKVT there’s also https://debr.mirbsd.org/repos/wtf/mkdebidx.sh
I take care of cleaning up old versions myself by using a different structure, not the dists-vs-pool structure of dak, symlinking entire source package directories to make them show up in another suite, and hardlinking shared origtgz files.
Much, much easier.
That script also generates a package index, which takes a relatively long time. But it’s been battle-tested during the m68k revival 2012-2015… and a company-local mirror of lenny (IIRC) updates back then when we still needed it but the signature on archive.d.o had already expired.
What it also lacks is the ability to cache hashes for unchanged files, but even so, it works fine for its scope (smaller or personal repos).
It is capable of outputting for different architectures at once, rebuilding only for a subset of releases, Release.gpg vs. InRelease, and… huh, I forgot while writing this.
@draeath @SRAZKVT @wouter @ska @navi @dalias (one practical tip: run echo | gpg --clearsign before it (twice, to verify the second run doesn’t need a pw) to prime the agent, so you won’t have to enter the PGP password several times during running it; I had to disable pinentry there due to some problems)