(Replying to PARENT post)
squid-deb-proxy can do most of this - it fetches and locally caches lists and packages. Clients install squid-deb-proxy-client which uses multicast-DNS to discover the local proxy.
Packages are fetched and cached by the proxy first time they're requested and thereafter served locally (subject to lifetime/space etc. squid configuration).
There are ways to pre-populate the cache but you're always going to have situations where package updates have to pull from remote archives.
apt-mirror is designed for a similar purpose. It allows creating a sub-set or entire mirror of one or more upstream archives.
Clients can then be pointed at the local mirror which could be on the same host or LAN.
Generally, 'packages' on github will be source-code only so you'd need a quite complex build server, or at least per-package specific links in your local cache rules for pre-built binaries.
(Replying to PARENT post)
One solution is to do it once only: grab the base image, install all the packages you need, and then make an image of it so you don't start from scratch again next time.
(Replying to PARENT post)
Easily arguable that RPi isn't made with such use cases in mind. I'm all in favor of removing unnecessary bloat if only a small percentage of user base is ever going use it, especially when the software is readily available.
>If those repos are inaccessable for any reason, I have a bunch of hardware that's very hard to do anything useful with.
How so? They are just Linux boxes, you can just download the source code and compile the binaries you need. Pre-built packages are not necessary for functional OS.
(Replying to PARENT post)
a) define your entire system state and dependencies in a single, declarative file
b) prebake an image based on this file
It's all done and ready, available now, no hacking around required.
(Replying to PARENT post)
(Replying to PARENT post)
As to caching, I tried setting one up and found out a few things.
I originally tried apt-cacher-ng:
https://www.unix-ag.uni-kl.de/~bloch/acng/
but had trouble (can't remember exactly what) and muddled through polipo instad.
I found out:
- it was a lot easier and faster than copying cached packages around
- when doing "apt-get update; apt-get upgrade" the cache really sped up multi-machine updates. The first machine was slow, the rest were fast.
- By far, the majority of cache traffic was for updates to base stuff. foo-1.2, foo-1.2.1 foo-1.2.2, etc.
- second was prerequisites for packages I needed.
- the few specialized packages I used were not updated as frequently.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
I've built most of the base repository of Arch Linux on my RPi4 without cross compiling. I'm buying a few more so that I can go 'full Gentoo' and pretty much rebuild the world.
Have a recording of a container booting from a rebuilt userland: https://asciinema.org/a/283303
It might feel like the ARM packages are somehow special but really only the blobby bits like the RPi firmware is. Everything else is just your bog standard armv7/aarch64 ELF.
So yeah, back up a GCC binary or bootstrap, I guess? I can probably email you a tmux binary in a pinch, plus I have a full local mirror of the repo for the apocalypse? :P
(Replying to PARENT post)
(Replying to PARENT post)
They help you maintain a stable software stack over time. And generate a fully contained software image that can be flashed without needing the internet afterwards. (All downloads happen during compilation)
(Replying to PARENT post)
Personally I find it almost magical that I can install and update almost any software I need with a short command.
(Replying to PARENT post)
Imagine that you can build your rootfs is a docker container and export it to IMG
(Replying to PARENT post)
If those repos are inaccessable for any reason, I have a bunch of hardware that's very hard to do anything useful with.
I know there are such things as apt-caches and squid caches and stuff, but I could really use thing that goes through every apt-get I've ever done and the top 50,000 packages on github and stuffs 'em all onto an SD card and shows me how to use them from my commandline.
OP mentions this as a future direction for the project, but I think it's one of the most important.