weberc2 ☕️ is a user on octodon.social. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

Am I misunderstanding, or does apt-get download and unpack sequentially?

It seems to download each dependency one by one and then unpack them one by one. Presumably it could be streaming several downloads at once, and once one finishes, it could be unpacked by the CPU while other downloads are in progress.

@weberc2 Yeah, AFAIK that's correct.
There's apt-fast, a wrapper around apt-get that does parallel chunked downloads: xmodulo.com/speed-slow-apt-get

Other package managers do what you suggested, e.g. emerge (Gentoo) fetches in the background and starts compilation as soon as the first package is downloaded (while continuing to fetch the other ones).

@redacted cool. Thanks for the link! I guess I've just been spoiled by ; I expect every software to efficiently use resources.

@weberc2 There's still an argument for doing everything strictly sequentially: It prevents partial installs.
Suppose pkg X depends on Y and Z, but Z is from a different source, which is currently unreachable, so the install fails. If you already unpacked & configured Y, it's basically dead weight on the system.

The again, apt-get has autoremove and most other package managers have something similar...so it wouldn't be that horrible either.

weberc2 ☕️ @weberc2

@redacted seems like you could extract everything to a temporary location and when the full install succeeds, promote it to permenant storage, else delete it. If a dependency is unreachable, you would fail faster by downloading in parallel.