@weberc2 Yeah, AFAIK that's correct.
There's apt-fast, a wrapper around apt-get that does parallel chunked downloads: http://xmodulo.com/speed-slow-apt-get-install-debian-ubuntu.html
Other package managers do what you suggested, e.g. emerge (Gentoo) fetches in the background and starts compilation as soon as the first package is downloaded (while continuing to fetch the other ones).
@weberc2 There's still an argument for doing everything strictly sequentially: It prevents partial installs.
Suppose pkg X depends on Y and Z, but Z is from a different source, which is currently unreachable, so the install fails. If you already unpacked & configured Y, it's basically dead weight on the system.
The again, apt-get has autoremove and most other package managers have something similar...so it wouldn't be that horrible either.
@redacted seems like you could extract everything to a temporary location and when the full install succeeds, promote it to permenant storage, else delete it. If a dependency is unreachable, you would fail faster by downloading in parallel.
@redacted cool. Thanks for the link! I guess I've just been spoiled by #golang; I expect every software to efficiently use resources.