Am I misunderstanding, or does apt-get download and unpack sequentially?
It seems to download each dependency one by one and then unpack them one by one. Presumably it could be streaming several downloads at once, and once one finishes, it could be unpacked by the CPU while other downloads are in progress.
@weberc2 There's still an argument for doing everything strictly sequentially: It prevents partial installs.
Suppose pkg X depends on Y and Z, but Z is from a different source, which is currently unreachable, so the install fails. If you already unpacked & configured Y, it's basically dead weight on the system.
The again, apt-get has autoremove and most other package managers have something similar...so it wouldn't be that horrible either.
@redacted seems like you could extract everything to a temporary location and when the full install succeeds, promote it to permenant storage, else delete it. If a dependency is unreachable, you would fail faster by downloading in parallel.
@weberc2 Yeah, AFAIK that's correct.
There's apt-fast, a wrapper around apt-get that does parallel chunked downloads: http://xmodulo.com/speed-slow-apt-get-install-debian-ubuntu.html
Other package managers do what you suggested, e.g. emerge (Gentoo) fetches in the background and starts compilation as soon as the first package is downloaded (while continuing to fetch the other ones).