Follow

Would you buy/use a computer that ran 3x slower than modern machines if it were more secure (less vulnerable to side-channel attacks)?

Interestingly, the numbers are very different on here than they are on birdsite. Many more people here seem to be willing to take a speed hit for security here than there. Not surprising, I suppose: selection bias.

@cwebber According to my official and unbiased mastodon poll Hillary should have won the election too :)

@cwebber There wasn't a "3x slower than modern machines would probably still be faster than my nine-year-old desktop machine." 😆

@cwebber

FWIW, I wouldn’t make the tradeoff (except maybe on a second computer) because I’m confident in my ability to avoid those kinds of attacks.

@cwebber

Rarely. I use NoScript and only enable it for sites I trust.

(Yes, I realize that’s not bulletproof because how much can I *really* trust a site but I expect actual wide-ranging in-the-wild side channel attacks to get on the public radar before they can reach me.)

(And that assumes such attacks are actually feasible and profitable.)

@cwebber Nope not as my main computer.

But I would buy one in addition to my other computers and use it specifically for any high security needs I might have.

@cwebber The notebook computer I still use regularly is over six years old and was a budget model. I think it is already 3x slower than a modern machine, so I would certainly appreciate a security upgrade with similar performance :blobcheeky:

@cwebber I assumed you meant 1/3 the speed. I already do - a Gluglug Libreboot X60 from the days before they became minifree.org/

@cwebber Yes, the modern machine that you referred to is Ryzen 7 gen 2.

In another word, if it is not slower than Pentium N4200, it will be fine.

@cwebber what does "3x slower" mean? 3x longer to do anything?

@cwebber and how much more secure is it?

acceptable: processor does things at 1/3 speed but no one can break in without actively using the machine under my (or my executor's) credentials

unacceptable: processor unpredicably slows waaayyy down periodically to do things for an average of 1/3 speed, machine is perfectly silent

@cwebber It really depends on how much more security you get for your 3x speed sacrifice.

@aran Indeed. In this case I'm suggesting security against things such as rowhammer, spectre, etc (btw my understanding is that the slowdown would be more like 1.5x-2x but I decided to give a significantly more conservative estimate)

@cwebber It depends how you define modern machine. If it were 3x slower than my current computer, no. It might technically be possible, but it'd be very unpleasant. If it were 3x slower than, say, a brand new, top of the line, ridiculously overpowered machine, then maybe.

@shadowfacts It's interesting that people feel this way, and I wonder how much is "anchoring"... I guess having run computers that were 300x slower than modern computers and getting a lot done on them makes me less bothered by it.

By "anchoring", I mean: if moore's law didn't die out and I was asking you this after computers were 3x faster than today, and asking if you were willing to drop from that to "today's speeds", would you object?

@cwebber If Moore's Law still held, it would be a much different question. A 3x decrease would only be a few years old, so despite being objectively much slower, I'd be much more used to it because it would be so recently in the past.

A great deal of the reluctance to change, I think, is not just that the hardware's gotten better, but the software that we use _expects_ that modern, fast hardware. My (and I suspect other people's) answer was based on the assumption of continuing to use the software we already do as normal. But, if something like your hypothetical actually happened, and lots of people started using computers that were several times slower, I think we'd see the software adapt to that.

If the software adapted to work about equally well on 3x slower hardware as current software does on current hardware, I'd be much more willing to take the slower hardware, knowing that I'd still be able to do everything I currently do and be similarly productive. But, current software designed for modern computers, running on a 3x slower device, would for a great many tasks be unbearable.

@shadowfacts

It sort of comes down to having programmes properly use multiple cores. Because then you can compensate for the individual core slowdown by putting more of them in the machine.

And more cores is already a trend in order to manage power and battery life.

@cwebber

@cwebber but I would say that part of that is probably due to perceived speed being very very different then just like, raw processor speed. it's dependent on a lot of things like OS programming, cache size, workflow, amount of RAM, etc.

when you said "3x slower" my gut reaction was "you know the amount of times your macbook completely freezes up when trying to do some trivial task like alt-tab or open a new text file? what if those freezes took three times as long or happened three times as frequently"

@nightpool Sure. "Software is a gas, and expands to fill all available space."

The question is whether it can be compressed again.

@nightpool I'm also aware that many people in this generation have been using computers *primarily since* moore's law has leveled off, so unlike people from my generation, have gotten used to an approximate baseline of speed.

@cwebber sure. I'm probably willing to give up "having 100 chrome tabs open" but not willing to give up "having a super high DPI laptop display". trade-offs are going to be different for different people

@nightpool If it's any consolation, I think that high DPI displays aren't a concern in themselves (but the amount of processing necessary to display to them might be, dunno).

@cwebber my understanding is that there's some complicated performance cost for the thing I'm doing, which is configuring a high DPI display with a non-integer scaling factor. I dunno though.

@cwebber
> Sure.  "Software is a gas, and expands to fill all available space." The question is whether it can be compressed again.

Only after freezing :D

@nightpool

@cwebber and of course that's completely nonsense, that's all based on stuff like, "I always forget to close programs I'm not using" and "retina displays with high scaling are hard for even top-line laptop graphics cards with current implementations"

@cwebber I don't mind slower computers, but there would have to be other factors for me to buy one (i.e. price/repairability/etc, a fairly theoretical branch of security isn't worth that much to me)

@elomatreb It's not a theoretical branch of security? It's been demonstrated, general attacks that run in your browser on any page you could load that can change your read from your memory.

That doesn't seem "theroetical" to me...

@cwebber And yet the most security breaches are caused by phishing emails

@elomatreb Sure, because that's a low hanging fruit. Also worth addressing, but a different topic.

@cwebber basically the only computationally expensive task I do is software development (okay and guix).
If people would realise that their phones are sometimes 1.5-2x times slower than their desktop, more would answer positively I believe.
If not for development, I would say "yes" probably. Not sure.

@cwebber
Asked differently:

• today's "own" machine, with whatever performance and security it's got

• subscription to a "cloud computing" provider, including equipment for the access (lease)

For the same price, period of time, availability, etc. But times higher performance in the latter case.

@cwebber It depends how you measure speed, but in general yes.

My requirements for development are not very significant. I can work easily on a 4 core A53. I don't care about games or 3D graphics, other than that browsers often make use of it.

@bob
Generally #GUI make use of it. But the demands there aren't very high either, an embedded graphic core can cope with it.

I'd say, the only place where the performance is needed for s/w development and maintenance is reproducible builds.
And the applications for content creators of course: CAD, graphic design, music creation, etc.
@cwebber

@cwebber sidebar: if we all got high we wouldn't notice it was any slower ;)

@cwebber I'd honestly like to see VLIW resurrected. Maybe see if with spectre and meltdown and whatnot as motivations the compiler technology can catch up to give decent performance.

@Azure
With explicit parallelism (#EPIC), you can sometimes reach the same performance at slower clock speed, but it costs way more than a conventional hardware, because it requires more circuitry, and because such machines aren't mass-produced yet.
@cwebber

@amiloradovsky @cwebber The 'more circuitry' surprises me, only because I recall one of the motivations for Merced being to simplify the on-die apparatus associated with out of order and speculative execution. (Of course, this could very well have just been bad information.)

@Azure
No, that's correct, but you still need more computing units/devices to process information as fast as those working at higher clock rates.
Also there is such a thing as "intrinsic parallelism" in programs, limiting the minimum clock speed needed for the performance.
@cwebber

@cwebber "no i need all the speed" but also "i would actively choose to make my computer less secure if it became faster"

but only because my threat model is basically nonexistent. like, "security" is abstract, but "speed" is immediately tangible. there are far greater risks.

@trwnh
Are you sure most of that speed isn't used for making proof-of-work for somebody, or for DDoS'ing somebody you don't necessarily hate?
@cwebber

@cwebber things run poorly already, and I'm not a likely candidate for whatever computer shenanigans. I want a snappy computer!

@cwebber
I've been buying used business systems and installing Linux for personal use for a decade and a half, so use... sure. Buy? Hmm...

I'm good with the performance of an i5-6600 (Skylake, 2015) for a lot of demanding applications. I couldn't afford a current generation i9 to get that, but make that a business requirement today and we'll see what's on the online auction sites in 3 years

@cwebber I'm posting this on a secondhand ThinkPad T60 manufactured in 2007. It runs fine, even if it lacks the raw power of a high-end laptop manufactured in 2019.

@cwebber
Yes.
I already run off a 10? year old laptop and manage OK. Anything new would be faster

@cwebber
Beyond side channels I'm even more worried about the ever present issues of poor security design/architecture and of security-critical components being written in unsafe languages. The fact that there is always another buffer overflow waiting in the kernel, in the browser, etc is nonsense. Who knows when someone will find a critical vulnerability in libjpeg and start manipulating images to take over the browser, then call a vulnerable syscall to install a rootkit.

@cwebber
I really want to run a microkernel (so poorly written driver code doesn't compromise the whole system) written in a safe language with arbitrarily nestable security contexts (eg. beyond users having different privileges, I want any program to be able to spawn processes, threads, etc in more restricted contexts, which can also spawn more restricted children, etc).

Also I want a modern Lisp machine...

@Tryphon
I don't know. For desktop use I would want good performance per thread. I don't know that I would care too much about massively parallel workloads. But I'm certainly open to the idea of massively parallel lisp machines.
@cwebber @alcinnz

@willghatch @cwebber @alcinnz I wasn’t thinking threads actually. More like actors or coroutines the kind Erlang/Elixir uses. See if there is another programming system that would benefit from parallel hardware, not just a faster sequential Lisp machine. Not sure I am making sense.

@cwebber I actually wonder what 3x slower means... CPU Disk access? Network speed? Wall time? Would it "feel sluggish" or just do some tasks more slowly?

@cwebber
It would be unbearable, depending on your threat model you can always make new hardware secure enough

@cwebber I would!

But I'm questioning whether such a machine would actually be slower. It would require us to recompile, and occasionally rewrite, all our software though.

Sign in to participate in the conversation
Octodon

Octodon is a nice general purpose instance. more