Ahmed FASIH is a user on octodon.social. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.
Ahmed FASIH @22

“I find it hilarious when people think of Pascal as some old, slow, useless language that would be a terrible choice for anything, yet still think of C as a realistic language to choose for development today.”

“ We want so badly to believe that the language/OS situation today is based to a large degree on some contest of technical merit, but from my experience, it’s mostly an accident of social dynamics.”

blog.regehr.org/archives/1393# (hat tip + references at twitter.com/wallingf/status/91)

· Web · 2 · 2

@22 The comment "We want so badly to believe" is a strange one to me. I thought most people would consider technical merit to be only a small factor. Programming language communities are after al intensely tribal.

But the point about Pascal is often made about Fortran too, usually by people who don't know the language at all.

I find it also interesting that so many people claim to love C and hate C++.

@wim_v12e “I thought most people would consider technical merit to be only a small factor”—wow, you’re very reflective! I agree but most people correlate a thing’s success with its quality (cue Duncan Watts’ MusicLab).

About Pascal/Fortran—I took the original comment to be more about the agedness of C for writing “modern apps”. I am a little familiar with eking out Fortran-grade performance out of C (restricted pointers…), but haven’t really looked a Fortran—keep hoping for something newer.

@wim_v12e I’m familiar with the “C rules, C++ sucks” people also. I explain that to myself as, well, you can teach someone (or learn) to write C with a pretty skinny K&R book, but StackOverflow has a legitimate book list, containing multiple door-stoppers, for C++ (stackoverflow.com/q/388242/500). It’s harder to dismiss C with a semi-serious Pascal (or Fortran) argument since they’ve been adding to it. But by gosh, it’s so… terrible.

@wim_v12e I really liked that Alan Kay interview recently (last few months) on Hacker News or somesuch filthy website, where he said something like “We really could use about three good languages” and someone asked “Which languages are good?” He replied, “I said we could use three good languages, not that there existed three good languages.” 🤣

I hope Rust and JavaScript like PureScript or even TypeScript get there in a couple of years.

@22 I'd say that there are no doubt good languages out there, and also languages I like a lot, but my ideal language doesn't exist.

You mentioned Julia a while ago, how do you like it?

@wim_v12e I’ve come close to learning Fortran at least twice before :)

I guess I’m looking for a language that let’s me do data wrangling & scientific computing (build algorithms out of SVD, FFT, etc.), but also databasing, CSV parsing, web serving. With a proper Haskell-grade type system (though I’m only somewhat familiar with what I ask for). And amenable to compiling, SIMD, and multithreading.

Python has shown itself to be a good general-purpose language, but fails in type systems + speed.

@wim_v12e Julia aims to be this language, and whatever languages we get down the road will certainly learn a lot from Julia, but Julia is not that comfortable yet. The type system is neat but not ergonomic, the compiler could use a good couple of decades of optimization, and it can’t yet compile to a static library for calling from other languages/applications. We use it at work and it’s ok. It’s better than Matlab and faster than Python. Hopefully it gets a lot better :)

@wim_v12e Er, I wrote “it’s hard to dismiss C” but I meant “C++”. Sorry.

@22 Me, reflective? Guess I'm just old ^_^
Actually, Fortran 2015 will be out next year, so it's actively developed. Still, hardly a candidate for next year's cool new system language.

Obligatory Duncan Watts reference: from a review of his “Everything is Obvious” in NYT: nyti.ms/2jXp2mx

“We think the Mona Lisa is famous because of its traits, but we think those traits are significant only because they belong to the Mona Lisa, which we know to be famous.”

This kind of circular logic is as hyperfine and subtle as it is endemic and destructive. When asked to describe *why* anything, you just describe *it* and make (vacuous) intellectual leap from there to “why”.

Here’s a tidbit from Duncan Watts’ “Everything is obvious: once you know the answer” showing what this looks like in the wild, with a real review of Harry Potter:

‘Although it is rarely presented as such, this kind of circular reasoning—X succeeded because X had the attributes of X—pervades commonsense explanations for why some things succeed and others fail. For example, an article on the success of the Harry Potter books explained it this way: …’

octodon.social/media/GvQXNv1ny