Welp, I've contacted a favourite podcast to correct their statement that neural networks are impossible to understand. I'll do more of this in the future, I think, but I'll probably wait until I'm done the paper I'm currently writing since it has some new approaches not in my previous publications.
@Vanessa Oh gosh. Everyone has to do that at some point u.u
For serious - ANNs have been around far too long for them to be a mystery.
@Vanessa Ikr. It's like the "Holy fluff, Google translate unexpectedly invented a language! Machines are evil o_o" thing last year.
Idk if it helps, but the tank thing as I learned was that they were using a very basic in memory machine vision learning (not ANNs), and taught it friendlies on runways, and enemies on sand. Because the tanks were minor artefacts, it couldn't tell friend from foe in the field. Just airbases from sand. So it was never put near a tank, and wasn't a mystery.
@Nafrondel But, yeah, the fear of them becoming evil also comes up a lot. I have a looot to say about that, but the short version is that an artificial consciousness of advanced capabilities is unlikely to care we exist. Humans just aren't that important or interesting or useful really. My friend Damien writes about ethics and machine learning a fair bit if you're into that. You can find some of his work here:
http://afutureworththinkingabout.com
I think he's decidedly excellent.
@Vanessa Doesn't help that Hawking keeps saying "Moore's law means if it can happen, it will happen".
@Nafrondel also, "holy fluff" is a pretty charmingly amusing exclamatory choice. :)
@Nafrondel oh, yeah, but I've actually had people (including a machine learning prof criticizing neural nets) tell me that story as being about using ANNs for classification trained on photos of enemy tanks but the ANN learns to recognize sunny days because all enemy tank photos were sunny and all the other photos were on cloudy days. I don't know how it mutated from what you've said into this other story, but it has been floating around in that form for decades.