@janellecshane Hypothesis: the lower layers of the net (closer to the semantics) did set up the foundations of a horse on a bicycle, and the upper layers were like "lol that's a weird looking finch buddy" and just overrode it
@janellecshane The laziness is strong with this network!
@cathal After some experimentation, I think it tried even less hard than that! There are 1000 things it knows how to do, and it will haphazardly map any input onto one of those 1000 things.