More specifically, polarization changes with the angle of reflection of the surface towards the detector / eye / camera, so every bump in the surface gets a color gradient different from the surroundings when seen by a polarization sensitive eye
Cryptography nerd
Fediverse accounts;
Natanael@slrpnk.net (main)
Natanael@infosec.pub
Natanael@lemmy.zip
Lemmy moderation account: @TrustedThirdParty@infosec.pub - !crypto@infosec.pub
@Natanael_L@mastodon.social
Bluesky: natanael.bsky.social
More specifically, polarization changes with the angle of reflection of the surface towards the detector / eye / camera, so every bump in the surface gets a color gradient different from the surroundings when seen by a polarization sensitive eye
Polarization filters on retinal photoreceptor won’t make light wavelength (color) be perceived different, it just changes the conditions in which it’s detected. If those polarized cells would cover unique colors compared to the rest, it would kinda resemble the highlight effect in Mirror’s Edge, where something with a different angle than the surroundings stand out (sudden color gradient)
He probably told Netanyahu to please stop it, it’s hurting my PR now, and was shocked when he said no
While sound is not nearly as dominant, it’s absolutely not just a backup sense. It’s the fastest perception we have (the best rhythm game players can play blind but not deaf), it covers all directions, and even in our sleep we still respond to loud sounds.
Sound perception is so fast that it’s often what directs you to look in the right direction, even if what you’re reacting to happened in your field of vision.
Funny enough, even our peripheral vision is faster than our central field of vision, to help us avoid predators coming from behind! Our forward directed vision is for tracking and understanding what’s in front of us, sound and peripheral vision is in large part for environmental awareness. They’re co-dependent!
Humans can even learn echolocation!