All: I know the website is annoying but please follow the HN guideline which asks commenters not to post about annoying websites: "Please don't complain about tangential annoyances—e.g. article or website formats, name collisions, or back-button breakage. They're too common to be interesting." - https://news.ycombinator.com/newsguidelines.html
The reason is that otherwise we get a thread full of comments about website annoyances—which is even more annoying. There's no good solution here but let's at least work on a local optimum.
I'm curious to how the parrots perceive the video images ... with cameras, codecs and screens tuned to a human visual system.
While our eyes have three primaries (red, green, blue), birds have four and can see into the ultraviolet — which is missing. The "cones" in their retinas also have additional colour filters, which allows them to notice differences in hues, and thereby quantisation in the codec's colour planes easier than humans.
Birds' eyes are also faster, so they might find the frame rate to be irritatingly low, and PWM-driven backlight would need to use high frequencies so as to not be perceived as flickering.
The paper does mention these issues and finds that the birds seem to cope — but I anticipate that they would give criticism if they could. :)
If anything I would say their ability to cope with the poor medium indicates even more complex levels of understanding.
They are able to reason that it is a real bird, it is not physically present, doesn't sound perfect, doesnt look right but they still engage despite all of that friction.
What’s worse: My parrot will happily attack the screen when I’m talking on FaceTime and he deems it’s been long enough or doesn’t like the human on the other end.
For example he lets me talk to my sister but attacks my mom’s video immediately.
They did, but not without a lot of tricks to compensate for the uneven color sensitivity of early black and white film. Standard makeup didn’t look right on film so they adopted some very extreme styles just so things would look “normal” on screen [0].
The point is, there’s no “objective” version of black and white, or full-color, or full-color-except-for-ultraviolent. They’re all tuned for our specific visual perception and may look bizarre past the point of recognition for other species.
You can go further back to the art created by our ancestors: cave paintings, carved stone figurines and cubism can all be understood by modern humans who are 99% of their time confronted with a high resolution environment.
other humans used red-blue glasses to watch 3d movies. If you've ever tried this, after a while your eyes sort of adapt the red and green out of each eye and you stop seeing the wildly contrasting colors.
This got me thinking: what would we see if we implanted (with a futuristic tech) cones that can see ultraviolet? Would we see a new color? Or perhaps our brain would recalibrate and ultraviolet would be the new purple?
> if we implanted (with a futuristic tech) cones that can see ultraviolet?
The information also needs to make it from the retina to the brain. Surprisingly, there are no separate red/green/blue channels for the different cone types. Instead, there is a channel for the difference between red vs green, and another for the difference between between blue vs (red+green).
Egan wrote a short story ("Seventh Sight", collected in "Instantiation") about a subculture of otherwise blind people that hack their optic protheses to see ultraviolet.
Our cones can already perceive ultraviolet. The lenses of our eyes filter UV out. That is why we are prone to cataracts; the UV light the lens absorbs clouds it over time.
People who lack eye lenses have been reported to see ultraviolet as a light, bright purple. Maybe if you had a tetrachromat with no lens, she would see it differently, I don't know.
Color vision is based on two signals, one that varies between red and green and one that varies between yellow and blue.
So a cone integrated normally would probably just come across as further blue and a better purple, not something notably distinct from existing colors. Getting the full use out of more cones would require a significant rework to how our optical nerves work.
Where can I learn more about this stuff? Also for other animals.
I remember reading that a fly's "framerate" is so high, it doesn't see an image on TV, just the dot created by the electron beam slowly making its way across.
> A new study shows that their rapid vision may be a result of their photoreceptors - specialised cells found in the retina - physically contracting in response to light. The mechanical force then generates electrical responses that are sent to the brain much faster than, for example, in our own eyes, where responses are generated using traditional chemical messengers.
They don't have languages per se, more like expressions of emotion. They also imitate human expressions (and maybe learn our language, for bigger birds)
>A few significant findings emerged. The birds engaged in most calls for the maximum allowed time. They formed strong preferences—in the preliminary pilot study, Cunha’s bird Ellie, a Goffin’s cockatoo, became fast friends with a California-based African grey named Cookie. “It’s been over a year and they still talk,” Cunha says.
It would be fun if they get some of the birds to actually meet the birds on the other side!
Birds see very differently to humans [0] and standard RGB displays aren't able to reproduce the full experience for them. I wonder if the results would be any different if we could produce something more realistic to them.
Budgies will try to socialize (in vane) with extremely unrealistic plastic budgies. Parrots actually recognize each other by voice, the colors of the feather doesn't matter hardly at all (maybe for mating...), we know that because certain species do have different colors in different breeds.
I think RGB displays are fine for this particular purpose.
now get some footage of their communications, and let's finally
put that to some unsupervised learning algorithm so it
distills some patterns in their audio/visual communication and
then builds parrot2vec. Then you perform clusterisation analysis,
and obtain some characteristic patterns. At least we'll have
the vocabulary size with some precision. The vocabulary of
bored domestic animal, therefore reduced to some degree..
I'm not sure if you mean this as a joke or not because this actually sounds plausible; I'm sure we'll see AI tools used to try and learn communicating with animals.
The reason is that otherwise we get a thread full of comments about website annoyances—which is even more annoying. There's no good solution here but let's at least work on a local optimum.
While our eyes have three primaries (red, green, blue), birds have four and can see into the ultraviolet — which is missing. The "cones" in their retinas also have additional colour filters, which allows them to notice differences in hues, and thereby quantisation in the codec's colour planes easier than humans. Birds' eyes are also faster, so they might find the frame rate to be irritatingly low, and PWM-driven backlight would need to use high frequencies so as to not be perceived as flickering.
The paper does mention these issues and finds that the birds seem to cope — but I anticipate that they would give criticism if they could. :)
They are able to reason that it is a real bird, it is not physically present, doesn't sound perfect, doesnt look right but they still engage despite all of that friction.
For example he lets me talk to my sister but attacks my mom’s video immediately.
The point is, there’s no “objective” version of black and white, or full-color, or full-color-except-for-ultraviolent. They’re all tuned for our specific visual perception and may look bizarre past the point of recognition for other species.
0: https://cosmeticsandskin.com/aba/max-and-the-tube.php
The information also needs to make it from the retina to the brain. Surprisingly, there are no separate red/green/blue channels for the different cone types. Instead, there is a channel for the difference between red vs green, and another for the difference between between blue vs (red+green).
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2826135/
So adding a new cone type would not be enough, it would also have to be represented in one of these channels, or a new channel.
Excellent as usual, if you like Egan.
Others argue that this wasn’t the case, but it’s interesting either way.
https://www.quora.com/Could-Monet-really-see-Ultraviolet-lig...
People who lack eye lenses have been reported to see ultraviolet as a light, bright purple. Maybe if you had a tetrachromat with no lens, she would see it differently, I don't know.
So a cone integrated normally would probably just come across as further blue and a better purple, not something notably distinct from existing colors. Getting the full use out of more cones would require a significant rework to how our optical nerves work.
I remember reading that a fly's "framerate" is so high, it doesn't see an image on TV, just the dot created by the electron beam slowly making its way across.
https://bookshop.org/p/books/an-immense-world-how-animal-sen...
https://phys.org/news/2012-10-eye-mystery-insight-flies-fast...
https://en.wikipedia.org/wiki/Impossible_color?wprov=sfti1
Also don’t miss the Mantis Shrimp eyes:
https://www.radiolab.org/podcast/211178-rip-rainbow
https://en.wikipedia.org/wiki/Mantis_shrimp?wprov=sfti1
[1]. https://en.wikipedia.org/wiki/Alex_(parrot)
It would be fun if they get some of the birds to actually meet the birds on the other side!
[0]: https://www.reddit.com/r/parrots/comments/7itlyx/can_budgies...
I think RGB displays are fine for this particular purpose.
- other budgies
- my green cheek conure (who isn't sure what they're doing exactly, but still tries to humor them)
- cuttlebones
- my fingers and tip of my nose
- my wife
- various parts of their cage
- toys
Honestly, I think budgies are happy go lucky little birds that like to play with anything and everything