The idea of a "whistle robot" really sticks with me. I wonder what it would take to build a whistle robot that could reliably whistle even when given fallible, varied, or outright defective lips. Maybe this sort of robustness is what we should be striving for when we try to build intelligent machines. There's a certain merit to a computer vision system that figures out how to function even when you provide it with random webcams dredged out of the bargain bin.
> my point is sort of all knowledge is conditioned on its terms of invariance
Have you read Ian Hacking's "The Social Construction of What?" -- in it he talks about "interactive types" which is similar to what you're getting at (but I think distinct). I'd say that he focuses more on "interaction" being "interaction with humans", but I think your leafcutter ant example is nice because it shows this is even more pervasive as an issue.
The idea of a "whistle robot" really sticks with me. I wonder what it would take to build a whistle robot that could reliably whistle even when given fallible, varied, or outright defective lips. Maybe this sort of robustness is what we should be striving for when we try to build intelligent machines. There's a certain merit to a computer vision system that figures out how to function even when you provide it with random webcams dredged out of the bargain bin.
Great talk!
> my point is sort of all knowledge is conditioned on its terms of invariance
Have you read Ian Hacking's "The Social Construction of What?" -- in it he talks about "interactive types" which is similar to what you're getting at (but I think distinct). I'd say that he focuses more on "interaction" being "interaction with humans", but I think your leafcutter ant example is nice because it shows this is even more pervasive as an issue.
>And eventually, your hard drive needs to be so big and so wet, that it's got to be the ocean.
yes.