I had a challenge to this idea, but after I thought about it more I’m going to take it in a different direction.
Consciousness seems to be an emergent behavior of at least some complex systems (what systems qualify is unknown). Just sticking with my own neurons, each neuron simply reacts to the signals sent to it and then sends out it’s own signal. No neuron has the full context or is necessarily even aware that it’s playing part in my own consciousness. Even I don’t have the full context of what’s happening in my brain.
If we extrapolate this to group behaviors then we can’t assume any greater consciousness is any smarter than it’s parts.
It would be really funny if they asked latitude and longitude instead of asking them to point on a map. Suddenly the data becomes very impressive.