In the coming years, advanced whole-brain mapping technology and interfaces that may allow connectivity between the brain and computers located externally will likely change the world we know today. Such technologies will eventually begin to allow control of devices remotely, and in a manner that is hands free, and orchestrated through the use of mere thought alone.
Similar technology has also displayed rudimentary use of brain waves that are generated coinciding with an individual focusing on a particular word, either when hearing the word, or simply being asked to think it to themselves. The eventual hope is for development of software that can interpret such brain activity with enough success to allow speech for disabled individuals the likes of physicist Stephen Hawking, who must rely on a crude, sight-oriented computer system to slowly formulate words and sentences.
Arguably, as technology capable of recognizing the connections between distinct abstract thoughts performed by the brain and speech improve, devices will eventually exist that allow “speechless” communication using the faculties of the mind alone. But is there any way the same technology might be provided to enable simple levels of communication for animals just as well?
Many would argue that this is simply impossible, due to the fact that most animals lack the depth of thought, in addition to the capacity for abstract thinking, that humans rely on to formulate complex forms of communication. But I would argue that, in rudimentary ways, certain levels of animal communication might indeed be accessible utilizing such interfaces, due to the fact that simple commands issued by the animal brain might nonetheless have what humans would liken to being conceptual, or even emotional counterparts.
To explain, I’ll use a dog as an example. The family mutt may often whine, growl, or pant at various intervals, indicating when there is something he or she wants. A short, stern bark may be used to get one’s attention, while a similar “command” may indicate a desire to be let outside. A timid whine or similar physical action associated with sadness, shame, or displeasure may become distinct enough to allow the dog’s master to begin to determine the animal’s attitudes at other intervals just as well. Altogether, owners of animals who are particularly close to their pets will, with time, probably begin to discern the associations between sounds, postures, and other physical actions as they relate to what the animal is feeling at a given time.
But there are other times that what an animal is feeling may be harder to discern. Having worked for a number of years as a professional dog handler/sitter, I know that every animal will tend to behave very differently. Even certain dogs I’ve managed to grow very close to will occasionally begin to act in ways I’m unaccustomed to, and it can be difficult to discern exactly what they’re after, or what they’re trying to relay. Granted, those experienced with working alongside animals know that a variety of things animals will tend to do can be generalized “across the board,” so to speak; in other words, the tail-wagging of an excited or happy dog is more or less a universal display, with a few minor circumstantial exceptions. So what if the brain functions of certain animals that communicate desires and other concepts in this way could similarly be networked with brain/computer interfaces? Might this allow certain concepts to be expressed by an animal?
Imagine how annoying a sound-emitting device would be in this case. “Let me outdoors!” or “Feed me!” would be sounding off at all hours of the day like the incessant ringing of a broken alarm. On the other hand, if a small LED deviceby