Elephants rarely get cancer: less than 5% of captive elephants die of cancer, compared to 20% of humans. Elephant genomes have at least 20 copies of the tumour suppressor, p53, which may explain their low cancer rates relative to humans, who have only one copy.
The amount of times I’ve tripped over or hit something while walking around in the dark is unfathomable; it’s got to the point where I’ve somewhat accepted my incapability to manoeuvre around in the dark. However, I am fortunate enough to be able to eliminate this problem by simply turning on a light switch. Yet what about the people who live in constant darkness? Would they ever truly be independent if they cannot see any of their obstacles? Perhaps if you had asked me this before, I would have agreed that blind people wouldn’t able to achieve true independence without some sort of technological advancement allowing them to see. Of course, that’s where my thought process was flawed. I had reduced the idea of spatial awareness to simply using the eyes. Many animals e.g. bats and dolphins use echolocation or Biosonar as a way to locate and identify different objects. They do this by emitting a call and then listening out for the echoes produced that return from the various objects around them. Dolphins use this technique to see better underwater.
Blind humans have also been known to use echolocation. By making a sound of some sort e.g. making clicking noises with their mouth etc., they are able to accurately locate where and how big an object is by interpreting the sound waves that have been reflected off of these objects.
If one side reflects much louder sound waves than the other, the sound has bounced back faster, therefore taking a shorter route. This indicates the presence of an object or obstacle on that side. Functional Magnetic Resonance Imaging (FMRI) studies on the brain were conducted on blind echolocation experts with sighted humans as control. The study demonstrated that the primary visual cortex is activated during echolocation in these blind people while there was no activation in the control subjects. This is unusual because the primary visual cortex is normally used to process visual information in sighted people. Moreover, the parts of the brain that are associated with processing auditory information didn’t show much of a difference in activity during echolocation, in either group. This implies that the parts which process auditory information in the brain are not necessarily vital when blind people carry out echolocation. Since the original function of the primary visual cortex is not very useful in blind people, it seems to have rearranged itself so that it can process spatial information from echolocation.
This ability to rearrange and adapt the functions of different parts of the brain is known as neuroplasticity.
Daniel Kish is a blind echo locator who teaches other blind people to use echolocation so that they can be ‘active participants in society’; here is a link to his Ted Talk where he explains how he uses echolocation to achieve independence and how even sighted people can be trained to use it too. https://www.ted.com/talks/daniel_kish_how_i_use_sonar_to_navigate_the_world
Year 12 at Boston Grammar School
My name is Rayna Koshy and I am in Year 12 studying Chemistry, Physics, Biology and Maths at A level. I enjoy the science subjects and an affiliate of the Royal Society of Biology. I'm very passionate about healthcare and volunteer at a local care home. In the future, I'm hoping to study medicine.